Online privacy: one size fits all? – PL 02 2012

From EuroDIG Wiki
Revision as of 18:41, 13 November 2020 by Eurodigwiki-edit (talk | contribs) (→‎Transcript)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

14 June 2012 | 15:45-17:15
Programme overview 2012

People

Key Participants

  • Rosa Barcelo, DG Infoso, European Commission
  • Hans-Olof Lindblom, Data Inspection Board of Sweden
  • Henrik Nilsson, Member of the Swedish Bar Association and partner at Com advokatbyrå
  • Rainer Stentzel, Federal Ministry of the Interior of Germany
  • Katarzyna Szymielewicz, EDRi, Pat Walshe, GSMA

Moderator

  • Douwe Korff, London Metropolitan University

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com


This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.


>> OLA BERGSTROM: Okay, ladies and gentlemen, please take your seats. Thank you.

I hope you had a pleasant lunch and nice discussions at the workshops. We will now continue with the next plenary session. Online privacy, does one size fit all? I’ll leave the floor to the moderator, Douwe Korff.

>> DOUWE KORFF: Okay. Can everybody sit down, please? People at the back. Okay. Thank you.

Thank you all for showing up at this plenary. My name is Douwe Korff. I’m a human rights lawyer, and sometimes a bit of an activist, but I’ll be very quiet at this session because I’m only moderating.

We have an excellent panel for a very broad theme, and that is online privacy, one size fits all?

There are a number of issues we can bring under that, and the panelists will tell a bit more about it. But it includes issues should we differentiate between high and low risk profiling? Should we differentiate between identifiable and hardly or quasi identifiable data? What should we do about cookies? There are different kinds of cookies, as you just learned. And most importantly, when we are talking about the developing standards, should we try and stick to the existing standards, in particular the Convention of the Council of Europe, or look at different conventions such as the Madrid Principles and the development of the European regulation?

I’m not going to talk about it, I’ll just ask the panelists, very briefly, introduce themselves and mention what more specific issues they would like us to discuss. I would like this to be an interactive session. So they will be brief, they promised me, and we will hand it over to you. And I hope that we get lots of questions and those questions will enable them to give you more in-depth answers.

Please, panelists, from left to right, introduce yourself and then briefly say what you want to talk about.

>> RAINER STENTZEL: I’m from the German Ministry of Interior and I’m there responsible for data protection and reform of data protection especially on the European level. And I’m the head of the delegation and the DARPICS Working Group.

I’d like to talk about how it works in the Working Group and why this question is very important for us.

>> HANS-OLOF LINDBLOM: I’m a chief legal advisor at the Data Inspection Board in Sweden. Next year, that Data Inspection Board will be 40 years celebrating. So, we had this data act already in ’73. And I think the reality has changed since then. We didn’t have the Internet, we didn’t have people communicating digitally. We didn’t have trade digital, and everything did not go by the net at that time.

And I really would like to talk about today the online world is everybody’s reality. How do we protect the fundamental right of privacy? What rules do we really need for that? Should we have one rule for everything or detailed handling rules or should we have focus on different kinds of risks with the processing, the ordinary processing and the high definition risk?

So that’s what I would like to talk about.

>> DOUWE KORFF: The very first country, HASON, was a bit before you, but that wasn’t the country, Sweden was the very first, you are right.

>> HENRIK NILSSON: I’m an attorney, a partner in a Stockholm media law firm, specializing in issues such as privacy. I represent business interests. Typically my clients want to use more information rather than less. I like to believe I can see other sides of these arguments.

I have a background in public authorities, having worked, among others, the host organisations today, the Swedish Post and Telecom Agency. And I also have a civil society background, having been a longstanding member of the Internet Society, and I’ve been on the board of the Swedish chapter of the Internet Society for a number of years.

I’d like to talk later on, when we get to our main points, about the connection to the main topic for this session, really, one size fits all?

As a lawyer, I’m supposed to I presume be a big friend to regulation. But, typically, I find regulation can be hindering, obstructing more than helping many times, even though I don’t want to live in a lawless society. You wouldn’t want a traffic light on every street corner on every deserted street in every part of the country. The question is, do you want IT regulations regulating all kinds of online behavior? I think there are lots of questions to be discussed in that matter.

Thank you.

>> DOUWE KORFF: Thank you.

Rosa.

>> ROSA BARCELO: I’m Rosa Barcelo. I work for the Information Society called DG Infoso. I work with online privacy. And I’m primarily responsible for that.

I wanted to thank you now. And ask Sophia, I think she is online, it’s important what I would like to hear about. Rather than answering that directly, I wanted to just raise, have you heard sometimes about these talks about the law who killed digital? And then on the other, sometimes, well finally, Europe got rid of cookies. So I’m sure that you might have to identify that this is article 53 of the privacy directive. It talks about cookies. It requires consent to place a cookie in a telephone or computer and provides some exceptions.

So I am personally attached to the – to this topic. We worked a lot. So I would like to hear a lot about the concerns, the privacy concerns, that originated it. So the past, the present, what are we seeing and doing to comply, what is the statute of implementation?

In the UK we have seen a lot of pop up, splash Windows, that they were not there with consent. Which types of consent we are seeing? And then I’d like to look at the future. Is information and consent for this type a good tool in privacy? Does this concept fulfill what originated the rule? So it will be the – my choice.

>> DOUWE KORFF: Good topics. Thank you.

>> KATARZYNA SZYMIELEWICZ: Hello. I’m Kata for short. I run the digital rights part of EDRi and I’m here as an EDRi representative. So I hope I can present you with a civil society perspective. I would like to talk about why we need a very strong coherent framework on data protection at least in European, because globally it might be too much to ask, and why that framework should cover US based companies. So I would like to talk about writings proposals about modernization of Convention 108. But also I’m keen to talk about profiling as the great challenge for our society, as probably the biggest challenge for data protection in the future, including the cookie debate.

>> PAT WALSHE: I’m with the GSMA Association. That’s a global trade body for mobile phone operators around the world.

I wanted to say I enjoyed the Nordic Youth delegation video this morning. It was great. And it’s great to see what the Internet is now, and the older people have to be included in the digital society. I consider myself included. But I think one of the things – some of the things I want to talk about today, and I’ve known Rosa for a long time, is that, from a mobile perspectve, it’s clearly global. But it’s extremely complex. And I think the key thing for me is that it’s governed by a patch work of geographically bound laws. But data flows and relationships are dynamic, immediate, now. And they transcend the borders of those geographic laws.

My expectations, needs and wants don’t stop at the English channel when I download an app that was then provided by someone in Timbuktu. So one app might, for example, involve four different parties, taking my device data.

To go back to the point that Rosa raised about article 5.3, it’s accessing information stored on subscriber equipment, which is my device, and it’s sending it across the world and other parties are sharing it for advertising and other methods. So what do we do about providing people with capabilities? First to understand that there are privacy consequences, but what do we do about giving them simple clear tools to participate in the management of their privacy? And, you know, I’ll talk a bit about it, some of the work that we have done as an association. These are privacy design guidelines, mobile apps that we have members implementing at the moment. I’d like to touch on some of the bigger picture issues that I think we’re in danger of losing site of, and concern about the fact that Amazon knew, or knows, that I watched the video last night. I quite like that fact. I like that personalization.

But I think we have got the fact that there is data generated that has significant utility value in terms of delivering social and economic gird, and I’d like to talk about some of that later on if I get a chance, and including the fact that – how do we automize this data? I’m concerned about the proposals in the European regulation that in fact will burden users, not just business, but burden the user of a mobile device so much that they would disengage both from engagement online but also from engaging in managing their privacy. And I’d like to touch on that, because I think it’s flawed in many ways.

>> DOUWE KORFF: That ties in with the issue of what constitutes consent. When do you need it or not need it? There is a menu of topics that you can choose from. Before I start with that, I have to agree, Rosa, that I’m an impostor. I’m – Sophie should have been here. Hello, Sophie, if you can hear us. I’m trying to represent you and be as polite as you are. Thank you very much.

Could I open the floor both here and where are the remote watchers? You are a remoter? There. Okay. In that case, I’ll open the floor first, Question to you.

>> MODERATOR: Without a microphone?

>> DOUWE KORFF: No. There should be a microphone. It’s on its way. Every time I made the error on in the previous meeting. Say who you are, and then put the question for record and so it can go up.

>> AUDIENCE: I don’t want to be annoying to come in at every panel, but they are touching the issues. I’m Alexander Alvaro and I’m Vice President of The European Parliament and was the draft person who did the e-Privacy directive. Especially with article 5.3, which has been mentioned, which is somehow called the cookie law.

Two ways, it shows hysteria in the industry. It was never targeted to deal with cookies. Now I can, since it’s been done, say what our intention was. What we wanted to do, because we had the discussion in Germany about a Trojan which can be implemented by the state on personal computers, we wanted to avoid the state somehow can introduce some sort of spyware on your computer. But at that time our interior minister was smarter than I thought, and he found out. But then the problem was that we didn’t want to regulate cookies in these terms at all. And that’s why it was foreseen in the discussions to include into the article a passage that says that by confirming in a browser or similar technology, one time for many cases, that you would accept cookies, that the requirement of consent would be fulfilled.

At that time the Czech presidency and the Commission, which was not representative here in negotiating it, didn’t get the implication of what would happen if we wouldn’t do it. Now we have a lot of implementation problems in the Member States. So article 5.3 has to be read in conjunction with I believe recital 66 or 76, I’m not sure of that directive, where this is explained to citizens in this recital, which should have been in the article.

And just to wrap it up, that actually shows what happens if bad legislation due to a certain lack of knowledge in one of the European institutions hits the public, it creates chaos. So we actually should take time and have more experts when working on legislative issues like these, because then the problem of article 5.3 would have never arisen.

>> DOUWE KORFF: Thank you very much. And we will put things like that to the panel like the needs for technical expertise in law and lawmakers. I think it’s an important issue.

If I may collect one or more issues before I turn to the panel. Any further questions right now? Please raise your hands. Everything is already clear? We have no questions about cookies at all? That’s amazing.

Okay. In that case, I’ll ask the panel right now to address this issue of how can we make sure that legislation actually relates to the technologies as they really operate in practice. Because I do think that is a big problem. Maybe we can already start discussing this issue of consent.

Can you say people have really consented by accepting certain default process acceptances? I can ask anybody on the panel who wants to pick up those issues.

>> RAINER STENTZEL: We have to think about for what environment do we like to draft our law. So when we started with the privacy or data protection law in Europe, we had taken into account the Internet and the whole environment of the Internet, because it was later on. And now we have the Internet and well mostly in every areas of life. If you drive a car, we are connected to the Internet. If we – well, for electric smart grids and net management, we need the Internet, and so the Internet will be a part of our life everywhere. And we use the Internet. We have our computers with us. And so if we are regulating now privacy and processing of data, we are regulating mostly everything.

And we are not only regulating things we have in mind, like Google or Facebook and cookies, we are regulating the bakery just next to us and we are regulating Mrs. Lindquist and public authorities all over every sector of public authorities. And that is a very big problem. If we would like to have one law, technical, mutual and technically in a sense, we reach our bodice. And if you say we would like a strong regulation on data protection translated to you all at areas that we have, that means we would like to have a strong regulation for most parts of our life. And that might cause a problem. And so I think it is the right direction to say, well, we shouldn’t focus just on processing of one single set of data and then looking if we can slide it in some stages of processing, collecting, and further on. And then we are trying to use who is responsible for it, and who is the controller and the processor and these things.

These things don’t work in our world as we have it now, with clouds and everything – well, it might work, but it’s getting harder from day-to-day. And I think that is the problem and the challenge. If you would like to have a new law or a modern law, we should focus more on how data processing is just affecting our privacy.

Because we had a huge debate on Google Street View one or two years ago in Germany, and it was not really affecting my private sphere. It was the house, where everybody could take a photo of this house. It was just public. But people were complaining about it and it was a problem of privacy.

On the other hand, we have all kinds of profiling. We don’t see how Google is profiling, for example, our searches and what they are doing with these kind of data. And this might affect my private sphere very heavily. And so I think this should be the focus to have a risk-based model and to focus on, well, maybe we can figure out some measures to assess how much the private sphere is really affected or not. And this must be the future. We cannot, as lawmakers, you cannot, well, have a look at the environment and say well, this is the new technical development and for the next ten years it will be the base.

>> DOUWE: Are you going back to the sectoral or universal data protection rules?

>> RAINER STENTZEL: No, not necessarily. I think we need general rules, that is for clear. If we cover everything mostly, and every data is – has a relation to a person, we should – we must have general rules. But that doesn’t mean that we do not need also detailed rules for risky processing of data.

>> DOUWE KORFF: I’ll ask your fellow panel members if they agree with this approach. Let’s take the data protection rules and criteria. They are pretty well accepted as the general standard. What do we put underneath it? Pat?

>> PAT WALSHE: Well, you raised a point of Germany. But I think we have one of the founding fathers of the data protection regulation, a German DPA in Hessa in the mid ’80s, who recognized the importance of context. And he was saying and he wrote about in 1987 that informed consent depends on the socioeconomic context of the activity. He was writing this in 1987. Here we are today still struggling to come to terms with what that means. Even today’s policy, it perceives the fact that I can negotiate my privacy in a binary one-off way, and I cannot. Because my privacy interests, which are my expectations, needs and wants, emerge at different points in a relationship with a company, with a service, with an application.

So, for example, if I’m downloading, if I want to know where the nearest ATM is because I want cash, and I say okay, I used the app, do I really need a cumbersome legalistic privacy policy that tells me thank you for asking where the cash machine is, but we want you to read this notice and then consent.

You know, on a notice on my device the other day, I had to update terms and conditions in a privacy policy and it amounted to a little over 21,000 words. I was a brunette when I started reading it. It took that long to get to the end. And we can’t do that.

So, in Germany, if I come back to Germany, location, we have three sets of law in Germany. We have the main data protection, and the telemedia, and if you’re a telecom operator, you have the Telecom Act, which means that the way the law is implemented in that particular state denies citizens access to services. What do I mean by that? I mean that if you’re on a particular sector, the privacy compliance barriers are so high that you do not get into that market.

And that is what is happening in Germany in relation to location privacy. If you’re a mobile operator, you have to get written consent. You have to send a text message after you’ve been located for five times. But if you use GSP or wireless line, that is okay. Well, how can that be? I believe that if you treat functionally equivalent data in functionally equivalent ways and give people consistent privacy experiences, they will become familiar with the fact that there are implications and consequences for engaging in things and how they can manage that.

So we have to look back at context again. Because, you know, my interests are shaped, and they are shaped at multiple layers and levels.

>> DOUWE KORFF: Don’t you have the problem that if you say you have to look at context, everybody is going to scream: My context is so innocuous I can get away with everything. Regulate him tightly..

>> What the law says is that you have to help them become aware. You have to give them choice. Those words already exist. It’s the fact that they haven’t been implemented in a particular way or in a way that is meaningful to consumers that we’re in this situation.

Let me look at, if I can, let’s look at the proposal in the EU data protection regulation. There are proposed sanctions. And one of the sanctions, for example, if you are a global company you can be fined half a percent of your global turnover for failing to meet the information provisions of the regulations, for failing to, under articles 12 and 14, provide this very long list of information about who you are, what data, what specific data, who you are sharing data with, whether the data is going overseas.

So what is going to happen? If you were subject to half a percent, possible half percent fine of your global turnover, would you make sure that you do give someone the 21,000 word privacy policy or you would think just leave that bit to the side? So there is a danger that we overburden users with information in which they just drown, which means that rather than enhancing privacy, we will undermine privacy. People will get so bored as I do with cookie notices, not another one, you click, click, click and you go through. So, actually, it undermines privacy.

So I think we have to think differently about this. If this is about giving people the capabilities to make informed decisions in meaningful way, then what I see at the moment doesn’t facilitate that.

>> DOUWE KORFF: Can I ask a regulator, perhaps, how do you deal with this? Because if you say you need consent in order to give consent, I need information. But if you give all the information, people are not going to read it and therefore they click, saying they have given consent, and it’s a pure formality. How do you deal with that as a regulator?

>> Maybe we have solved the questions, really, in some cases, really. Because we have, since 2007, simplified rules concerning unstructured processing of personal data. For instance, when you’re processing personal data on the Internet or meeting any mate or friend also at the office, the simple rule says that you’re allowed to do that, without taking any notice of the detailed handling rules for structured processing of personal data. You’re allowed to do that, as long as you don’t invade somebody else’s privacy.

And that is a simplified rule for the everyday ordinary processing.

And our experience in Sweden is, I guess, and you should have it as well in other countries, is that the legislation always must reflect today’s values. And you – it must be rules that you can abide by. And I think that Swedes are very law-abiding persons. They really do want to do the right, lawful thing. If you look at today’s detailed handling rules, it’s very hard to abide – to comply with those rules. If I’m making an e-mail to you, telling you about Henrik or something, well, I’m processing personal data about Henrik. Do I need his consent? Do I have to inform him and so on?

And by these simplified rules, we make this ordinary nonriskful processing legal as long as you don’t make an intrusion in his privacy.

>> DOUWE KORFF: That was of course initiated by the French. They have operated this since the late ’70s. They simplified norms, mostly for the public sector of course. They were useful, because if you stuck to the simplified rules, you were exempt from all kinds of informing. You say I’m acting in accordance with the standing rules, and everybody understood. Is that a good way to do it or is that different than the way they do it in Sweden?

>> I wouldn’t say it’s the same as the French has. I think the French generally speaking are very complicated.

>> DOUWE KORFF: We all know that.

(Laughter and applause)

>> I think the Swedish notion is more that you don’t have to think of all the detailed rules. You just have to think about do I violate any person’s privacy or not? I mean, you have to think for yourself. You don’t have to think – do I have to make any information note to that person. You should think well, I should make a balance of interests by myself. Do I have a legitimate interest in processing this data about him? Or should I think that Henrik would mind if I put anything about him on the Internet and so on? It’s very easy rules, for the data processing.

Another thing is that you should, really, focus the detailed rules on the sensitive issues, like profiling business and big companies making supervision over everybody, like – or data retention. That is risky business.

But at the same time, we have those more simplified rules in Sweden and the ordinary ones for the companies more sophisticated, personal filing systems. But the problem that we see today is when individuals don’t want to make the right thing, but they defame each other on the Internet and so on.

And this legislation we have for data protection doesn’t help us. I mean, we can’t really help them in those cases. So that is a problem today and how should we deal with that? Maybe some other legislation, more like protecting the private life or something, more criminal –

>> DOUWE KORFF: Do you think the draft regulation goes in the right direction, in the light of what you just said?

>> I can say that we welcome the Commission’s proposal concerning strengthening individual’s rights and so on.

>> DOUWE KORFF: Right.

>> But we do miss the simplified rules for the everyday processing.

>> DOUWE KORFF: Okay.

>> I mean, you have to be able to live by the rules. And we see problems with that. But they are very good points in the proposal. I mean, the sign, you were talking about that, that’s great. Why not have privacy in built-in devices. And we have the privacy impact assessments, yes, that is good. And for the worse scenario, some sanctions, okay.

And we also believe that, I mean, to harmonize the rules for Europe, that’s a good thing. I mean, it’s creating a lot of problems today when there are different rules in Germany compared to France and the Netherlands and Sweden and so on. And that makes it difficult. We welcome the –

>> DOUWE KORFF: And the strength of the role of what is going to be a data protection board to issue more specific guidance in particular areas.

Please, anybody there who wants to come in, let’s just make it a conversation. Please put your hand up. Would that help? Would you be able – how happy with a light touch for not sensitive processing and a stricter hand where it becomes sensitive?

>> KATARZYNA SZYMIELEWICZ: I’m trying to understand your question.

>> DOUWE KORFF: Risky processing against not so risky processing.

>> KATARZYNA SZYMIELEWICZ: Light separating the two.

>> DOUWE KORFF: Light regulation on the one and strong regulation on the other.

>> KATARZYNA SZYMIELEWICZ: I don’t really think there is any point – okay. Light against heavy regulation against effective. What I’d rather see is a coherent principle-based framework, whether that is regulation or the directive. I would prefer to have regulation, because it solves many issues with interpretation, covering all aspects of data processing, including high risk or low risk.

But I would like to come back to the discussion that started here on how to shape a good law that goes in line with technology on one hand and what kind of data protection do we want in general, general sector or a more general framework. I think they are two debates. They have a strong analogy. We can get it right only if we base it on sound principles. If we go into details, if we try to regulate every possible piece of human activity, or if we predict every piece of technological development, we go nowhere with this.

Because we create a difficult framework that cannot be applied by businesses or citizens. So I think we can easily go into absurd talking about lengthy privacy policies. We can go into the absurd by talking about regulating every new technology developed. But if we get the principles right, that’s why I talk about the coherent strong framework, and I think cookies might have how it went into absurd discussions. Whether we want to click for every cookie or not, instead of talking about the principle, which should be the discussion about consent. Do we want consent for every privacy related technical development or only exclusively for data processing? Is profiling data processing? What is the definition of consent? This is the discussion that we should be having.

I’m giving this example to show that I’m against that kind of detailed, very fractured discussion about technicalities, which loses the points we should be making here.

>> HENRIK NILSSON: I’d like to go back to the question of the one size fits all legislation. I’m generally suspicious of trying to solve problems by regulation. I think we need to be humble in our attempt of social engineering.

It was interesting to hear the member from the European Parliament to discuss how the latest legislation went wrong in the cookie example and came out in a way that was not intended. And we have had a very interesting sort of review here from the GSM association about the difficulties of the consent concept. And how do you really gauge that?

And then we hear from Hans on the data protection authority on one way to sidestep that is to exempt large areas of common life from the regulations. And I think you could push this further by trying to focus on the – trying to push the power to the individual. If you have strong emphasis on transparency requirements, if you really push out the information about such processing that takes place, then you sort of move in many cases the power over to the user, to the data subject. The more information that is available, the better possibility the data subject has to make an informed choice. And so it becomes – rather than having to make some kind of gauge, whether this individual has consented or not, or you leave it to the person that is impacted on.

If you get to the proper information about the process taking place, you have quite a large opportunity to behave in different ways as to this or not be bothered by the impact of the processing. I think this was – this really helps to cut away a lot of difficult analysis if you can empower the user.

>> DOUWE KORFF: Transparency in some ways is more important than formal consent.

>> Yes.

>> DOUWE KORFF: I’ll give you a chance, but there was some remote participation questions and I think it’s good to involve as many people outside of the panel as possible. Please.

>> MODERATOR: Thank you. We have a question from a remote hub in Ukraine, from Edward. He wants to know many companies make extensive use of cloud services. The companies have a need to store data. He wants to know whether (inaudible)–

>> DOUWE KORFF: He wants to know whether? It’s about cloud data. But what does he want to know about cloud data?

>> MODERATOR: He wants to know whether the companies have a need to store customer data, and whether that regulation of processing of personal data is low or have – is low, the implementation of cloud services.

>> DOUWE KORFF: The cloud – the cloud data is a very good topic to bring up. I’m happy to bring that, even if I don’t completely understand the question. But access to cloud data is an important one. I’ll pick that up and we will deal with it in the minute.

I haven’t forgotten you two. But there is a question there, if the microphone can go to the back. The gentleman with the white hair. I hope you don’t mind me saying that. At the back.

>> AUDIENCE: Thank you. My name is Bertrand de La Chapelle. I’m with the International Diplomatic Academy.

One element that is mentioned at the moment is we are using personal data as a general term. Aren’t we going in a direction where there are many, many different types of data? To make a very short list, you have the data recorded for instance when you register on a platform. It can be credit card information and your personal information, like name and address and so on.

But then you have very very sensitive data, like something that is health records in some situations, or financial records.

But then you also have the behavioral data collected through cookies and others.

And the last category, and there are many others, I suppose, is the data people post themselves especially on the social media platforms. I wonder in the latter case should we be talking about something that is a new definition of intimacy. It’s something that we are posting voluntarily and the questions are slightly different.

The second thing that I would like to raise and ask the panel is on the social media platform, the terms of service are actually the law of the digital territory. As long as you are on the platform, it’s basically the terms of service that apply.

Is there any direction where we can have on the one hand a better organized scrutiny of the terms of service by civil society entities or supervising groups that would take the responsibility of saying, okay, the privacy part of these terms of service are acceptable. And so the consent of the users would be easier to get, because they would trust one or another validation group.

>> DOUWE KORFF: Okay.

>> BERTRAND DE LA CHAPELLE: The final element, on certain cases shouldn’t there be dispute resolution mechanisms on the platforms themselves? If there is, I don’t know if at the moment you have a way of recourse towards the platform instead of going to a court or a data protection authority. So internal dispute resolution mechanisms.

>> DOUWE KORFF: I think what we’re getting, thank you very much, both from the Ukrainian intervention and from here, more complexity. We find out that processing of personal data or data generally is getting more and more complicated over the Internet more and more happening in the cloud rather than in the territorial defined space. And you have to make distinctions between very risky and not so risky processing. How do you come up with a regulation that covers all of that and is manageable and technology neutral and technology proof at the same time? I’m happy that I don’t have to answer that question.

Rosa, you can answer that.

>> ROSA BARCELO: I want to make two points. One is applying a different framework, different set of rules for different type of data. We have here several people, I think you look at proposed regulation, it already contains the seeds of that. For example, having a different set of legal ground for processing sensitive data and normal data, that is one.

And then privacy assessments, you don’t have to do it, a privacy impact assessment always. There are certain cases where there is an enhanced risk of privacy that you have to do. So it already has this notion. It’s perfectible. It can be inserted more now that it goes to the Parliament and council. This can be further expanded, in my view. But it’s already there.

The second point I would like to go back to, to Alexander, on his point, I think he made two very good points. First is this notion of how policymaking is being done. He was saying that when the e-Privacy directive was passed, there was not enough being done on the cookie. And I tend to agree that it was done very quickly. And, Alexander, this provision was not put by the European Commission. If you look at the proposal from the Commission, it was not there. It was added by the Parliament, one of the Committees, I think. I’m not sure. So these discussions should have happened there and they didn’t.

But I would like to say that the privacy – the Commission has committed in the proposing of the draft regulation has committed to change or assess whether the privacy directive is still fit and is well combined with the regulation. In this context, it’s a good opportunity where we can do what Alexander was saying.

Look at it again. Last week the working party 29 issued an opinion on the exceptions to article 5.3. If you read carefully, you see that in one of the parts, well, the Commission in revising the privacy should look and see whether these sections are too narrowly defined. Maybe we have a bunch of cookies that should not be there. And it’s something that we should all be doing together.

And the final point is on consent. Consent, when you look at consent, there is something that says we can consent through browsers. And I think this is an area that is not explored yet how new technologies can allow us to do consent. Consent fills the three criteria, explicit or fairly even, so looking at – and I would like to mention something that is happening right now, which is promising. There is a standization incentive to not track what is being discussed right now. And this would enable people to – your browser will send a signal to all Web sites, I want to be tracked, I don’t want to be tracked. So this is the direction that – it might be good or bad – but this is the direction that I think is worth looking at it and thinking about how we could use these technologies.

>> DOUWE KORFF: Pat and then Kasha and then a question from the room.

>> PAT WALSHE: To that point, Rosa, I think already it’s out of date before it’s off and flying. Still, the tracking is more about browsers on this device. There are so many ways in which data is accessed and collected and tracked and profiled and targeted.

So will DNT fix that? I think there are some – I think those failures are driving innovation in privacy. So you have Mozilla are designing a system called boot to Gecko and mobile OS, and at the heart of that you’ll be able to set DNT at the heart of OS, and I think that’s a very exciting thing.

But that has to work in cooperation with the industry. What is going to happen if an industry has built-up systems and invested lots of money in being able to communicate with the customer, and suddenly here is a system that says you can’t.

So there is a bit of negotiation going on there and setling down.

I want to touch on the other points is that we welcome the regulation. Because the director has implemented in 27 different ways and it’s a nightmare if you operate across European Member States to comply and deal with it. So you have one regulation. That is great. It needs to be much simpler than it is. You have the principle of accountability. I think it’s unfortunate that more wasn’t made of that and that some of the burdens have been imposed on data controllers and on the regulators. It’s estimated that it will cost 28 million pounds to comply every year with the new obligations.

But then I have to say we have a regulation; lovely. But then we have a new privacy directive which has been implemented in 27 different ways. One of the key public policy objectives of the European Commission is to manage urban traffic flow, to reduce fuel consumption and air pollution and noise pollution. That achieves many things, and most importantly it’s a social gird for individuals who live in those polluted environments. But that needs data to support that. If I’m driving in and out across France and Belgium and Germany, which laws apply? How do you work through this and why do we need two separate pieces of legislation?

>> DOUWE KORFF: I couldn’t agree more. But it’s that you cannot have a regulation and leave the e-Privacy directive untouched. But I’m sure the Commission will get there.

>> KATARZYNA SZYMIELEWICZ: Three quick comments on the cookie debate in general. Talking about consent and browser, I’m afraid we don’t have anymore real debate on whether that should be a browser or something else, because nobody can conceive the mechanism is based on clicking to every single cookie. But the real debate which I see in Poland is how we define consent. The industry is extremely wary to agree that it should be an active behavior. What we see now is the Government implemented a directive in admitting that by doing nothing with my browser means I consent. And that’s a dangerous trick that happened in the debate and I point your attention to this. Because I think that is where the real problem happens.

The second issue, transparency versus consent. Well, we could argue for transparency if we could agree that the users are already empowered to change services, to choose different services, to go somewhere else. It’s not really the case. It’s a situation where we are more or less trapped within certain business models and services, and that’s why I think we really have to work on the consent basis in order to empower more the users who just given the information can really not – can really do nothing.

And that brings me to the more broader point on high risk, low risk data. Well, as a customer, as a user, I really don’t know what happens with my data and whether the procession of my data becomes high risk for me in the sense I do not know what kind of profiling exercises are being done with my data and whether that becomes risky for my personal life. It’s not so easy when it’s health –

>> DOUWE KORFF: I agree. That’s a very good point. But the lady has been very patient. I think it’s her turn. Please.

>> AUDIENCE: I’m –

>> DOUWE KORFF: You have to say who you are.

>> AUDIENCE: I’m Ana Olmos, from Spain. And this session on this topic was held in the Spanish IGF and I’m happy with the last intervention. She mentioned practically everything that I wanted to say.

First consent, consent has to be available in different formats for different people. We have to distinguished between different things. Not all cookies are the same. Not all cookies serve the same purpose. There is nothing – there are so many differences between cookies that is establishing a session and a cookie that is tracking my movements from one Web Page to another and from different behaviors. Users need to be informed of these differences and be aware of these differences.

When we are talking about transparency and educated users, we need for them to understand what giving away their data means. Big data, like you were saying, profiling. And the fact that this, as a separate data, may not be a big issue. But if you are compiling a lot of them together, then it might become one. Those are very very important issues.

But then also we have to take into account the business models that are built. And what is the impact of all of these decisions in business models? Not necessarily in the ones that already exist, we have to think of innovation and ways for innovation. So the problem when we regulate things in an absolute manner is that we may not have imagined what other users may come out of a certain technology, of a certain application or of a certain use. So whatever measure is empowering users, making informed users, being transparent about consequences, not only facts, not only this is being picked up, no. No. No. What is it being picked up for? What can it be used for? Who is going to have access to this? These kinds of questions need to be available.

I agree that the way to do this is the real challenge. Even if we agree on the principles, making it happen is not going to be easy.

>> DOUWE KORFF: Anybody else from the room? Raise your hand if you want to participate. Just wait another minute. We have another participant there.

>> I’m from the Swedish Ministry for Enterprise and Private Communications. I didn’t mean to pick this up just after Mr. Alva spoke, but since the Commission brought it up. The cookie regulation was not a suggestion from the Commission. It was not a suggestion from the Council. I know that both the Czech and Swedish presidents tried to get that out of the Council. But I might have a little, but we should sit down and straighten this thing out.

And if it was a mistake, if it was a mistake, I mean, this review coming up in just a couple years time, this is an opportunity to correct this mistake.

>> DOUWE KORFF: We should deal with how can we get it right for the future? And I think we are getting. We are moving towards a mixture. We say we need basic principles. We seem to agree on that. We need differentiation between types of risky, and Rosa pointed out that the regulation provides for that. We need privacy by design. Data protection seals I would like to bring in. And it would be interesting to hear something about what the Commission might think of actually doing – giving yourselves the power to do something or possibly not do something in a regulation with privacy seals, and certifications of privacy by design. That would be useful.

But before that, I’m going to give Rainer the possibility to say something on that.

>> RAINER STENTZEL: I want to define risky processing. What you said should it be the data subject or the person who should assess whether the data processing is risky or not? I would say no, it shouldn’t be the data subject. It shouldn’t be the tool. Consent shouldn’t be the only solution for risky processing. I’m using what you said. The question is how to define and how to find indicators for risky processing. And the one suggestion was to talk about a proposal, was to talk about different types of data.

Well, in Germany, we were never a friend of a close catalog of types of data. Because it always depends on the context. Health data. When I broke my leg, and everybody could see it, and I’m wearing glasses, so there must be something with my eyes, and it is an indicator for sensible data, but it’s not a closed shop and a closed catalog.

So are there other indicators? Maybe one indicator might be what was said, when it comes to intimacy or spheres of private sphere. That is an interesting approach and we should think more about these indicators and how to implement them into our law system, our legal system, and – well, most of the discussions about privacy, they are very – it’s only – it takes five minutes. And then we are talking about concept or not and cookies and all these things. But we need more fantasy to create our new data protection law for the future. It’s not only consent. It’s not only separating data and trying to find solutions, whether this is a control or process or another one.

The question from the Ukraine about the cloud was a good question, I think. This was daily work in the Working Group.

What does this rule mean, for example, for cloud computing? Nobody can answer some of these questions, because when it comes to cloud computing, who is the controller? Am I the controller as someone who is putting something into the cloud? Is it Apple or whoever is serving the cloud? Or is it the company who is managing the server? Is it a third country processing, because the cloud is in – well, maybe it is in the European Union, but maybe the data flows all over the world because it’s net based? These are very crucial questions, and with the set of tools we have in our existing law, we cannot handle these questions for the future. And so we need more fantasy to think about new indicators for the new law.

And the last point –

>> DOUWE KORFF: That is risky. Professor Simetis was always against the case of sensitive data.

>> The last point on harmonization. I think on a legal basis it’s right and necessary. We have it already with the existing directive. But, says the European call of justice, but it’s not a question only of legal harmonization. It’s more crucial when it comes to the interpretation of the law. And there we have the situation that we are all depending on the data protection authorities, the DPAs, and 27 Member States. We in Germany have some experience with 17 DPAs, and they are not interpreting the law in the same way. So, nothing is won when we have a harmonized law, but a very different independent interpretation of the law. And so I think the approach of it, the goal of the Commission, is right, that we need some kind of harmonized interpretation. The question is whether it should be the Commission who has the last word on it.

But this is a problem that we have a tension between harmonization of the interpretation of the law and the independence of the data protection authorities. And we have to find a solution also on this question.

>> DOUWE KORFF: Very important one. I’m going to pass on to Pat. You had a comment?

>> PAT WALSHE: Well, it was in relation to consent. I agree that we need shades of consent. So there is a concern that there is a regulation on explicit consent and it says that data controllers must contain evidence of consent. Well, if you’ve got to contain evidence of consent, why have explicit consent? One seems to undermine the other. And I think you’re right. And this gentleman in the audience was earlier, I think today the world we live in, this complex ecosystem that we live in, created new categories of data and new contexts that give rise to privacy implementations and privacy consequences that can’t be prescribed for in law. And I think that is where that context becomes more important. We have been looking at ways, we brought engineers and others together and have been working with academics across the world about how you can have privacy icons rather than the regulation, which is a hard form of paternalism, I think, into making decisions when it matters. Who are we to say when it matters? For example, take one of the – take the article on profiling. It says that you must notify individuals of their envisaged affects of profiling. Okay, if I’m profiling someone for a credit check, I can have an effect that if they fail it, they wouldn’t get a mobile phone. But I can’t envisage how the lenders would react to that fact. So, if I haven’t communicated all of that, I could also be subjected to a fine. So some of the things are far too subjective to be meaningful for the individual or to be meaningful for the data controller. So this comes back down to consent. How can someone consent to something that you have no chance of understanding in the first place because the law is gobbledygook.

>> DOUWE KORFF: Sometimes people who use the profile don’t understand the algorithm because it’s complicated.

>> PAT WALSHE: We have the definitions of personal data to include personal data. The earthquake in Haiti, the ability to use data was crucial to understand where people moved to to deliver life saving aid to them. If this happened here recently, after the proposed rules, it would be a challenge to use that data anyway – in a way to deliver that very real urgent social good. And I think we have to be careful, don’t we?

>> DOUWE KORFF: A second after Rosa and then Kasha.

>> ROSA BARCELO: One comment is on transparency. Transparency and communication. I completely agree that it’s key and I would like to give an example in the context of cookies and online. The round table on BOBA and the United States is working to provide this icon. This is supposed to show them that well, this is an app and this app has been given to you on the basis of a profile. If we could reach a situation where people when seeing this icon understands the same as going to the toilet, everybody knows what it means when you see. So if we could reach this level, then we have achieved the transparency. We are not there yet, but I think it’s a long proprocess.

Privacy by design, I would like to play the devil’s advocate. Privacy by design is the obligation to put into process and technologies privacy principles. Well, almost everybody in an industry agrees that this is a good principle. Now, everybody – I would like to ask for hands how would you enforce this principle if you don’t have underlying rules that define what it means for each case?

So if you only have privacy by design and nothing more, how can you enforce it? What are the – but then when you define underlying rules, you enter into technology. You have to define technology and you have a problem. So it’s very difficult to reconcile these two rules. Privacy by design and not being technologically neutral is really, really hard.

>> DOUWE KORFF: It might be good exactly to follow up on this point. You have an answer and you have an answer. Please.

>> Well, it depends on the sanctions, really. But privacy by design. A couple days ago we had a good example of what we should call privacy by design. Because as a start, they were collecting a lot of data on every car owner in this country. We used that from the statistics authority, concerning the major income in the area where the owner of the car lives, and so on. And they encrypted all these data. So they had data that were not pinpointing the individual person, but the kind of car you have and what area you live in. And then they have this very sophisticated device, looking like a camera, but it’s not a normal camera because you can’t generate any picture from it. But you can – the camera can recognize the registration number on your car, and that will be encrypted, or hashed as we say in Sweden with the other data, and they will recognize oh, that is the car of the model say Mercedes Benz and the owner is in the area where they have a median income and they could use some commercial screen which pops up when you enter your car into the garage. Very sophisticated. Well, they weren’t able to go from the encrypted data and go back and see who was that individual, but it became a very good example of privacy by design. They couldn’t live to the demands in the directive to really get, if anybody – if the car owner would try to request information on what personal information are you processing about me? They couldn’t answer that.

And that was the good thing, that was the privacy by design.

So I think it could be possible to enforce that type of thing.

>> DOUWE KORFF: Okay.

>> Without detailed rules.

>> DOUWE KORFF: We have an intervention and we have a remote intervention.

>> MODERATOR: We’re going to – close your microphone. We are trying to figure it out. Edward’s question from the Ukraine. So he says that online interaction with customer service Departments is accompanied by a hidden transfer of data. Edward believes that software developers should be required to display to the – to every – to the user each transfer of customer data. Does the panel agree?

>> DOUWE KORFF: Okay. This is about hidden transfers of data. In other words, you think you’re just talking to one person but in fact all kinds of information gets passed beyond that, for secondary purposes. I think it’s interesting.

So, that kind of system should have something built in on the lines that you have. Identified to the extent that you can could that, that’s a story in itself.

Good. Anything more – Kasha and Rosa – on privacy by design?

>> KATARZYNA SZYMIELEWICZ: I’m going back to points made by Rainer and Pat. So, on the point of definitions and concepts, if we need to revise, I cannot agree more. The concepts that we have nowadays are not relevant for the digital reality, so here we do agree.

But one more example of that for me is the concept of risky or not risky or sensitive or not sensitive data. To repeat my point and make it more clear, not only health or race or sexually oriented data are sensitive. Your nonsensitive data might become problematic if somebody is performing complex profiling and maybe traces back your sexual orientation or your religion or health issues on the basis of on the surface unrelated data.

So just to make the point that technology gives us so many possibilities of processing data, that the very concept of risky versus unrisky, I think it’s flawed and old fashioned and we should look at it from a different perspective. I’m not saying that I should Judge this as an individual, not at all. It’s just more complex than it used to be.

On profiling, the definition in the proposed regulation is not sufficient because it does not cover the mere profiling as activity performed on my data and the categories of data. It only refers to the results. That is not enough. We should be able as individuals to control profiling in itself, with exceptions just like we have with simple data processing. There might be exceptions for law enforcement or emergency situations. But as a principle, we should be able to control these processes, because they do affect our private life. They have a lot of discriminative potential, a lot of potential for society. And it’s an increasingly serious issue.

>> DOUWE KORFF: Rosa and then you, Pat.

>> ROSA BARCELO: On privacy by design, I would like to put a question in general. The following. Imagine that the Irish data protection authority, which is responsible for Facebook, because Facebook is established in Ireland. Besides that, privacy by design means the false settings on the social network should not be sharing anything. But then you decide the opposite. So, if you don’t know exactly what it means, you might create in the internal market an immediate... I leave it.

>> DOUWE KORFF: Okay.

>> Just a quick response to that. According to the proposal, that will be an example where the – that will be an example that the data protection board will have to take up that case and decide the consistent interpretation of the law and what methods you would take.

>> DOUWE KORFF: The regulation has a good process.

>> PAT WALSHE: Just a couple of things. One on the design and default. Rosa and Kasha’s point. Alexander, we met a year ago and we talked about technology and we talked about a GPS enabled camera. You can go and buy a camera that can connect to the Internet. Say to the side of the box: Always know where you were when you took this photo. And it says on the side of the box, this is a geotagging camera. Should the default be off or on? So I was surprised when there was a regulator in the room, it should be off. But it says on the side of the box, and that’s why I bought it. You have to be an idiot. This is to capture your location data. It’s not covered by the law because it sits outside of the legal framework at the moment. And that’s why this codes of conduct, that is an example if I can do it.

And then in relation to profiling, the way the mobile technology works is that you have to understand who is connected to one of your base stations. So if that base station is reaching capacity, you can shift them to another base station. Maybe you have to understand so that you can connect calls. So, would that sit in your exemption concept that you were thinking of in terms of profiling? Because that is a legitimate business need. It’s a commonly accepted practice. And we need new ways of thinking about this. Otherwise, people will be denied opportunities and services.

>> DOUWE KORFF: Now, that –

>> You have to decide what are the opportunities.

>> DOUWE KORFF: That brings us back to the question, the existing standards, good enough at least as the starting point, can we improve on the Convention 108 and the EU regulation? Or do we need a different approach altogether?

I think that is – we have been talking a lot so far and we spent three quarters of our time on finding out how difficult things are, and I think everybody agrees it’s very difficult.

I would like you to concentrate on the solutions and in particular on whether the solution lies in new developments, like the Madrid principles and the like, or perhaps the Obama initiative, or whether it lies in the EU regulation, or whether it lies in my old friend Convention number 108. What do you think is the answer? I’ll Judge around like that and we will finish the time by that. One and a half minutes, each.

>> Well, what is the solution?

I think when it – one of the problems we have now, that it’s not only that companies are regulated and public authorities as it was in former times when they had the big computer and the seller and they had enough money to process data. Now, we are regulated, because we all have our little computers in our pockets and so on. So I think if we are all covered by this law, and all our activities or many activities, the Swedish approach might be the solution for dealing with this problem. Because many things are covered by the law now, but we do not really need to regulate them. I think if we have some cases like the Lindquist, many people will agree that this is not really a problem of privacy and data protection. But there are other problems. And so I think this distinction of simplified rules and specific rules and – well, then we have to think about well, what should be a specific rule? What kind of categories should we build or design for a set of specific rules?

For example, President Obama’s administration, this is focusing on the private sector for companies and these things. I think this is a good distinction, public and private. And then they make a distinction of a focus on providers and/or companies and consumers. And that is also fair, because this is – they have different roles in the online environment.

And so I think we have to think about some categories for specific rules and one of them should be, I think, consumers and companies or providers, and then we have to think about different accountabilities and responsibilities.

>> DOUWE KORFF: Can that be done on the basis of existing rules or a new conceptual framework?

>> We are now in the legislation process with the regulation and the whole package. And well we are thinking about how to improve the proposals, and maybe we could add some rules or change some rules. But this is an ongoing process. We are in the middle of it.

>> DOUWE KORFF: Okay.

>> I couldn’t argue against what my colleague is saying here. I don’t think that I could add that we have to conclude that we have to have rules that we can abide by, and that they reflect today’s values. And we should welcome the modernization of the guru, because it dates back to what we were thinking back in the ’80s, and the world doesn’t look the same.

And also, with 108 it dates back to the ’80s. Good luck, I say, and it’s a great challenge to find adequate rules for this.

But we should also think about thinking about the focus on the threats, the risks and misuse and how to handle that.

>> DOUWE KORFF: That’s a fair point.

>> RAINER STENTZEL: We have been talking about how difficult it is because it’s difficult. I think the first response to that would be to turn over – to take it slower. In a legal perspective, we can parse and go back to the sources and find out the meaning of different concepts. But in a policy discussion, I find it fascinating that everybody has a clear idea of what privacy is. I think it’s still extremely subjective what privacy is. It’s completely outrageous to think that you can define it in such a way to make a meaningful regulation. I think you have to be very, very careful in setting your standards here. An overarching topic of the seminar are what is privacy? And I think that varies enormously. So please be careful in regulating this. Obviously people are not that afraid of these risks, considering the exposure to the potential privacy consequences. Everybody has a potential surveillance device in their pockets. Facebook has close to a billion members. Is this – this is a sign to me that people are not overly concerned. It’s not that they are ignorant. I think it’s – I’m sure there are ignorant people. But I think it’s a matter of having some kind of idea that this is something they are prepared to take a risk – a risk that they are prepared to take.

>> ROSA BARCELO: I think that the way that the proposal from the Commission of having big principles, standard – the big lines in the regulation is the way forward to modernize, but I think that the regulation alone will not give all the answers. There will be a lot of need for additional tools, nonlegislative, probably. But some guidance by the board, because if we want to maintain a regulation that is neutral, not technology, completely neutral, we will need to fill the gap, the details, through other ways. It could be codes of conduct. It could be other tools that will have to complete the picture in order to answer about how to do it in each specific case. That’s how I see the whole picture.

>> DOUWE KORFF: Thank you. Kasha.

>> KATARZYNA SZYMIELEWICZ: It’s easier to start from what we will not solve the problem and I think the self regulation will not solve the problem. We cannot think of self regulation as the answer for the future of data protection. Of course we can think of it as additional tools, and I agree with Rosa that it might be a good tool of complimenting standards. But the standards have to be there in the law. And the more general flexible, the more basic, better for us. Because we can never build a law that will give answers, precise answers to every single question. We would have to rely on the courts, maybe on the European Commission, maybe on the privacy boards, that is obvious. Interpretation will always be there. So I would focus on the principles. In that context, of course, I strongly advocate for the writing proposal and the modernization of the Convention 108, with specific focus on lack of loopholes.

We need to make the regulation work and not to have circumvention, especially by European companies. I see that as the basic requirement to regulate US based companies with the European law, which we are trying to do now, but we have to do it successfully.

And the final point on the privacy issue. Privacy is a difficult concept. Maybe we should move slowly to the concept of controlling information. It’s my freedom or my autonomy of controlling information is at stake. Privacy might be subjective, but we need clear standards as a citizen to be able to control my information. It can be as simple as that.

>> DOUWE KORFF: Data protection is more than about privacy.

We have reached 17:50.

>> PAT WALSHE: I’d like to take the most of all of that, if I could. We produced a set of mobile privacy principles in addition because we have members where there are no data protection and privacy frameworks, but where consumers, mobile users in those regions, have the same privacy expectations and concerns that we see here, so we produced those. I agree, I think it’s important that the law does apply to companies overseas that process – that capture and process data of European citizens. Absolutely we support that. I think there is an opportunity here in the regulation and its initiatives that I see, you’ve got the GPIN, you have APEC and 19 Latin American countries that came together to establish a privacy enforcement network, and I think we need to utilize those mechanisms.

I’m afraid that I do believe in self regulation, but I believe it should complement basic principles. Because if we say that it’s about the law and you don’t have a code of practice, or guideline, then, you know, do we really want to be going through this process again in three years when we come through another technological revolution?

Trust me, these things are sense enabled. This thing knows which way I’m facing. It knows where I’ve been. It knows where I’m not, which is quite important. So this raises concepts of what does location data mean and that can be dealt with by these things, not by the law.

>> DOUWE KORFF: Thank you everybody for being here. Those of you whose hands I haven’t seen, my apologies. I hope you found it interesting. I think we haven’t resolved all data protection issues in one and a alf hours. We clarified how important data protection is especially in a globalized world with cloud computing. We found out that the drafting of the regulation is a fantastic opportunity to go back to absolutely basics on everything. However, we mustn’t forget the basic opinions tells that were first established in Convention 108. And I should mention the original drafter, poor man is dead now, but it has stood the test of time. So we have something concrete to build on. But a lot of very complex issues to resolve and we need to resolve them in many different fora and in many different levels. And whatever framework we come up with has to be flexible to deal with the technological change, the enormous scope in which this operates and many of you who for many years come to many of these meetings. And we will need every one of you to participate in the ongoing debate as well.

Thank you for being here and let’s continue the debate for the rest of the evening. Thank you.

(Applause)

>> OLA BERGSTROM: Thank you very much.