The privacy standards that we want – WS 01 2011

From EuroDIG Wiki
Revision as of 17:30, 17 November 2020 by Eurodigwiki-edit (talk | contribs) (Created page with "30 May 2011 | 15:00-16:30 <br /> '''Programme overview 2011'''<br /><br /> == Session teaser == Data protection legal frameworks are currently und...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

30 May 2011 | 15:00-16:30
Programme overview 2011

Session teaser

Data protection legal frameworks are currently under review in several international fora with a view to meet the challenges resulting from globalisation as well as from the increasing emergence and use of new technologies. Tomorrow's legal frameworks should be able to protect privacy with regards to new IT development and irrespective of where it is used. Eurodig can discuss the shapes of legal frameworks to come, addressing challenges and delivering messages

People

Key Participants

  • Marie Georges, Council of Europe
  • Katarzyna Szymielewicz, Panoptykon Foundation
  • Nevena Ruzić, Commissioner for Information of Public Importance and Personal Data Protection (Serbia)
  • Milan Nikolić, Telenor

Co-moderators

  • Sophie Kwasny, Council of Europe
  • Sorina Teleanu, Parliament of Romania

Session report

Data autonomy: right to oblivion and consent

A common feature that was underlined for both issues is the need to raise awareness of users’ rights and responsibilities (i.e. the consequences of their actions), and the functioning of the system they are using. This should be a shared responsibility between public authorities, industry and civil society.

The “right to oblivion” should be safeguarded and different regimes should be established depending on the purpose of the related data processing. Users who publish their personal data on the Internet should have the possibility to have it deleted (despite the possible current technical difficulties).

The issue of consent cannot be the sole legal basis on which to process personal data as it is not always necessary (depends on the type of data concerned and purpose of the processing at stake) and because it may not necessarily be free. Individuals should be informed on this processing (purpose, duration, etc) in order either to give informed consent or to exercise their right to control.

Effectiveness: global standards and chain of actors

The multiplicity and lack of transparency of layers of actors involved in the design and implementation of equipment and data processing, and the variety of jurisdictions suggests that users might not be as protected as they should be and so agreeing on common values and principles at global level, taking into account the possible regional differences, would enhance their protection. Users are a key part of the chain of actors and while industry considers that they should individually act upon their privacy without systematically relying on other actors’ responsibilities, it was recalled that respect of privacy and personal data is a human right which must be protected and respected by all those who process data. This chain of actors also comprises non-technical actors who are the regulators and supervisory authorities that play a key role in ensuring adequate protection of the individuals and should therefore cooperate with other agencies and stakeholders.

Freedom and privacy in service driven architectures: behavourial targeting, search engines and social networks

Data increasingly appears to be the price users pay for services; yet users should be given a real choice to give away their data or on the contrary to object to the collection and processing of their data. There is a tension between the right to privacy and new business models, a tension which should not lead to diminishing the protection of privacy. The trust relationship between businesses and users was stressed, as this trust is a vital element of the continuity of business and, depending on the nature of the service and business model, some providers will prefer not to store data beyond what is necessary.

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com


This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.


>> MODERATOR: Good afternoon everybody. We’re going to start. We just have a problem with the reiteration we are trying to sort out. This is the workshop about privacy standards. What is the idea? The idea is to tackle the new notion of privacy. The new principles, which are in the field, knowing that several international organisations are currently reviewing their framework, their legal framework.

We wanted to see the opportunity of this multi-stakeholder platform to have your views on what privacy should be. And we have a panel that represents experts in the field. So I’ll have the pleasure to introduce you to our panelists.

>> MODERATOR: Good afternoon. I start with Marie Georges, who has experience as a privacy expert. She has been working in the promotion of the issues of freedom in IT, internationally, and working on the drafting of the EU directive on data protection. And she is here as the Council of Europe expert.

We have a representative of civil societies, and if this is not the right pronunciation, I’m sorry, Katarzyna Szymielewicz, Executive Director of the NGO defending human rights in the context of modern civilians.

Then we have Nevena Ruzic. She is head of the Office of Public Information here in Serbia.

And we have an industry representative. These are our speakers for today.

>> MODERATOR: One thing concerning the thing, the privacy standards that we want, as you can imagine, it’s very, very large. So we had to make choices. The organizing team thought that we wanted a session to be relevant. If we wanted exchanges that focus on some issues, we would have to identify the things. There are three things that we would like the panelists to address, and see what your views are on those three things that we will present to you.

>> MODERATOR: So we have the three things. The first one is the right to consent. We will have the speakers speak about it and then we will have questions from the floor.

On the second issue, it’s effectiveness, global standards and factors.

And the third one is the behavioral and social network. And we will start with the first topic, right to oblivion and consent. And I invite Marie to start, Marie, please.

>> MARIE GEORGES: Good afternoon. Yes, in the consultation, the world consultation that the Council of Europe is doing for the moment, we have over 50 responses for the moment. For many, many countries, even outside the FTC, the United States, Senegal, Mexico and others. So it’s a world consultation.

And one problem that the Council of Europe asked the people if they thought that we should add to the convention is a principle about the right to oblivion. For the last year it had been a hyped up topic, not always very well understood on the international agenda, the last IGF meeting. The right to be forgotten.

And, of course, this can, from the fact that – now that we are empowered to publish a lot of things, publish ourselves, publish also our work and publish by others, and because the Internet, many things can remain for years, years, years, the question was raised in this context.

I would just like to point out, I won’t give you a complete answer today, it’s not my problem. Just to react and to discuss why this question comes.

I would like to draw to your attention that in democracy, it’s for many, many years, 100 years at least, there is the right to oblivion. For instance, you can’t sue someone after a certain number of years. In which area? If it’s a very important case, you have a lot of time normally. But it can be three years, it can be ten years.

And I think I only common situation in which there is no limit, it’s the crime against humanity. So you can sue someone even after 60 years. But in other cases, there is a limit of time.

Also, there is, in the many countries, you have the society by consensus, by law. It’s designed to reconcile people with what has been done in such cases. We forget it. We can’t any longer say about one person that she has done this or that, and so forth.

So, you see, a democracy, the fact that you don’t have all your life to prove what you have been done 20, 30 years ago. You may change ideas. There is something quite important.

Of course, this resolution is not complete, because there is a need for proceedings in many cases. But it has to do something with some of the principles that we have already in the convention.

For instance, you can’t keep that time more than necessary for the purpose. People have the right of access to data, to correct them, and to suppress them. Of course, it depends on the circumstances. You know, it’s a basic right.

And then you have to apply it for the circumstances, the legitimacy for which it was kept. And you have an obligation for history.

In some countries, it’s decided by law, how to decide which data or documents will be kept, for how long. Nobody can access except for some research, and then they are made public. You see, we have all these nuances.

So, the reason for the consultation is that some people are convinced that there is now a need for the right to oblivion. I don’t think there is any constitution which has it in, but we think – but the concept is at this level.

And some others say we have to still think about it. They didn’t say no. They said we have to continue to think about it.

Of course, there are some very precise situations and already you can go do things. If you publish yourself, I mean, if you write something and it is published, if it is on the Internet, people who want to put it on have to ask you for your consent. That’s already.

If you publish yourself something, you should have the right to get it off. When I say that, it’s because there is – I mean, it’s clear, I think, in Europe, worldwide, that it is not. There are services which keep information.

If someone publishes something on you and you don’t agree, there are already – it is a question of freedom of speech. But there is a balance to make. You can ask the person to withdraw the information. If the person does not, you have to go to the church. In our democracy, the limits of free speech is judged by church. So you see there are already some situations.

But of course it can be massive and it can be difficult for an individual to do it, you know, for a lot of cases.

Anyhow, you see how the consent of the individual goes, and this is also a question, because the consent as a legal basis was not in the convention. So the question is: Shall we add in the convention aside the question of legitimate purpose, a list of situations in which there is a legitimate purpose? Over the years, we see that consent is one of the bases of the concept. But there can be abuse of consent or so, right?

So, the response we had on the consultation showed that all the people think that there is – we should not in the future give too much emphasis on the consent. There are very few situations in which data are collected on this basis, which should be of course informed consent. Clearly, you can read through it any time. But there are situations in which you are the only one who can choose.

So, that is what I could say under this topic.

>> MODERATOR: Thank you, Marie.

Now we move to the society. Kataryna Szymielewicz.

>> KATARZYNA SZYMIELEWICZ: A few points repeating the basics, especially on the concent. From my perspective consent is urgent, of course, because of the fact of how much information flows online, which is obviously in this context. But I have to say it, every click we make and every move that we make online can create a trace, sometimes a digital fingerprint, and the big question is what they are saying, to what we consent to and to what we accept as essential features of the Internet and what could happen. I think we have far too much information that flows without awareness of users. So we need to work more on making users aware of digital information about these data flows that they generate in the day-to-day use of the Internet.

And the question comes if we see data protection, if we see this data protection issue, if we agree that sometimes our digital fingerprint can identify us, sometimes it’s simply personal data.

How do we go about consent without destroying the Internet? I think it’s obvious, and then I would say that we don’t want the situation, we want every click that we make comes with consent, with a proper thing that blocks us from using the Internet that we want.

But on the other hand, it’s not a solution that we have now, leaving people in complete ignorance and depriving them of the right to consent.

So, how we deal with that? I think this is the precise exercise that we have to face and try to figure out together how to do it well. I would be considering something like a pop up window that comes once every number of times, while the user is forced to choose his or her options. For example, what type of information we consent to give, what purposes we consent to, what types of entities we consent to in terms of processing; decisions like that.

But the presence – I think present on the market we see two solutions that are being proposed. One is going to – one is a classic terms and conditions, like social networking signs offer us 40-page long terms and conditions. This is not the right solution. I can’t imagine any of us reading long documents any time we want to make informed consent on liability.

On the other hand, we have advertising business, we have different ideas of having icons, and very concise information, symbolic information that should be converted to users. This is another extreme. And I would say we need something in between. We need information that the users can digest but information that really gives them an idea of what are the concerns, what are the implications if you give your consent.

If you say I don’t want to give my data for the service and we need to think about the method, I think methods like “do not track me” are less efficient.

We don’t have educated users, I’m not talking about this room, but most people outside of this room don’t know what flows online. If we don’t force them to make choices, if we don’t force them in some kind of situation, we will have the situation of ignorance.

About the right to oblivion, two quick points. I totally agree that we need a simple principle: The data that I have placed online I have the right to withdraw, and the service provider has no counter argument, apart from one argument. Have in mind that there is something like data retention. So we shouldn’t be able to delete our account immediately and forever if there comes reasons like law enforcement, and that is one rationale for keeping data for some time after deleting the account.

And the second thing I want to underline is caching. It’s not only the services where we put our data, it’s copies of Web sites and archives that are created online mostly by search engines. This is a separate issue that I want to address if we want the effective right.

>> MODERATOR: Thank you for this very clear view. Now we move to the data protection view. Nevena, please.

>> NEVENA RUZIC: Trying to continue what was mentioned, she refers to the educated user. That is very important, and I will Try to combine both the consent and right to oblivion. Because there is a common feature there. There is a positive action by the individual user, and I know that to have it, a positive action, it’s only rare cases the right to oblivion is triggered automatically in court cases, like a right to rehabilitation, that the deletions of the files in some other cases about data processing. But this demands a certain amount of knowledge from the user. So the concept should be specific and free.

And the right to oblivion is also a certain demand of the user to start some action and demand the deletion of the data and has a right to be forgotten and forgiven, perhaps.

We need an educated user. How we do that? We do that jointly. So there is a role of the State also to regulate, but to protect. And most importantly, this is sometimes we forget about this element to educate. And this is a positive action of the State. It’s highly necessary. And of course this cannot be done only by State authorities. The law is always behind them. So the authorities are always behind them with technical people and mathematicians and programmers. So it has to be done in some joint – a joint venture. But it’s not only about technical staff or civil servants. It’s also about engineering, Human Rights Activists, so think about the right to oblivion and consent. They have a meaning both.

If there is something positive as a general principle behind, it’s not only about developing a standard but it’s also thinking in general and how to implement this.

And this is sometimes more difficult than trying to define a sentence or two about the meaning of consent or the right to oblivion.

>> MODERATOR: Thank you. We heard about userS, governments, State authority. Let’s hear the industry view.

>> Thank you. How to first forget information I have trustfully given to it.

Mostly, the responsibility is at the entry point. And that means that we have to practically enforce basic personal data protection principles at the entry point. That is to minimize the collection and processing of my personal information.

Then what is most important? The restrictive dissemination to third parties. Because if my information goes without any control from entry point to the cloud, there is no way I can stop it anymore. Of course we have to give full information to the data subject about – for what purposes and how the personal information will be used. Of course, create measures to entry service points. And we have to have process people and technology in place at the entry point. I think all other efforts to, first for the Internet, it’s a waste of time and information because you can’t control the information unless you stop it at the entry point.

All of these principles, they are already in legislation. So what you are talking about is implementation challenge, how to implement this. That is a major challenge in all similar situations. So, from an industry point of view, and from a service provider point of view, confidence and trust is the cornerstone of long-term sustainable business. And once as a provider you lose it, then you can practically close down your business for a long period of time until the public forgets it.

And as we know, we are talking about how to make the cloud to forget, and the Internet never forgets. So once you blew it, you’re already in big trouble.

So for service providers, full protection of user privacy, not only complaints and important reputation issues, that practically is a business system issue and serious service providers are well aware of that, and they are investing a lot of efforts in how to protect privacy of the user.

So my advice to all people when they ask me how to first forget, I say well, find a credible service provider. That is a major step for you.

Because credible service providers, they account very rationally. They don’t retain personal information in huge volumes and after long periods of time, because they don’t have an interest for that and, again, it’s against the law.

And then credible service providers, they don’t have hidden agendas with personal data. They don’t trade with them. They use it just for their basic business. And of course serious service providers, they play ethically.

So the question is how to assess the culpability of the service provider. That is very challenging because every user is transformed and not aware that he needs to check the credibility of the provider.

And then the Internet can be so seductive in tempting the different authors. Sometimes we are willing to gamble with credibility with the provider in order to get something that is offered on the Internet and it looks very, very nice.

And then again we don’t have KPIs. We don’t have benchmarks. We don’t have quality data available to the public, so we can say this provider is credible and this one is not so. So my advice would be go for a brand name, look for the history of the provider, and then really think about the right to protection in the industry. Because it is, I think, beneficial both for industry and for users. And a black list is for consideration.

>> MODERATOR: Now, open the floor for questions? Questions? Yes.

>> My name is Rainer Stentzel from the general industry of interior. I’ve got two remarks on what you said about the right to oblivion and maybe challenges of data protection privacy in Europe and the European Council. Well, if it’s good or bad that the Internet doesn’t forget, it depends on the case and the perspective. The case you mentioned was that someone committed a crime. And it’s okay that the State and the law enforcement agency, they can prosecute it for maybe ten years. After that, it’s well not forgotten, but it is not possible anymore to prosecute it. Now we have the perspective of the victim. What should the victim do? After ten years, is it still allowed to write on a blog about the crime, someone who ruined the life or anything else?

I think it’s very complicated to find one single legal rule or principle which deals with the right to oblivion or forgetting on the Internet.

So the general industry of interior just started a contest with three categories. One is about awareness, education of the user, but not education by us, by the State, but education by itself. So everyone is welcome to bring in ideas and to make people aware what it means that the Internet will never forget anything.

The second category is about social behavior and rules. I think it’s much more important that we think about social behaviors, that we learn how to deal with phenomena like – that the Internet doesn’t forget. And just by words, like you behave on the Internet that we say, well, after a while it’s not worthwhile that some information is still that important, because it’s – I mean, everyone has some information about everyone on the Internet and it’s getting normal that they are there. Maybe.

And the third category is about technical solutions, and we are seeking for something like, well, a digital eraser or something like that. Maybe there are any ideas out there.

The second point, as you said, what is the best educated user? And if we think about privacy you would say well think twice before you give information on the Internet because it’s there forever and maybe it’s better for your privacy that you don’t write on blogs and social networks and so on.

But from the privacy perspective, what is the idea of a best user? The one who doesn’t use the Internet? And I think that is – that can’t be the solution.

And the real challenge we have now compared to 1990 and 1995 when we started with the directive is that we don’t have a clear distinction between companies and users or people like me and you. Because we all became, in terms of the directive, controllers. If we use the social network, we are processing data and if we take the directive literally, we should be controlled by oversight, by the data protection authorities.

We have to think about limitation and so on. And I think that is a real challenge, because we cannot really draft a framework with the same words as we have now in the directive, which are completely applicable to the users.

And that’s a real challenge. And how do we deal with this?

>> MODERATOR: Thank you. This was not exactly a question. It was more of a comment.

Marie? Okay. Marie?

>> MARIE GEORGES: You are completely right. I mean, on one part you’re right and the other one I don’t agree. Where you are right, that users do know also are creating that, keeping them, storing them, like the company yesterday. But in each case you can make a difference.

I don’t agree with you with the fact that it is difficult to make the difference. If the company – I mean, the platform of social networks, in each situation you can make a difference. What is the responsibility of the social network platform and what is the responsibility of the user? The user published in his group. It is the responsibility of what he did.

If he didn’t have enough possibilities to close the information, if he was not informed that when you click there, it is published to everyone and not only for the group, here you may have the responsibility of the platform. So that is how I would start to discuss it. But it will belong to –

>> MODERATOR: Thank you, Marie. Could you please be brief, please?

>> I’m Jona Holderle from the European Youth Press. I have a short question. A little bit provocative. But is it a problem that we don’t delete data after a while? Is it a problem? I think we should focus more on the data that are not open and get into kind of, you know, I called it yesterday, into a kind of opt in mentality. And especially in the whole social networks, and not the privacy opt out mentality that we have at the moment. Privacy settings, they are quite nice, but you always have to opt out of public settings.

>> MODERATOR: Thank you. Do you want to reply?

>> For my point. It’s not a big problem that it’s not erased. It’s not a problem that data is collected. The problem is how the data is used. Either used or not. If data are stored in some archive on tapes, for five or six years, okay. It is legal. But still it’s not such a big thing, because they are dormant.

So I would say as a big thing, I don’t know, that I got the point. Is this a question for the industry? I repeat, in the interest of industry’s to get rid of data as soon as possible, to release the resources, and to have business.

I’m talking about ethical and credible. A business with a hidden agenda, a business who is practically illegally trading possible information about users.

>> MODERATOR: But I also think that could be a problem depending on the nature of data. Some data that we give away freely is not a problem to have them for eternity online. And there are some data that are very sensitive and should be deleted. Like, for example, criminal records. Because I do believe that a person has a right to correct his or her behavior in the future. And if you don’t give this chance, for example, maybe it’s just one example, I think that it would be really one thing to do to the person. Because everybody has the right to be focused and to a certain extent after a certain period of time.

So I really think there is no formula how to – and I would agree with you, for example, at the beginning, when you said there is no formula how to establish whether there is a consent, whether we need that consent. We need to know the case and whether there is a right to oblivion.

When I think of a group of cases where we can develop some standards, the right to oblivion in certain cases, if they refer to children, criminal records, or if they are like just every day daytime. I say everyday data, I make the confusion. What is everyday data? But maybe some classifications are needed. Maybe some are not sensitive and others that are more sensitive. So...

>> Yes, you have to see which kind of service, what is the core, the key issue or not. There is the issue, the right of oblivion, about things that have been public, because so many things can be published. That’s what the organizer of this table chose this word, chose this theme.

About the – I don’t know what you think about search engines keeping your request for more than a year, related to the freedom of information.

The finality principle would go to the fact that the request should be off the system once you close the PC. I mean, at least, or you change the service. But they are kept 13 months. It’s a real question on those data, Huh?

Now, on the opposite, I am very impressed by people who are working on the idea that on the basis of right to access to nonopened data, we should be able to get those data in an open format so we keep our data and we can make – you know, like the open data for government. Why don’t we do it also for ourselves with our data. That is another issue that is coming as I kind of look at the screen of people looking at new services. And they are not looking for making – for making a market survey. It’s not one market survey. You get your data, it’s portable. The right of access could give you access to data. But you don’t do it. You do it only when there is a problem. It’s a consumer right, I would say.

But maybe there is an interest in developing your own service. But what you do with the Internet and what the others say, they do it legitimately. But maybe you want to make statistics and how you evaluate and so on and so forth.

>> AUDIENCE: Okay. I would like to – I’m Katawa.

About the right to oblivion, you would have someone’s absolute right to pull out of the Internet what they published before. A person who makes a speech and regrets it, they could try to demand it or the Internet papers remove it or if it happens later from history books. Are you really asking for that?

>> No. I thought we had to take back personal data. So for the essence of the discussion, we keep mixing in that discussion very different ideas in terms of information we can imagine online. One is personal data, so something about you or somebody leaves about you in the social network.

The second situation is information about you that does not necessarily identify you, but somehow construes you – for example, the criminal records, I guess, the big issue is when they are identified. But I can also imagine a story online, which is not necessarily mentioning the person, but with the context it’s possible to identify the person. So journalistic stories, maybe politician, they have a lot written about themselves, not necessarily naming them. So I would say this is a second category of information.

And then you have copyrighted material or all sorts of Artistic creations maybe that are not copyrighted, which in my head it’s not personal data. It’s a very different situation. When I write a book and I publish a book and it’s online, the question whether I can have it – it’s not a data protection question for me. It’s a question of, I don’t know, access to knowledge, or access to information or different words we can ask now. But yes, I would not put that situation in the same category as deleting personal data of myself.

And the second layer to discuss is who we are talking about. Individuals like us, some of us might be public individuals, then you have different situations. If you are a politician, you are a public individual and data protection doesn’t apply in full. If you are a President of a company or if you are a Prime Minister, so there are different situations where that information is applied. So I’m not saying all information, not even personal, but all information related to a person should be deleted? No. I’m saying that we ask data, that we as users give for social services, and then it’s discovered that this data is still there, for profiling, because someone forgot because it’s too expensive to delete. But I know some companies just disregard it because it’s cheaper to keep it there and don’t look for a problem.

But the answer is no. It’s a more complicated discussion, but I don’t think we will go into books and articles and things like that.

>> MODERATOR: I’m sorry, because I see several hands raised. This is the difficulty of the session. We have three things to address. We made a choice. We would like to address the other things, and if not, we continue on this theme, the right to oblivion, right to consent. We don’t address the other ones.

So if we have time at the end, we will come back to your questions.

Now we will ask our panelists to give us their thoughts on effectiveness of the data protection. And two sub questions, the need for global standards and chain of actors.

I will first give the floor to –

>> So the question, effectiveness of data protection for global standards with regard to the chain of actors.

If you think the chain of technical – I mean technical, economic, so forth, actors which are involved, and when you go on the Internet there are a lot of them, you have your device. There is a function or a feature to prevent the person you are calling to say you remember that this comes from a position taken by the DPAs in Europe in later ’85, which was the recommendation of the Council of Europe, then passed in the law in Europe, and was a norm, a technical norm. It has to be implemented in your device and you can use it.

Nowadays, there is the same situation for the other organisations. It seems that technical norms are not fixed for the moment. As you know, last week we had some examples of very important issuers of devices and platforms, which was keeping location without any legitimacy.

So you have the device you have the access provider, thank you, Milan. You have the Web site where you go. But the Web site is on the host, okay? Then if you want to pay – if you want to put the number of your credit card and so forth, it goes through a certain process with the parties, to banks, and to your bank, another one. And then if you want to go have a good delivered to your house, another actor is coming to get the data so that the good comes to your house. And if you have a problem, you may have a call center where you call the person, asking where is the goods I bought and so forth?

You have also the back up. All those people, they have back ups, because if something happens in their server, and they can find the data again, you have the – to maintain the information system.

You may have people taking care of it and they can be in another company.

Well, and the user at the end, okay?

All those, and I could give you examples, may be in different countries today. Even for the device. The device is designed somewhere. The hardware, the opposite of the earth, okay? You say yes? And the software may be coded in a third country.

So I took example. I could take all the chain and give you a lot of examples. So I learned a lot of people are looking for having common principles under data protection, worldwide. And of course the DPAs, since last year – I mean, since ’85, they are looking at the Council of Europe to have a complete worldwide basic principles.

But you have a second level of implementation of those principles, according to the kind of service and so forth, it may apply differently for the purpose.

So there might be also a need for a secondary level of making common recommendations, so we apply the same things.

Now, there are also the busy flights that people can – I’m too long.

Cooperation, because they are not the same. So the need of the DPAs, and so forth.

Just one word, how many countries in the world have a data protection law? And DPAs? 60?

43 are in Europe. That’s 27 EU plus three in the EA. Plus 13 which are outside the EA.

>> MODERATOR: Thank you very much.

Please, your views on the chain of actors for global standards?

>> Thank you. For the security application and who is making a living out of data protection, this is my favorite. Every chain of actors has the beginning and the beginning is users. So I want to say a couple words about end-users, and the fact that there is also the possibility of the protection of their own privacy.

And the average user is usually unaware and uneducated about privacy risks and his privacy rights. He usually is careless, addicted to the culture for free, which is practically the basis of the Internet, and it’s one of the main drivers of the Internet.

Then as I mentioned, exposed to the offers from the Internet, those offers are differently motivated. Sometimes they are commercial but also fraudulent, even criminal. And of course the end-user is sometimes greedy. And as every good service provider, luckily, he has millions of average users. So the service provider alone cannot protect the users from themselves, regardless of how capable is that service provider. And of course we need to take care about end-users as a community of solution intellect and keepers and whatever you need to help that.

And, basically, we can try to send them in the direction, first of all, to try to raise awareness about other risks and also about the rights of privacy. Because very few people know much about their rights.

We also need to focus more on education and of course we need to mention the legacy and the technologies that is utilized.

So it’s a revolution who should direct. It should be not the stakeholder. And I think we will talk more about this topic. Thank you.

>> MARIE GEORGES: Thank you, Madam, for a few sentences about users and their rights. I will tell more about the chain of actors itself, and I think it’s a huge challenge. And as a user and as a rights defender, I don’t think it’s merely the user’s responsibility. We need like investigative journalists or scientists to go through the whole chain of actors and realize what is going on.

Marie said a lot, but I think I would add the whole ecosystem of the advertising itself, on its own. This is like tons of accounts that possess our data. And really until not long ago, I was not aware of that. Until I met someone in the US who did deep research and started showing me. So if I don’t know that, who knows it?

I think we are at the point where we are completely forgetting the basic principle that we have there.

The user has the right to be informed who is using that data and for what purpose. Not only concept is needed, but the right to information is the basic, the essence. We should be aware of where it goes and for what scope the person uses it. It’s not that they care. I don’t own it as an Internet user.

That’s where we have to start. Who should have the responsibility? Probably the first entity we space, whoever it is, search engine or Web site or our mail provider, whoever starts taking the data and is planning to give data further, or our profiles, which I think we will talk more in the other part, but there is also the privacy issue. Should we be inclined to inform the users about further steps?

And the second is jurisdiction. We kept saying that we wanted the multi-stakeholder solution, we wanted cooperation.

You mentioned how few DPAs we have. Well, the concept of data protection, like you were saying, this is just the biggest challenge. Because as long as the world has a different concept of protecting data and that they have no major companies who provide services there, we have a huge problem.

And I know it’s – and I know that the European Commission and US are thinking about it. We are thinking about multi-stakeholder is a bit fake as long as we have the main actor again.

So has the jurisdiction changed? If I were the one to decide, I would say if you target people, you should have to follow the laws of the country. How to enforce that in China and Africa and other places, that’s the question. But it’s a big one.

>> Data protection authority certainly will always raise this question of jurisdiction and the need for international transparency.

>> If I could first refer to the changing of actors. I think that we always – that we can always be in a position that one link can break the chain. And missing one link, if we focus on end-users and think about the – that they should be aware of their privacy, then we could think of society, to inform and educate the end-user. And then the business recommendation would be more privacy.

And then we can think of governments that actually failed in regulating their obligations. So I think it’s always something that goes at the same time. It’s on different levels and different layers. Perhaps it’s the same space, but it is action going on on several tracks.

When we talk about global standards, we want them to be adopted by not only States, but other stakeholders. But then we live in different legal systems, different cultures. We live in societies where we are asking someone about his health or marriage is completely unnormal, and in other countries it’s absolutely not acceptable.

So we have to – I think the focus should also be about differences in order to reach those global standards. What we do online, I mean, we do all the time with jumping from one jurisdiction to another, and an individual user cannot be expected to know about each and every legal rule.

But there are certain standards that we can think about, not only from the European perspective but also from the global perspective. What is, for example, credible business, which was mentioned earlier? So this is also – this can also be a standard that we could think of. And how are we estimating whether a business entity is credible or not? We need certain, how to say, signs and like a box to know what is a credible business and what is not.

And then standards, we can talk about privacy standards and we can talk about protection of privacy standards and how we implement those standards.

What is an independent Data Protection Agency? How we estimate the independence of the agency? Is it because it’s not attached to the government, because the members are not appointed by the government, by the Parliament?

But what about financial dependence. So whether there is also standards when we are talking about an independent data protection agency and to what extent. And this is also a result of different cultures. In some countries, even if agencies and government payroll, it doesn’t mean that necessarily it’s not independent. It can be independent.

And there are other countries that can be – the other can be even in a general election. We can hold a general election regarding the budget and regarding the people that we sent. But it doesn’t mean that it would be independent in a clear sense.

So I think it’s always an open question, something that we have to think about over and over again and perhaps change it every now and then.

>> MODERATOR: Thank you. And for you for raising this matter of independence and DPAs, and the financial status. Because it’s one of the questions that was sent to us by the hub. So I think they are there, we already replied to one of their questions, so thank you for that.

Are there any questions from the floor on this?

>> EVAN: Hi. I’m Evan from the Party of Serbia. I would like to ask you, let’s say I have a good solution to completely protect my data on the Internet and it’s totally noncommercial. It’s free, and it’s open source. Would corporate people like you and me and anyone else in Serbian corporations be willing to promote that and to suggest to their end-users not to use some commercial sites like Facebook or some other sites on the Internet because of that protection of their private data?

>> First of all, I think all the industry will very seriously examine each proposal. As far as my company, we have very rigid rules about procurement because of anticorruption.

Also, yes, we had some kind of business playground where different people from the outside could offer their business ideas and then those ideas were going through the machinery. So they would be evaluated and then the final decision could be made.

So in general, yes, I think industry is interested for this kind of a thing.

The other thing is the need to give consideration that in order for this solution to be accepted, you should also think about leaving the back door. I don’t know if you are willing to take this into organisations. Is it okay? Yes?

>> MODERATOR: It’s interesting what you’ve said, because although last week with the G8 and EG8, I was surprised that Malcolm was talking about hiring, especially in open source and everything. It raised me to the question of if the data protection people, DPAs and all of those – I’m not an DPA at all. I was thinking well, maybe we are most of the time fixing problems that come from the heart place. And probably on certain things, not on all of them, on certain things maybe we should let more – you know, on the other types of services, no marketplace of which exists and which are developing, and on some certain points are better than those that we see. But for the moment, I think we read anything in the community of data protection, we are proficient on this topic. But I think it’s going to be raised more and more.

I have examples in my head precisely, but I leave it to you and others. You are not the only one.

>> MODERATOR: Are there any other questions from the floor on this issue of chain of actors and intermediaries?

>> LOUISE BENNETT: Louise Bennett from the PCS in the UK.

I think those are very important issues that you haven’t covered yet in the chain of actorS, and this is concerned with whether the data about yourself is actually accurate in the first place. Because there is a question of is it accurate and true? And there is a question of your privacy. Do you know it and do other people know it? And there are problems in social networking where the people can say things that are not true about you, and it’s very hard to eliminate them.

And a very example in the UK was that a women who was a pediatrician, which she is a child doctor, she was identified as a pedophile. And people came to her house and attacked her. But there was misinformation that she was a pedophile, not a pediatrician. And they got it off the Net and it was a hard thing.

>> There are two issues, right? With the chain of actors, what you described, it’s more about the right to delete incorrect information, rather than the problem of having chains of actors reproducing the information. Maybe I’m wrong. But I think it’s not a problem of having a powerful medium, like a powerful social network with a mistake there. What is a different situation from I think what we are talking about, which is that certain information about you, your profile, or the case that you generate or your IP is being processed by huge amounts of entities, and you don’t know that. And of course there is no right to collect information if we don’t know that the information is being processed.

So this is just the outcome of the very basic problem, which is no information for a user that the information is being processed at all.

>> I’m Christopher –

>> Just an addition. The case you were referring to, in some countries there is a deal of work of data protection authorities, which obtains a force, which is not related exactly to speech. Maybe someone is saying somebody is doing something and it’s not the right person. In this case, it’s the suppression of the information. It’s a problem.

>> CHRISTOFFER KARSBERG: I’m Christoffer Karsberg from the Swedish Telecom Agency. I have more of a comment or reflection. It’s not a question, actually. Maybe we can all reflect on this.

About the chain of actors, PTC, my agency, one of our aims is to protect customers or users’ privacy from misbehaving, okay. And then we have the Data Protection Agency in Sweden that has the overall responsibility for data protection. So, we also have a chain of actors in the regulator port of the coin so to speak.

So we have a role to play and a role to cooperate between the agencies so that we can follow the chain and see that the customers and users are protected all through the chain in the best way that you can. That’s one remark.

And the other one is that to sort of encourage the privacy by design thought, that everybody that designs through this chain doesn’t necessarily use personal data unnecessarily for transactions and tracking and so on. So, use data in a privacy protection sense. Thank you.

>> MODERATOR: Is there any reaction to those reflections? Were there any questions?

>> I could comment to this design concept. One of my points was about our own marketing. Because they have the need to market. They have very tough targets and benchmarks. We are discussing products and new services. Unfortunately, security and privacy issues are not the most visible. Because if you do it, then you will lack – our competitors, who don’t really care. So if you are talking about policy by design, it should be an effort for everybody. And I don’t know how far we are from this concept.

But at the end, it’s all about the money. So, well, we cannot sell our – I’m talking very, very – but that is the case. If I cannot sell our privacy by design to our customer, then practically we are going to lose the race with the competitors that we don’t care about.

>> This also brings why your role in your firm is important, as a privacy officer in your firm.

>> Yes.

>> They hate me there.

>> MODERATOR: Any reaction on that again or we will now move to the third subject of our session, if you agree. Yes?

So we will go to freedom and privacy and service. Targeting, search engines and social networks. To some extent some of the issues were addressed, but let’s talk with the society.

>> Well, it’s my favorite topic and I would spend hours to discuss. But I think we need to be brief. Milan gave the opening thought. It’s all about money. This is privacy in service driven architectures. We have service, which is about money. And we have to notice this very important fact. Users sometimes want services for free. That perception or misconception, if you like, is the thing that we have to address to start with.

I don’t agree. I don’t think that users really want services for free. Simply nobody has ever asked them. Yes, Internet, it became obvious that online we have things for free. Not for free. We pay with privacy and personal data. We pay with – we are subjected to targeting and to profiling. And at present we have no alternative.

And I think this is the very first thing we need to start changing. Businesses, which I often talked about that, saying this is not possible. Because if we start offering something else, for example, you pay for the service you receive, but you deal your data, but it’s becoming very complicated. We cannot keep that business model. So I can imagine we get to the point where this is a challenge.

But if you really want the data protection and reform, sorry, but I think we have to go this way. We have to reconsider the models in that way, that people, users can choose what model they want. I think people would want data for search engines and social networks. But I know some users who would be like here, who would like to create their own service, go that far.

Like the ideas where you keep your data and it’s open source and it’s for free. Different models are possible. Other than the point that big companies keep our data. Without consent we cannot move forward. Profiling is one thing I want to address here.

Our companies say this is not personal data. Yes, it’s not personal data. I mean, there is no IP number. But still, my detailed profile identifies me and even it does better than identify me. This is a powerful tool. So this is about freedom and privacy and it should be regulated.

I would compare it simply to my right to refuse a market survey in the street. I don’t want to answer to a question about what I want to buy if you don’t want to increase the service. Online, we’re not offered this choice. We have been profiled all the time on the grounds that it’s good for us. It’s good for the service that we receive. It’s good for the Internet. Maybe it is. But all I’m saying is let users decide. Let them say they want increasing policy. Or if they want blind Internet that doesn’t recognize them, that doesn’t know what they want or like, but it keeps their privacy, so this is a big thing.

And of course we have been talking about China factors, the same discussion. We haven’t mentioned cookies yet and maybe I’ll leave that for other speakers, but this is a big change. How to regulate cookies. I don’t consent to every single cookie is feasible and makes sense. But certain types of consents should be considered and implemented.

Thank you.

>> MODERATOR: Thank you, Kasha. Before leaving the floor for another question, let’s go to Milan.

>> MILAN NIKOLIC: I think you mentioned data that we collect and what we need. So we get the data, and then we go back to the beginning and we can talk about consent. But it’s all about standards. So this is why we need those standards. In Europe, unlike in many European countries, there is a duty to protect and usually it’s the data of the State. So this is positive action of the State, it’s necessary. And that’s why we need the Council of Europe convention as a binding document. That’s why we need an additional input that gives poverty to independent agencies. We need checks and balances on all sides. But we have to also update the convention to be able to respond to new technical information. Because, unfortunately, the laws and provisions are always lagging behind technical innovations, and that is actually the possible way, how to achieve it, that’s always difficult.

So, unlike for example in the States, we are lucky because we can demand from the State to protect. So we cannot leave everything to business actors, we cannot leave everything to users, notwithstanding whether we consider users as ignorant or not ignorant or greedy or not greedy, but the society has a duty to protect.

>> Milan, your turn.

>> MILAN NIKOLIC: Thank you. I’ll reveal a big secret. First class service providers, they monitor what other services do, because they have the interest who has quality service. They need to elicit market needs and improve product offers. And what is also very often neglected, they need to – they would like to enhance the security of users and prevent fraud and cybercrime. For example, it’s mandatory according to the telecommunications.

So, this is also kind of profiling. Those activities are in towards the benefit of users. Only, of course, if they are conducted as a legitimate business purpose. If they are an illegitimate business purpose, that means that they aren’t for good purposes.

If they are only for collecting, they should not be sent to third parties or becoming the subject of a secret threat. And of course that should not be related...

So, in any way, we can consider this as a profile, this kind of profiling, but we need to keep our privacy.

And, of course, a person must have an undeniable right to exclude himself from this machinery.

Other parts of the privacy protection, it’s not so well big mistake if we collect and analyze information. It’s how we use the information and intelligence extracted from that. But this brings us back to the question of abilities of the service providers. But there are others that fit in that category. As always, we are looking for a balance between our right to privacy and our wish to have the best possible things. And we can find that balance.

>> MODERATOR: Thank you, because the Council of Europe provision has been measured, and we need to update it.

Let’s go to Marie.

>> MARIE GEORGES: Well, this time I have very few things to say because everything has been said around the table.

First of all, there are many things that were said. There is different kind of profiling, information coming from you, information about the group of persons and you are one part and you are targeted because part of it, well, if it’s only from marketing, like there is a big problem, where normally if it’s in decision-making, this is real.

So I would like to draw your attention to a recommendation that the Council of Europe is discussing on the question of profiling. And, clearly, this is for marketing. There is a right to eject from this. It’s already in the law and I think in Europe.

But the quicker question that was very interesting, in the chain you have standardization. I will mention it exactly as another topic. But look at what is in a cookie, how it is normalized over the process.

How can you Judge the time of the issuer? The time? You have a whole bunch of numbers, but you don’t know what it means and you don’t have the purpose. And the data protection turn is to cope.

So it’s a question of cookies, but still. So, I think – and I would like to add two things, yes. It’s under consideration for the moment within the Council of Europe and the right to speech and other things and data protection and social network. And there is also a recommendation in the process of being adopted when? Within some – yes. On search engines. So I think in time.

>> MODERATOR: Thank you.

I’ll start with a question from you. How can data protection work in the context of social networks located outside the border? This is a question, who would like to reply to it?

National borders and not legal presence.

>> Just building back on what we said, it was said already. But the question – or even I said it I guess – jurisdiction is the big question. How – well, nobody, nobody knows how to effectively change the rules of jurisdiction, how to first force the actors to comply with our idea. But I guess there is no other way than changing the law. As long as we have – like now, the European or the companies established in the EU as NGOs, we’re helpless, and as DPAs I think as well.

>> Maybe I could add something. I’m back to the credible access point. So if I’m attached to a social network which is physically and logically placed in a region, I cannot expect too much protection for the privacy. So it’s almost a choice whether you’re going to join or not.

And indead, there is a lot of outsourcing activities, a lot of services scheduled, but the catalyst here are the guarantees for the protection. And if individuals decide to join, that’s a question for our decision.

>> MODERATOR: Thank you.

Do you want to say something?

>> I agree with you. Completely. I would like to add just one thing. The more you have a harmonized solution, the less the contradiction is important. So this year you have to take care. From my experience of 30 years in data protection, I can tell you that all over the world the question of culture is very few. The people for instance who have the responsibility of data protection, they write the DPA, they write the DPA authorities, and they meet. And within two hours they agree in 89 – well, 95 percent of the cases.

So, there is a large margin of agreement worldwide because of the principle of purpose. The principle of data for the purpose. The duration, all of this, not having data processing behind the back of the person, the person doesn’t know.

All of this. Any human being with some thought agrees it is not a matter of culture. There are some areas in which it is not only a question of culture, but a question of organisation of the State, organisation of the democracy and so forth. Solutions which – and sometimes we don’t agree. But three months later two more agree and they pass it on in their country. So I am pretty much optimistic, except that if the telecom or international organisation made much lobby to all the countries to have regulators, which country, I think we are up to 170 now, it’s a question now to do it. I told you the number of countries. But you can know that there are already 70 in Africa, there are three or four in South America. In Asia, forestry. So it’s not none. It’s a beginning.

And there is a huge push to make, and I think it’s being announced that they are trying to promote the convention and mechanisms and also at the level of implementation there will be more optimisation. But there is a need to have the Council of Europe do that because it’s not healthy with free money organisations. So I’m looking to them.

>> MODERATOR: Questions from the floor?

>> AUDIENCE: Hi. Boris. Just one extension for that question. For example, Facebook is an American company, but now we have cloud computers and we have CDMs and we have data distributed over a lot of countries, and maybe when we open one page some data came from America, some data from Europe. Who is then responsible?

>> Facebook is responsible for everything.

>> AUDIENCE: Yes, but are they American rules or rules from the country –

>> Facebook.

>> – where the services are located?

>> Facebook.

>> AUDIENCE: Who controls that?

>> We are aware that the service is around the globe and you join Facebook. Then it’s Facebook.

>> Who in DPA controls that? Which country?

>> It’s the State of California and this is the law that applies on Facebook.

>> You believe –

>> Difficult question. If there is a cloud computer computing in Europe, maybe one could argue that this – the rules might be applied, but not necessarily. They didn’t go down and then question the self sourcing entirely.

>> And they say –

>> It seems to me that there is also not much courage on the judicial power.

>> Yes.

>> Because – because there is no law, applicable law, really in the convention.

Because it was on the basis if we have all the parties of law, you know, generals, so no matter we will be okay.

But if you take the rule of law – I mean, of applicable law in the directive, you will see that if the data controller is outside EU, but uses a mean on the territory of the union for the purpose of his data processing, the European law applies, except that no one has the courage to say that your laptop is a domain. They are interpreting the domain as the controller. That’s stupid, because if it’s name, it’s here, it’s on the territory. So nobody had discouraged, and everybody wanted to change the rule of law on this point to say that if you target the market, you have to – so it’s the applicable local concept of consumer.

But this was not ratified. This was not ratified for the consumer that says it is the law of the consumer which applies so they never signed it, ratified it.

So in my view it’s more a question of – but in practice you could apply the existing concept, which is in the directive. But, nobody...

>> Okay. We have a more complicated situation when we have this to make. The Czech Republic, which completely abandoned data retention, and it’s – I just want to wrap up the issue. The point is that data today is the currency. And that’s all – and that’s because we have this argument. The data equals money. And in the scope of human rights, we are now talking about our personal data as money, and in my opinion that is not acceptable because by default the data, and my data, every data that I use have to be encrypted by design, as people mentioned.

And there is no strategic intention or even there is no – of course, there is no corporate intention on doing that.

So, everything we speak here is a bit – you know, two-sided effect, because we are speaking about the importance of that limitation and we are not even getting the important fact that there are hardcore – hard low level protection by the installation. We are protecting the installation for exploiting the private data.

When we have the hard coded encryption by design, then we can speak about the limitation and the price.

This is really funny.

Okay.

>> Thank you.

>> MODERATOR: That was my point. We should reclaim the right to our data, to reclaim it from being a currency online to being something we own. It’s like coming back to the principle of data and how will we know how to understand it.

I want to point out proportionality in the context of data retention. What we have now, if nobody effectively controls what sorts of data are requested for the purpose of service provision from us, then we also have no idea of what is being retained.

So, if we allow the search engine for example that has been mentioned by Marie stores our results or stores our other personal information or profiles that is enough to identify somebody, this is an issue.

There are two issues: My data, but also my security as a person of the State. The State might come and request the data, effectively while the data is retained. So we have a complex issue and a good reason to reconsider the principle.

We need to trust as users. It’s difficult to sort of Judge whether –

So we have some sort of maybe standardization, maybe international standardization of what service should require what sort of data. So we as users have confidence if we do not give too much or if we give it.

>> Just one comment for my company. My personal data are not the currency because we are not selling it. We are using it for our purpose. It’s sort of the – it’s source centered. It’s criminal like. Just give me your name and here you are on the seat. I don’t deny that there is a possibility of someone selling that. But also, this could be a part of a crime. But let’s call it a crime. But for telecom operated things, it’s not a currency, because they are still in the company. It’s a part of doing business.

>> I just refer to what you said, that we need it coded, that we need – that we again are dependent people that rely on a person that is knowledgeable about it. Not many of us are aware of how to use the service. And if we go back to digits, instead of letters and numbers, like in a meaningful way, not many people would continue to use the Internet. So there is a need for balance between that.

So, what we can do and what we should protect and how to protect, we know – we need to know how to protect. But in order to know how to protect, we have to be informed. But one of us can be knowledgeable like you, and this is also something that we should bear in mintd. Users of Internet are – like – no school is necessary for that, even intellectual people use the Internet. But it doesn’t mean that they can use particles and quotes.

>> May I? Sorry. I’m Jake. I come from the office that specializes in IT law. I would really like for each and every one of you to give me one short answer for two questions. First of all, what should I say to my clients when they ask me: Can I expect any privacy on the Internet?

And the second question is: What is your personal stand? I don’t mean your corporate stand, but are we governed by the policies of the company, of the companies, or are we governed by the policies of the government? Right now, as is right now.

Thank you.

>> Who would like to start?

>> Because the corporate sector is mentioned. First of all, when you connect to the Internet, you have a right to privacy. That’s my personal opinion, experience, because I’m in the sector and I’m paranoid.

And the second thing, are we governed by the corporate law? I don’t know. The corporate law is definitely not there, we rely on government laws. That is – that might be some kind of conspiracy. But this is self regulating. That is a very good question.

Corporations in Serbia are not so powerful that they need peers. But, personally, because you asked my personal experience, or opinion, I don’t – I don’t think it’s the case.

And I believe we are governed by State laws.

That was Peter, that was not the position of my company.

If you ask – okay, your question was, what do the users expect, right? Not how it is. I think what we should expect is that we have privacy. And of course we don’t have it now. That is the answer to the one question.

And the second one about corporate rules, it depends what you use as a user. If you use services provided in your country, then you’re lucky, then your country deploys. But the jurisdiction and the lack of harmonization if you use services provided by US companies, there are different rules.

>> It depends on what level. I mean, we can expect some. That will be my answer.

And to the second question, yes, I would agree. Yes, it really depends which services we use. It could be both, it could be a combination and in some cases it could be corporate only and in some cases it could be government only. But that really depends on the service.

In some countries, access providing is owned by the State and everything is filtered by the State. So definitely it’s a government policy. Maybe that is not the best example. But in other cases it’s all about corporations. So it’s not so easy – and you’ll interpret your question as you wanted to.

>> Marie?

>> MARIE GEORGES: If you have friends in secret services, ask them.

Okay. That’s where you posed your question.

It is true that the general situation is bad. That you can find any personal PIN code and pass for a lot of people, clearly, in many, many cases. So we have luck that that data is not used.

So when I hear people talking about encryption, there are certain services that should be encrypted, basically, basically, without having to bother too much as a user.

Some years ago, it was clear for everybody. It was especially for enterprises, but we are changing the information on networks. And older laws in western countries, I know the laws changed, before it was not allowed to use encryption. Since the end of the ’90s, it was liberalized, the use of encryption. And now in this situation, we fear that if – if individuals use encryption, they are going to be suspect.

So we have to raise this situation. We need some kind of alliance, I think, among IT people, community, the quote IT community, which knows what is going on, which knows how to make or how to fix things. And to better explain to, again, to politicians and all things, how they are done. And of course if there is suspicion on certain persons, the enforcement, security needs to have the means to intervene. But there is a suspicion with some people. But the other way, the court collaboration, no. They ask either the service provider, the host and so forth to get the data. But not the situation that actually is going on.

>> MODERATOR: Thank you. We have run out of time. So one last question.

>> Just in Germany, last year, five million people moved from one social network to another. And why did they do it? Because of privacy rules? No. They all went to Facebook. These five million users, and other social networks which have better privacy and data security standards. They lost their position on the market. So, I think in theory I would say that data protection privacy has a very, very high priority in Europe. But when it comes to practice, ask the people what are you doing this for, they do nothing.

And then my impression is that maybe, because – that’s because data protection and privacy is already very, very complicated in Germany, and in Europe. I lived one year in America and I tried to explain to them how it works in Europe, with the privacy and data protection, and it’s really a mess to explain.

And so maybe – you talked a lot about new categories, new kind of rules, in order to make the world a bit more complicated even. So do you think that is the right way and how could we just make it a little bit easier, for example, or better to understand? Also to convince others like Americans And other people in the world that our privacy rules are the best ones.

>> You are saying that I propose more and more laws. I propose reforming existing laws, which does not mean that I want them duplicated.

We will not do this now, we will talk later, if you want. But, in my opinion, we have a complicated law at the moment that does not cover real life situations and we should get rid of many formalities and many detailed principles. I can imagine there are a few. But we should add or formulate new principles which apply to the Internet in a new way.

Now we have a huge gray zone. We have some principles which are overly complicated and the huge discussion of how does this overly complicated world apply to the complicated Internet? This is a complicated matrix.

But, this is a huge challenge; I’m not saying it’s not. But many people tried to fashion it. Maybe we can make something in a few years time. So yes, we need more easier rules and more transparent and less rules, but covering situations that cover the Internet.

>> MODERATOR: Thank you. Any other final remarks?

>> Well, I know certain Americans who are coming here to understand, to learn, and back in country they say we learn to not.

So I think on the international level, it’s good to start by talking about cases. In this case, what do you do? So people can start to understand.

Then you can go up to the formation of basic principles. As a matter of fact, the basic principles in the convention, they were invented by the Americans. They were not going to implement them largely for political reasons at the moment and they are trying to figure that for the moment, but only for commercial with the proposal that is in caucus. Mr. Kerry from the Democrat and Mr. McCain from the Republican, so they are trying to, I would say, lessen the gap. But they didn’t do it yet.

So, I don’t say it is so complicated. I mean, if you take – that’s why also the convention and the Council of Europe, as I understand it, was to keep very basic principles. But after that, you have to implement it in this situation. And here, because we have IT in all sectors, in all of our activities, we don’t have to work with the same people and so forth, this comes to be complicated if you want to know everything. Of course.

>> MODERATOR: Thank you.

Anything else?

>> It’s also a question complicated to whom? It’s complicated to whom? Is it more complicated to the end-user? Should it be more complicated for the private sector? Perhaps, but not to the extent that they could not function, that they would be overloaded with tasks.

Should it be more complicated for the State? I agree with that. Yes, the State, exactly, because of this duty.

So maybe the burden should be – maybe this chain in the – the first link in the chain should be the State. And then the individual. But that really depends, like more or less complicated to which actor. I think that it’s the same.

Complicated – I don’t think it’s the same complicated situation for each and every one of them.

>> MODERATOR: I think we should end this now. Thank you for being here with us. And one final word.

>> SOPHIE KWASNY: I just invite you to come and have a look – have a look in the building, the things that have been exchanged in the session. A very nice sketch that will inspire you. So thank you very much to our panelists for their ideas. Thank you.

(Applause)