How do transnational data flows affect users’ trust? – PL 1 Part 2 2016

From EuroDIG Wiki
Jump to navigation Jump to search

9 June 2016 | 14:30-16:00
Programme overview 2016

Session teaser

We take up the discussion where the morning session ended and re-visit the more controversial issues. [Open discussion format – NO Panelists!]

Session description

We continue the discussion started in the morning session [[1]]and dive deeper into the more controversial issues.

Keywords

Big Data, Transnational Data Flows, International agreements, data-related technologies, algorithmic technology and decision making, artificial intelligence, privacy, security, user trust, innovation, European economic divide

Format

Please try new interactive formats out. EuroDIG is about dialogue not about statements.

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, only links to external resources are possible.

People

  • Focal Point: Thomas Grob, Deutsche Telekom AG, Germany/Switzerland
  • Key participants: Everyone interested in continuing the discussion.
  • Moderator: Frederic Donck (ISOC)
  • Remote moderator: Allon Bar
  • Org team
  • Ana Kakalashvili, GIZ GmbH, Georgia/Germany
  • Marco Pancini, Google, Belgium
  • Frédéric Donck, ISOC, Belgium
  • Karen McCabe, IEEE, USA
  • Justin Caso, IEEE, USA
  • Reporter: Thomas Grob (Deutsche Telekom AG)

Current discussion

See the discussion tab on the upper left side of this page.

Conference call. Schedules and minutes

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange
  • be as open and transparent as possible in order to allow others to get involved and contact you
  • use the wiki not only as the place to publish results but also to summarize and publish the discussion process

Mailing list

Contact: pl1@eurodig.org

Messages

  • There is no trade-off between Privacy & Security
  • Security needs to be a collaborative effort / Subsidiarity works: intervene at the least intrusive level possible!
  • The multistakeholder model offers the tools to solve complex issues – approach needs to be open, transparent, inclusive, accountable – needs active engagement; we need to do more!
  • Transparency and Openness are meaningless, if people do not understand what is being disclosed or in case there is no alternative option.
  • Openness requires shared responsibility: companies and governments may not solely and completely be held responsible for what people do online.

Video Record

See the video record in our youtube channel

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com


This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.


>> EMILY TAYLOR: Good afternoon. We’re just about to start the next session. As the room is not particularly packed, do you think that those of you who are sitting in the far seats, would you mind just moving down a little bit to the front where there’s a little bit better light? Because this is going to be another open type of format and it would really be nice to have as many people as possible towards the front.

Thank you. There’s just a couple of logistical things to make you aware of. On the schedule was Commissioner Oettinger is going to get an address. Unfortunately, please been delayed. He will be joining us after the coffee break at 4:30, so if you want to listen to Commissioner Oettinger, his remarks will be made at 4:30, and then we’ll go into the Internet of Things panel with the lightning talk just after that before that. That was clear.

Anyway, so Frédéric?

>> FRÉDÉRIC DONCK: Thank you very much, Emily. Back on stage, thank you for joining us for the second part of the session that we started this morning. I don’t know how many of you were in the session this morning. By hands, just to have an idea.

So, yeah. Half of you, at least. Good. So you might have seen, there is a rule, you really don’t have any plan when you address issues like trust and date flows and privacy. We were in many different directions. So I’d like to first made remind all of us what’s happened this morning. I have Thomas Grob, who was the reporter of this morning’s session. Hopefully he took some notes. Thomas, it was great if you were able to sum up a little bit what’s happened this morning and give us your five bullet points and we’ll take it from there. Thank you, Thomas.

>> THOMAS GROB: Thank you, Frédéric. Maybe we can have the slide up here?

So in my role as focal point, I have been organising that’s the wrong slide this whole session, and I’m also the reporter, so I have been taking that’s the right slide, thank you. I have been taking some notes this morning and this is basically what I’ve heard and what as of now would be the messages going out from this session as the messages from Brussels, from EuroDIG.

Now, of course, I don’t have a monopoly position on this messaging and that’s why it’s up here on the screen, so let’s use those notes to maybe refine the points I have written down. I quickly walk you through there, so in the first bullet point basically reflects what I’ve heard. It’s not an alternative between privacy and security and reducing privacy for more security is not an option. That was stated several times, at least I think.

We had Kathy Brown stating very strongly that security needs to be a collaborative effort so that it is reflected in the second bullet point up here.

We have also heard that complex issues at stake here can be solved best in a multistakeholder approach which obviously needs to be open, transparent, inclusive and accountable. We have also heard that this needs a lot more engagement. It shouldn’t only happen on paper, but we actually need to act now, so we could also discuss in this upcoming session what does this mean, what needs to be done, and who can act on what stages.

Then we had a very strong point, which I clearly heard from Matthias, saying transparency and openness are useless as long as users do not understand what is being disclosed, so I think that is also a very important point that we should take away from this session and maybe discuss what does this mean, what kind of information is actually meaningful and what can be done with it.

And then last but not least, I’ve heard from several speakers this morning that openness requires a shared responsibility, so we cannot say it’s only the Governments or only the companies or only the users. Each in his respective or her respective role obviously needs to take responsibility, and I think that leaves also room for interpretation, and maybe we can dive deeper into this, and with those few explanations, I pass the floor back to my Moderator. Thank you.

>> THOMAS GROB: Thank you very much, Thomas.

>> Great summary.

>> THOMAS GROB: That is great because first I believe you write there were key issues and I’d like to come back to the first one and I’d like to add some comment and turn it to you and see how we can take it from here. We’re clear here in this room, this might not be as clear elsewhere but we’re clear in this room, no trade off between privacy and security. Yet we might want to say a little bit what we understand being security. Some speakers this morning insist on this, you’ve got network security, you’ve got Homeland Security, security of your data, security of premises, nuclear premises, whatever so let’s be clear about what we’re speaking.

We might start by saying that and I’m a bit puzzled about this. Seriously We’re European, and in Europe, we’re very strong. We have very strong requirements in principles that protect the privacy of citizens and yet in the same times we see that in many countries, Governments still are able to just access the data of citizens without any control, without any guarantee, without even any oversight, how do we see this being balanced actually? At the European level we have nothing that impede Government to do that so I mention this morning what’s happening in the U.K. but I might have referred to many different countries in Europe. There have been data released recently showing that most of European countries, not all of them but most of European countries when it’s about security, Homeland Security, they just allow Governments to access citizens’ data without much guarantee.

How do you see this? When we talk about balance between the homeland or State security, and privacy, what it is that you feel we should say strongly? How do you see that going forward concretely? Does someone want to take the floor from here? Because we claim balance between, or reconciliation between privacy and securely but concretely what is it we can do. Matthias you want to say something?

Sorry, you’ve got mics there.

>> There’s some here as well so if you want to it’s great to have the lights up, thank you. If anybody wants to take a mic or say something, please. That would be brilliant.

>> First off, thanks for pointing to that issue because I think it is really important and you said you used the example of Great Britain, but you could point to other examples, and that’s certainly true. In Germany, this Monday there was a leak of the proposed law to reform the German Foreign Intelligence Agency BND, and basically all the criticism that Civil Society organisations voiced in the past concerning the implications of it have not been heard. Looking at that draft law, it’s worse than most people would have imagined that it could be.

And consequences would be that actually as you just suggested, the data of many citizens are not secure from Government access, from security agents access. You’re asking what can we do about that? Because there shouldn’t be a trade off between security, or safety, and privacy.

I think the only thing that we can do about this, in a democracy, is to have the rule of law here, meaning that of course, we need to debate draft laws like this, and if we are not satisfied with the way they are drafted, or if they are implemented, then in the end, we have to contest them. First we have to do that in the Parliamentary process and then if that is not successful, afterwards we have to go to court.

I’m also on the Board of the German Section of reporters without borders and this is exactly for example what we did last year. We sued the BND in Germany because we think their indiscriminate surveillance is overbroad and they’re using a method of collecting metadata that’s not in line with the law. It’s basically data retention without data retention law.

I don’t know whether we all agree on that but you said in Europe, we have strict laws. We also have, I would hope, some trust in the Democratic process and in the rule of law, so this is what we can do. We can contest these suggestions in the Parliamentary process, in the law making process, and then we can contest these laws, if they are enacted, in the courts, and then hopefully, in the end, we get a decision that is more friendly to fundamental civil liberties than what some politicians think is necessary to do right now.

>> Matthias, can I ask you a question? Thanks for raising that issue about the situation in Germany. One of the things that I’ve been observing, because there’s been a lot of debate in the U.K. around the investigatory powers Bill, the proposed law which Frédéric brought up, and at the same time before the Court of justice of the European Union is a sort of hang over case based on the previous piece of legislation which was actually held there’s a question about very much relating to data retention and whether or not that’s legal.

One of the things that surprised me was the number of European Member States that have sought to intervene in order to protect the ability for the secret services to access and keep this data. I wondered whether you had any reflections on this, and what you think might happen.

>> Matthias: I can only say that much. It’s a struggle, you know? It’s a very hard fight that is being fought right now between, how can I call them, between politicians on the one hand who are proposing that we need to have more State power, more law enforcement possibilities, and on the other hand, people who are defending Civil Liberties and it is a very, very difficult time, because we all know that people are afraid of terroristic acts. People are afraid to become victims of that. And this is actually where I sometimes think that even if I agree with the statement that we cannot make a trade off between privacy and security, we are making that trade off very often.

Because if you have these laws that, in my opinion, in many experts’ opinion, do not respect the rule of law and Democratic values, then you have already made that trade off. And then it is that struggle to reattain that balance, and whether we can be successful, it’s really open.

The case that we brought in Germany to the Federal administrative court, we feel that we’ll lose it and that we have to go to the Constitutional court afterwards. This is all expensive, and expensive process, a process that needs lots of resources and usually Civil Society organisations are not very big and well funded but it’s the only way to go I guess in convincing politicians it’s a bad idea to enact these laws.

>> Sorry.

>> I’d like to jump on this precisely. Are you suggesting that the last place where democracy might happen when we discuss this is court? I mean, this is when the bad things happen. Could we have some way where we can do this before this is happening? I know I’m driving this through the multistakeholder model but what I want to say is that in most countries where everybody agrees on multistakeholder model it’s just about consulting people, period. Don’t you think there’s a time though where we might claim it’s part of the process when it’s about law?

>> Matthias: It definitely is completely. What happened again in Germany, you have to understand I’m most familiar with examples in Germany but when it comes to data retention I think we all know the case at the European Court of justice so what happened there is that there was actually an alliance between Civil Society organisations and companies because they said data retention is not something that we want to do. It’s not something that we want to do with our customers’ data. We don’t want to retain them if they don’t need them. They also argued about costs, that it costs them a lot of money.

This is not my first concern, but I’m willing to take that into account, if you can build a stronger alliance.

So there was already in the sense a multistakeholder discussion because many stakeholders were concerned. It was the Governments. It was the private sector. It was the Civil Society, and in many cases, it was also academia, because they argued that it’s mostly useless. If you have data retention, you know, the empirical evidence shows that it usually is not very useful to combat crime.

So it is kind of a multistakeholder process, but, yes, Frédéric. Unfortunately, in some cases, it seems that in the end, we need to rely on the courts to save some parts of democracy.

>> Could I just maybe throw out a couple of thank you very much Matthias for that very informative remarks. Could I just pick up a couple of themes from your remarks to ask others in the audience?

There are people here, I can see, who are much more familiar with the technical side of things, and could we just explore together how effective these Big Data techniques are at actually combating terror, combating crime? Can anybody help us, guide us on that?

This seems to be an area that isn’t much explored. So that’s one question to you. I’m hoping for your feedback on that.

Another question is: You showed a lot of confidence in the Democratic process, and Frédéric, you mentioned the courts as seeming to be the only ones here. One of the things I want to know is: Is the Democratic purpose is the Democratic process really fit for purpose here? When you look in the U.K. at the level of debate over the investigatory powers Bill, the main opposition parties abstained when it came before Parliament for its second reading and the debate probably lasted an hour or two at most and the chamber was mostly empty. In Committee, very, very few amendments have been made.

And this has been described as one of the most intrusive pieces of legislation ever proposed by a democratic country. You’d expect a bit more debate so are the politicians up to it? Are the democratic processes really working? Can we have does anybody feel like chiming into the debate?

I know it’s after lunch and everyone is thinking, I’m going to just digest here quietly, but could we okay, the next stage is I’m going to come around and thank you very much.

>> Hi, everybody. So I couldn’t really speak on the technical implications because I’m nothing of an IT expert, out of solidarity I just wanted to jump into some of the things that we’re that have been said here by the previous speakers, and link it to the reasons I would say terrorist, threat climates we see rising in Europe and how this can actually be one of the things that will make some of the Governments and even perhaps courts eventually to have much more I would say, you know, damaging, you know, approach when it comes to privacy, and ensuring that the democratic principles are being preserved in Europe.

I mean, a clear example for that for instance could be the recent adoption by the European Parliament of the passenger name records. So for the people that have followed the debate, for years, Parliament stood against the wishes of Government trying to collect data of travelers in Europe, and following the terrorist attacks in Paris and afterwards in Brussels, obviously one of the implications that the Parliament actually quite swiftly decided to take a much more protective approach. And it is also I mean, there are good foundations for that.

>> It’s understandable, isn’t it?

>> Exactly. But the problem is, who would actually assure us that tomorrow, you know, Governments and the European institutions will not take a much more stricter approach when it comes to encryption, data retention and so on. Perhaps it’s really also exploring the ways of how actually we can preserve the users’ privacy while at the same time obviously countering big threats upon our society.

>> Thank you very much. Anybody else want to make some contribution?

>> Yeah, my name is Wolfgang Kleinwächter. I wanted to make a comment about what Kathy said this morning about the collaborative approach to issues also like security on a more let’s say political or National level. What I’ve seen now in the last couple of years is that the issue of Internet security has really moved from a more, let’s say, technical security understanding to a question of what we can do against criminals and has now reached the level of freely war among states so this is a totally different layers, if it comes to security.

If it goes to the question of National Security, so we have meanwhile a number of bodies dealing with this issue on a global level, in particular, the so called Group of Governmental Experts which operates under the first Committee of the United Nations General Assembly and we see now that foreign Ministries, G7, BRICS countries on the level of Foreign Ministers or other Ministers are dealing with this issue, and the what I see funny thing or contradiction is that in particular the Western democracies are supporting the multistakeholder principle in their documents, like the G7 document from Takamatsu or what the G7 leaders in Japan adopted three or four weeks ago. They have nice words for the multistakeholder principle, but the final document from Takamatsu, Declaration on the connected world, and G7 principles for cyber which was adopted three weeks ago by the Heads of States or the new G7 Working Group on Cyber, has nothing to do with the inclusion of stakeholders.

There was no call for public comment. There was no draft which could have been circulated among stakeholder groups for the Takamatsu document. It was just written by some people who worked directly in the office of the Prime Minister or the President, and without any consultation and I think this is really contradiction. That’s why I support full heartedly what Kathy Brown has said today. Even if we trust a Democratic Government that say will do their best to save Cyberpeace and to work in the interest of the democratic society, you know, the outcome really creates holes which can be filled by contributions from the other stakeholders.

We all are interested in National Security. The Civil Society is interested in National Security. Business is interested in National Security. Also the technical community. Why not to use this knowledge and to come to results which at the end of the day would be much more sustainable than just a one stakeholder approach? And I think here we take this for granted in the Internet Governance community and the ICANN community and the IGF but these groups had nothing to do with the Internet in the last 15 years.

So that means we have to reinvent the wheel in a certain way now on this higher level. I think this is really a good subject, and again, I’m very thankful that Kathy raised this but this is an agenda for the next five years. Germany will host the G20 next year, and there’s a discussion now in Germany that they will organise a multistakeholder conference on Cyber Issues under the G20.

I think this is probably a good opportunity, you know, to jump in. Italy will Chair the G7 next year. I think Mr. Renzi he will organise a multistakeholder conference under the G7. I think this community should be very sensitive and to go to these places. They will not come to Europe but then we have to go to these places and to say it’s not enough to pay lip service to the multistakeholder model. So you have to practice what you preach. Thank you.

>> Thank you very much for that. You raised some very important points there about the in a way there’s almost two parallel processes going on. You get the Democratic Governments talking to us saying yeah, carry on you’re doing a great job and then mean while keeping to themselves the security dialogue if I can oversummarize what you’re saying.

So how do we match up the two? Is there anybody here from Government? What do you think of the proposition that National Security should be a multistakeholder process.

Sorry to put you on the spot.

>> Diane Steinour from NTIA in Washington. So obviously, we do deal with these issues. We do have stakeholder consultations in our cyberdialogues. They’re a new feature of our interaction with other Governments. I think your critique, or the gentleman’s critique of the G7 process is probably pretty fair, but we do have Advisory Committees made up of stakeholders that advise different agencies in our Governments, so there’s general knowledge of what stakeholder positions where when we’re in a negotiating phase of outcome documents for those processes.

And I think that’s sometimes a function of the host Government and how they want to conduct the negotiations so yes, stakeholder interactions and consultations are always very key. We have a cybersecurity dialogue now with the EU. I believe there’s a call tomorrow to try to organise the next one in December, so you might want to talk to the External Action Service people about how you might feed in. We generally do talk about Internet Governance issues in those types of consultations. We have them with the Nordics and the Baltics as well on a regular basis and occasionally with the French and the Germans and just on the European basis.

>> There’s one thing perhaps if I could just probe a bit further: Do you think it’s right that Governments should give up that responsibility to secure I’m just trying to think of, what would the opposing point of view be? In Human Rights law, and many of the European laws, you would get carve outs for states on issues of National Security. In other words, a real recognition that it is actually down to States to secure their citizens.

If you get rid of that in a way, that strong link between the State and securing the citizens’ responsibility and throw it open to the “everybody,” is there a risk that it’s nobody?

>> I’m trying to ascertain your point, of what you’re offering is that you would like to see a multistakeholder model of security problem solutions?

>> No, I’m saying the proposition is: Let’s have multistakeholder governance in National Security. Isn’t there another point of view of that? Which is: No. This is the preserve of states. We’ll consult with you, but ultimately it’s our

>> It depends on the sovereignty of the region and I guess I could spend all day trying to find out from all of you guys the whole complex situation of substantially and

>> Not making you speak for the U.S. Government.

>> And I won’t.

>> What would your take be on it?

Okay, anybody else? Sir?

>> So actually, if we look at this question, I would argue that the states are doing a very poor job, like, all the states, are doing a very poor job at just protecting that security, because the framing of what security is at this point in terms of National Security, maybe security of, like, main economical assets and et cetera, is a very narrow definition of security.

So if you look at security as the security of everyone, of, like, the end users, the end users that interact with these bigger systems as well. Everybody uses a laptop including those people who rule the other systems that control our economies and et cetera so if we take that into account, if we take this, then it becomes a whole question of security versus security, and not of security versus privacy or any kind of, like, fake juxtaposition in which frames the debate has been going on right now where we’re sitting in this room saying yada, yada, yada, Human Rights and some other room people are saying: Yes, but we actually have to solve real world problems like terrorism and other stuff, you know? Your Human Rights, I mean, I’m white. I don’t even have Human Rights. I have privileges, you know?

And that’s the thing we have to do, I think, is really change the narrative of this debate, and talk about how cybersecurity is not about only protecting the interests of companies or businesses or Governments or National Security, but the interests of the end users, of the people. And in that way, Human Rights and security are aligned instead of juxtaposed. There are only few situations in which there will be a trade off between one kind of security and another kind of security.

>> FRÉDÉRIC DONCK: Thank you very much. Thank you. I like this conversation a lot and you’re right, we might spend a bit of time maybe not here and not today, but we might spend a bit of time in defining security because there are so many angles. You know security as protecting Government data, protecting Government data, protecting the nuclear premises here in Belgium. You’ve got a lot of different angles but it is also protect citizens, protect citizens’ information so there are many angles so we might spend a bit of time because I’ve seen nowhere a clear definition of what cybersecurity is. That would be my first point.

The other big challenge certainly is security and the Internet, and the open Internet, so how could we keep those really big foundations of the open Internet end to end accessibility and security? So how can we mix both? That would be the greatest challenge.

Now, I’d like to come back to you with I was surprised this morning, I thought that there would be someone in the audience who would jump into a privacy shield, and I haven’t heard this word, I love this word privacy shield. Some ambitious

>> Superhero.

>> FRÉDÉRIC DONCK: Yeah, privacy shield. When I turn this word in my mind, I just come back with a smile. So anyway, we got something, this is not that black and white. Europe saying, I need to protect my citizens that don’t have guarantees. So again, the privacy shields, as I understand from many comments, including from our EPS in Europe, won’t fly. Just it needs to be reviewed very thoroughly.

Maybe things as I mentioned this morning might not be that clear. Yes, in Europe we’ve got very strong regulation for protecting GDPR to protect privacy but at the same time at the European level you have an open bridge for governments to tap into citizens’ data.

While in the U.S., you got a court oversight about many of those issues. This something that needs to be recalled. Also when it’s about corporate just trying to have a balance here when it’s about corporate, there is a very strong corporate rules data officers in those big companies in the U.S. take privacy very seriously. Take the Apple gate and the FBI. There are people say this is a commercial, marketing approach, yet those guys just wanted to protect the data of their users.

While maybe in Europe sometimes we might feel that this is being formally adopted so you’ve got 10 boxes you tick and you’ve got security. So I would like to hear from you about well first, is it that black and white, Europe needs to teach the rest of the world about how to take care of their citizens? And what about privacy? How do you see it going forward?

Because if you listen to experts, they’re saying we’re not going to take this longer. It won’t fly so anybody wanted to try to take on privacy shields from this room? Do you want to address this? Or we go to the next point? Up to you.

Any takers?

Come on, privacy shield. Please, stand up.

>> Hi. I’ve been following this privacy shield in its previous iteration and I think the problem with these kinds of agreements is that you’re trying to combine two systems that are inherently non compatible, so you have the American Legislature and the American way of doing Internet things. And then you have the European way of doing things. And those two systems don’t match. That’s the reason we’ve had such different path of development within the community.

And what they should be doing I mean, what I think how this matter should be resolved is basically admit that, and not trying to put two incompatible things together, hoping that by some magic, it will work, because right now, what’s happening after the safe harbor agreement, now they’ve come up with this and I agree, it sounds very Marvel like, privacy shield and Captain America. I wonder which superhero represents Europe in that case. Let’s hope it’s Iron Man.

But what I’m trying to say is now they’re playing the waiting game, so they’ll pitch a new proposal and they’ll wait it out until it falls in the European court, and then they’ll pitch a new one and so on, so on, so on but I think we’ll never get anywhere, where you could say: Okay, now we’ve solved this problem completely. Let’s move on.

>> FRÉDÉRIC DONCK: But then because we speak privacy shield so indeed we speak EU U.S. but there is the rest of the world. Internet is global. Maybe we step back a little bit and think about privacy globally. I have an anecdote. A friend of mine who is not living in the U.S., living in Latin America, told me that privacy in his country was very flexible. He told me in order to receive the coupon of 10% reduction on something on the Internet they will give every figures you want including the birth date of his mother, et cetera.

So it’s another example of how privacy might be taken globally. So what could we do about that? I see someone from the Council of Europe, and I know she’s thrilled to speak about 108, but it might be one of you that we might be investigate. I don’t want to put you on the hot seat Sophie but you might, actually.

>> I was in a flash session on Data Protection

>> FRÉDÉRIC DONCK: Now you’re here.

>> I haven’t had the chance to listen to the previous interventions, so indeed, the theme being Transnational Data Flows and users’ trust, we have a Convention in the Council of Europe and the principle of the Convention was basically to allow for the free flow of data. When the Convention was developed over 35 years ago, at the same time the OECD was working also on its privacy guidelines and the experts from the world that participated in the OECD work were also participating in the Council of Europe on the preparation of this Convention.

So really, this idea of enabling flows of data was crucial for the Convention. We have tried to promote the Convention as broadly as possible, because basically the more countries we would have in the Convention, the more the scope of free flow would be extended, so at the moment, well, in September, we’ll have 48 countries so basically gradually more and more non European countries but we are now facing a new difficulty which we didn’t have previously, is that as the 28 Member States of the European Union are party to the Conventions themselves, they actually cannot implement this free flow of data that we have in the Convention.

They’re restricted by the EU regime. This is something that the EU negotiating in the Council of Europe in the modernization of the Convention has repeated very strongly, and have translated, actually, in what will be the text of the future Convention. So we do have this restriction.

Nevertheless, what I really want to highlight is that the EU, and you see it in the regulation in the GDPRN, one of the recitals, the EU is supporting the promotion of the Convention towards those third countries as a first step in alignment of the Data Protection legislation.

So I think that’s from a EuroDIG perspective and the discussion we have, the Council of Europe Convention is a good instrument for the EU, for the European countries in basically bringing other regions of the world, we have more and more African countries come in but also Latin America. We have a meeting next week in Strasbourg where we’ll also have Asian countries so really the Convention is seen I think by the EU, by the EU Member States, as a tool in bringing other regions of the world to this common understanding of Data Protection. Thank you.

>> FRÉDÉRIC DONCK: Thank you, Sophie.

>> Ana?

>> Hi again. So I want to take this discussion from privacy more to the innovation. That’s one of our topics, as well. And I think if we talk about privacy, there might be sometimes crashing into the wall, going into depression, et cetera, et cetera. But if we talk about innovation, we might be more cheered up, and let me say what I mean.

I think everything would have been better regarding data regulations, data retentions, if we had more innovation within Europe, if the country if within the countries we would see more startups, for example, in Germany. Startups like in California, we need them locally that we use so that we don’t if I’m concerned about my privacy, because I’m using the services that the company in U.S. is offering and I’m wondering about how it is how my data is transferred from Europe to U.S., so the best solution is to have a local startup, local company, that would offer me the same.

So that’s one thing that’s something we need to discuss. Why don’t we have more startups, more companies, more businesses?

>> What’s the answer to your question? Because it is a you look at the United States, these incredibly vibrant world beating companies coming out, and incredibly successful innovation story. What’s happening in Europe? What’s the answer?

>> Ana: So, yeah, why U.S. is blooming? And why do we need to use always Spotify, which was created in Sweden and flew now to California or other there was this search engine, European search engine, Quora, which died, which was supposed to be the competitor of Google. Quora? Which wasn’t successful. So every time I think of that, it’s first thing is that

[Off Microphone]

Yeah, pronouncing it. The first thing is we need more venture capitals. We do have stronger, in London where compared to the rest of the Europe, the businesses are going better, but we need more investment in the startups.

Second is that

>> I’m going to say, there’s a couple of people I think wanted to pin to this. Sir

>> Let me finish the sentence, and then you can always stand. The second is Europe is very diverse. So if I’m a business startup in Holland, and it’s the thing first to make my business in Dutch language, English, then if I want to operate in Germany, German. So language is a barrier, the diversities is a barrier, and the third one is a bit more mentality thing. It starts from education.

I don’t think Europeans are as business oriented as people in Europe, so maybe we need to address the educational component of innovation that the kids at school start to learn, coding and the basic things. And that we have more venture capitalists and that we and, of course, as a company, for example, from U.S., if I want to make a business in Europe, it gets so complicated with different legislations in Europe, even though the Commission is trying to harmonize it, it’s still so sometimes it’s just very stressful to understand where I am allowed to do what.

So one thing another thing that has to be done: More harmonization of legislation within the European Union as much as we can. So that’s the points, and that’s the I

would love to hear more about innovation rather than again and again discussing the privacy.

>> Sir?

>> Hi. I’m Krishna Kumar from Berlin and I come from India. I would actually like to go back to the privacy topic and add my perspective to what Frédéric said. Particularly with the example of the coupons and the compromise on the privacy and that is mainly because people have other important things to care about, which they rate higher than privacy, which is getting food on the table and saving money for their families, and coming from a developing country I understand this way more than people here, because let’s do you can do it on your head. Let’s compare GDP per capita of Europe and GDP per capita income in developing countries. It’s not like people there don’t care about privacy. They have other key issues to care about before they care about privacy but that doesn’t mean the privacy can be exploited right now by companies from the West. Thank you.

>> FRÉDÉRIC DONCK: Thank you, Krishna.

>> That reminds me of the expression by George Bernard Shaw in “Pygmalion,” My Fair Lady, where he puts Eliza Doolittle’s father, he said: I can’t afford middle class morality. And there is there are certain talking about a kind of hierarchy of needs and what gets compromised and what people choose to do. Al Gore also said, as well as saying, we’re living in a stalker economy, he said and businesses are collecting way more than they should, he said, you know, that there is a point where we need to reach a choke point, and we really need to sort of stop choosing convenience over our fundamental rights every single time.

There are two people waiting there and then I’m going to come to you, Sir.

>> Yeah. I’m from Deutsche Telekom. You just mentioned it might be important to have European service providers to then provide also data processing in Europe. Certainly that could be a good point, and not necessarily a European service provider also processes the data in Europe. You could do that anywhere, which then refers to the point that I would like to make, is transparency and the consumer’s choice. Particularly when data flows are complex, when you can process data anywhere, if you can have a server, I don’t know in Hong Kong and still provide the service in Belgium, you need more transparency. That point was made in the morning already, where we need to think about in that context is about playing with data.

It’s the point that monetary payments are more and more substituted through submitting personal data. That’s totally fine. That’s a business model. What we have and that’s not what we’ve discussed today is therefore a convergence actually of privacy and consumer protection. So it’s more and more converging or overlapping, but nevertheless, the rules are not applicable, though if we would look into consumer protection rules, horizontal ones, there are tools which are reasonable which you can reasonably translate to contracts that are based on the submission of personal data, provide clear information, right before you have the contract conclusions, so that the consumer then is actually enabled to make an informed decision to decide, for example: Yes, I can do this contract because this provider ensures my data are processed in Europe, I think that is the key. Maybe much better than trying to regulate somehow the flow of data, so right in the beginning to have clear and understandable information when you conclude a contract that is based on data submission for commercial purposes.

>> Can I ask something? So you’re from Deutsche Telecom, is that right?

>> Right.

>> Do you provide cloud services?

>> Yes.

>> Since the Schrems decision, have you, like other cloud providers, started to offer guaranteed EU processing, or storage? Data at rest in the EU? Should we put it like that? Is that one of your offerings?

>> I’m not sure if that is contractually agreed, but that is one of the business models we are thinking about, and what we also do processing data in Germany, maybe Deutsche Telekom but also across Europe and I think that is also a differentiator of course if you advertise that towards business customers particularly, if you say your data are safer when you’re with us because we process them in Europe, and that is maybe a benefit that is an incentive to yeah, provide higher protection standards.

>> Because just to step out of Moderator role for a second and to share an anecdote of software providers who are also doing reselling cloud hosting to be more precise, working for public sector clients in the United Kingdom, I have noticed that whereas I never, never got asked before, in the last 6 months I’ve been asked by public sector clients: And can you guarantee EU hosting for the cloud provision?

And I never would have expected that to come through into the clients before, but it seems to be so we have this parallel conversation about the importance of maintaining Transnational Data Flows, commercial companies getting into Amazon, you can elect to have your data hubs within the European Union if you want to, and I’m sure other providers are doing the same.

So that sort of contradictory position and also apparently at least two clients who are actually interested in having their data hosted here. Just to share the anecdote.

There was a gentleman behind you. Then you, Sir.

>> I’m just going to speak briefly, and relate to both of the previous speakers. I think the notion of an informed user making smart decisions based on information gathered, you’re talking about a pink unicorn. Internet readers never read ULIS. We’re not using the Internet in a rational, calm manner. We use it passionately. We use it for a purpose. We use it for a reason, and if the if that reason, if that reason coincides with our rights being violated, and especially rights to privacy, rights to Data Protection, protection from personal data, I think that’s a risk we’re willing to take because we don’t know what loss will come of us losing privacy and losing our personal data.

And I think the companies, I’m not sure what should they do, but on the other hand, saying that just so you’re putting labels on things saying they’re dangerous and you’ve done your job, I think that’s a moot point. I mean, the companies could maybe invest in long term education process, and not just the companies. I mean, all I’m not saying Deutsche Telekom should go into public schooling developing a programme

>> I didn’t see him leaping up to...

>> No, but I’m saying that there’s no silver bullet but we should think of different interest groups that are involved in this conversation, and we should each decide on a specific goal, a certain interest group should pursue, in order to make the world a better place. Thank you.

>> Thank you.

>> Thank you. I would like to come back to the innovation issue, which I’m more interested in. I am from the Confederation of European Computer Users Association. Let me come back to the innovation. Lots of interesting things come out of Europe. Skype came originally from Estonia. Then it traveled over to Denmark and then to the USA. This is in addition to the other examples here.

I think Europeans are pretty good at inventing but they are poor at selling and marketing. That’s where the Americans are 10 times better than we are. So that’s something, it goes across the Atlantic and they make the money. That is the point.

There are exceptions like the SAP in Germany, and Airbus, those are two exceptions Europeans should be proud of. Airbus takes over from the U.K., they all came from the U.K. Now the U.K. makes the wings for the Airbus, so we are not bad at inventing, but we’re bad at marketing and selling, at least on a global scale. To come back to why we have so few startups in Europe, part of it lies in the European culture, European mentality.

I lived for many years in North America so I know both sides of this. In Europe, if you start something, a startup, and you fail, you feel that is sort of a death sentence. You are finished. That is the feeling many people have.

When this happens in Silicon Valley they say try it again. You learn from your mistakes and don’t make it again. So this is a difference in culture, mentality, which we need to overcome. Thank you.

>> Thank you.

>> FRÉDÉRIC DONCK: Thank you. I’d like to continue what you were saying and linking what Ana was saying, innovation and privacy. I have a simple yet candid question: Is privacy I’ve heard that many times is privacy, will my privacy be a competitive advantage for some businesses? Or to say the other way around: Is business completely doomed? And is business, or the business ecosystem, privacy hostile? Could we find a way to reconcile privacy, innovation, and business? How do you see it? Please.

>> I’m from Greenhost, a hosting provider based in the Netherlands, and privacy and security are some of our core values and also things we put forward and there’s definitely I wouldn’t say that that is the bulk of the market but it’s definitely a niche that attracts certain people. That also have the chance or the opportunity to choose that.

It’s also useful for a lot of companies to choose for that, so it definitely can have a benefit. But in the end, I think there are a lot of other points that are better selling points to people and privacy especially on the Internet is still a huge selling point and in an environment where “free” for a lot of services is the norm, and money is made by selling people’s profiles or search history and et cetera, there’s not a lot you can build on that is like that can beat that kind of free.

There’s this interesting question you raise I think in a way about: Can we do something different in Europe, right? I mean, it’s interesting to just navel gaze at how successful Silicon Valley might be but everybody who has walked the streets of San Francisco and saw all the people that were less successful can see that we’ve built a different society in most of Europe, and that has its costs but also its benefits.

So the question becomes then: What can we do if we might be less, well, be more risk averse, that can also have benefits because if not like 19 out of 20 startups fail, then those startups that we have could be more successful with relatively more money without all the losses that come with it.

So what is then the kind of like added value that the European startup can have, right? I think that’s the kind of question you should ask, and not how can we, like, move something from a different cultural context to another cultural context, and, well, just see it fail, as it does when people actually try to do it.

>> Thank you.

>> FRÉDÉRIC DONCK: Thank you. So but again, let’s continue on this. Ana this morning said privacy is a privilege, so that we might maybe at the end of the day we might need to be paying for privacy.

And it’s also about the content of consumer. You write, sometimes I feel we should all be post graduate from Harvard to understand the 10 pages of Facebook or Google when it’s about the rules and the contract we’re doing when we start contracting with this. This is about the contract.

So is there a hope that people can give a meaningful concept that maybe under user pressure finally and I’m talking about edge providers, it might be about my mobile provider telling me exactly, transparently, what they do with my bits and bites when I use my mobile. Are they using the IP or that kind of stuff? Is there hope for meaningful concept to be really something? Back to you, Ana.

>> Ana: Yeah, following up on that comment. As I said during our previous panel, I am optimistic that the users will become more aware, and once that happens and I said that in the morning as well, we will all realize that we will go and take services from the businesses that will have the privacy, and as well as transparency, in its key business model.

So and back again to the AirBnB. What I did is like look for another host apartment provider that didn’t require me, my passport scan and I looked they have pretty simple demands, which is okay and I think that the businesses from the users come the demand that your terms and conditions should be easy to understand.

Once they reach this point, once the users make the demand that they are more transparent, they will adapt their businesses. On the panel, we’re built on users’ demand so what we demand and how we behave online is going to be how businesses will, and I hope.

And the Government should play along, and also push this alongside the users, with regulations and et cetera, et cetera and the Civil Society should make everything sure that the users are becoming more and more aware of what’s going on with your data when you tick the box of the terms and conditions, that you don’t give up your privacy to someone else.

Yes, the AirBnB might have said in its terms and conditions that they will secure it but at the same time, the second thing is that I really need to be sure that when it is not treated, it’s accountable. So these are the few points. Maybe chaotically said but if you make a sense, I hope you will follow up on these points, and tell more.

>> Thank you.

>> FRÉDÉRIC DONCK: Please.

>> Thanks. Only short remark. I don’t think it’s too complicated in some regards, since I would already be a big step forward to simply indicate if data processed only for service delivery or if they are also processed for other purposes. We have these discussions on going in a new level and a new directive but I think that is for example one key issue. If I know I submit my data because the provider can then deliver the service to me, fine. If I know, okay, I submit my data and then something else happens with them, in an anonymous way or whatever, that would be an information requirement that requires transparency.

At this point, I think that’s already a big step forward, and one needs to think about reasonable translations, how you can summarize it, which information are crucial. Definitely that’s something different than an explanation about all the details, which is required for other purposes.

But some bullets main characteristics of this contract, how data are processed if they are. For example, where they are processed, if it goes beyond service delivery, I think there’s quite some room for improvement and there are possibilities.

>> Thank you for that. Thomas, can I ask you a question? You’ve done quite a lot of work, I guess through reporters without borders on terms of service, and we’re hearing sort of issues around user consent and the sort of purposes that are expressed and unexpressed in those.

Like, how long is what we don’t know, how long is the data kept for? Things that we would ask our Government. Is there we know when we talk about Transnational Data Flows, we should all keep every bit of data the same, but actually, when we’re talking about the content layer, seems to me that there’s a different type of data. If you and I have a private conversation, that seems to me to be private, and I wonder whether we could be starting to build up a dialogue around: Could different types of conversation be kept different, or treated differently by providers? And so just your general views on terms of service, and how they could be improved.

>> MATTHIAS SPIELKAMP: As a matter of fact, as reporters without borders we don’t look into these things but with iRights we do because we’ve been looking at mainly copyright issues in the past and their terms of service are also important but we started doing that for privacy and security reasons, as well.

But I think, you know, what we are facing here is really a conundrum, because when we are referring to these Transnational Data Flows, we have to accept that we don’t have to accept. It’s a thought that we need to look at this very closely, the whole debate about safe harbor and our privacy shield focuses on the exchange of data between Europe and the United States.

Now, as we discussed before, when we are for example looking at the investigatory powers that law enforcement and surveillance agencies have, they’re strong around Europe. Inside Europe, we have the digital single market, so if data is transmitted inside Europe, for example, from Belgium to Romania, from Britain to Germany, there are no legal requirements whatsoever but then this could open the data that is stored in a different country to different kinds of regulation, right? Because we have no single market when it comes to National Security.

There’s no single regulation on that, so we have to be really clear on what we’re talking about here. But trying to answer your question, I don’t really think that we can regulate this kind of privacy that you’re referring to with terms of service. If we want to have a private conversation, we have to be clear that if we have an intermediary there, then this intermediary will, in most cases, be able and also will be able to store the transferred data, which is the communication between the two of us in that case, and in many cases, they have to do it anyways, because everything that you do on the Internet is copying, right?

So you need to store data. If I write you an email, some intermediaries have to store that email on their servers so if you want to ensure you have a private conversation, you have to use encryption. Now, am I one of the people who says encryption is the solution to anything? No, I don’t think encryption is the solution to anything.

Encryption is someone said that, at the Council of Europe in a debate about freedom of expression, it’s a medication. It’s not a cure, right? So the problem that we have is that we need societies that do not decide that the State can snoop on the citizens, right?

If we say that in case you want to prevent that, use encryption, you’re looking in the wrong direction. I’m a big fan of encryption. I’ve been using it for 15 years. My first PC dates from 1999. At the time, I couldn’t use encryption, because there was nobody to communicate with in an encrypted way, right? Now the number has increased after the Snowden revelations, but I would say it’s about 1% of the people I communicate with, I can communicate encrypted, because you always have to be two people to use encryption in the conversation.

>> How many people in the audience regularly use encryption for emails? Impressive.

>> Okay, but I want to cut it short. The thing is that here, again, there is user responsibility, definitely. You can use signal. You can use other kinds of messengers that make it really easy for you to use encrypted communication, but at the same time, it’s not this is like civil disobedience in a sense, and it’s not a solution to the problem that we’re facing.

>> Please. Can we get a microphone to you?

>> Sorry. Just two quick points. I think one of the problems with privacy is that it’s an intangible item, and we, as a species, suck at valuing something that’s not tangible. We would rather have something in our hands than in our head.

The other thing regarding notification about potentially dangerous technologies and potential abuse of our data, it’s a thing that comes on pile of this, because imagine if you would have a car that every time you would start, the car would say to you: You can now kill a person with this one. Be careful. Or, you can now die because you run off a cliff. That wouldn’t those notifications sound nice, and they you can wash your hands saying: Okay, we told them that they can die doing this, but it doesn’t raise an awareness. In the long run, it just raises the tolerance towards exactly the same that it wants to prevent.

>> But we don’t let people drive cars unless they’ve taken the driving test, right? So those notices at the moment are a substitute for all of that training that you put into people so that you don’t have people zooming around on the street crazily because they don’t know we don’t do that with the Internet, so we have these notices instead so that like, on a packet of cigarettes: This will kill you. Yeah, yeah, yeah.

>> Even driving license, accidents still happen. You have a colossal amount of

>> Think what it would be like without

>> Okay, thank you.

>> Is there anyone else?

>> Can I ask you if it is possible, because we’ve got 10 minutes left, I still just don’t see anymore the 5 items that we have. Maybe we could put that again on the screen if that’s at all possible. I don’t know. It’s there on the screen? No. If not, it’s okay. We can continue the conversation like this. I wanted to make sure that there’s no frustration.

>> I think that there’s an attempt to

>> FRÉDÉRIC DONCK: So we covered pretty well, not everything, but

>> Can you see the slide?

>> FRÉDÉRIC DONCK: Well, actually, I can see it.

>> We can see it but I don’t know if you can.

>> FRÉDÉRIC DONCK: Could you?

>> I think you’ve just upgraded to Windows 10.

[Laughter]

>> While here waiting should we hear from this gentleman?

>> Yeah. Hello again. I’m a practical guy. I would like to come back to the other issue brought up by the person this morning about registering of what is it AirBnB and asking for a copy of the pass. We have also discussed the difference between the physical world and the Internet world. Actually, when I checked in at the hotel here yesterday, I was asked for my passport. They took the passport, they recorded some of the information there. Someone in the hotel immediately take a photo copy of the passport.

So actually, this is a standard procedure in hotels, only in this case, you were asked for it before. However, if you go to a B & B often you never see the landlord. You get a number, email and then it’s a key box there, you key in the number and you get the key. You never see anybody.

So this is the only time B and B can get the information that the hotel required and I am told that the passport information in Belgium is a requirement of the law enforcement. So basically, I don’t see the difference. Please help me to understand.

>> Can anybody help with that one on the passport? The gentleman is making the point that actually, we should chill out a bit about it, because if we go and stay anywhere, we’re asked for our passports. Sir?

>> Hi. No, I don’t have an explanation but I think that is a rather psychological issue because if we go to a hotel and we give the passport and they make the copy of the passport, we are somehow sure they will have it there and it’s not going to be somewhere in the Internet in the hands of God only knows who.

But I think it’s

>> We’re probably wrong with that, aren’t we?

>> I don’t know. It might, but that’s the psychological difference I see of the case earlier this morning and the case of this afternoon.

>> Thank you very much.

>> FRÉDÉRIC DONCK: Let’s get back to this because we don’t know what they do with this information. I don’t know if the only one to be shocked when, I’ll use this example often, that whenever I call may service providers in Belgium, I got a wonderful voice telling me that my conversation will be recorded for purpose of quality. What does it mean? I have no idea.

And I have no idea what it will keep. How long they will keep, where they will store. What part of me yelling they don’t understand will they keep. It’s about transparency and the ability for users to control their data and have an ability to put their hands, right?

>> Okay.

>> I don’t think it’s transparency again, you know? Because what happened if they made it transparent to you? Okay, hi, Frédéric, this conversation might be recorded for training purposes. We’ll keep it for 6 weeks and then we destroy it. Trust us that we really do it. And otherwise, some people can probably look into it for quality measurement.

But if you don’t agree that we record it, we’ll terminate the call right now, right? So what does transparency then what is it worth?

>> FRÉDÉRIC DONCK: Sometimes I have the choice indeed.

>> Sometimes you have the choice to

>> FRÉDÉRIC DONCK: To refuse.

>> To not have it but I’m referring to that idea that transparency is a value in itself because of course it’s necessary to make a decision. If you don’t know what you’re making a decision I 100% agree with that. But what comes after that? What if that transparency is there but you don’t really have a choice? Otherwise if you don’t let your conversation be recorded, you won’t be able to access our service, right?

So I’d just like to pour a little bit of water into that glass of wine that is called transparency.

>> FRÉDÉRIC DONCK: You’re right.

>> This is also an illusory consent, isn’t it? Because actually, say somebody Tweeted this morning: AirBnB standard terms, sorry to pick on them, 30,000 words. So say I’ve read the 30,000 words and I say actually, I don’t consent to that. But I would still quite like to use your service. What choice do I have? We still haven’t quite evolved the sense of, user empowerment in terms of: I would like the service, but I would also like not to have certain aspects of your P and S. I’m putting this out there. There’s a row of people. You’ve got the mic, so go first and pass it forward.

>> Just quickly, it’s about it basically comes down to trust. That’s all there is. If you trust the service no matter what it says in those 3,000 words, you’ll use it. If you don’t trust it, it doesn’t matter what they’ll do to improve their standards, to improve their safety. You’re not going to use it because you have this illogical maybe

even fear or reason for not using it.

>> Thank you. Pass it forward.

>> There’s one here as well.

>> I think there are maybe two different kinds of trust. One is the informed trust and the other one the blind trust. I think end users generally need to be informed to mange that decision, right? Simply saying well, I hope it works and I like the service so it’s all fine. Yeah, you can call it trust but I don’t think it’s very intelligent so again I think transparency is key but it’s a complementary key. I think you stressed the point with the phone call, I don’t want to make you nervous but at least you’re informed. There are many services where you’re not informed and it’s still recorded and used as information, right? I think here at least the transparency is better than no information and your data is still used.

>> So I think the trust here might be pretty much besides the point in a lot of situations, because sometimes you have the feeling that you can’t actually escape using these kind of platforms, because they are the only platforms that still offer the service that you want to take. There’s not a lot of choice. There might be a second one. There might be a third one. But I mean, people want to be on Facebook. They’re just not going to flip through those 100 pages. They’re just going to click “okay,” so it’s more about power that is one of the problems here.

It’s the power of that platform. There’s no kind of like remedial process or something if for instance, AirBnB says: Well, we’re going to kick you off, or Facebook is going to say, we’re going to block your account, you know? And then you’re not part of that anymore, and you can’t go to that one kind of like small corner shop. You can only go because, well, all your friends are on Facebook, right?

So it’s that kind of power that usually power relations can be, like, negotiated through transparency, or through some any kind of like law process or et cetera. But with the platforms basically being their own judge, jury and executioner, that becomes a very tough job.

>> I think that’s a very practical point. And reminds me of an anecdote a friend told me. It’s a Woody Allen thing. Do you remember this one? He goes to the doctor and he says, my brother thinks he’s a chicken. It’s a real problem. It’s causing a big problem in our family. He thinks it’s a chicken.

And the doctor says: Just tell him he’s not a chicken. He’s obviously not a chicken. He goes yeah, but we really like the eggs, you know? And with these platforms, we love them. They’re so popular because they’re really good. And lots and lots of other people and all our friends are on them, so we’re caught up about the terms of service, but ultimately, what they’re delivering to us is something that we all use.

And if we didn’t, then they wouldn’t have that power, huh? Right.

Ana, you had the I saw you getting the microphone, would you like to

>> ANA KAKALASHVILI: Yeah, yeah. Anyone who has the wants to speak, please take the liberty.

>> Hi, I’m from Finland. Just a quick note on these terms of service. It’s actually arguable whether terms you don’t understand can be meaningfully agreed on. So how about just clicking on will not constitute any kind of agreement is a bit of a caricature you might require that users have to be able to type out what they agreed on, rather than just clicking on something

>> Also, of course, in the European environment, terms of service with consumers have to conform to the relevant Consumer Protection and Unfair Contract Terms legislation. I’m not really aware, maybe some others in the room are aware, whether the terms of service have ever been challenged on the basis of unfair contract terms. Does anyone know?

>> In any case, just people should not have to explicitly agree on more than they actually do read and understand. Most of that should come from law and then the exceptions

spelled out be short and few enough that people do have an ability to meaningfully understand and agree on them.

>> Thank you. Thank you very much.

>> FRÉDÉRIC DONCK: I will the party pooper right now because we slowly come to the end of the session. So I don’t know how we want to wrap it up.

Come on, we’ll take .

No, I just don’t want to I want to see if anybody wants to add something to this conversation. Because we’ve revisited again so many issues but I like the idea that we have those 5 bullets, which is actually a good starting point because we revisited most of these issues there.

I liked the last conversation about innovation and privacy. We might turn it into what we talk this morning, the Internet of opportunity, so we might just in the future add another bullet. But I really like these and I hope you do, too.

Is there anybody to take the last word? It should be you, Nigel. You just moved. So take the mic. It’s a good idea.

>> Nigel: Sorry. I certainly don’t want to take the last word but I just did wonder and I wasn’t right in the beginning of this, this conversation, and have no problem with any of the points there, but what I think this debate sometimes falls over is the sort of domestic, Regional, global approaches. In other words, we’re very good at talking about privacy, in particular in Europe, and there’s no reason why we shouldn’t be. And we’ve got strong heritage in the work that the Council of Europe have done in particular, and of course the European Union in terms of the directive, and now the updated regulation.

But what has really been lacking is a global approach. We haven’t been able to, for whatever reason, really enthuse other parts of the world in terms of the same. Now, OECD has done work and come up with privacy. And the Council of Europe has done great work in promoting this but still we have this lacuna, we have this gap. We have this global gap where so many countries have not adopted these principles, and so I think something about that, something that captures that is needed in our as we go forward on these issues. But that’s just a thought.

>> Thank you for that.

>> FRÉDÉRIC DONCK: Thank you.

>> Perhaps it’s also fair to mention in your summary, Nigel, that I think that there’s been some very good thinking about privacy and security by design, particularly in relation to the Internet of Things coming out of the U.S. authorities. I’ve been very impressed with the thought papers going out from I think is the FTC or the FCC. I always get them muddled up. But yes I think you’re right about the importance of the global conversation. But isn’t it also, going back to the point made by the speaker earlier, that in some countries, people are contending with very much more immediate issues, and much more difficult issues to overcome, sort of basic in Internet terms, basic connectivity, but in human terms, some real challenges driven by poverty.

And perhaps one of the things that we can do as a resource for other countries, is to develop the thinking, as we see it, and encourage others to join the conversation and to share their experiences.

I think I’d like to thank everybody in the audience for your contributions. This has been a much more intimate conversation than this morning’s, but it’s been thought provoking and valuable and I really thank you for that. I think we’ve ranged around Transnational Data Flows. Issues about privacy, security, how these are not in contention with one another, but actually part of the same whole.

We’ve talked a lot about terms of service. And also a lot about innovation and trying to get a much more positive conversation going within Europe about that, and much more belief in ourselves. Sir, you wanted to have an end mention? I think there should be a microphone somewhere. Oh, there.

>> Okay. I’m with the Electronic Frontier Finland. I think we as Europeans should show the Americans example and create a social media, with respect to privacy, for example. I think it can be done without selling all customer data. I think it can be done nicely.

And I think there should be a lot bigger effort. I’ve been wondering why isn’t there anything in Europe and since this is about flows of data, we should have the data flow our way for a change.

>> Maybe that’s a difference, because maybe in the U.S., instead of wondering why someone else hasn’t done it, the question would be: I’m going to do it. So maybe that’s your take away from it, that these are very important points.

So I think that brings us to the top of the hour, is the coffee break, for half an hour until 4:30 when we’ll reconvene to hear Commissioner Oettinger and Ross LeJeunesse from Google talk about Internet of Things. Thank you very much.

>> FRÉDÉRIC DONCK: Thank you, Emily, very much.

[Applause]

Session twitter hashtag

Hashtag: #eurodig16 #trust