Will users’ trust impact on transnational data flows? – PL 01 Part 1 2016

From EuroDIG Wiki
Revision as of 16:21, 4 December 2020 by Eurodigwiki-edit (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

9 June 2016 | 11:30-13:00
Programme overview 2016

Session teaser

This session will explore the socio-economic effects of transnational data flows and discuss implications for policy making.

Session description

In this plenary session we will discuss the role of the Internet for economic development and innovation. We will specifically address concerns regarding privacy, trust and transparency. In an ever more complex world everyone appreciates apps and services that make our life easier. At the same time individuals are facing serious challenges when trying to keep track of their “digital trail”, let alone understanding how their data is processed or even knowing where it is actually stored.

The panel will address risks and opportunities while tackling the key questions:

  • How is digitization transforming the economy and the society?
  • Is there a need for a global policy framework for data processing or will open standards suffice?
  • Has the EU become a fortress while attempting to protect its citizens rights online?
  • How can user trust be gained and maintained?


Big Data, Transnational Data Flows, International agreements, data-related technologies, algorithmic technology and decision making, artificial intelligence, privacy, security, user trust, innovation, European economic divide


We start with a key note by Kathy Brown. She will then join her fellow panelists for an initial round of input statements. After this we shall proceed quickly to opening up the discussion to the floor and have a highly interactive session. There is also the opportunity to continue the discussion in a follow-up session in the afternoon, which will be completely open - no panelists - for everyone interested in continuing the debate.

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, only links to external resources are possible.


  • Focal Point: Thomas Grob, Deutsche Telekom AG, Germany/Switzerland
  • Key participants
Kathy Brown, President and CEO ISOC
  • Ana Kakalashvili, GIZ GmbH
  • Kathy Brown, President and CEO ISOC
  • Ross LaJeunesse, Global Head of International Relations, Google Inc
  • Matthias Spielkamp, Journalist/AlgorithmWatch
  • Pearse O'Donohue, DG CONNECT, European Commission
  • Moderator: Frederic Donck (ISOC) / Emily Taylor
  • Remote moderator: Su Sonia Herring
  • Org team
  • Ana Kakalashvili, GIZ GmbH, Georgia/Germany
  • Marco Pancini, Google, Belgium
  • Frédéric Donck, ISOC, Belgium
  • Karen McCabe, IEEE, USA
  • Justin Caso, IEEE, USA
  • Reporter
Thomas Grob, Deutsche Telekom AG

Current discussion

See the discussion tab on the upper left side of this page.

Conference call. Schedules and minutes

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange
  • be as open and transparent as possible in order to allow others to get involved and contact you
  • use the wiki not only as the place to publish results but also to summarize and publish the discussion process

Mailing list

Contact: pl1@eurodig.org


  • There is no trade-off between Privacy & Security
  • Security needs to be a collaborative effort / Subsidiarity works: intervene at the least intrusive level possible!
  • The multistakeholder model offers the tools to solve complex issues – approach needs to be open, transparent, inclusive, accountable – needs active engagement; we need to do more!
  • Transparency and Openness are meaningless, if people do not understand what is being disclosed or in case there is no alternative option.
  • Openness requires shared responsibility: companies and governments may not solely and completely be held responsible for what people do online.

Video record

See the video record in our youtube channel


Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.

>> MODERATOR: So, hello, everybody. I would suggest we start this session. This is the first plenary and I thank you for joining us for this extraordinary plenary. We will address so many key issues, data flows, digitalization for society, I guess from our panels we will address issues like data protection privacy and of course trust. I would get back to this in a few seconds and I will introduce the great panel that we have today to just try with to you investigate those issues. Before do I that I would like to ask Kathy Brown who is the Internet Society president and CEO. Kathy, if you don’t mind, could you join me here and offer us your keynote, thank you, very much.

>> KATHY BROWN: Hello, good morning. I understand when I was sitting there you couldn’t hear very well. Are you okay now?

Hello, everyone, thank you so much for having me here, inviting me to this panel. I think it’s going to be a good one. I thought I would first thank all of you for all of the conversations, all of the commitment that folks here show to our ever important principals of an open global full connected Internet. I’m just on my way back from the African Internet summit in Botswana. And it’s very heartening to me to see the energy that is generated around the world listening this morning to the young people. Africa is a young continent and it is in the hand of the young that the Internet there is being developed. Their commitment to community and to a multistakeholder approach to building their future is quite inspiring. I commend to all of you from where ever we sit in the world to turn around and look at our neighbors around the world and see that indeed we still have this global distributed open Internet that people are quite deeply committed to. And I applaud Europe and all of you here for that commitment. You all know that ISOC has I think it’s 31 chapters here in Europe. 34 chapters here as we keep increasing. And the work of ISOC is doing a lot by the chapters who are addressing the critical issues talking to their governments, talking to each other, organizing efforts like this to make sure that the things we care most about stay on the table and are at the front of the agenda.

So what are we in the Internet Society focusing on this year? Two things. Access and trust. Those are the issues we believe are the, if you will, existential issues facing the Internet after its depending on which founder you talked to 30 or 25 year history. It’s amazing what happened over these years. We know and won’t repeat the kind of growth we have seen in parts of the world but our concern is that only half of the world is connected. And that we can’t have this. We cannot only have the world connected to what is the essential nervous system of our economies right now. The way people communicate one to the other, the way we have been able to actually change the way we live, the way we think, the way we gather information. We cannot have half the world not connected. The other issue we think is of existential value is the security issue. The trust that people must have in order to use the Internet we believe has come under attack. It is every day that we see some criminal hacking of the Internet. People losing their own data.

We are faced with corporate action that at least for some who have not given permission for use of their data is unacceptable, needs to be looked at. And finally we see government action that unfortunately includes not only surveillance but the use of our data without due process. These are serious issues. We see from the polling that takes place, the user response, and the conversations that are happening all over that these kinds of things need to be addressed. So today what I want to do very quickly to set up this panel is to try to connect two ideas. One the multistakeholder idea which we talked about a lot this morning, and two, this idea of addressing security and trust.

Let me start with the multistakeholder issue. Indeed it is a post WSIS world as Frederic said this morning. In November of last year the world’s government stood up and said yes indeed we agree with what we did ten years ago and the so called governance on the Internet should be a multistakeholder bottom up process and not a governmental top down process. There was unanimous acceptance of that report. We should applaud the work of this community and communities around the world that we have achieved that. We do have a consensus at least on paper that this is the way we should ago. And now is the time to put some meat on those bones. We really need now to make it happen for the practical application of this process. I’m pleased that UNESCO is looking to it on research. I think it’s time that we all paid much better attention to the elements of what it is we have won here and how to put them in action. The Internet Society published a paper just this last quarter on why the multistakeholder approach works. We have now translated into all of the UN languages. It is our attention to take this framework and push and turn it into actual real procedural reform so that the processes we use across governments, across companies, across our own communities become truly multistakeholder. What do we mean by that? We mean open, transparent, inclusive and accountable. We have a long way to go to get our processes to actually look like that.

And they must go beyond dialogue and consultation to actual decision making. There’s some good thinking going on about how to actually look at the legal kinds of processes that we all now use to become more stakeholder oriented, to become more collaborative and to actually result in participatory deliberation and decision making. Now, let’s connect the dots to security and trust. Even with our allies and governments and elsewhere who agree with us that this collaborative approach is a better and best approach for the Internet, when we come to security they say no, wait a minute, we will take care of that, thank you. The governments will decide both inside their borders and on a global level how we are going to deal with this.

And we believe at the Internet Society that this is a totally wrong approach and that we need to take the same aggressive advocacy that won the multistakeholder approach to begin with and apply it to what we call collaborative security. Olaf will run a session later on in the programme of what we are calling elements of collaborative security. I will try to lay out the elements of what we believe has to happen in a collaborative forums, what have to talk about where, what and how, so that we get decisions that don’t balance privacy and security but that result in both.

I think this is the I just feel convinced that this is the topic of our day. This is what people who use the Internet care about, it’s what governments are talking about, and it is what people all over the world are worried about. So as we gather together in our various forums around the world, I hope to bring this issue to the top of the list, to the top of our agenda. The issue of security and trust is the issue of the 21st century on the Internet. We must address it together and we must use collaborative processes in the spirit of the multistakeholder jurisdictional approach that we have one hard one and we have to do it loudly, boldly and so that it’s clear to the whole of the community how we proceed. So thank you for this opportunity. I hope to open this conversation up and I’m looking at my distinguished panel to help develop some of these ideas. Thank you. (Applause)

>> MODERATOR: Thank you. Thank you, very much, Kathy for those very inspiring words. Let me introduce panelists. We have Ross LaJeunesse. Next to you Ana Kakalashvili, a student but you’re also working at the legal department for GIZ GmbH. Thanks for joining. Pearse O’Donohue, thanks for joining us. There will also be people in the room here who will help me driving these conversations starting with you, Emily Taylor, I don’t know if I have to reintroduce you again. As our row moat moderator we have Su Sonia Herring. Thank you to our remote participants as well. Last but not least we have a reporter who was instrumental in the organization of this panel, Thomas Grob; you are working with Deutsche Telekom AG. So thank you for this. Let’s start.

There is something I would like to insist again. We are in EuroDIG so we should like to I would like to see you, this is a two way conversation. You are the six panelists of today. This is the Internet of opportunity. I’d like to hear from you very rapidly when you hear Internet of opportunity what comes to your mind? Please? Who wants to start, just one word, Internet of opportunity, what does it tell you? Don’t be shy.

>> Things are possible that were never possible before and we need to use those benefits.

>> MODERATOR: Thank you, Marc, this it more than one word but we will take it. What else?

>> WSIS.

>> MODERATOR: WSIS. Someone else? Equal opportunities.

>> (Speaker off microphone).

>> MODERATOR: Okay. I have the transcript here that supposedly takes all what you’re saying. Let’s continue. We have a very important audience today. I would like to hear from the challenges that you see when you hear of Internet of opportunity. Positive, negative, challenges, what comes to your mind?

>> (Speaker off microphone).

>> MODERATOR: Say it again?

>> (Speaker off microphone).

>> MODERATOR: Exactly. Data localization. Someone else?

>> Governments, thank you.

>> MODERATOR: I guess we got enough. I would like to turn again to our panel and I would like to start with you, Ross, you heard some of those words when we say Internet of opportunity. How does it resonate with you as global player?

>> ROSS LaJEUNESSE: Thanks, very much. It’s a great honor to be here. I was pleased to hear some of the words from the audience. I think too often when we think about the Internet we think about companies like Google. And the Internet is much broader and bigger and more important than any one company, any one platform like ours. But especially when we talk about government approaches to regulation, it’s often the big American platforms that first come to mind. How do we effectively regulate them, how do we effectively deal with them? And if we are to take to heart Kathy’s admonition which do I that we need to be moving to a world of true multistakeholderism, multistakeholderism not just on paper but in action and in addressing the problems and the challenges as well as the opportunities faced on the Internet there’s no other way but the one that Kathy outlined. As she said we spent a lot of effort and energy to get governments to recognize that fact. Now it’s time to engage with them and make sure that they actually put it into action.

Anyway, I was pleased to hear those words because it’s not really about the big platforms. When you look at the majority of the economic benefits, for example, being achieved by the Internet, most of them are going to traditional companies. 76% of the benefits of technology and digitalization are being achieved by what we would consider old world or non tech industries. People don’t realize that. So it’s really about in our mind it’s about a couple things, it’s about small enterprises, it’s about partnerships, and it’s really about the reality that every business today pretty much is an Internet business. It’s just a matter of whether or not they recognize that and realize that. And as we move to a world where it must be a multistakeholder model of governance and engagement, I’m sorry to say that we see far too few companies engaged.

How many companies are even here at EuroDIG right now when many of them I think should be here engaging with Civil Society and with academics, the technical community and government officials? Part of the problem is far too few recognize they have such an instrumental stake in how Internet policy is developed and that really, really has to change if we are going to address the challenges and take advantage of the opportunities we have in front of us.

>> MODERATOR: Very good. Thank you. I’d like to turn to you, Ana, and I would like to see from your perspective users or youth perspective, how do you see the benefits or the challenges of the Internet?

>> ANA KAKALASHVILI: Yes. Hello, everyone. Yes. As I’m here as the youngest panelist, yes, I will be talking from my perspective. Let me say besides being youth, I’m a customer; I’m a citizen of my country. And but again as a young person I’m embracing this digital evolution as I call it. And I love that we are this society is becoming digital. I love that my life is becoming easier with many apps and I love new startups. They are so interesting driven by amazing new people. I sometimes think how did they come up to this idea? And I like that it’s being more and more supported. But at the same time, I’m confused. And let me emphasize confusion. I’m not scared of the future and at the same time at the presence of my digital life. I’ve sent millions of datas every day like you all do, like you do right now. I’m just confused.

So what is happening with it? Let me give an example that happened just a few days ago. I’m going on vacation and me and my friend we wanted to book apartment from Airbnb. You go somewhere for home. And we gave up our credit card information, our name, address, and they ask us can I have passport? And then we start, hmm, that’s not good. That’s not at all good. So we quit that tab and we started looking for alternatives. The alternatives I think we will come back to that later but let me finish my thought.

So but then we thought would our friends who have nothing to do with Internet governance or any cyber dialogs who are not far from that, they’re just using Internet and not sitting like me here and not being here for many years, would they do it the same? And I think the answer and that’s maybe many of you won’t they wouldn’t do the same. They would have given up their passport. That’s where I’m a bit that’s where I say and I said that already that privacy is becoming big privilege. It’s like education. Not everyone is educated. For example, in Georgia the country where I come from if you speak five languages that’s matter of money as well. So that was my pick up line, that privacy is becoming a privilege.

>> MODERATOR: Thank you for that. We will come back with this panel on trust and privacy in a few seconds. Pearse, from a policy maker’s perspective, what do you want to say?

>> PEARSE O’DONOHUE: I have to start with what Kathy said; we have to ensure that half of the world is not cut off from the central driving force behind our economic development. I’m obviously talking from a European perspective on it but I know it applies outside and also we are very keen not to taking just to European perspective. But in relation to that point any policy that we try to develop to actually reach out to the other half who may be cut off from the economy it might be a question of money; it might be a question of education or whatever. Trust and security is actually the key issue because people would not engage with the system and trust their data in the system if they don’t have trust. You asked for words from the audience.

As we are trying to extend the benefits of the Internet to everybody we cannot have a situation which unnecessary unjustified rules are blocking data and data services into one geographical location because we are tying our hands behind our back. We are preventing the economy from developing and preventing many individuals and small companies from benefitting. Perhaps the third point I would react to was one of the things that Ross said about businesses, their participation. It is also important because a lot of the policies which I might talk about briefly later on our mandate is not to develop the ICT sector as such. Our mandate is to develop all of society and all of the economies so all of the businesses in sectors, the traditional sectors who can and some of them are but many are not users of these technologies who can grow, who can put new ideas, who can lower costs for others so it is very important, the business involvement in this kind of dialogue is very important for the European commission as well.

>> MODERATOR: Thank you, Pearse. Be sure I come back to you about data localization in Europe for sure. Matthias, you come from a company in Germany where things about privacy protection are very serious. What is your take about what you heard?

>> MATTHIAS SPIELKAMP: The thing is I heard someone say once technology doesn’t really work yet. And when you ask about or ask all of us about what are the opportunities and benefits of the Internet and what are the challenges, I think we live in a very in a word of separations because for me coming from Germany it’s like I have a hard time thinking about a word in which the Internet with all its benefits and opportunities doesn’t exist or didn’t exist because all the things that we enjoy now, the level of communication that we have with, you know, just anyone, anywhere in the world, who has access to the Internet, right, is something that we take as a given.

At the same time, Kathy and others were already referring to it; we have different situations in many other countries where people do not have that opportunity or these opportunities now and the benefits now.

So that is a big issue but looking from a German perspective what we have in Germany is that separation between people for whom the use of the Internet is something that is completely normal in their daily life, they don’t recognize they are doing it anymore. But they are also not questioning it. On the other hand we also in Germany have people who have not taken up using the Internet or if they do take up the Internet, it’s in that sense that they need it for some specific reason. Using what is able to communicate with their nieces and nephews. It’s there, it works and you don’t need to think about it.

But if we start thinking about it and if we start thinking about user’s trust, how is it achieved and how can we keep user’s trust in the Internet then of course we to have talk about these governance issues and privacy and security. To say that I have a specific view for example on privacy, yes, it is true that Germans are known to be very sensitive about that. I do think that there is a lot of confusion about what privacy actually entails on the Internet, you know. What is the problem if someone posts of their own account an image on the Internet that for example their parents would not have posted? Is that a privacy issue? I don’t think it’s a privacy issue because it’s an issue of cultural development that they decided that this is something that they are willing to share where someone else was not willing to share this. Is it then on the other an issue with a company that collects all that data? It depends what they do with it and it depends what level have of transparency or action I have or possibilities to change that. Now I think we have to look at this at a case by case on a case by case basis because otherwise we will get tangled up in that huge debate about what is privacy to many different people.

>> MODERATOR: I agree. But we might address it today if we have some time. Back to you, Kathy. How do you want to react to what you heard?

>> KATHY BROWN: Who said this, many things can be true at the same time. And if that isn’t so on the Internet, it’s not so anywhere. There’s many things, complexities, that arise on human kind using a tool, a platform, a connection, that is open completely open to anyone, anywhere, to say and do what they would choose to do but for barriers that are put in the way. That’s quite a thing. That has never happened before. It is certainly going to cause the world, communities, states, people, some confusion. It is certainly going to call upon us to figure out some norms of behavior of the way we want to operate as communities because we are communal people, we are communal in the whole nature of things we do and we are no different on the Internet.

Indeed it makes us more communal on the one hand and more separated on the other because there we are sitting with our computer or with our device not actually talking and looking into the eyes of someone and yet being able to reach all over the world. So we believe deeply that the benefits far out way the challenges but that the challenges are real. And that if we don’t meet the challenges, the benefits could be badly affected.

So let me just take up something you just said. We see every problem before us and think how are we going to solve that? If you’re someone from the UN you think you need a treaty. If you’re from a national regulatory place you think you need a regulation. If you’re a mother you think you need a good swat and say please don’t put that up on the Internet. Everywhere we are sitting we think we have a notion of how to solve that problem which is why we keep saying in that complex ecosystem we need to sit together and deliberate together because we will have different points of view. The principal you will see on this great paper called collaborative security, these are my ad moments says this: That when we are trying to solve these problems here in particular we are talking about security there’s to single piece of legislation, there’s no single technical bit, and it’s not that this thing is going to be solved by anyone. Typically for greater effectiveness and efficiency solution should be defined and implemented by the smallest, lowest or least centralized competent community at the point in the system where they have the most impact. Does that sound like an engineer talking? It is but it’s a good social as well as engineering concept, that you govern, so called govern at the smallest place where the centralized community is affected. Teaching kids how to be responsible on the Internet, I would like the teachers, the parents and those mentors with kids to be able to have that. If we say what do we do with respect to the Internet community with governance, I think we can tackle this if we start to look at principals of how to meet those challenges to ensure the actual benefits.

>> MODERATOR: Thank you, very much. Thank you very much. Kathy, we will come back for sure in a few minutes on those security issues that you underlined. So let’s keep in mind a general level, no one will be able to solve all those issues on his own; therefore we completely feel for the multistakeholder approach. I had like to get back to key words and warn you dear audience that I will get back to you in a few minutes. The word trust is an over harshing issue and concern among many of us. Kathy just said this morning there are concerns from users, breaches, you don’t know where your data goes, who stores it, where is it, who uses it, how it will be used. I’d like to hear a bit from you, Ross, how do you want to address this, how do you feel we should tackle trust build trust, how do you feel we should do that?

>> ROSS LaJEUNESSE: We certainly have cared about and focused on our users’ trust from the very beginning. If we haven’t done that we wouldn’t have grown to the size we are. The number of users that we have.

From the very beginning from my very first day at Google which is a little over eight years ago there are a couple of things drilled into your head as an employee of that company and one of them is always put the user first. If you put the user first when you’re dining a product, when you put the user first when you’re doing any of your jobs including mine, everything else is going to take care of itself. And our users do care about their privacy, they do care about their security and they do need to trust us because if they don’t, they will go somewhere else and we have always known that. And so we build our products with that in mind, I hope many of you have been able to stop by my account booth in the foyer of this meeting. That is not only about building user trust but making it as easy and efficient as possible for our users to exercise control over their data and in every interaction they have with us. That’s about building trust as well.

So we as a company have always focused on that and seen it as vital to our existence and our success. And it continues to guide to us this day. It continues to be as we have gone to be a much larger company than were eight years ago continues to be one of the key messages that everyone at Google operates under. I want to get back to this multistakeholder engagement is also a vital way of building trust. Without security without the benefit of real engagement, understanding the concerns of the user community and the concerns of businesses, that builds trust. So this multistakeholder way of engagement is not only the smart way of building Internet policy, it’s the only way to build that trust that users must have, trust in government, trust in businesses in platforms that they use, trust this society and generally trust in this thing called the Internet.

>> MODERATOR: Thank you, very much. I’d like to take both your points and turn to you, Pearse. You’re partly responsible for the whole European strategy on data flows, we see that Europe, when it comes to privacy, about protecting users. How does it work on a global environment?

>> PEARSE O’DONOHUE: Of course those risks do exist, something we are conscious of. But again it goes back to the alternative is probably worse. If people do not have, for example, evidence, proof that there is a high level of privacy, that there is a high level of protection of their personal data, they are unlikely to engage or trust the system at all. They are unlikely to migrate onto the Internet and benefit from the services that are provided. So we are conscious of the risks but at the same time the alternative may be greater. We have no embarrassment in saying that it is written, it’s hardwired into our treaties. And the open court just recently confirmed that, that there should be this high level of protection of data. What is important is that the European economy, the European Internet economy is opened to all.

If you are prepared to engage with European Internet economy respecting our rules on data protection, then there is no discrimination from that sense, there is no barrier. We are finding that some countries find it difficult to understand how important it is for EU citizens to have their data protected so we have to try to explain that to them but nevertheless is it something that is, shall we say, the necessary price that European citizens ask for in order to have open Internet economy. Just as importantly, that is going to be the nature of these services anyway. I know that a lot of the focus has been on mass surveillance and as opposed to lawful due process, access to data when that is needed for understandable security reasons. As Kathy also mentioned the biggest threat is cybercrime, it’s still hacking; it’s still incorrect and proper use of people’s data or of business’s data. Again if they are not protected, if we don’t have the rules, then the system is not going to work. The Internet economy will not work. And therefore Europe and other regions will not be able to share, there will be obstacles at national or local level, and the society will not be able to benefit from this open global structure that Kathy described.

>> MODERATOR: Thank you, Pearse. As you see we are slowly coming to this point where we will be talking about privacy and security. We might just balance this, if this is the right word to use. I know by the way in Europe we have a very strong protection for privacy, but anything preventing governments to access citizen data on the whole. Even worse, I haven’t seen anything that would just prevent them to do that without a court order. To have a bit more of your feedback, audience, we are at the juncture and there was a strong statement this morning, privacy cannot be breached by security. How do you want to tackle this?

Emily, could you help me have the audience reaction. I would like to get back to you Kathy in a few moments after the audience. Emily, could you help me having a bit of feedback from the audience, you’ve been silent for a while. I heard you clapping this morning when many people in the audience said privacy security we should not just sell our privacy because of the holy grail of security. Please. And I hope Olivia, do you want to report a little bit from what is happening right now in the U.K. Please, introduce yourself and give it your thought.

>> Hi, I’m from Digital Rights, Bulgaria.

>> MODERATOR: Speak louder.

>> I thought I’m talking in the microphone.


>> Actually I want to make a remark on the remark the young lady did with Airbnb. When you have a business that is running big part trust and money, you want to know that identity can be verified, same as let’s just say conducting a bank transaction. You need identity verification. So basically how would you imagine somebody letting you in their house without being worried about their positions or their life or what not? And isn’t that statement kind of claiming that verification of identification online is not acceptable because it’s not data protection friendly? That’s my question.

>> MODERATOR: Thank you. I would try to channel this thank you. I would try to channel this to this panel. For sure we don’t need so much information from users. In the Internet Society we once promoted a vision that we might have an environment when you just need to share the minimum necessary to just go and line. You might keep your anonymity which is one of the basic crimes when you go on the Internet. How do you want to react on this? This is just about the business models. We know as we all say for ages, data is all of the economy but how do we protect the environment if I may use the same analogy.

>> Sure. I think the first thing important to remember is there’s a very clear distinction between government surveillance and company use of data. Government surveillance, no transparency, very opaque rules, sometimes the rules aren’t even followed, there isn’t a whole lot of information about what is going on, what is happening, that sort of thing. Google’s approach and you can’t speak to other companies but it goes back to that issue of user trust. From the very beginning we looked at it with a couple different principals in mind. The first is you give users transparency about the data that is being collected. The more transparency the better. If you visited my account booth you know that you can we have done this forever but now we are making it even simpler for users because too many don’t do this. But you can have complete transparency on the data that is being collected, you then have controls over what data is collected or not collected. I give users complete control over that. And then you make it easier for users who do want to leave. You can do that with two clicks on Google. Take all your data and take it somewhere else.

The thing that needs to be kept in mind is even though users have the option of using all Google services anonymously, the vast majority of users choose to have their data used. The reason for that is because the collection of that data makes the products work better. If I am searching for coffee on my mobile device and it sees that I am walking, the device, the platform, the tool, generally knows that Ross is walking in this neighborhood; it’s probably looking for a coffee shop. If it were to give me back the Wikipedia entry on the history of coffee that, wouldn’t what about I want. So the evolution of what do users want, what do they want in that moment, and in order to figure that out there is collection of data. Now if I’m walking around and I really don’t want my data being collected, I want to use it anonymously or I don’t want people knowing my location, but the tool using my location, I have that option. What that means is it’s going to send me that Wikipedia page on coffee or it’s going to send me something irrelevant to what my needs are at that moment. There are a couple different ways of looking at it. It’s important to look at all of that is under this focus of trust. If the users don’t trust us, they will go somewhere else. And we have never lost sight of that in the entire existence of the company.

>> ROSS LaJEUNESSE: Ross, could I just explore some of these points a little bit with you. How many people in the room albeit we are not particularly demographically represented, how many people have read Google’s terms of service? Quite a few, probably undermining my point. But one of the things that surprises me in your terms of service is that you retain the right to scan people’s e mails. I think most people would view that in e mail correspondence would be something inherently private particularly in that it might include privileged or private communications. You talked about user consent and all of that stuff and I felt so happy and so relieved but then I thought about this and that you’re scanning everybody’s e mails who uses Gmail, do you think they have consented to that?

>> MODERATOR: I think that’s a dangerous proposition to say that companies have the sole and complete responsibility for what users do online. We have tried and continued to try and make it as transparent as possible for users to understand what is going on and to give them those tools but there is a point at which the user needs to be informed and take responsibility for his or her actions online.

>> ROSS LaJEUNESSE: Do you think they will go also where though? One of the things that is so sort of conundrum to lots of us concerned about surveillance and we see survey after survey which shows that users actually tend to trust private companies less than they trust governments with their data and yet they still won’t move elsewhere. So I heard your could go league once say with great power, great responsibility, that’s a Spiderman quote but can we always put the onus onto users. When you’re seeing people actually are losing trust in you but they’re still not moving, to actually change your practices a bit and not collect so much.

>> MODERATOR: Emily, what would let’s look at the terms of service, for example. What would you say the answer is in response to that?

>> EMILY TAYLOR: First of all there’s been a slight accusation we are not taking responsibility for our platforms and relationships with the users. And I completely disagree with that. We spend a lot of time and energy trying to make sure that users retain their trust in us. We spent a lot of effort; we heard concerns that the terms of service were opaque and very difficult. We simplified them. When we tried to do that it was the DPA’s in Europe that took us to task for the way we did that. So I mean, we do our very best because we try and do the right thing but also because it’s in our business interest and the relationship of maintaining that trust with the user. So, you know, I guess I would say I would just respond to maybe the implied accusation that we aren’t taking it seriously or the great responsibility. I think we live that every day.

>> MODERATOR: Thank you, Emily for jumping in. Thank you for your response. I’d like to turn back again to you all, we were talking about privacy and security and how we might need to balance these. This is a key question. People keep asking how could we do this? Do you have an idea you want to share at this stage before I turn to you, Kathryn?

>> I want to share my confusion and build to that confusion about how we might have a problem where we understand what we are all talking about. When I think about security I’m thinking about data security and network security. People talking about state security or mechanisms, I won’t speak for Kathy but that’s how I understand her comments. There is no tradeoff between privacy and security. You wouldn’t have people using the systems unless you have security. And way back when there were the original Snowden revelations, there was a policy that said we have to rebuild this trust but not because we are going to make apologies for a mechanism or surveillance structure but mass surveillance was not acceptable. But we had to restore trust; we had to restore security in that data in order for the data economy to move forward. If I can add to that. The attention was put on Ross and Google as a provider from the very interesting question from the floor which related to Ana’s example of Airbnb. I don’t pretend to know Airbnb’s position but a slight area of confusion to be sure we all know each is the same is that where is the Internet opens up whole new vistas for sharing, for contact, for understanding of cultures and Kathy described in her opening remarks about how this is the first time this has been possible for everybody but at the same time the Internet is not some new dimension or are new world or universe.

And a lot of what we have been doing is saying if there are protections in the physical world then we have, they don’t all transfer to the virtual or online world but we should think that the same laws and principals of laws at least should apply and that starts with privacy and privacy protection. Of course the rules have to be adapted. One of the biggest challenges we have is that we have new ways of doing business and new services to individuals which are limited and constrained by the fact that the legal framework in which their operation is an analog paper world pre Internet. I don’t know, somebody here will no doubt be able to answer for me, perhaps in the context of Ana’s example the country where you were looking for accommodation requires those who provide accommodation service to have a copy of the identity of the user. So is it necessary or is it a request or an obligation on the provider of that service? We have to make connections between the current rules in the physical world and the online world.

>> MODERATOR: Thank you.

>> I would like to offer one thought of my own here when we are talking. Kathy named the four principals openness, transparency, accountability, inclusiveness. Leaving aside my thought for inclusiveness, if we look at open, transparent, accountable, this is where the trust issue really lies. It’s one of the most important aspects. Emily asked how transparent and accountable is Google. Ross answered we have our terms of services, so on, so forth. The complexities of the systems are becoming so great that it is becoming much harder to achieve anything with that openness and transparency. You can be as open and transparent as you want if people don’t understand what is going on. If you have your terms of service published, that doesn’t mean that anyone can understand them because most of the time they are written for lawyers anyway and this is probably how they’re supposed to be written because lawyers need to challenge them but what are looking for is the idea of intelligibly, people need to understand how the systems work. If we are looking at the kind of influence that companies and the governments and the automated systems they provide have on their lives right now turning to the topics on Facebook but at the same time looking at how electronic data exchange is impacting on people’s freedom of movement because all their data is stored in border control services, then we as citizens have to understand how these systems function which is, you know, just as my moment of advertisements here is why we founded that initiative because we say transparency is not enough, we need intelligibility, we need the discussion about what kind of decision do we want to leave to automated systems, what kind of decision we want to keep in our human system, and of course everything has to be in line with due process and rule of law, of course.

>> MODERATOR: Thank you. Ana.

>> ANA KAKALASHVILI: Yep. So I left my last note on a pessimistic note but now I want to turn it optimistic. I think the users are growing. We are becoming every day more aware and I believe that we will be more aware because Internet is something new and we will get to know it more and more. We will understand and make our choices being more aware. So I believe that users at some point will look at their business and said if they don’t have the privacy in the heart of their business model next to the security, it’s just the business is killed. So I think we are going and I hope we are going to that stage. And at the same time I want to go back to the multi talks, multistakeholder and drive it a bit into the multi responsibility talk.

And we had the WSIS last month where we also talked about Internet trust and there I said and I will repeat it that the users have their responsibility and this is you need to start by saying that offline world online world is not far away from online. So would you just sent a very letter sensitive or letter of sensitive content by postcard, few, few, few years ago. I don’t think so. So what you do is put it in an envelope. Of course it can be harder. The same is communication online.

So we and I a few years back I would send an e mail without any encryption. As I became more aware of the process that is going on with the data flows, I’m trying to keep my e mails that has sensitive information encrypted, that’s the way I protect my privacy. And that’s the responsibility of users. I think the companies and we will go to the controversial points later but the companies, Apple, Google, that offer us big products, services, they offer us significant measures and tools to protect privacy. They tell us set up the passwords so no one can open it. What we do is 12345678. If you care about your privacy, make it more sophisticated. Then firewalls and anti viruses. Take time to care about it. That’s the responsibility. A private company they have to take care of our privacy as well. And I think enough was said so I won’t add anymore. As regulators, write policy.

>> MODERATOR: Let’s bring it to the other level. Kathy, we start remembering here obviously the Internet uses central network so it’s all about the users. What I hear, we might have around the world a different understanding. Understand that some countries say I want to protect my own citizens, I want to protect myself against governments and I close my boundaries. So how do I see this? Is this risk of fragmentation of the Internet when we talk about trust?

>> KATHY BROWN: I actually don’t think so. I think users have an innate notion of wanting to control their own lives and their information and their identity and who and how they operate. It’s kind of us as human beings, it’s expressed in different ways across different cultures. But you see that in the desires of users to be taken seriously about what is their own personal sort of credo about how they want to live their lives. I’ll say that.

I want to add another little wrinkle to this and I think you teed it up very nicely. I think that we have caused a lot of complexity for users around the world because of the very competition we see in the Internet space. In the Internet economy there are lots of different choices we hear that one could pursue. What happens as usual is 1 or 2 or 3 come to the top and people then use them without really thinking about their use.

I don’t disagree that there’s a user responsibility. I think, however, it’s almost naive of us to think I loved your word privilege that all of the folks are going to sit down and read these things and actually understand them. I don’t think they will.

I have an aspiration with respect to our own security, our own safety that, you said the Internet becomes that there’s some, if you will, standards, by which I can just understand what I’m doing. When I drive a car almost anywhere in the world I know there’s an emergency brake here. There are some standardized things that happen with technology that over time we have understood allows us to more easily doing something we might want to do, stop the thing. We are not there yet. We are not use a notion where we have that users have an intuitive understanding that if you’re driving your car and you see that red light it means stop. These technologies that we have taken into the very way we live we manage to figure out some norms about how we want to use them. Point one. Point two is that the role of government to do all that? I think it’s very dangerous to say please decipher how we want to be safe. On the other hand we see that happening more and more collectively. To we leave it to competing companies to do that? Yes, but, and I hear the European Union say there’s a price to their freedom to compete, there’s a price to creativity and that is you’re going to guarantee some level of security for your people.

Can we leave it to technology to fix? Well, there are fixes in technology; there are ways to think about this. Encryption being one. We can encrypt that so frankly you can’t read my e mail, and or we can start thinking about authenticating identity online. And I’m going to throw it over to Olaf to his next session to talk about some of the things we think about with respect to authentication of your own identity. You know who I am, BNB when I say who I am. There is a way for you to know it without giving your passport. Thank you for not doing that. We can think about some technological enhancements. We need to think about all of those together. Not one of them alone is going to get ourselves into a place where we feel much more relaxed about our digital economy and digital lives. We need to do all of that. I hope we take this up seriously now as the next phase of our thinking when trying to solve these hard questions.

>> MODERATOR: Thank you, Kathy. I see we have questions from the audience.

>> Yes, it’s Phyllis and she wants to read out a question to panelists.

>> Can you hear me? Hello? Can you hear me?

>> MODERATOR: We cannot hear you. Could you read Phyllis’s question?

>> Data collection and distribution is a subject that needs further attention. Users can go elsewhere can be a motivation for business to trust and place towards themselves but when it comes to governments users do not have this luxury. If your government asks you to provide certain information like voter information, there is no choice but to hand it over. Any comments on this from the panel and especially Pearse O’Donohue where similar regulation for better security to protect privacy of individuals can be encouraged at government agency levels?

>> MODERATOR: So, Pearse, thank you to take this question. I don’t want you to be the only one on the hot seat. Is there any government in the room? Someone who can provide some answers? I would expect some government in this room from France or U.K. that can tell us they need to access data with certain conditions or no data. I would like some of you to stand up for this but Pearse can you start?

>> PEARSE O’DONOHUE: Thanks for at least trying do we have someone?

>> MODERATOR: Do we have someone? Sorry, I have the light in my face. Please stand up and come to the mic if you want to say some words about it. In the meantime, go, please.

>> Because of course a lot of this is at the level of our member states. It’s not a perfect match in any what from what Kathy was saying but trying to deal with the issues of the lowest possible unit, the beginning is built on the principal of subsidiarity. We are in situations such as with privacy increasingly imposing or agreeing at the European level to ensure there is a high level of protection, bur in terms of the individual transactions that’s their job.

What we are looking at is how do member states do that and working positively; we have one state which has to ask one principal. Every time you have a transaction with a government agency you’ve provided your data once, it should be safely stored. It’s linked to an electronic secure ID so you don’t have to provide these details again. The agencies using the security government system should have at data at hand. But at European level it is clear that the administrations, public services and national governments, that all public services have a huge role in trust of and use of the Internet. For example, access to government services online can greatly facilitate individuals’ lives, it can mean that actually members of society and individuals have a greater understanding of what their governments can provide for them and greater access to those services. Also by developing governmental services and therefore the presence of public administration online they shift in security.

So we are trying to ensure that there is that and when I say bottom line I mean it as a high level of protection. But the size of public services in any national economy they have a role of creating a critical mass where through use, through even habit, people become more comfortable using these services. It’s in itself an education, what can you trust? And that is we hope I hope I’m not sounding too naive about it but it is a virtuous circle to the reaction in the Internet world.

>> MODERATOR: Do you want to say something?

>> Yeah, to the question because if I understood it correctly it was emphasizing the difference between the level of control that different entities have over your life, be it on the one hand a government entity, you don’t really have a choice accept immigrate. On the other hand a private company where you can choose whether you want to use the services or not. This is a question we are debating in our group of people who are thinking about that accountability question. Of course that is important. The level of power that the entity is able to exert over you. If we say we need intelligibility that’s more important than if there were an electronic example with algorithmic approaches that will decide for a visa whether someone will go to a certain country versus something with discriminatory pricing. It would be relating to the clothes you wear and the impression you give.

Do we conceive that as a problem? Maybe not. If we don’t see it as a problem we don’t need that level of accountability and intelligibility. If we don’t have a choice then this level becomes much higher. And then we have the very, very difficult question to ask what level of power is it? Because to just say that if you don’t want to use Google, go to someplace else, if you don’t want to use Facebook, go someplace else, it always depends on the market power that these companies have. And this is something that is being debated on the European level. I don’t think anyone would contest the mere fact that this is the question behind it. What the decision then is at the end, whether there is an overwhelming market power that someone is using for the users benefits, I don’t know the correct word for it right now, that is an open question.

>> MODERATOR: Thank you, very much. I don’t know if any of you want to react. I see we have two questions. We have seven minutes left. Let’s go.

>> Good morning. I try to be brief. It will be hard. I’m not certainly from the French government nor the British one but I’m from the data protection authority of one of the European member states. And for answering the question which we had I can answer that. If citizens related subject have problems with their data or concerns how governments are processing their data turn to us. You have all the possibility to turn to us. And I can assure you that we are really putting it as a big priority for us to handle complaints. Although I can tell you for law enforcement purposes it’s my personal assessment that we don’t that have much which is quite surprising. But this is another question.

I also would like to make some comments if I may, very brief because I’m here representing the data protection the committee for data protection convention of council of Europe. And I really would like to pose very loudly this approach of collaborative security because that we are also fighting for and that we are personally advocating for. This is what we have to do. In my opinion security privacy is the two sides of the same coin which is our current which is democracy. So we have to fight for it and we certainly have some very good example. We have the reform the recommendation on policy and for example the special investigation and it includes data protection and privacy experts but my question would be to the panel for how that we can make it more beneficial for every day citizens how can we instrumentalitize more these already existing process.

>> MODERATOR: Thank you, very much. Can I keep your question for the last question that I will ask the panel to answer? I will keep this. Thank you, very much. Do we have another question?

>> Yes, hello. The comments I’d like to make are my own personal views. I think the issue we face is one of accountable. We hear that governments stay out of the sphere. Anything Internet related, data related, we don’t want governments in there, they’re bad. There are a couple of issues with that. First there’s a gross generalization and over simplification. Not all governments are bad. Perhaps there are some entities in some jurisdictions there are concerns about and this is over simplified to include all government representatives. And then we are told stay out of the sphere but when something goes wrong our citizens come to us and say fix it. How can you fix it, you’ve blocked us out. At the end of the day is government is going to do something there’s always a legislative process that has to take place. And if it’s not right or not liked at the very least peers in other member states can put pressure on the government to try to adjust their principals.

We have seen numerous examples of that. Private companies are more or less exempt from that. Sometimes there’s pressure but they’re not held the same levels of criticism or rigor. At the end of the day if a company wishes to change its term of reference it can happen overnight with no public process. We have seen that. Some companies have changed, other companies have said take it or leave it. It’s their right. And yet there’s such a gathering of services that we have seen a lot of mergers of these services, we have become so dependent on them that in some cases private entities hold more sway over us than governments do but they’re not held at the same level of accountability. We can’t say it’s down to the users or the people. We are dealing with people who may not be educated or may not be literate in these sorts of things.

Sorry I know I’m taking a bit long but I’m trying to stress the point this is something collaborative. I think we all have a role to play but we need to know who to hold accountable. And part of this is there’s the concept of a social contract between people and those that represent them or those that govern them. I think the Internet needs a new social contract. When we go into digital space we need to know what basic rights, what basic expectations we can adhere to. So perhaps it’s time to discuss a unified social contract over the Internet that we all adhere to and held accountable to.

>> MODERATOR: Thank you very much. You are bringing a new discussion here. I’m afraid we are short of time. I would like to take the question of this gentleman a few minutes ago and turn to it the panelists. Especially you, Ross. We have an extraordinary panel. We have investigated trust, security; we have covered a lot of issues. One or two lines from your perspective, Ross.

>> ROSS LaJEUNESSE: I’d like to return to Kathy’s point because I think it’s very important. I think the multistakeholder model Kathy makes the argument very effectively that it is now the norm. At least on paper. But in order for it to truly be the way forward, it’s really incumbent on all of us to be participants in that and it really is the only way to achieve success. We are starting to achieve examples of it, for example our approach to extremism online which is really a multistakeholder approach. Saying we don’t know how to solve this problem alone, help us and it’s been collaborative. But we need to do more. We need to make that not just the model on paper but the model in reality.

>> MODERATOR: Thank you, Ross. Ana?

>> ANA KAKALASHVILI: So I’ve been many times asked when these multistakeholder dialogues bring and I say yes we come together and discuss things and then go home. And I think that’s the most important part starts then. Someone from Google, from ISOC, that we all try to take our responsibilities to keep this Internet open and neutral and innovative. That’s the most important thing. So this will be my yeah, closing remarks.

>> MODERATOR: Thank you, very much. Pearse?

>> PEARSE O’DONOHUE: Thank you. Starting from the question of the perspective of national data, it’s important to remind ourselves that there are mechanisms and people deliberately working for the citizen, the network of data protection authorities in the European union who have a more important roll, they are independent of their governments just as importantly they’re independent of the European commission and implementing a strong set of laws which are designed to protect the individual and they take pride in their work. Also the citizen does have an avenue to take if he or she feels that they’re rights have been in some way infringed and some of the very important recent developments have come about through court action or action brought my citizens. But because we have these strong mechanisms in place this is another reason why we have to realize that unjustified restrictions to data are so out of place.

There’s the myth that somehow the data is more secure if it’s in one place or even in one country. Cloud service providers and we take an independent objective look at their services and technologies, clearly can give stronger guarantees with regard to the protection and security of data if it is dispersed across several storage sites and data centres. And so the geographical location, restriction of data in some cases is working against this stated purpose which is the security and protection of that data. And that’s why the commissioner will be speaking about this, this is why at the end of the year we are going to come forward with proposals to remove these restrictions but that is on the high level of network security and data protection.

>> MODERATOR: Thank you, very much. Matthias?

>> MATTHIAS SPIELKAMP: I’m not alone in this room feeling a certain level of frustration after a conversation like that. Many of you think we didn’t hear answers to the questions we are posing. Yes I do feel the same. At the same time because we are here at EuroDIG and it’s an Internet governance forum we are used to that. This is what we need to keep doing. I do think we need regulation on a national level and a regional level but we also need responsibility by companies and by users, by all of us. What the balance is between all this, this is what we have been debating for a long time and we will keep debating it. And that’s why I think this is the right place to ask these questions and to try to find some answers even though if you don’t have them right now. But the most important thing is that we further develop the tools that we can use to increase the understanding of what we are dealing with because if we don’t understand what we are dealing with, for example with automated systems and transmissional data flows which is in the title of our session then we don’t really know how to cope with it so I think there’s still a lot to learn but in the end we have to come up with decisions.

>> MODERATOR: Kathy, I have a special one for you. You start this conversation and I follow up with asking the audience what they feel when talking about Internet of opportunities. So in this post WSIS world how do you see us facing this Internet of opportunities and restoring trust in the Internet?

>> KATHY BROWN: I believe deeply this is an Internet of opportunity. I see it everywhere around the world. Honestly it sounds trite but it inspires me every day to see how people use one of the most greatest inventions of human kind to keep innovating. It’s an overused word but the very nature of the Internet is about innovation. It’s about innovating itself to keep changing and growing and changing and becoming more safer in my view. We will innovate our way to safety, we will if we allow us to. With respect to the human issues that resolve around the use of this tool we need to use the same tools of innovation to come up with new ideas for old issues and just new ideas of moving forward. And I’ll stick to my suggestion that innovation happens when clusters of people who care about something who have a mutual goal work together to use their human intelligence and creativity to create the next thing. We are at that pivotal point where we need to do this creation again and again and again. Let’s make ourselves remember we are not going to be incumbents on the Internet. We don’t need to defend what we did five years ago. We need to keep thinking about what is the next step. I believe we have the tools to do that and I leave this feeling not frustrated but hopeful. If we can get the frame works to find the issues and get to work we will solve it.

>> MODERATOR: Thank you very much. I promised you a great debate. We have a panel who was able to jump from one topic to another so help me in thanking them.


>> Sorry to interrupt and thank you for a wonderful panel to all of the panelists and to you, Frederic. Just to say it’s now lunchtime. And just some housekeeping announcements. Unfortunately the commissioner has been delayed and that means that we have another half hour of lunch break, right? And Ross, listen up because this affects you. What we will do is we’ll start the afternoon session just before your lightning talk, we will hear from the commissioner. So we will slightly rearrange the programme. There are lots and lots of furious paddling under the water. So enjoy your lunch and back here at 2:30.

>> Thank you.

Session twitter hashtag

Hashtag: #eurodig16 #trust