The Internet is broken – Bringing back trust in the Internet – PL 03 2014

From EuroDIG Wiki
Jump to navigation Jump to search

12 June 2014 | 17:30-20:00
Programme overview 2014

Session subject

  • Why is the Internet broken? Personal data collection, tracking and surveillance, security, human rights.
  • What can and should Europe do in order to restore users' trust in the Internet?
  • What could be the role of different stakeholders in this respect?

Session description

Personal data of Internet users are collected and online activities are traced in order to deliver a better response, better service, better experience. Recent revelations of mass surveillance have shown that governments undertake Internet surveillance activities against their own citizens as nationals of foreign countries. They justify such surveillance by the need to ensure security and counter criminal and terrorist activities. Such tracking and surveillance actions undermine human rights, especially when it comes to privacy, personal data protection and freedom of expression. As a consequence, Internet users have lost their trust in using the Internet as a free medium of communications and as a tool for exercising their rights. What can and should Europe do in order to restore users’ trust in the Internet? How can policy makers, the business sector, the technical community and the civil society in Europe contribute to bring back trust in the Internet and ensure that privacy is protected and the right to control our own personal data can be effectively exercised? How to ensure a proper balance between the legitimate interests of business and governments, on the one hand, and the rights and interests of Internet users, on the other hand? What are the solutions for rebuilding trust: new or enhanced regulatory approaches meant to ensure the effective protection of human rights in the digital society? new technical mechanisms and applications meant to ensure that each and every one can control their personal data online? more education and awareness raising to empower users and give them the means to better protect their privacy online? And what about jurisdictional issues – how to ensure that the rights granted by European legal frameworks are protected beyond virtual European borders when personal data is flowing frontierless across the optic fibre or vanishing in the clouds ?

People

Format of the session

Plenary. Short introductory remarks by each speaker (based on specific questions tailored to each stakeholder group represented), followed by discussions with the audience.

Protocol. Discussions

See the discussion tab on the upper left side of this page

Further reading

Messages

Reporter: Lee Hibbard, Council of Europe

  1. Need greater respect for the rule of law and its enforcement so that democratic oversight is effective. This necessitates transparent application of the law, including due process safeguards, to ensure both governments and business are accountable for their actions for example vis-à-vis the activities of national security institutions. Institutions that do not respect (work outside of) the rule of law should be dismantled.
  2. Business should be more transparent and accountable. They should make greater efforts to meet the needs and expectations of both users and governments.
  3. Users should be able to regain control of the privacy of their data. They should be able / empowered to use privacy enhancing technologies. They should insist on their human rights being respected which includes demands (on data controllers) to be more transparent and asking why enforcement is not carried out.
  4. There is a moral responsibility to protect people on the Internet, including those who take great risks to blow the whistle on practices which do not respect human rights. Internet users should be mobilised to discuss and shape debate on the protection of those who take such risks in the public interest.

Video Record

https://www.youtube.com/watch?v=iYRuzb-re88

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com


This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.


>> VERONICA CRETU: Dear colleagues, we are going to start up in a few minutes. Two or three minutes. Please take your seats.

Dear participants, please kindly take your seats. We are going to start the final plenary session of day 1.

I think we will start. Good afternoon everyone.

Okay. Our panelists are here. So we are going to take advantage of this and really start, because we are running out of time.

Good afternoon once again. Thank you very much for joining us in this last plenary session of the day 1.

Thank you all for being here. I know it has been a very exciting day today with several sessions and discussions. Really, valuable input being generated during the workshops, so thank you very much for making it to the final plenary session of today.

My name is Veronica Cretu. I come from the Republic of Moldova, where I’m leading an NGO called Open Government Institute and I’ve been involved in Internet governance and open governance for the past several years. I have a very good, excellent and valuable panel today. And before we start, I would quickly want to ask each one of them to introduce themselves.

Mics are there. Please.

>> JAN-PHILIPP ALBRECHT: Thank I very much. My name is Jan-Philipp Albrecht. I’m a member of the –

(No microphone)

I’m a member of the European Parliament and Rapporteur of the European Parliament for the General Data Protection Regulation. And a member of the European Parliament since 2009. Just recently newly elected.

>> CORNELIA KUTTERER: I’m Cornelia Kutterer, digital policy director for the Middle East and Africa for Microsoft and based in Brussels. >> JACOB APPELBAUM: I’m Jacob Appelbaum, a developer with the Tor project and I’m a journalist working here in Germany.

>> MATTHIAS TRAIMER: I’m working in the Austrian chancellery. I’m there for media, law policy and the society.

>> BEN SCOTT: I direct a programme called Stiftung Neue Verantwortung. And I’m also a senior adviser at the Open Technology Institute in Washington.

>> VERONICA CRETU: Thank you all very much and thank you for making it to the session today and for accepting to be on the panel.

I’m standing here on the stage, I would love to go down but as I’m not that tall and I need to make some signs to the people in the back of the room so that they know what they are supposed to do at some moments during the session, I’ll be standing here.

For those participants, for you in the room, there will be an opportunity to contribute to the session as well. We have created a wiki page for the session. The session is an ambitious one, focusing on trust and the way to bring trust back into the Internet. So the questions that the panelists are going to address today are also questions that you yourselves could be answering. You just go online and edit, use the edit option, and insert your answers.

And you could also leave questions for the panelists, so that we could either address them during the session, if time allows, or we could take them further and address them as part of other fora related to Internet governance.

Very important details, the reporter for the session is Lee Hibbard, standing – sitting right here, on my right. And we have the remote moderator Sorina Teleanu. The main them is around trust and it’s entitled “The Internet is broken, bringing back trust in the Internet.” Nowaday, the paradox is that even though the world has become a much better place with Governments aiming for accountability and transparency, at the same time we have been witnessing unprecedented cases related to passing some restrictive laws, reducing and limiting civic space and conducting massive surveillance. The arguments that this is needed to protect us and guarantee security are probably no longer relevant that much. And the Question is, is it contributing to building really open societies, or why Governments don’t trust us, the citizens, why they are spying on us.

As John Kerry during an open Government related event in London at the end of last year, he says yes, it all goes back to the Second World War, to the Cold War, 9/11, massive data collection and surveillance is there. But he did acknowledge that this has reached beyond the limits. It has gone too far. So trust is something that is, indeed, a powerful tool. And today more than ever we really need to find ways to redress this situation.

But how do we do that? Probably the most important thing is to depart from the notion of trust. And before I go to the panelists, I would like to pick a couple of reflections from people in the room.

Can you hear me? Okay. I’ll take another microphone. Do we have another microphone?

So before we move to the panelists, I would like to pick on some of your reflections, you in the room, what is trust for you? How do you define it? I’ll just randomly pick the lady here. What is trust for you?

>> AUDIENCE: Trust for me is that I’m not afraid to use some, I don’t know, some sources, some information, you know, something like that. >> VERONICA CRETU: Thank you very much. You are not afraid. Interesting.

Gentleman here, what is trust for you?

>> AUDIENCE: From my point of view, trust on the Internet is broken so we have to take care of what we are doing on the Internet.

>> VERONICA CRETU: We have to take care about what we are doing on the Internet.

Lady here, what is trust for you?

>> AUDIENCE: Well, I’m from Finland and there we have a tendency to trust each other very, very much so. So I would trust – I would say trust is the basis of a society.

>> VERONICA CRETU: Steven Covey is the author of the book called “Speed of trust” and I want to share with you briefly some of the tips he has given when he addressed trust. The video, can we have the video?

(Video)

>> you are asked, just give me the big ideas on your trust work, in a summary, what are the big ideas behind it? I say there are three big ideas. The first big idea is that trust is an economic driver. Not just a social virtue. In other words, there is a business case for trust. And it’s a compelling case.

The second point is that trust is the number one competency of leadership needed today. It’s the number one competency that we can do to create and grow and extend trust. If you get good at trust, it will make you better. Your ability to collaborate goes up, the ability to engage people, innovate, to attract and retain people goes up. Your ability to sell more goes up. They buy more frequently in larger amounts. They refer business to you. They give you the benefits of the dought. When there is no trust, just the opposite. So if you get good at trust, it will make you better at everything that you do.

The third thing is that trust is a learnable competency. We can do it, master, move the needle on it. The ability to create trust, the ability to create smart trust, and the ability to do it with partners and customers. You do it by focusing on credibility and behaviors. Credibility and behavior is about how you create trust. So it’s a learnable competency. Many people never thought of it that way. Some thought of it as something that you have. It’s either there or not there. That is the starting point. In the same way you can diminish, you can deliberately grow it on purpose. And when we do the three things together, the business and leadership case for trust and then the mechanics of how you build trust, you will move the needle, you’ll get better results and better business outcomes, and you’ll find yourselves moving at an incredible speed. And I call it the speed of trust.”

>> VERONICA CRETU: Thank you to the person in the back. We see that trust is an economic driver. Number one competency of leadership that is needed today. It is a learnable competency and it’s critical when it comes to the online space. I think these are critical elements for us to be able to manage and build an open, free, secure and accessible Internet for all.

So I want to move to the panelists and to ask them to reflect and share their thoughts on what does trust mean for them and how can we specifically bring back trust, especially in our European users, in order to restore the trust that has been lost.

So can I ask each one of you to take turns, a couple of minutes, to reflect on this, on this aspect?

>> JAN-PHILIPP ALBRECHT: I would start in passing the microphone. For Me, it’s not just about a feeling of trust, but trust for most of us, first of all, is a working society in which I can trust on the application of the rules we decide together. That’s Democracy.

And I think that, first of all, we need to restore trust in Democracy and in the rule of law, and then we can talk about trust in the framework which we have in a Democratic rule of law based society. So I don’t think that trust can only be restored by just talking to each other or by just having nice advertisements or nice pictures. But trust is linked directly to the question if we respect the rules we give together in a Democratic way.

>> VERONICA CRETU: Thank you.

>> CORNELIA KUTTERER: So for me personally, trust is not a feeling, it’s experience based on patterns of behavior that has been shown before. And hence in the context of the broken Internet, which is probably not the broken Internet for a company like Microsoft, that means that we will do everything we can to show our customers that we are bound by the rule of law and we do everything to actually force everybody else to use the rule of law in their behavior. And what I mean by saying that is where Governments try to access our data on transmission, we enforce through encryption them to go through legal procedures to get the data. And where the procedures are not according to the rule of law, we will challenge that.

So these are two of those areas where we try to restore trust by also initiating dialogue between Governments. That is one of the biggest challenges we still have.

And also by going to look into our contractual assurances that we can give based on rule of law to our customers.

>> VERONICA CRETU: Thank you.

>> JACOB APPELBAUM: I think, you know, trust, as Jan said, it really has a lot to do with Democratic oversight and it has a lot to do with the ground rules.

I think, though, we could say a few specific things. One of them is a preemptive freedom from suspicion. For example, if we are treated as if we were not trusted, it’s difficult to return trust. And that’s a matter of repetition and grounding. So if by default we assume that people are free, so then it follows that we can trust them without suspicion.

This is about respecting basic human rights, which is sometimes in contradiction to laws. To that end, we have to consider that trust is really not something that we should just give freely but should be earned. So part of that is to look at actions of Governments and corporations and see where they promote transparency and accountability and honesty.

There is a great quote from the ’80s, “Trust but verify.” So anyone who is not helping with that process, they deserve our suspicion. We know that companies often operate in places with secret laws and secret interpretation of basic words, like the word “Relevant” to be defined as anything in American law, in secret. So we should look for people who are talking about doing what is morally correct as well as what is legal and looking to go preemptively change the laws not when there is a clear business case, but when it would create a better world, and having that as the goal instead of just having profit as the goal.

So those are the few things for the basis of trust, and from there, it depends on trust in which context. So I think we can move on from there.

>> MATTHIAS TRAIMER: Yes, that brings out so many wise words about trust that we all can share these ideas.

I’m from Vienna, so the City of Sigmund Freud. He says that children are attached to people who give them nutrition. So it’s always the mother’s fault all the time. And one of the people, it was Erickson, he said it’s not there. It’s that the children in the first one or two years learn who they should trust who fulfill their wishes and desires in the best way. So we see already, it has to do with a process on the one hand with a natural, maybe human resource. And on the other hand, also something that we have to learn. We won’t find the final answer here.

But my role is, since I was invited as a Government representative, unfortunately not from the Minister of The Interor, I’m not a member of the secret service. Or if I were, I wouldn’t tell you.

No, I’m dealing with human rights. And for me what is important when we talk about trust and also the role of the State is this concept of the European called the human rights. It deals with the so-called positive obligation. What does it mean, not to be too abstract. It means in the context of human rights, the State has not only the obligation to refrain from intervening, but to be really active in delivering and, as you said, the court in the think case, delivering an environment where you can act free of fear and where you can make full rights of especially freedom of speech.

So it’s a paradox we speak about the topic, because the State acted in the way of surveillance. But what my message should be is the State should do, of course, not only the exact opposite, but what can the State do to build up trust? And I say my message is the State has an obligation to build up this trust. And how, we can speak a little bit later.

>> VERONICA CRETU: Thank you.

>> BEN SCOTT: The benefit of being at the end of the panel is you get time to think about what you’re going to say.

Our – I’ll repeat something that has come up I think for every speaker, which is trust is deeply linked to the law. And for me, it’s about transparent law consistently applied. Or, in other words, justice. But it’s more than just law, because it’s also about – it’s also a very human thing. It shapes how we act in the world. It’s about the observation expectation that we have of one another in societies, about shared values that shape our behaviors.

So what I mean by that, I’ll give you a nonInternet based example. How many in the room are parents?

(Showing of hands)

Okay. For those of you who are not parents, you’ll have to use your imagination.

If you are a parent, you know that on a sunny day like this, you go out with your kids in the stroller and you get on the bus or the subway and you park the stroller in the place where the stroller gets parked, and the diaper bag is hanging over the stroller, but the kid wants to sit in the back of the bus. Do you leave the stroller with the diaper bag parked in the middle of the bus and take the kids to the back of the bus?

I always consider myself a trusting person and I felt like I ought to be able to leave the diaper bag on the stroller. You know how many times it took for me to never do that again? One time the diaper bag was stolen and now I can’t trust anybody on any bus.

The common expectation of shared values that shapes the behavior of bus riders has been violated for me to the point where I – it may be the rest of my life before I leave the diaper bag on the stroller again.

The problem with the NSA affair is that everybody’s diaper bag got stolen all at once.

(Applause)

And now nobody wants to get on the bus. This is a fundamental social problem that, yes, the application of law transparently and consistently applied will help to rebuild trust. But fundamentally we have to reset how we think about our common expectations of one another in society.

>> VERONICA CRETU: Thank you all very much for these valuable contributions.

Indeed, in projects like WikiLeaks and all the disclosures from the NSA, disclosed information on how Governments and private organisations monitor and influence our everyday activity is indeed frustrating. We all know that every time we send an e-mail the NSA can read it. Every time we chat with someone, NSA can track it. Every site we visit online services we use, NSA knows it.

And here are some interesting statistics; I’m not sure people are aware of them. But just as an example each day in 2011, NSA was able to collect about 200 million SMS messages, five million missed calls, and 1.6 million border crossings a day, more than 1,150,000 names from electronic business cards, which also included the availability to extract unsafe images. Over 800,000 financial transactions, either through text-to-text payments or linking credit cards to phone users, job location data from more than 7600 text messages a day. These are incredible data. This is really something.

And I remember again referring to the Summit on Open Government, where Governments were supposed to talk about the initiatives of how they are being open and accountable and respecting citizens and respecting the rule of law and so on and so forth. One of the Government officials was really, and I remember it was someone from Europe, was really frustrated with the Google street view programme. He was saying that I’m very frustrated that my house at the beach is available in every single detail to anyone who has access to the Internet.

So even if we accept that today’s technologies give us an added value and contribute to the increase of the quality of life, we should also look at the freedoms that are being taken away from us.

And definitely it’s not only the U.S. doing it. My Government is doing, your Government is doing. The degree to which this is happening is different. It depends on the resources that they have, but still this is happening.

And in line with this, I would like to address all of you on the panel. With all of this in the background, how to you still ensure a proper balance between legitimate interests of the business and Governments on one hand and the rights and interests of Internet users on the other hand?

What is your view on this? How can we ensure balance? Is there any balance possible at all?

So your thoughts on this?

Or maybe we start from Ben, he was complaining he was the last one.

>> BEN SCOTT: I wasn’t complaining. So the question is how do we get the balance back?

>> VERONICA CRETU: How to balance.

>> BEN SCOTT: It’s all about the transparent application of the rule of law. The problem is that the rule of law was not designed for the Internet era. And we now have a very big job ahead of us, which is to modernize privacy policy for the digital age. And no country on earth has laws that are sufficient to the task. It’s not just a job for the United States Government, it’s a job for every Government. And we need to think about it in all of its facets and need to understand the pervasiveness of the Internet in order to get a handle on all the different areas in which we need to be rethinking and resetting law, in order to account for the power of technology and the ways in which people use it.

>> MATTHIAS TRAIMER: First of all, I like very much what you say about it’s all about the rule of law and especially also the linkage that we had in the first round, between what you said about trust and rule of law and Democracy.

I have to contradict you when you say rule of law was not formulated, or whatever, constructed, for the time of the Internet or whatever. I think rule of law is a principle no matter if it’s on or offline. In principle, we say that the human rights that we have online are the same as we have offline. Of course we have specific situations. But my problem is that all kinds of conferences like this, EuroDIG and be it also on the national or International level, it’s always this general view of things. The problem is, of course, whenever you ask a lawyer and you want to know, well, do I get my money back or not? What is the answer? “It depends.” And this is, I mean, also the issue here.

Of course the courts say, both the Luxembourg Court and the Human Court on Human Rights, so far say that if you do something in general, without doing distinction, for example, a general filter, a general plotting obligation like we had in the Court of Justice, this is of course against human rights or the charter – or against the EU charter. But in principle we have a perfect academic theory. This is the European Convention on Human Rights.

And coming back to the problem that I wanted to express, we often speak here about terms well, it’s important that we guarantee safety, transparency, that we guarantee fair balance. There are hundreds and hundreds of people working on standards and not all the standards are that bad, as there is often given the impression.

When you say what should the state do or what is the state responsibility? For example, starting with network security. The EU has been working lots of years, has founded an agency, NESA, there are directives, the NIS directive is actually discussed and of course a lot of criticism, also. But a lot of work has been done.

The question of trust, I do not have to talk here about all these issues about data protection, about the whole e-Commerce law and especially all the initiatives in the private/public partnerships. So a lot is going on. And that is often not mentioned in this general discussion.

So I would also say that the Internet is broken with a question mark. Where is it broken in the way? Maybe a crime has happened there, that you say you did something against the law. You were spying on people without any legislative basis for that. But the Internet is not broken. Everybody’s addicted to it. Everybody’s sitting here. It’s terrible, we all get up in the morning, the first thing we look at is our mobiles.

So it’s not broken. If we talk about it, we have to speak about the concrete cases. And I’m – what? I said the Internet is not broken.

>> AUDIENCE: (Off microphone.)

>> MATTHIAS TRAIMER: To my mind, the Internet is not broken. We have, of course, problematic cases, for example, if I give it to Jack, I think he can tell a lot of the problematic cases.

>> Am I the individual problematic case?

>> VERONICA CRETU: Thank you. And just a note, we will immediately, after the panelists, we will address this question and go to the audience and you’ll be able to formulate your specific questions or comments. So please bear with us.

>> So I have a couple points and to respond to you. One of them is this notion that everybody’s doing it and you mentioned your country. And my guess is that “It” is different. Your country probably doesn’t design unlawful intercessions, you just get the trickle down from my country. So we have to talk about the “It.” We have to talk about it’s sabotage, and not everybody is sabotaging. For the most part, those standards happened in Europe and in the United States and mostly driven by the United States, specifically Scalia.

So the notion that everybody is doing the spying, also, the scale is really important to think about. When we talk about dragnet surveillance. You know, Sweden’s FRA, they are using quantum related, quantum hand, quantum insert, these kinds of attacks because the GCHK and N and others are working with them. So everybody’s doing it. It’s not clear, for example, that many of the people targeted by that or that run those networks know that their machines are doing that. So many people are participating without their knowledge. But really everybody that is doing it is a small set of intelligence agencies that are working in a small conspiracy across the planet, and I don’t feel like everybody in this room knows about it.

The rule of law is not always the most important thing. In some cases people argue that an executive order in the United States, even if it has a classified annex, that it trumps Constitutionally debated laws that are passed by Congress. And they would argue that it’s legal.

And so I mean what we have to look at is principles as well. So in the United States, we have a tradition against general warrants.

And often if there is a law that says that it is legal, we must look at the Constitutionality and the spirit of the foundation of our Republic, not just at the legality of this thing. And what is clear is that there are a lot people that did not consider the Constitutionality or they did and thought what they were doing is more important. And Justice Brandeis talked about this years ago.

And so when we ask about the balance, it’s not a balance of States’ abilities to wire tap or to sabotage standards or to attack people and my right to privacy. It is about the fact that States have positive and negative obligations that have something to do with transparency and accountability and Democratic oversight, and they are not living up to those things.

And my right to be left alone is something else entirely. That’s a totally different topic.

The fact of the matter is that a lot of these people, like General Alexander, are criminals and they are creating criminal activities. When the NSA kills people, they are committing an assassination which is under International law illegal and under American law it’s illegal and it’s immorale. So it’s for beyond the Internet and it’s critical that we recognize it has nothing to do with the privacy of people. It has to do with due process and fundamental liberties.

(Applause)

>> VERONICA CRETU: Thank you.

>> CORNELIA KUTTERER: Well, so I’m saying the same thing that I wanted to say, but I think now it’s even more important to say it. I think the equation between Governments and companies on one side and users and citizens on the other is not a correct one. You have Governments. You have companies that hold data. And then you have users. And we have different relations to them. Some of the Governments are our customers. But we also see them as someone that we have to defend our customers from when they are not acting according to the rule of law.

And I would at least agree with you insofar as there is need for Government reform in the United States, and Microsoft has been partnering with others in demanding these reforms.

We have also pursued the United States to allow us to be transparent about requests. And to a certain extent we can now at least tell parts of that. As an interested person, I always found this interesting. Our first law enforcement data access request report was called “Law Enforcement Data Access Report” and not “Transparency Report,” and it had a footnote that it did not include those FISA requests, which we have eventually received, because we couldn’t say whether we have received them or not. So we do not stand on the one side and our customers or users on the other. So where Governments access illegally data that we hold on behalf our customers, they hack our systems or the data transfer to access any other illegal access.

Eventually that is, in my first statement, why I was saying we have reinforced across our services the need for encryption, so that it is not easy for anybody to do that, be it Government or any other illegal access or hacker.

The situation for us in rebuilding trust is something that we see globally. I do agree that there is just because of the amount of data that comes – that it has a different scale. At the same time, when you look at our data, data access reports, you can also see that it is not actually a fact that there is a huge amount of cases. So I want to say that we have to look at specific details. And unfortunately very often on those panels, we cannot actually specify what part of the – what we are – what part of the equation we are discussing.

And the last point on rule of law I want to make is that in reaction to this starting in developing catch 22 laws, that does not necessarily help, catch 22 laws, when you are specifying laws that brings companies in – okay, you either break this law or that law situations. It’s not putting forward a solution. It might be something that is understandable, but it’s not really what a Government should do. So we are also asking Governments to have that dialogue on the different aspects that need to be solved.

To give you one example, in the Internet law that has been recently adopted, they have refrained from data legalization requirements but they have enacted an additional data access provision. So the paradigm shift in the world we live in is a challenge to jurisdictional realities of rule of law, and in particular one where most of the International laws were based on territoriality. So by forcing Governments to use MLETS, we will hopefully also bring them to a state where they enhance them and make those procedures better. And that is a role that we are willing to do, to rebuild trust.

>> VERONICA CRETU: Thank you, really valuable to hear the Microsoft perspective.

And you have the last one?

>> JAN-PHILIPP ALBRECHT: Yes, I think the question was mainly on the balance, which I would like to come to now. But I would like to, first of all, point that there is a book which has been written 20 years ago, and which I really like, and which is written by Bill Gates and it’s named “The Road Ahead.” Already in that book 20 years ago, Bill Gates was writing about the necessities – technical necessities to safeguard the rule of law and trust of users in the Internet. And these technical capacities, which he was writing about, never really occurred to be implemented in most of the systems which we have used for the last 20 years. And that’s I think because perhaps, too, less politicians have read it. But also because – different than with other technologies, like with the car where it was obvious that you need a brake, and if you manipulate the brake it might be that you are injured or you lose your life. In this case it was possible to just ignore the fact that there’s technical incapacities, because the implications were not really visible and nobody really asked. And as I said, politicians obviously didn’t demand anything because perhaps they didn’t knew about these technical issues.

And so my impression is that what is broken is the application of due process as it has been set and the application of standards. And that is something which is necessary to know if you talk about the balance today. Because it’s not only an issue of a Government’s not acting right. But perhaps it’s also an issue of legal standards and technical standards not being adjusted to the realities in which we live since quite a long time now.

And the problem is that sometimes balancing things or rebalancing things might be more hard than balancing it from the beginning. Because quite a lot of people just get used to the balance. We just had the years before, although it might be completely wrong.

And just to make it once again a bit more concrete also with the example I started with, here in Germany and in most countries in Europe, public authorities use technical – or information systems, which though cannot assure that their own laws are applied to, because they don’t have the capacities to look into the hardware of the software, the system, the processing systems, and processing parameters which are applied.

So, technically, they tell their citizens that the laws are applied, but even public authorities cannot really assure it.

So how should a citizen, how should a user or even a company be able to assure that basic standards and the rule of law be applied? It’s not possible.

So the balance, it’s talking about the values to be trusted or balanced out in talks, it’s first of all talking about do we have the right system in place and do we apply this system properly? And I think that’s what we need to talk altogether with, very honestly. Because otherwise we will not fix it again.

>> VERONICA CRETU: Thank you very much. And all this requires a very comprehensive holistic approach and probably there is no one single answer or one single solution to that.

Because we are going to end in like 20 minutes or 20 minutes maximum, I would like to give the opportunity to the people in the room to reflect, ask questions, and be as active as possible. So the gentleman, please introduce yourself and please state who is the question for. Is it for all the panelists or for one specific person you want to ask.

>> AUDIENCE: Thank you for the panel and for the professional panelists. I’m (inaudible) from the Electronic Frontier Foundation. I would just like to give a comment to you.

In your efforts, when you tried to restore the trust, please do not consider that the restoring is going back in the history, but consider the future as a potential, what you can do. So it’s not only that you need to go back in time, but please things do change and it’s not always a bad thing. The future can also be bright and nice things.

So thank you.

>> VERONICA CRETU: Thank you for this comment.

Please.

>> AUDIENCE: My name is Alva. I work for the Organisation Raply. I have two questions for Mr. Jacob Appelbaum. I would like to know if you can comment on the plans in Russia, if you know them. The Germany delegation is planning to meet Snowden in Moscow and I would like to know if you can comment on that. Thank you.

>> VERONICA CRETU: Let’s take one more question. Sorry.

>> AUDIENCE: I also have a question.

>> VERONICA CRETU: Let’s take this one.

>> AUDIENCE: I’m Anafol, political adviser to the Social Democrats in Parliament. What do you expect the erosion of trust in the Internet is going to mean? Or how are we going to deal with this? Is it going to mean that we will face a decentralization of the Net in the future or are we going to go – or should we go for a new kind of global regulation or regulation of the global comments in terms of the Internet? What is the path ahead?

>> VERONICA CRETU: Thank you very much. And let’s take one question.

>> AUDIENCE: I’m C. Taxpayer, so this is a request for comment from the entire panel.

We heard about encryption and you mentioned encryption in particular. To what extent is encryption useful to tackle the issue that we have at hand right now? Because I strongly wonder if agencies eventually would say if everybody would use strong encryption across the board, we would say that we can’t spy on you any longer because everything is encrypted. My idea would be that it would go to the Government and say well, we need an increased budget to – well, still be able to decrypt and encrypt the communication. So essentially, I’m going to pay for my own – for the decryption of my own messages, essentially.

So I just wonder how we should tackle this dilemma. And I guess it’s not going to be tackled by – well, increased encryption or by the use of – well, increased use of stronger encryption. That’s my point.

>> VERONICA CRETU: Thank you. We have three questions. One is related to Snowden. Would you like to – and please, number four.

>> AUDIENCE: So my question to you, Microsoft, is kind of a strange one. But I have a friend named Casper Bowden. He is one of the greatest people to work at Microsoft. He was fired for doing the legal analysis so that things like PRISM and FISO would exist. So I wonder if they would bring Casper back to clean house.

(Applause)

And, specifically, if you guys are going to rearchitect Skype to be peer-to-peer again and with an open Protocol that we can verify with free or at least verifiable source code so we can begin to trust Microsoft again. Because you did rearchitect it for PRISM.

(Applause)

And so my question is, ultimately, is Microsoft going to leave the PRISM programme and programmes like it and build software that make programmes like it impossible for Microsoft to betray their users? That is my question to Microsoft.

(Applause)

>> VERONICA CRETU: Thank you.

>> AUDIENCE: Sorry about that.

>> CORNELIA KUTTERER: I would not be sitting here if I were not okay with having the more difficult questions, and I would like to make the note that Sophie, who talked me into this, is not here. I don’t know why. But she also told me that there were a number of other companies who declined. So take it as something positive.

So on Casper, I have been working with Casper. At the time he was – when I started at Microsoft, and I remember still talking to Jose from International privacy that I should, coming from a Civil Society organisation into Microsoft, this gave me the advice that he is a good guy and you should be talking to him. And I have been talking to him quite a bit.

So Casper and I, I think, I hope, still are on good terms. And I cannot possibly comment for the reason that he has left Microsoft. I simply do not know the exact details. And so this is a question I’m afraid I cannot answer.

I think, on the other things leaving PRISM, we have worked PRISM, the existence of a PRISM programme under NSA the first time. Then everybody else, with the revelation in the guardian, and –

>> AUDIENCE: I mean, just the name PRISM? You said you protect information in transit. It was around another story. Do you protect data –

>> CORNELIA KUTTERER: So here is the thing. The – what I can say to it is that apparently there has been requests, and if you look at the documents that you can see there has – it has been struck out of it. But they kept the LCA piece, which identifies Microsoft, where this company has declined to work with the NSA on anything.

What we have so far done is we have sued the Government, and then asked for being able to showcase the numbers of requests we have received. We can even say now that in a certain number of times we have a certain percentage of national security orders in comparison to law enforcement requests. And we can also say that none of those have actually impacted European business customers.

And so just by claiming things, it doesn’t make it right. When we have – and that is – and there has been, over the last 12 years, really step by step. And I’m not saying that Microsoft in 2013 said and now we are doing this, this, this, and this and this and this. There is involvement. Discussion. There was lengthy discussions internally.

We started first of all with the transparency. That was like one of the first steps. And then the revelation continued and we heard that – and there was actually Google transit, the PowerPoint slide which was revealed was on transit of Google and Yahoo. And that triggered us, Hmmm, let’s go do encrypt there. Reinforce mechanisms there.

And then we have looked at the transit – transborder access part of the design and have looked at whether this would actually – whether we should actually challenge those when they reached customer data, which is stored outside of the United States. So to give assurance in our contracts for European business customers who would like to have that assurance that we would go against such an order.

We are doing this currently in a different area. We are doing this currently in a normal law enforcement procedure with a warrant that has been issued by a New York District Court case. And here we have last Friday submitted our response and our challenge. And we have also done additional steps to go – to go to EU Governments to help us with an amicus brief. We have one of the – one of the key negotiators between the AMLET between Ireland and the United States that has negotiated that procedure to help us with the amicus brief. We have U.S. companies. We have Civil Society in the United States that are filing amicus briefs, to help us doing that. So we are doing all these steps.

Now, I want to go back to the taxpayer. I’m very sensitive to that, certainly because I live in Belgium. The encryption – and I think that – to a certain extent I understand the comment. Security is a process, so any step you do will have to enhance. And it depends probably on your own Government’s ability and eagerness to do those kind of accesses. So it is again a Government issue to help them behave under the rule of law. And I’m turning to policymakers in that respect, where I think we as Microsoft are simply on the same page and in the same boat.

It doesn’t help to think that we are the adversary here. I think we are simply trying to do what our customers want us to do.

>> VERONICA CRETU: Thank you.

To the encryption, decryption question, does anyone else want to pick up on that?

>> Encryption works as far as I can tell from the things I’ve seen. I’m one of the people that works on the NSA. That’s why I know that the Government owns Microsoft inside and out. There are people that don’t care about the lawfulness of the back door. Factoring, with a reasonably sized key, that is to say not 1024 bit RSA, 4096 bit seems safe. Factoring and discreet logs are hard. And what that means – and I really believe that.

My entire life right now and all the things I do are predicated on certain mathematical problems being hard. And I probably would not be free sitting here if it were not for mathematics. So I put my money where my mouth is in this case. So when companies don’t rearchitect to make peer-to-peer end-to-end solutions, we should not trust them.

And no amount of money is going to decrypt that. They will break into your house and we should raise it from passive to active instead of getting it for free, or going to a company to coerce them. We should make sure that they have to do an attack that can be detected, which may allow for, for example, bringing it to a court.

The important part is the decryptor is not the weakest link. And the mathematics are not bad. I’ve seen time and time again interactions that there is a person targeted for surveillance and the NSA will say sorry, we can’t break that. It’s off the record. That gives me great hope that policy and technology can meet in the middle. When companies don’t make that move to meet in the middle, we should not trust them.

(Applause)

>> VERONICA CRETU: Thank you. Anyone else want to reflect on this last question of encryption, decryption? Or we go to flagging fragmentation.

>> There is a term of vulcanization. Some members say it’s in the European Union. Also the Chancellor of this wonderful country here in Germany said that we need European solutions, European storage of data. It’s not the position of the Austrian Government. For the Austrian Government, this kind of fragmentation or doing a fence around the Internet is not the correct way.

We also believe that especially the safe environment in the way of encryption and so on are the solutions that we have to work on. What the Austrian Government especially is working on, where one of the few things where my Government was really in the lead is the question of working in the trustful EID, which is accepted everywhere in Europe for services. So this is one of the aspects.

But the whole thing, the question was also in which direction is it going with trust? And so on.

And I think it’s nothing new when I say that I think I just quote Neelie Kroes, who said the notion of prison was a wakeup call. And the answer of Governments in a Democratic society can only be one thing: Transparency, transparency, transparency. That means really rethinking, for example, how is the – is the legal framework as regards access to information? Actually, we are heavily discussing in Austria to introduce a new Freedom of Information Act, and stakeholders are not satisfied so far with the draft. Authority to my mind are still too much troubled about the questions they could get. But it’s really access to information, this transparency, that must be the answer of the state.

And then really what I just want to repeat, trust is a direct link to the concept of human rights. If people can’t trust in effective remedy, if people can’t trust in the institutions, if they can’t trust also in the professional legal framework, then trust is a concept of theory.

Last word. I’m working in legislation. It’s right what was said here that the legislature has a big problem. It’s always behind all of these developments. Therefore, I was from the very beginning quite supporting this idea of multistakeholderism in the way of legislators should really listen to that what is going on in this process and should make use of that, of the various voices. It’s no chance for legislators if laws exclude the voices of the stakeholders. And therefore from the principle here, I think there should also be a change how legislation is done in the future.

>> VERONICA CRETU: And there is a lot of work that is going on in countries around Europe exactly on this issue, how to build and institutionalize processes related to rethinking the design and the new way of incorporating and codesigning things together with all stakeholders.

I see a remote participation question.

>> SORINA TELEANU: Thank you. We have comments and questions.

“I trust the Internet to be entirely human, meaning trustworthy. Trust is the ability to have the integrity expectations of my directions in life met and to have some control over that.”

And now two questions. “Does the panel believe that – (Off microphone.)

“Do the panel believe that national laws cannot discriminate a right to privacy solely by nationality of the target and still be consistent with human rights?”

And the second one. “Would the panel support the idea of stronger standards for greater transparency in terms of making privacy practices clearer to users not just written in long terminology that hides more than it reveals?

Thank you.

>> VERONICA CRETU: Thank you Sorina.

>> I think those questions are valid and also connected to this question before on the territoriality.

Because, for example, in Europe, we have Constitutions and laws and all of them say the right to data protection is a human right.

The Constitution of this country, by the way, gets 60 this year and it’s also in there. It’s a human right. It comes from the human right to dignity and personality. It’s written down, data protection is a right for everybody.

In the European Union it would not be Constitutional to distinguish protection to personal data along the lines of nationality. And that’s exactly what I see is also the solution. If we would at least enforce that clearly in the territoriality of the European Union and all its institutions and all companies or actors coming to the EU’s market, then we would be far because that’s the biggest market in the world and it’s a quiet strong player. 28 Member States of the European Union strong.

And the International community, by the way, also referring to the question from the previous round, is dominated by a majority of nonDemocratic repressive States. So just to imagine that in a few years we would just create global standards and enforce them equally in due process is a bit blunt. And I would say that, first of all, we should just enforce the laws and the Constitutions which apply on all territories. And that’s what we need to do.

And that’s the second question, which I think refers very clearly not only to transparency, but I think that transparency is one of the most important issues here, because obviously we are not even aware of many things. And as consumers, which are also the suspects of lawsuits, we need to, first of all, know that our rights are infringed before we go to court and before we can enforce our own rights. So without transparency, there is quite a loophole in the system of rule of law. So we need of course to have a better rule on transparency with regard to our privacy rights. But they are already there. They could be improved. But also there. It’s a lack of enforcement. We have a huge lack of enforcement.

For example, in our legal framework here in the European Union, we have 28 different national laws which are frameworked by, on the one side, Council of Europe Convention and on the other side the Director of the EU. And they are so different and so weak in enforcement that I would say since two decades, so since the existencd of the directive, it’s very hardly enforced.

So what we need is to get processes, due processes which work, and where citizens and consumers can insist on the application of their rights. And where it would be sanctioned if authorities or companies infringed these rights.

And that’s the problem of today. It’s taking place all the time. I can tell you hundreds of examples which I experienced myself, and I would say that in the area of right to privacy, right to data protection, at the moment we have a lack of enforcement of about 80 to 90 percent. And that’s not what I would say is, yeah, something which I could trust on.

>> VERONICA CRETU: Thank you very much. We are really running out of time. And there are two more people standing here.

Very quickly, one minute maximum to formulate your question or your reflection. And please introduce yourselves and say who is the question addressed to.

>> AUDIENCE: I was afraid you would say that about the one minute. I’m Stefan Herwig. I’m an Internet analyst but also Civil Society.

I’d like to go back to the topic of the panel. Is the Internet broken? I don’t think the Internet is broken. The technical layer is working well. The host provider layer is working economically extremely well. Google and Apple have just become the most valuable brands and have, how can I say, exceeded Coca Cola. That was the most valuable friend for 20 years, and Google and Apple last year have overtaken Coca Cola by a long shot.

What is not working is the regulatory level. John Perry Ballo, you’ve been proven wrong that we can sustain the Internet with minimal regulatory level.

And I would like to say more, but since I only have one minute. I would like to say this regulatory level has never worked. So what is broken is the vision of the Internet that we have developed 10, 20 years ago. We should now re-envision this societal domain.

(Applause)

>> VERONICA CRETU: Thank you very much. Please, one minute maximum.

>> AUDIENCE: Good evening. I’m B. Corval. I’m working at the International Centre for New Media in Austria. And there is an initiative called “my data belongs to me.” It’s a shift of paradigms. See the issue is not protection, not safeguarding, but property and ownership. So if the people whose behavior the data originates from are the owners, they also have to own the property rights and the power to decide on the data, including the right to alter, augment, and also to exclude others from doing the things. And of course this approach would also be reflected in the technical infrastructure.

So I would like to invite the panel to comment on this new approach to “my data belongs to me” and invite the audience to get involved in this initiative.

Thank you.

>> VERONICA CRETU: Thank you. I think we could spend a couple more days here in this room and we won’t, you know, exhaust all the questions and all the reflections. Because we don’t have that much time anymore.

I would like to ask the panelists to have their last minute, last one-minute reflection, based on the questions that have just been addressed. Please.

>> Let me say every time I’m in a session or a panel like this, the problem seems more overwhelming. So let me attempt to make an asset out of a liability. I said at the beginning I thought the definition of trust was the transparent rule of law consistently applied. And now I’ll amend that to say it’s the transparency of law consistently applied adequately to control the power wielded by public and private authority. Our problem is not that we have no structure, no laws or principle, it’s not that we have no secure technologies, it’s that they are not adequate. And that adequacy problem is revealed in a hundred different places, and we talked about them today: Corporate accountability, secret law, public debate, cryptowars. The breakages that we see in the security in our own machines. All of these can be seen together as an overwhelming problem, or we can pick them apart and say we didn’t get into the mess overnight and there were many things that need to get fixed to get us where we want to go.

So any of us on our own interests can take the problem and work on it diligently. It’s not all or nothing. We can begin incrementally to repair what is broken and to restore trust. And each of us will have a role to play in that, whether it’s political, technical, social, journalistic, scientific, pick your poison.

>> Nothing new to say. When I repeat that trust is something all stakeholders have, of course, to worry about, the state’s role as such is really to deliver this, what is called human rights, the stable environment for human rights, that you can communicate both public and private without fear.

And one thing I really want to also say is it’s one of the big, big issues that users and people really get the chance to understand their rights in a much better way than this is done today. It’s a catastrophe how. For example, the human rights aspect is covered in schools. The average, so-called average citizen has not really an idea of the most basic things, like, okay, when an authority wants to intervene, they need a law. This law has – needs a legitimate aim.

But the main thing is nobody talks about the proportionality.

And the whole PRISM thing was not only because it was just pretended to have a legitimate aim. It was also this question of complete misuse of proportionality. And these are the things that we really should talk about. How to bring this back to the schools, because this is one of the basic things people should really have a right to learn. And the Council of Europe is hard work on that with the user handbook.

>> VERONICA CRETU: I think we didn’t cover the question on Snowden. Your last minutes of reflection, quickly, update us on this.

>> Make some news?

>> No. You get a headline.

>> I think it’s important to talk about Snowden. I’ve got a little Snowden head here, symbolizing him sitting here. And I think it’s important to note that we have heard a lot from people on the panel about accountability, human rights, all that lip service that we hear from people all the time. And no one on this panel or in Europe has really helped him to get here. No one is looking to help him, for example, really solidly on Jan a bit, but he can only do so much because of the European Parliament’s legal powers. But nobody talked about getting Snowden here and protecting him, his basic human rights.

And so this question about what about Snowden in Russia, it’s an open question. It’s clear that if Germany steps up – that might happen. And that should happen. It’s clear that it should.

I actually went to the Austrian Parliament and I asked, and I asked several different parliaments, including people in the Council of Europe and people in the European Parliament. All of them said that they are afraid of the United States Government and they couldn’t keep him safe if he stepped here.

I asked Neelie Kroes, why don’t you help Snowden? And she says what would we gain by doing that? What more do we need to learn? That is not the reason we take a morale action. We take a morale action because it’s the right thing to do, not because of the capitalized conception of benefit. Right? We save a young man’s life.

(Applause)

So to that end, the question on data protection is straight forward. It’s not about data protection, it’s about people protection. So we apply simple tests, and we say when it’s the case that people are not protected or secret services get to use a loophole in the Constitutions that protect our privacy, we have to know that we need to change those things in service, fighting for the user. And so things like giving Snowden asylum are important, but we should work to stop the spying.

What do we suppose that the new B and D is going here? The B and D and NSA are best friends. If we want to talk about privacy and human rights, we have to start to dismantle the institutions outside of the rule of law.

(Applause)

>> CORNELIA KUTTERER: So I have already given some of the examples and how Microsoft has reacted to the revelations and how we have tried to step forward in restoring trust by certain steps.

I want to basically use the time to go back to this question on who owns your data. And that’s right straightforward from Microsoft, because in our privacy policies we make this very clear. The customer owns his data. And that is not always clear. And it’s not always in all privacy policies of all companies. A choice to make of the customer, who to trust in that respect.

We do interesting research in that area and how to actually architect around those controls in order to have customers being able to more granularly define what data uses they might agree to and what data uses they might not agree to.

And there is a lot of stuff undergoing. So I’m happy to connect back to the person who was raising this topic.

>> VERONICA CRETU: Thank you very much.

>> Yes. Perhaps I would like to first of all get to that question, because I don’t think that the concept is pointing at the problem of us citizens at the moment or on the rule of law question.

It’s a question which perhaps in addition is also – does make sense with regard to the rights of the consumer or the role of the consumer in this data economy, which is building, insofar as I can also somehow say under which conditions I give my data for specific reasons.

But it’s not really sufficient to talk about that, because as we have already heard, it’s not only about protecting any data or having rights on economic issues, but it’s also about human dignity. It’s the right to be left alone. The right to privacy, or the right to own personality or how the German Constitution 30 years ago said it, it’s about the right to my informational self determination, which I like. But which is not really, perhaps, a nice word.

But so it’s about really the question how do we protect human rights? How do we protect the humans in the information technology and in our lives and in the general environment in which we live?

And there I think it’s really about the question if the rules which we give as a society are somehow in place, if they really are respected. And I have to say at the moment, our biggest problem is that they are not respected. And that we therefore need to get out and get this done again, get politicians to do the right decision. And, for example, with Snowden, that is not very easy. Because you, in a Democracy, in most of the situations, need a majority. And until now obviously there is not a majority there for getting the rules applied again. And obviously we will only get that majority if there is a majority outside calling for that to happen.

So to finalize this round that might be a conclusion to get out and get people on this issue. Because it’s about very fundamental issues and not only about the Internet, like we knew it perhaps 20 years ago of being something that we can use or not use, but it’s about everyone’s life every day, every hour, every second.

>> VERONICA CRETU: Thank you very much. Thank you all for the great questions and the panelists for the excellent contributions. And several ideas that I’m sure everyone in the room is going to reflect on after this session.

We have Lee Hibbard. Lee, I think you’ve got two minutes for your brief report on the session? Could you please come on the stage or shall we bring the microphone to you?

>> LEE HIBBARD: It’s a very difficult task summarizing that panel. Thank you very much. It was very good.

And I’ve gotten about three – five bullets, more or less. I hope this encapsulates some of the clapping and some of the major issues you mentioned.

Clearly, in my opinion, the question of transparent application of the rule of law and due process safeguards is key. That’s something which I think you all agreed upon. Therefore making Governments and businesses accountable for their action, activity, particularly vis-a-vis national security issues is key. And it has driven the discussion.

That is said because you’ve been saying that the regulatory level – the regulation is broken, lack of enforcement, lack of respect for rule, the question of Democratic oversight. And that there is a need to dismantle institutions that don’t respect the rule of law and work outside of the rule of law. That’s the first bullet.

The second bullet is, what I took away is that getting businesses to be more transparent and more accountable, getting companies to meet in the middle with the users and maybe other actors. And, you know, there is a push and people were clapping with regard to some of the issues regarding pushing back at Microsoft. And I think it’s clear that, you know, more transparency, more accountability of companies is key.

We heard about transparency reporting and how accurately and what is reflected in those reports. I think that is key. That’s the second bullet.

Third would be about giving control back to users, in particular using encryption. You said the need to know about rights, insistence on rights, including the right not to trust companies. So I’m not talking about human rights, but the right not to say – to say no, I don’t believe that.

And then you went on to talk about the – how to enforce those rights in particular privacy enhancing technologies that you referred to. Of course, better transparency.

And I think fifth bullet and final – fourth and maybe final bullet is about the moral aspect. Moral action to protect people. You mentioned Snowden, but it could apply to other people who are considered whistleblowers or human rights defenders. And there is a need to mobilize the action of more people. Maybe people are not aware enough. Maybe we need to do more to get X to come to country Y. And that’s something which came through quite strongly.

Those are the four bullets which I hope encapsulate the major conversations you’ve had today.

Thank you.

(Applause)

>> VERONICA CRETU: Thank you, Lee, very much. And thank you the panelists. And would I like to end the session, and before giving the floor for some housekeeping announcements, I would like to end by quoting Tim Burner’s Lee, who just shared in one of the recent interviews the following: “Unless we have an open mutual Internet we can rely on without worrying about what is happening at the back of the door, we can’t have an open Government, good Democracy, good healthcare, connected communities, and diversity of culture. It’s not naive to think we can have that. But it’s naive to think we can just sit back and get it.”

Thank you very much. The floor is yours, Wolf, and I think our panelists deserve a huge hand of applause.

(Applause)

>> WOLF LUDWIG: Thanks. This is the last session and my thanks also goes to the panelists.

I have some last housekeeping announcements to make. We have to leave this building. 7:30 latest. Because security is only paid until then. We will have an incredible mess if we are not out at 7:30 latest.

And you may recall there is a social evening tonight, which is not here, which is another part of Berlin. Outside you will find this nice little brochure where you find a description of the location where we meet in another part. It’s roughly set close to underground station Friedrichstrasse. There we have a social evening.

You may be aware that tomorrow you will continue. You are invited for the first side event starting at 8 o’clock in the morning, not in this room, but in the Strassaman, a side room, one of the workshop rooms, here. They will start at 8 sharp.

Then we will continue with another keynote at 9 o’clock. Here. And then after the keynote with our next plenary session 4, which is an economy session. So try to be here tomorrow morning again, right on time. And contribute as today. And you are all invited for the social evening now.

As I said, any further instructions for the venue you will find in this brochure.

Thanks and see you later.

Pictures from working group

Link

Session twitter hashtag

Hashtag: #eurodig_pl3