How to improve users protection online? – WS 05 2012

From EuroDIG Wiki
Jump to navigation Jump to search

15 June 2012 | 11:30-13:00
Programme overview 2012


Key Participants

  • Wolfgang Benedek, University of Graz
  • Olivier Crepin-Leblond, ICANN’s At-large Advisory Committee
  • Johan Hallenborg, Ministry for Foreign Affairs of Sweden
  • Meryem Marzouki, CNRS and University Pierre et Marie Curie
  • Marco Pancini, Google


  • Matthias Traimer, Federal Chancellery Austria


Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835,

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.

WS5: WS5: How to Improve Users’ Protection Online?

(testing audio).

>> Okay. Yes. Okay.

(no audio).

>> Welcome to workshop 5. I’m from the Austrian government, and I’m happy to be the moderator for this workshop. The title of this morning is How to Protect Users’ Protection Online. So it’s not only about the user perspective. As such, it’s all about understanding what kind of rights do exist, what do we understand, what kind of rights do we mean when we speak about users’ rights and protection. Who are users? Because not – user is not always a natural person, a private person. As you know also there is a big variety as regards the expectations of these users. And apart from understanding what rights do we have, of course, the logic question, then, is how to exercise these rights, how efficient are these rights, and especially what must people understand and know about the rights. Do they know about their rights? So practicability. What I really would like to do is to invite you as an audience to actively participate and especially maybe tell us one personal experience that you had as a user, especially, when you said “well, I’m on the Internet every day.” Many of you will be, for example, on Facebook. But every, I think, the daily business is to Google something. Yeah, you’ll be happy we have a representative of Google here. It’s like “what do you do? I was eating, I was sleeping, I was Googling, maybe?” do you sometimes have sometimes the feeling” well is this really good if I search for this question and so on? We’ll hear what the representative of Google will say. Very briefly also I want to introduce you for the panel. But for an indepth look, please look at the programme so we don’t lose too much time. Just about the main activity. Professor of international law of the University of Graz, and very involved with Internet questions.

And Meryem Marzouki, representative of – she’s representing more or less the needs of the users in this international context.

Then we have Johan Hallenborg, an international lawyer, if I may say so, and working for the Department of International of Human Rights and Sweden’s ministry for foreign affairs.

And then Olivier Crépin-Leblond, chair of ICANN’s at large advisory committee. Finally we have Marco Panchini is leader of the Google team in Brussels.

We have an international professor, but please don’t act too much as a professor. We do not need a lecture now, but maybe you can explain us when you speak about user protection, we think of the rights, what kind of rights are there or are there already just to remind us of the kind of framework? Do they exist somewhere already? Yes, they exist. Maybe a trivial question but just to give us a short introduction.

>> Thank you very much for that question.

>> There have been for guides of the Internet and rights of principles in Scandinavia. And indeed there is already several efforts to put together main rights and principles for the Internet, Human Rights, like, for example, the Internet rights chart of the association of progressive communication already in 2006 or the chart on the Human Rights of the Internet which were drafted in 2010-2011 by the Dynamic Coalition of Internet rights and principles in the context of the IGF. And this later document which at the moment may be more comprehensive follows the convention on Human Rights and simply to see how these rights and universal declaration can be used or can be adjusted and interpreted for the purposes of the Internet. And this is something you can all check on the Internet, obviously, and also contribute to.

Now, there are two perspectives you can have. One is the public perspective, Human Rights perspective, so to say, the other could be the private law perspective, the consumer law perspective. And the Human Rights is something that can meet both perspectives. So the challenge is how to break the users’ rights of the users down than to practical rights, this can be in the field of civil or political rights, for example, rights to online protests, obviously part of the rights of freedom of expression and association and so on. Or this can be in the economic, social, cultural rights, context, the right to education, digital education, quite obviously. So when you go through the rights, you can actually find in each and any right relevant connects to the Internet. And let’s take the right to unlimited. Obviously that is nothing new, but it has a different meaning in the context of the Internet. For example, another issue, the rights to an effective remedy. So how can we address violations or how can we act when we feel that our Human Rights are adversely affected? There is a human right to remedy. There are also user rights, consumer rights, and now you have to bring this together and actually there will be some work done in the Council of Europe on the compendium of Human Rights of Internet users, which will try to do this job of connecting these different levels.

>> MATTHIAS TRAIMER: Thank you very much. We will come back to the concept of the Europe initiative. We will hear from commissioner, but also others. In this connection. Even the Obama Administration, as you might have known, is dealing a lot with the question of the right to be forgotten and so on. But Meryem, to break it down a little bit from now, thank you from the legal side.

You may, if I may say so, as the representative of the user in, the rights, should know about it, what are things should they know and where are they left alone?

>> MERYEM MARZOUKI: First of all, I’m not very comfortable with this framing in user rights. I believe in citizens’ rights and I will speak about citizens’ rights. Since we are here, the EuroDIG, it’s the dialogue on Internet Governance, governance is much wider than only the legislation and the Human Rights legislation in particular, especially since the Internet space is not only regulated by public, social, legislation, regulation from the states intergovernmental organizations, but also for big part of it it’s privately ordered space.

On the Internet, we have new and older gatekeepers. We have the Internet service providers, the access providers, the hosting providers, the application or service providers. These are private entities. And since they are the gatekeeper, they can, in some cases, for their own interest or because they’re obliged to do so by some legislation, they can infringe our rights. For instance, they can infringe our right of freedom of expression when the block or the filter some content.

>> MATTHIAS TRAIMER: Can you think of a complete case it’s happening?

>> MERYEM MARZOUKI: Yes, for instance, in Europe, in the European Union, we have things under current revision or there is a discussion about this. And this directive has some provisions on access and hosting providers. And in some cases, in case of allegedly illegal content, I say allegedly, because we have not come to court judgment yet, the service provider is sometimes obliged to take down some content, to remove your own content. As an author of this content, your freedom of expression is violated. As an Internet user, you cannot see or you cannot read this content or you cannot watch it or listen to it. And then your fundamental freedom of information is violated, too. And this because a private company, an Internet service provider, is obliged to do so by the directive; otherwise he or she will be held liable himself.

But we have also another kind of gatekeeper which is technical artifact gatekeeper. We have algorithms. For instance, when we Google something, we do a Google request, and with the case of the Google suggest feature, Google can suggest to you another key word, and this may well lead to the courts, as we have seen recently in France and in other countries. This is for the algorithm. They also select at least a prioritized content as an answer to your request on Google, and this could be seen as a kind of editorial process like in the media; it’s not natural.

>> MATTHIAS TRAIMER: Maybe we will go directly to Mr. Google here on the panel. Sorry for saying so. But I hope you also feel a bit honored about that, of course I hope so. It’s a corporate entity. But there is not only corporate identity, you have also a kind of perspective of corporate social responsibilities as it is always read, not only Google but also Facebook. I have the feeling, for example, that when Facebook started now to do a kind of new system, that also the users can vote if they want this rule or so, they didn’t do it voluntarily but because there was some pressure on them. But there is much criticism, for example, in the big companies, although we use them.

>> ALL: The time voluntarily more or less – we use them all voluntarily, more or less. What do you see as Google company, other rights of the users and what do you provide and what do you offer to them?

>> MARCO PANCINI: Yes, this is already a big task to talk about Google, so I will refrain to defend the choice of Facebook, making a step back but also to have some complete example. I would say that first of all comes the rule of law which in a certain sense is the contracted citizen and all the industry have with the local governments or with the institution in order to live all together in this society, so we believe in the rule of law, so we take action accordingly to the rule of law. We want actually to take action in relation to the rule of law, which is for us the right balance, which is different rights and interest at stake, security versus freedom of expression.

But then again we believe that the rule of law sometimes cannot be everything, cannot help the user to have a clear view on rights, for example, a specific content was taken down. For this reason we started a few times ago – which goes in answer to your question about corporate social responsibility. We believe that part of our responsibility to our citizens using our services is transparency. So we want to be very, very clear about the request that we are receiving from law enforcement and from institution, the different jurisdiction in which these requests, from which these requests are coming from, the action that we are taking in relation to these, and even abuses, and even things that are not working. Because if we believe that security online is a big, big important issue and specifically child safety, then it sounds quite strange that in some jurisdictions we are receiving much more requests for defamation –

>> MATTHIAS TRAIMER: May I interrupt you. You look at the past, Google for several years, I think, what would you say what would Google, for example, learn? Did they do mistakes? Or did they do everything perfect? Or what would you say was one of the real improvements you had to do? And also were forced to do by the users?

>> MARCO PANCINI: When we made mistake we always were the first to admit that we were making mistake.

>> MATTHIAS TRAIMER: What would be, for example, you’d say?

>> MARCO PANCINI: There is also the famous case about the Google bus and the old discussion around that and the fact that in one week, in less than one week, we took action and we fixed the issue. I say that going into the direction of saying where we improved. For example, we improved the thinking in the last years in order to give more control to the user for what concerned personal data in answer to your question. Because we believe in the bigger responsibility that we have in Austin user information in being the steward of the user data, we want to provide full control over this data so we are doing this through the dashboard. At the same time, we receive a lot of criticism where Google was launched because of the lack of anonymity. We understood there were specific cases which the right of anonymity was very important for user to express their opinion and to protect themselves from possible censorship. And then we came with the policy of sustaining in this specific case unlimited. Before I come to the panel, because I know Jan Kleijssen has to use, but we should take the Council of Europe what is the Council of Europe planning to do in this? Jan?

>> JAN KLEIJSSEN: Yes, thank you. Good morning, everyone. Already mentioned the compendium on the rights of Europe that the Council of Europe will be preparing. What is the added value? Professor already announced that a number of documents already exist that bring these rights together, however the documents he mentioned are not as such approved by governments. They are not legally binding. What we hope to achieve at the Council of Europe is to bring together in a compendium, that is to say in an easily readable and accessible document for users, the rights that already exist within the Council of Europe context. I’m thinking here of the rights that the European court of Human Rights has found to exist as regards Internet under Article 10 of the Human Rights, under Article 8, of other articles of the convention without going into a list. But there are specific cases where the court has already decided, where it has established that certain things can and other things cannot be done on the Internet. And it is important, I think, that users know that.

There are other treaties of the Council of Europe, there’s the cybercrime protection, there is the data protection convention, there is a convention against the sexual abuse of children, against trafficking in human beings. There are rights that flow from these conventions. But even for myself, who’s supposed to be responsible for this sector, it would take me probably a couple of days to find them all surfing on our intranet in order to bring them all together. And therefore it is unlikely that users would be able to find them very easily. So the aim is – and our governments have agreed to this very importantly – in March, our ministers, 47 governors have agreed that we should set up such a compendium. So there is now a clear mandate. A special intergovernmental group, but also I should add multi-stakeholder with representatives of business and civil society. Business is a bit newer. I’m addressing myself to Google. We have more experience working with civil society but we’d be happy to have business with us as well. To bring this all together so we have this easily accessible document and then we hope that our governments will formally endorse this and then secondly and again here I turn to industry, it would be absolutely great if a link to this document could then be made available, together with all terms of users’ contracts. We’ve heard from the young people this morning that they don’t understand the legal language. They don’t know what their rights are. They certainly don’t know what their remedies are. It’s very important if your rights are violated or you think they are violated, you must have a remedy, what are we going to do about it? So what we hope is that well find companies prepared to link to this little document, relatively little, it will be a set of rights that people can invoke so that when you actually use a service on the Internet, you’re immediately told in an easily understandable language what your rights and remedies are.

>> MATTHIAS TRAIMER: Maybe Jan, we can provide you in this workshop, the idea of the content of this compendium. We have crafted from youth would like to do that but to give one other idea.

>> JAN KLEIJSSEN: We’d love to get it. Thank you very much in advance.

>> MATTHIAS TRAIMER: And you said, I introduced you to be linked to ICANN, but you also announced that you would speak in your own personal ideas. And so not as a real official representative in this case. So coming to your personal ideas, we had a little talk yesterday of coffee. There is so much talk about rights and so on. It’s so much theory. You have a little bit of problem in practice. What does it mean in practice? Could you tell us?

>> OLIVIER CRÉPIN-LEBLOND: Yes, there is a lot of theory about Internet user rights, but the first problem is that in a multi-stakeholder scenario, you actually need to be able to involve the largest group in the world; that is the actual users, the 2.1 billion people that use the Internet. And the Internet would be nothing without those 2.1, probably 2.2 billion by the end of this conference.

And as you see, we don’t have 2.1 billion people here, we are far from that. And in most conferences out there, it is very difficult to get so many users to come to face-to-face meetings. Teleconferences are one thing, but are faced with a couple of physical problems. The first one is that the earth is round, so we can’t solve that. It will always be the wrong time for someone somewhere around the world to hold a conference call.

The second problem is that the telecommunication and electrical infrastructure isn’t the same everywhere around the world. And in some countries, people will spend more time redialing into the conference call rather than actually bringing their points of have you onto, into the debates.

The other problem, then, if you start having to bring volunteers to face-to-face meetings, they’re actually volunteers. And so they have to take time off work. They have to convince their families. It’s something that is very onerous on volunteers. And unfortunately, that means that the voice of the individual Internet user is often forgotten in those debates where you have organizations, where you have private sector, where you have governments that are funded to go to these meetings. But the individual Internet users often have to self-fund or try and fight for some kind of sponsorship that they could find from somewhere. And I heard of some volunteers having to share rooms three, four people in a room because they could barely put the money together to actually go to a place. And then having to constantly have to cut down on costs.

The other problem with volunteers is they often don’t have the actual knowledge required. They don’t actually – they’re not technically involved enough both on a legal basis, on a political basis, on a technical basis. And they are therefore quite lightweight when you actually have a debate that involves private sector, who has experts, where they have lobbyists, where governments bring seasoned politicians to the debate. And it’s therefore very hard to get a balanced multi-stakeholder model if you don’t invest in the volunteers, invest in the users, in teaching them what this is all about, teaching them about the importance of this dialogue that is taking place. And also enabling them to be able to come to the table and talk to us. That’s the gist of the problem. We cannot really seriously talk about user rights if we don’t have the users at the table.

>> MATTHIAS TRAIMER: One question that comes to my mind that I would like to discuss also with all here in the room. You always speak about “the users.” and what other users? Average user that you say” well the average users know so much and so much, or is there under such big variety of users, maybe we can clarify that a little bit later stage.

Last but least ministry from foreign affairs, you are representative more or less, but you’re a Human Rights lawyer. When you hear all these various ideas, what would you say, finally, is the role, then, of the state? And it was just said people have to learn about Human Rights. Do you have this also your mission to maybe teach people what are their Human Rights? What role does the state have to play in this context?

>> YOHAN HALLENBORG: I think first and most importantly is of course that we have obligations as laid down in international law. My country is state party to all of the most important Human Rights conventions. And these are rights that exist in reality. It’s not just something that is there on paper. So our responsibility is to make those real. And that includes awareness raising, of course. Of course it does. But more importantly for a government is of course to secure the structures, the rule of law makes certain that that works for people to get an effective remedy, which we’ve been talking about.

In our work here, we seldom speak of Internet rights. We use the terminology that exists in international law, Human Rights. We do not believe that there is a need to create new rights. We strongly believe that the existing Human Rights are sufficient. And they apply online as well as they do offline, which is sort of a political catch phrase, maybe, but true.

The challenge we face is how to apply them online in certain specific conditions. And this is why we welcome the work that is done in the Council of Europe. We’re engaging in the work. We will try to be active in this Working Group if we can. I think it’s a good idea to pull together the existing standards that are already out there and to make these available to people.

I think you shouldn’t underestimate the power of the Council of Europe. This is 47 states which are very, very different in their nature, in their political systems, in how they relate to openness on the Internet. And to get those 47 to agree on embarking on this project is a feat in itself. So I think this is important to remember.

Otherwise, we will continue to work also outside of the Council of Europe from Sweden. We’ve been engaging very much in the UN, in the Human Rights council. We believe that the Human Rights council as the UN’s most important policy making body on Human Rights, they need to engage on freedom on the Internet.

We’ve been addressing this issue from a freedom of expression angle. We have been doing cross-regional statements, together with other countries in the Council. We have managed to get a panel discussion in the Human Rights Council for the first time ever. It’s been on the agenda of the Council. And in this current session, which starts on Monday, we together with a core group of countries from all over the world will present a draft resolution for the first time on freedom of expression on the Internet.

So we believe the Human Rights council needs to play more active role. But we will also continue to engage in the broader discussions on freedom of the Internet. This year, in April, we arranged an international conference, the Stockholm Internet forum on global development, Internet freedom in global development. This is our contribution to the debate, to the broader debate on the Internet and freedom online.

Our point is that it’s also important to bring the developing world into these discussions. This is something that Olivier has talked about. So what we want to highlight in this document forum is the importance to link Internet government in its many facets, political participation, political development, but there are so many other aspects of development, fighting poverty, creating a good climate for innovation, et cetera, et cetera, trade. So that’s what we wanted to highlight with this international conference. We have people from 80 countries. So that was an important thing that I wanted to mention. There are several facets in the work, but we approach this issue from a Human Rights perspective with the obligations flowing from international law.

>> MATTHIAS TRAIMER: Okay. Thank you very much. We’ll now come directly to your questions and remarks. I already see this gentleman over there. After we have now tried to make a little bit of an understanding of the rights that exist, really come into this question: Efficiency, transparency of rights. And as I said, I would highly welcome if you brought in your maybe personal experience with the one or other that you had on the Internet. May you introduce yourself, please?

>> Look at me. I’m a user.


>> And I don’t really know for a moment if I have been insulted, if my rights have been handled in a wrong way. I don’t know, really what are my rights. Where can I, the user, turn? To whom should I ask? What can I do? What are my rights?

>> MATTHIAS TRAIMER: Good. Very much for this core question again. Who wants to directly react?

>> I think that it’s a very good question. You should be told what your rights are. You should be empowered with your rights. There should be a system for you to know that you have the right to speak and you have a right in the debate in the multi-stakeholder dialogue that’s taking place. But just following on the discussion that took place in the bigger hall in the previous session, we saw that the press, who has at the moment the biggest power, it’s the third power out there to really spread information to the wider population, the press sees the Internet and the new multi-stakeholder model as somehow a threat, which means that there’s no help from them to let you know what your rights are. So a conference like this one might be very successful in its own sense, but we often see the same faces. But it’s often the same faces that participate. Users, with an S, there’s no typical Internet user. The wider users do not even know that this is taking place.

>> MATTHIAS TRAIMER: I don’t think you got really the answer. Because you wanted to ask where can I find my right? But it was kind of attempt. But maybe we can also ask Wolfgang? Could you answer this concrete question of this gentleman? He wants to know, well, what risks are there for me when I enter the Internet or when I use it? What would you give advice?

>> WOLFGANG BENEDEK: The compendium is not yet ready. So we have a variety of possibilities and sources. But I would like to give an example because this brochures from the Council of Europe are out there. Everybody can look at them. There are two recommendations very recently adopted with regard to Human Rights and social networking services and also with regard to search engines.

>> MATTHIAS TRAIMER: And you can find them on the Internet, right?

>> WOLFGANG BENEDEK: You can find them on the Internet. And when you look into these questions, they also have a catalog of rights of Internet users. For example, search engine providers, we heard about the transparency report. So there are governments who ask for restrictive measures, yeah? And then according to the guidelines, these service providers, like Google, they should only discard search engine results in conformity with the rules of Article 10 on freedom of expression, under the European convention on Human Rights. Which means there we have very clear restrictions or very clear criteria on restrictions which such search engine providers should at least in Europe take care of.

Now, I would be interested to know if the requests you get from the government, if they themselves are actually taking proper note or respecting these obligations which we can find in the guideline.

Or to give another example with regard to network services provider, it says that users should be enabled to control their information. They should always be able to restore their consent, which they give when they sign the terms of reference, which is a kind of contract with regard to the processing of the personal data. So this is an outcome of the right information of self-determination which has been broken down in these ways. So these are two areas which should explain what user rights can mean in practice.

>> MATTHIAS TRAIMER: Briefly, because I have two further questions, actually.

>> MERYEM MARZOUKI: I think what the gentleman was asking was some very easy-to-understand tool or document. Like when you are at an airport, you can read what are your rights as a traveler.

And I think this is the very objective of the compendium project of the Council of Europe.

Also I would like to mention, because I’m also teaching at the university, I would like to mention that raising awareness in general is very good, but having some real human right education as part of the curricula at school, at the university, of course according to each level of understanding of the student is very much needed. And we should do this for the Internet.

>> MATTHIAS TRAIMER: Thank you. Okay.

>> Well, I’m very grateful. I think I could find my rights with a little help from something which you suggest. But when I know my rights and see that my rights have not been respected, who do I turn to next to get my rights in a way? I mean, should we have anything like the users’ ombudsman or something like that?

>> MATTHIAS TRAIMER: Okay. We’ll come back to this because it is exactly the core you in our discussions. One second.

>> Thank you, Matthias, users should have the rights, but do they actually have those rights?


>> Wolf Ludwig: I think as a point America yum raised just – Meryem raised right now is one of the basic considerations. I appreciate everything that the Council of Europe has done over the last years. APC has done, et cetera. But please be honest and please be aware. We are a small elite of people knowing about these rights and having rights and to be aware about your rights. It’s very, very different. And as long as Meryem pinpointed, exactly, it would be from my point of view, my vision, my dream that one day like you have it at every airport, you have basic rules for passengers, that such basic rules for Internet users will be, like the 10 principles, country principles approved last year, would be on every computer, if you started a computer, then your 10 basic rights, you have it day by day before your eyes. And then there must be somebody behind who can complain and who can enforce the users’ rights. Thanks.

>> MATTHIAS TRAIMER: Okay. We’ll see how this is practical or not. Because yesterday there was also discussion as soon as there comes transparency or something like that, the user clicks on it and then says well I don’t want to read about that, I have no time. I think also there is the Swedish post, regulatory authority or agency, let’s say, is dealing with transparency a little bit later you can tell us about it. Thomas Hendi.

>> Thank you, Matthias, I am also from Austria, which is why we know each other well. My question is for Mr. Pancini from Europe. Our friend has explained what the compendium is about. Could you imagine that Google will actively participate in the deliberation? And then of course the 1,000 million dollars question, could you imagine that Google would put the result on its website in a way that you easily can retrieve it? Thank you.

>> MATTHIAS TRAIMER: I’m sure you won’t say no to this question.


But, please, tell us could you concretely imagine?

>> MARCO PANCINI: Yes. Without saying yes or no, I can say that Google is committed as a panel, could testify to all the processes that the Council of Europe, actually we are looking at these initiatives as a source for us to understand better what is the framework in which we needed to move, looking at the citizen rights online and for users to get awareness around on this topic. So the answer to the first question, yes, for sure, we are interested to work with you on that.

I think it’s very important when you talk about technology, to come with a specific task. I think the simple ask to be prominent and relevant in the search is just. I see the other way around. I think the content would be relevant would be automatically on top of Google search so when you look for it. So I’m expecting that the outcome of the work will be so good that it will come on top of Google search. What we can do without involving the search is working together in order to raise awareness once this work will be done. And we are more than happy to discuss about that.

>> But can it be, for example, just a direct question, that you know – Google knows more about me than maybe I know about me?

>> MARCO PANCINI: No. I think it’s the other way around. I think it’s about you as every citizen using our services to decide independently if they need that from Google a more detailed and personal answer to their question or if they feel that they want to use Google in a more anonymous way or without sharing the Google information. So we want to provide to the user the opportunity to make this choice.


>> MERYEM MARZOUKI: One quick addition about this point and I will target not only Google but also Facebook if this can help. Math and there are also other companies, let’s say.

>> MERYEM MARZOUKI: Of course, of course, but they are the big ones. I’m sitting as an observer in the Council of Europe group on new media and the group which has prepared some of their recommendations. We worked internally on a recommendation what could be a new notion of media and also on protecting Human Rights in search engines and on social networks. And we would have liked very much to receive real information from this kind of company, companies. We would like very much the user is free to decide. But you are free when you are very well-informed when you understand what is at stake, what is going on. And I’m not sure even though I’m in this circle for more than 15 years, I’m not sure I know exactly what Google has, as the information with me what Facebook has not matched because I’m not giving much to Facebook, but still we want as citizens is that you implement as a company, as an intermediary, we want you to implement privacy by default modes, for instance. I’m not – I don’t necessarily want that you use my history of Google request to provide me a better service. Maybe I want the minimal service. And this is my right. So it applies to Google. It applies to Facebook and to many other companies.

>> MARCO PANCINI: Just two things quickly. First, in the relation to this process, this specific process, I spent, I would say, two weeks working on the paper in our contribution for this paper. And I was extremely happy to do it because I believed that this was a very important process.

So, again, I agree on the importance and on the need for involving energies and the resources on this.

On user information, user information is the key. We are working on that. It is not perfect yet. We are going in this collection. But, again, user, raising awareness, we launch a campaign known as “good to know” in this connection. And user choice are the key.

>> MATTHIAS TRAIMER: For example, when the user says to the company, we want new data, I want the rights, there’s also the question is it just at you the company itself, or is it maybe a question of the state, that the state has to intervene, as such, so the transparency topic as such? May I just ask authority from the – I know you’re very much aware. Can we have a mic over there for this gentleman? What you say is important? What should authorities do in this context? And please feel free to raise your hands if you want directly to come on board. Please.

>> Thank you. Well, we’ve been hearing in the discussion here that there seems to be some bewilderment concerning, first of all, what are my rights? And, secondly, what are the remedies when my rights are breached? What I’m doing at the moment and I’m up to my elbows in working with transparency is trying to approach the problem from a little bit of a different angle. How do actually users, even when they are aware of their rights, know if their rights are breached? And this is especially true we’re talking about such variable service as Internet access services. If a certain service or a certain web page can’t be reached at a specific moment, how am I to know if it’s due to some temporary network congestion or an active blocking from my ISP? And that’s what we are trying to achieve now when working with the transparency issue concerning one of these gatekeepers, which is the ISP. Trying to make quality parameters of this best effort service, to become more apparent and more clear for consumers. And it’s a challenge to fix these parameters in a contract such as download/upload speeds, traffic management, policies and so on, so forth.

>> MATTHIAS TRAIMER: Okay. When we come back to the question of this gentleman who said I’m a user. Yes, we all are users. But you took the part that no, I really play the character of the users, right? And you were talking about the legal remedies. So where can I go to if I see – can maybe one of the law experts, first of all, say what would you say to him? Because if you say go to the Council of Europe’s court, maybe, well it sounds a little bit abstract. So could we just go and this practical area. For example, I see pictures of myself completely drunk of the he. It never happened in my life. Just theory. Completely drunk but this was not posted by myself but maybe one of my really close, close friends who I love so much, said ha-ha, this is drunk Matthias on the party. Take as one example but other as an example. Do you want to start?

>> WOLFGANG BENEDEK: In the more general answer but maybe we will hear more on the companies what their mechanisms. In general, the first step is that we need to know our rights and to establish those rights. And I think that’s a process which is happening. You mentioned the principles by Internet coalition. That helps to create awareness. People have rights. And then obviously they were asking what are my mechanisms? How can I get hold of my rights? And here it’s much more complex because the range of mechanisms is from the European Court of Human Rights to some hotline, whatever. There are hotlines out there that you can complain. That’s business on its own how you deal with the work of such hotlines. But at least you have somebody on the other end. And often you don’t have somebody on the other end.

And I think there are big deficiencies when it comes to the issue of remedies. And why should anyway companies provide such remedies? We have the rules, for example, there’s a right to delete. That is a rule which will be more and more established now. But how can I enforce that right to delete certain data? April I think that we have indeed a lack of reliable mechanisms in that field. Companies will, however, do something. And for actually the purpose for their own interest, because what they need is to trust of the consumers. And if they lose that trust, then they lose business. And this makes them go forward.

So if the consumers, if the users ask for such mechanisms, I think they have a chance also that companies will develop in that way. But how this is done, let’s ask our friends.

>> MATTHIAS TRAIMER: We’ll ask our friends, and I may just add something to my question because to take it, again, practical. We have also I think here Thomas from the European Commission. Don’t worry, I’m so bad at names. The complete question that I also want to bring here is, for example, this tracking question, the question of the right to be more or less forgotten. This is often discussed also by Vivian reading and so on within the EU context. When I put things on the Internet, and we all know these stories, this might be some kind of harm to me because I don’t want to be reminded of that. Are we talking here really about the kind of rights that have to be introduced in a new way? Or are they more an interpretation that we have to exist like the right to privacy in the European convention of Human Rights? So is it more a kind of to make them more precise, or is it something what we have to add? Could we also – and if we have to do so, what are my remedies to make? Meryem?

>> MERYEM MARZOUKI: Yes, regarding the case of privacy or more exactly the protection of one’s personal data, we have laws in Europe since the 70s for the older ones. First of all I have to mention that we have administrative bodies which are the data protection authorities at national level that have been established by law in all these countries to help the citizen to exercise his rights and to have some remedies. It’s quicker, easier and cheaper for the citizen than going to court. So we have this. And specifically for the right to access one’s information, the right to modify this information or to have it deleted, this is in the law currently, in the national law, in the European directive that is being revised. And it’s also in the European convention.

Now, regarding this right to be forgotten, there is an issue here. On the one hand, it’s easy to frame it. It’s easy to understand it. If I want to be forgotten, just leave me alone and delete all my data. So it’s easy for some for the user. But this right is already provided. And this is the approach of the Council of Europe, for instance, in the modernization of Convention 108 for personal data protection, they deliberately made the choice not to speak and to introduce a new right to be forgotten.

The European Commission in its proposal for the new regulation has decided to put forward this right to be forgotten. And I’m sure there are many countries, including mine, France, that are pushing for this.

>> MATTHIAS TRAIMER: Don’t you have it in the national law already in France? Something like the right –

>> MERYEM MARZOUKI: Yes. But not as a right to be forgotten. There have been charters and things like that. But not in the law. But there could be a danger here with respect to freedom of expression. Because if the right to be – let’s be clear. If the right to be forgotten is that I’m asking a company to delete the data that I myself have put online, then it is fine. I want to forget about what I have done when I was young, et cetera.

If the right to be forgotten is about having the right to tell others to remove some data, then we could have an issue. It’s good, of course, in the case that you mentioned, someone, my friend, put a photo of me that I don’t really like and more importantly my next employer won’t like. So this is an issue.

But if it’s about telling someone, you couldn’t criticize me on the Internet. Let’s say that I’m a politician. I’m a public person. And I don’t appreciate some criticisms. Then I shouldn’t use the right to be forgotten to actually violate freedom of expression and freedom of information. This is the danger. And the danger is real. This is not the idea of the European Commission as it formulated, but this is the idea behind – I mean, the discussion as framed by some national countries.

>> MATTHIAS TRAIMER: Okay. We’ll get the flow in a minute in the European Commission but I have a question from a remote participant, please. Can we have a mic here, please? I’m very sorry. Yes, and two. I noted you. So, first of all remote participation.

>> AMIR MUSAYEV: My name is Amir, I’m research at university. I’m helping out remote today. We have question from Alexander.

>> MATTHIAS TRAIMER: Is the mic working? I think it’s not working.

>> AMIR MUSAYEV: I’ll speak louder. We have a question from Alexander from Europe. He’s asking a question about what you were talking about now. And I guess we can get more specific on that with this question. “So in case a person sees offending information about himself, can this person actually go to police? Or court? Or any other place where he can ask to remove this information? I mean really offending. He just mentioned pictures.

>> MATTHIAS TRAIMER: Okay. And we collect two further questions. This lady that has waited so long. Sorry about that. Unfortunately we have only one mic? We have two? Okay. That one isn’t working. But this guy looks very sporty. So I completely trust in him. Yes. Is it on?

>> My name is Ashley Winter. I’m here on behalf of the NGO Alliance on Child Safety.

Well, I have maybe some provocative questions or comments. First of all, our main concern, of course, are the children. And when we speak about user protection and when I hear a statement like “we can’t talk about user rights without having them on the table, so to speak,” then I simply think this is an ideal goal we are never going to achieve because we have vulnerable groups, not only children but of course our issues are children. But we have other groups who definitely are not able to or not yet capable to participate in such a process. And we are not going to educate 2 point billion Internet users. So this is also not going to work. So I clearly see a responsibility on the side of industry to following also the principles which are out there on ethical principles for international business, ruggy guidelines, for example, to implement Human Rights, children rights, the rights of those who are vulnerable and not able to really participate like we are doing here, that these rights are secured and guaranteed. And of course there is also a responsibility on the side of the governments to check whether volunteer instruments are enough or not? Maybe this is not yet decided. And basically the Council of Europe compendium, I think this is a good step forward.

>> MATTHIAS TRAIMER: Okay, thank you. Gentleman over there, please? Can you just hand over the mic to? And then we’ll eye.

>> Thank you. I’m from the Swedish national police board. Let’s go down to practice here for a while.


>> We heard very many nice words from the –

>> MATTHIAS TRAIMER: Are you acting like TV television policeman?

>> I’m a very concrete policeman.


>> MATTHIAS TRAIMER: So tell us about.

>> We have heard a lot of the regulations and all the good things that it can bring to the consumers from the scene here. But I tell you two very concrete examples.

Number one, person gets abused over the Internet. And the big search engine, no names, of course indexes it. We can manage to help this victim to get rid of the material from the actual website. But the big search engine still indexes it. And everyone who wants to find the material can find it again and again.

When Swedish law enforcement tried to get in touch with that big search engine, no reply.

Example 2, identity theft. Someone steals your identity and then uses it on the social media, a big one.

>> MATTHIAS TRAIMER: What does he do, for example, when he steals your identity? He takes your profile?

>> Yeah, quite. Great new profile in your name and gets the most interesting points of view in there. If you as a Swedish policeman, tried to get in touch with that big social media, then you get a standard answer. “Just provide us with a subpoena and well help you.”

Just a note to the question. Did the European parliament, the European Union or someone else ever thought about talking to the big companies on the other side of the Atlantic? Because that’s our main problem today. It’s not the problem the lack of legislation in Europe; it’s the relation to the U.S. companies. Thank you.

>> MARCO PANCINI: I need to ask.

>> MATTHIAS TRAIMER: You are American company.

>> MARCO PANCINI: This shows me that we have two big tasks first to increase awareness around our tools because we offer a tool on our site that in one single click allows you to arrive on a page where if the content was taken out from the source, you can tell us what is the link that you want to, the index. So there is no need to go through, yeah. I will show you just after the session.

But, again, it is our fault. Because we need to inform you that we have this tool at the disposal of our user in order to take action.

Secondly, it is huge work to do in order to coordinate the communication between law enforcement and the industry. Because I fully understand you want a contact with us. But there are thousands – it’s not your fault for sure. It’s the fault of the lack of coordination at the European level in the fight against cybercrime in making the industry being be in contact with law enforcement when you need. You need to find. You are perfectly right. Identity theft is the same thing. I’m sure that there will be coordination. Case like the one you have just told us would be solved very easily.

>> MATTHIAS TRAIMER: Okay. I just want to give very directly to the European Commission because it was also asked what does the Commission do?

>> Yes, I was just going to ask, to clarify this issue with right to be forgotten because this is actually the issue that we’ve been strongly discussing at the Commission for a very long time. And we actually do not really agree that users’ rights are that strong in relation to possibility of being forgotten, of right to withdraw our data from the Internet, for instance. And that is why we thought that particular provision on this you would be useful.

Wed a long discussion how to frame it. We decided to make it a very light right. With lots of exceptions. So we fully recognize that there is a danger of balancing right to be forgotten and freedom of expression. But, still, we decided to do it in a very light way with a lot of exceptions and take this opportunity, these experiments and to see how it would – will work.

>> MATTHIAS TRAIMER: Thank you. I take two more questions, then, and then get back to the panel that they can answer.

>> It’s not a question. It’s just a comment. I’m using my chance. Yeah, so as a 19 years old active online user and as law student I can express my opinion about users’ rights. I think that if we say that there is no, another digital world and the world is one big world, I think that – I believe that there is no Internet users rights. There is only human right, which is universal. And it works for Internet. I think so. And being protected is general human right. And methods of protecting, I think, has to be the same on offline as well as online. So example comes from bull it on Facebook. I should be protected in the same way if I was a bully at school.

>> MATTHIAS TRAIMER: On Facebook you go voluntarily. And that’s good you were forced to go.

>> It’s another story, I think.

>> MATTHIAS TRAIMER: Sorry, I didn’t want to interrupt you. Also for consideration.

>> I say that there should be the same methods. I think that there is no Internet users’ rights. There is only Human Rights. Thank you.

>> MATTHIAS TRAIMER: The gentleman next to you, did you want? No, you were just waving me. I wave you back. Good. Any more contributions? Then I come back here, please, Meryem.

>> MERYEM MARZOUKI: First of all regarding the question that came from remote participation, I mean, I’m not a lawyer, but for me this is typically a case either of public insult on information. And we have very proven system to deal with this, which is the court. If this is at the national level, if the person who is framing it for me is from the same country, then I can file a complaint to the court. But if we are in two different jurisdictions, then we come to the comments by the gentleman from the police, one of the major problem of the Internet is how to enforce the law when we have a case on two or more jurisdictions. And this is the main problem.

Regarding the law enforcement, I agree that the better coordination we have between Internet service provider or companies and law enforcement authorities, the more we can protect rights. But I would like also to remind everyone that there is something called dew point and the rule of law as we call it here in Europe I’m translating from the French. And due process means that it’s not only to law enforcement authorities to the police to decide that they should get some information or they should get some content to remove or be indexed the case of search engine jeans. It should also be a core decision. Let’s not forget about this.

>> MATTHIAS TRAIMER: Okay. Anyone else?

>> OLIVIER CRÉPIN-LEBLOND: Yes, I wanted to address the lady about addressing 2.1 billion users and the dialogue. Of course I’m not advocating bringing all the people to the table. What we should do is remove the barriers that stop those people that are interested in taking part in this discussion. And in order to do that, we have to remove the financial barrier and really make it easier for them to be able to participate.

Yes, many users are not going to be able or capable to participate; but for those that want to participate – and I think that for each one of us sitting in the room, there are several of them out there that either don’t know this debate is taking place or have not managed to secure funding or managed to take the time off to come to this debate.

>> MATTHIAS TRAIMER: Who are not in the privileged club as we are.

>> OLIVIER CRÉPIN-LEBLOND: Yeah, the ivory tower. Anyway, the whole point is to try to break those barriers down or at least to ease the effort that those people are ready to produce to come here and to work with.


>> The last point. The point here is that we try to work with with the help of our development corporation is Sweden, Sweden CEDA, we actually brought several people from the developing world to come to Sweden to discuss these issues. So this is really something that we take seriously.

And to come back to the issue of users, these represent the users that we want to work with from international law perspective. These are the people that have Human Rights. And I agree with my friend over here that digital rights can also be a human right. But what we know for sure is that we have Human Rights that apply already.

>> MATTHIAS TRAIMER: Okay. Coming back to this, we have the rights. Nonetheless, there seems to be, because there were concrete questions, there seems to be need to have a more focused, be it document, compendium, whatever it is. And I would like to take the chance now, both for audience purpose but also because of some experts on the panel, what should really be integrated in such a compendium that it is easily understandable for the users. We were mentioning just one right as an example, like the right to be forgotten, whatever we call it. There are also other rights about the question of access, easy access and so on. Could we have in this round now concrete, let’s say advice from the Council of Europe? I’m sure our people are already interested in it. There are already documents. But now we have the chance to reframe it a little bit. Okay.

>> From our perspective, it is important to use the existing Human Rights standards as the baseline for these discussions.

We are quite concerned if we talk about creating new Human Rights because that means that we open up some framework that we already have. And this framework that we have already gives us and users and people, gives us rights. And if you open up this for discussion, we know that there are many governments around the world that would love to infringe on these basic rights and create something much less forceful. So that’s why it’s important for us when we work on this compendium is to work from existing rights.

Another thing is to understand how these work in an Internet context, that’s something different. We have some guidance from civil society. We have some guidance from the UN. The UN special rapporteur on freedom of expression has produced two reports on freedom of expression on the Internet. This gives us a baseline for discussion from an existing floor. And this floor, this framework is not to be re-negotiated.

>> MATTHIAS TRAIMER: Okay. What should be part of this compendium?

>> WOLFGANG BENEDEK: It’s not so easy to say because you rightly mentioned the issue of vulnerable groups. So when we work or when we want to communicate the rights to the kids, we have to use a different language. We have to address their particular situation.

If we do this for handicapped people, also we have to have easy-to-read language and so on. So it might mean that we need specific tools for specific groups. And nonetheless, there can also be general tools. But there is a question of awareness, how to make people aware these are your rights. But you also read in these terms of reference and so on. But it’s so long that you have hardly a chance to realize what are your rights. But then the issue is what are the mechanisms of making this work? And again when we use the child protection issue, a mechanism has been built up over the last years in the context of in safe. You have these hot lines. Also it’s called spotlight. You can complain about indecent, inappropriate content. You can also complain in Austria about racism and extremism, for example, incitement to violence and so on. Then somebody – and this is different in different countries – will look into these complaints and may make reaction to it. But you can see from that is that this is in addition to existing mechanisms, like court mechanisms and so on.

>> MATTHIAS TRAIMER: I think also to add in Austria that we need an ombudsman. for example, we have the Internet ombudsman in Austria.

>> WOLFGANG BENEDEK: I was just going to say that. Sometimes we need an institution like an ombudsman where you can address in order to realize your rights, also vis-a-vis let’s say industry and not only the government but this is something which does not exist in a general way. So I think there needs to be some innovation and some new ideas or so if we want this remedies really to be more effective and also accessible to the users.

Meryem, and then this lady over there. Yes?

>> MERYEM MARZOUKI: First of all I very much agree with you when he said that what we should translate actually on the Internet and make understandable and enforceable is the current Conference on Human Rights.

All of them, not only freedom of expression as some countries or some visions want to frame it. All of them, because they are universal, they are indivisible. But not more than that, because this is – there is some danger here if we reopen the debate.

Secondly, how should we – how I see this compendium of rights I see something very practical. I’m a user. I run a blog. Or I’m a user, I write comments on the blog of someone else. And I find myself in a given situation where I think my rights have been infringed. What can I do?

For instance, I’m a Facebook user and I find out that my page on some information I have put has been removed by Facebook, by the company. What should I do? What are my recourses? Is there an ombudsman? This could be an idea. Can I just file a complaint with the court? What can I do? And what I expect from this compendium of rights is at least for part of it, given some concrete answers, some direction, some guidelines to all users or citizens.


>> Thank you. Is it working? Yeah.


>> Okay. My name is Cindy what are grieves, I’m from the UK hotline for removing criminal content. One of the things I think we do work very well with industry and law enforcement because we’re self-regulatory body. And we’re trusted force. So one of the messages we try to get out is actually we can remove content in the UK. And it comes down in less than 20 minutes because industry works excellently well with us and so do law enforcement. And it’s very, very fleet afoot. It’s very fast. It’s to do with the commitment to remove the content. And I don’t think industry is always – industry does work very, very effectively in this regard. It is very good model. It’s not tied down in bureaucracy and legislation. It’s a very quick, easy solution. But it is around child an bus material.

>> MATTHIAS TRAIMER: But may I maybe not to you but to all of us, just as an experiment. When you put in YouTube, for example, I did it once special seminar when I was trying to also explain to young people – when you put in YouTube, for example, key words like drunk teenager. Drunk girl, drunk boy, you get I don’t know how many hundreds of videos of kids done by be it other kids filming them, screening them and putting them online. The same as regards, for example, teachers. When you put in choleric teacher or whatever, teacher is shouting. Just put it in YouTube. You will have all these questions of dignity.

So look at these legal remedies. There is also the aspect of how can we solve these problems as a question of education or of making aware this point? Does anybody – should this also be part of the compendium?

>> Super quick. The compendium and these exercise we already have something in place called global network initiative. There is also the opportunity for our relation on home Human Rights to have global initiative to have someone looking into these violation and also accessing if this is the case. So it’s an industry initiative, but could be an example of something that already exists where users can go for violation of the Human Rights.

Answering to your question what we are doing, we are working on two levels. First level there should be every time we post the user-generated content online, the opportunity for the user who thinks that his content is harmful or is in any case hurting their sensitivity, to let us know and let us take action in relation to this content. In terms of education, there is a huge work to do and there is not enough focus on that. There is curriculum that could be developed both if schools and civil society to teach you, children.

>> MATTHIAS TRAIMER: Shouldn’t this be in the compendium in the right for education?

>> MARCO PANCINI: I think the education is all right with that. It is a multi-stakeholder exercise that needs to involve law enforcement, educational institution, industry, civil society to many could together in different situation with the best solution for the different audience and for the different – so I don’t know if the compendium is the right solution for that.

Otherwise I think that local initiatives going straight to solve the issue that you can have in a certain area locally could be the solution.


>> OLIVIER CRÉPIN-LEBLOND: I’m of the view that our own age group is probably already a lost case. And I really believe that these rights should be taught in schools in the same manner as math, physics, chemistry, same thing. The next generation are the people who will suffer even more about these things if they don’t get taught about it. So whatever consensus we reach on with regards to the compendium and what rights we have, it needs to be taught in schools.

>> MATTHIAS TRAIMER: Okay. Thank you. I have a question or remark.

>> Yes. It’s a remark. Janice Richardson, coordinator of the Nsafe Internet. You probably know a safe Internet day where we reach out to 1,000 countries. In 2013, the theme is online rights and responsibility. So I think that this is really a perfect time to work. We have the structure in place to work with young people on their online rights and responsibilities.

So first of all, I’ve worked very often in classes and realized that teachers don’t know their Human Rights. So children certainly don’t know their Human Rights.

We did a book for the Council of Europe, a teachers’ guide on Human Rights two or three years ago, but there’s no great takeup.

What can we do to use the safe Internet structure that we have in place really to have concrete outcomes from the compendium, from the discussion here?

>> MATTHIAS TRAIMER: Okay. Thank you. Did you want to take the floor? And then I have this lady. But maybe you have the mic, actually. Could you? No, you don’t have the mic. Okay. Can we give this gentleman over here the mic?

>> Just a brief comment to the question that you asked and also the question that was asked before about the specific Human Rights in Internet. And I think that from our perspective, I would all an so agree that we don’t need any specific Human Rights on the Internet. But there is one specificity of the Internet that the involvement of private sector is extremely important in the functioning of Internet. And that is why our position is we need a specific instrument. We need a specific discussion of some types how to insure that Human Rights are protected because in Internet, private entities are actually – I think we can directly say responsible for insuring that Human Rights are protected.

So for instance you’ve probably heard that we are working on no disconnect strategy, which aims at combating the arbitrary destruction of access to Internet. And in framework of this action, we are now preparing guidelines for companies that are involved in trade in ICT products.

>> MATTHIAS TRAIMER: Thank you very much. Could you hand over the mic? Maybe could you help? Just the lady behind you sitting down. Thank you very much. This is self-regulation. Thank you.


>> I need to stand up, which feels nice after sitting for so long. My name is Inga Gundham and I come from the banker of users, primarily. I’m a native of Sweden but I’m definitely not a digital nature as you can tell from my countenance.

I have been teaching various parts of my life. IT or technology, getting onto the Internet, all kinds of groups of people, different citizens from different countries. They seem not to realize something as simple – I mean you speak with very big words, fantastic things about Human Rights, which I appreciate. As simple a thing, even people who have been on the Internet for quit sometime are not quit knowledgeable that each character you enter is there forever. And – good users so to speak, they have blogs. They have websites. They have Facebook. They go out and tell the world that they’re going to Thailand or vacation for three weeks and are amazed when they come back and find an empty house. This is a part of pragmatic user level education and needs to be taken in. Something like many sites. Skype, many of the services we use. It’s a kind of semi hidden to most users. But there are ways of protecting yourself there. But just only in change the preferences of the but most people don’t realize that because the default settings are for the public. Another thing is I don’t quite understand this thing about the right to be forgotten because from this perspective, I don’t see how you can ever be forgotten. I’ve heard people say well, you know, Inga, I can always go up and delete my web page. It’s been out there for three years. It may be anywhere in this world. What about the big databases that Google keep and other search engines and service companies? Thank you.

>> MATTHIAS TRAIMER: Thank you very much. Also reminding us of this question of the when you, for example, enter a special website like Facebook, people don’t often care about how much information is given from them. Thomas, did you want?

>> Yes, thank you, Matthias, I want to come back to the you that we discuss three minutes ago about a curriculum so that in schools not only the technical abilities would be taught but always what I would call the ethical way how to use it. And indeed it might be of interest here that also as part of the implementation of the Internet governance strategy of the Council of Europe, a compendium process has started to elaborate such curriculum. So it’s for framing the trainers, teachers and there’s a lot of NGO involvement also of course in this undertaking. And I think as our discussion has shown, it’s really complimentary because we need both the compendium and more awareness among the young generation, what to do on the Internet. Thank you.

>> MATTHIAS TRAIMER: I just looked now at the time. I would say we do not need this kind of formal lasting words, maybe again about Human Rights in more abstract way, although I am a Human Rights lawyer myself, so I always praise Human Rights. But nonetheless if you want to take the floor again as this real, practical hints that you would say for putting up this compendium. Maybe you can also name the one other not new right but what should it be kind of called in the way of interpretation of existing Human Rights? Because I think that is one of the very important outcomes also of our meeting here. Do not invent new, let’s say, rules or so-called new Internet Human Rights because it might be to the contrary, then, that states, governments may use it as we need a special new right and justified and restricting.

So in real practical terms, what would you say should be in there?

>> MARCO PANCINI: Super quickly. I will take into consideration the work done by Frank LaRue, which is the human right and principle charter that was prepared during the discussion at IGF that could represent, again, a great point of reference for this work. And involve all the stakeholders in discussion is again a great plus, which in these days where we are discussing about the Internet Governance models is still the best way to address issues in relation to the online environment.


>> OLIVIER CRÉPIN-LEBLOND: I’ll be very quick. Enabling participation and enabling education are the two biggest things as far as I’m concerned. If you do have those two really pushed forward, you’re going to see some change in the wider public.


>> JAN KLEIJSSEN: The main objective is to secure the free flow of information in line with Human Rights. That is absolutely the main goal. But I think subset of goals is to decrease the gap between more traditional Human Rights activists, Human Rights lawyers, Human Rights proponents and digital rights proponents, digital rights activists.

Now, I’ve seen this gap is closing but it’s slowly closing. I think discussions like this helps to close. But I think that is a subset of goals that we should continue to work on. Because the government – governments are part of the solution and not always the main problem.


>> MERYEM MARZOUKI: Yeah, I think what we need in practice is to translate the current Human Rights and see how they apply at different level of the Internet, including when using some Internet protocol or when using some algorithms.

And keep to the basics. Human Rights, democracy and the rule of law. Again, let’s not forget about due process. I’ve heard about hotlines and the Internet watch from Asia and so on. There are huge problems with these hot lines because it’s private decisions by private companies sometimes together with the police. But without really respecting the rule of law, due process. So let’s keep to the basics.

>> MATTHIAS TRAIMER: Okay. Last but not least Wolfgang?

>> WOLFGANG BENEDEK: Thank you very much. I’m personally involved a lot in Human Rights education. I can say that there’s not much of a difference. We also have the problem that many people are not aware of their Human Rights and that we need Human Rights education everywhere in order to remedy that. But my three main concerns is therefore first of all we have to make people aware of their rights. And here the compendium should make a contribution.

Second, they have to know what are their remedies? And assert whom to address in order to get to use those remedies. And now in looking at the complexity of Human Rights/users’ rights which are out there, the real challenge will be to break these down into a digestible way, in an easy to use way. Yeah, how this will be done we’ll see.

>> MATTHIAS TRAIMER: We’ll see. We will check it again maybe at the next EuroDIG. And we’ll see if some of the – EuroDIG. We’ll see if maybe some of the ideas that were brought forward today are integrated. But let’s think positive. We won’t save the world by such a compendium. There will exist still problems. I say for me it’s one of the key issues is also the default settings, as you said, for example. This is still one of the big issues that you do not have to look in a complicated where. Where can I change this? Education, of course, access and then of course this question of privacy. Not new things but it’s absolutely, I think of importance to break it down and to see what does it mean for the users?

So I wish you all a happy using of the Internet. And, well, be careful and also complain. And you have, for example, this – yes, yes, you have also, for example, those people from Google who say we are ready to listen to you and we will react. Wolf, is this now a statement or is this housekeeping?

>> Housekeeping.

>> MATTHIAS TRAIMER: Then you are free to speak otherwise I would have censored you because I want to finish the discussions, of course.

>> I would not do that knowing you would censor me for a final statement. Just a very simple housekeeping. We have lunch break now. And as you can see from the programme flier on the website, there are two parallel plenaries after the lunch break. At the same time there are two Flashes announced, Flash 9 on cyber security to neutrality. This will be canceled because the focal point is ill, couldn’t come to Stockholm. And instead of this Flash 9, we will shift flesh 12 GLAM galleries, libraries, archives and museums. and it is just a demo case how GLAM and common models in various countries works. So we will shift this from the last one into the place of Flash 9. And I wish you good lunch break and please to see you for the second.

>> MATTHIAS TRAIMER: I say so, as well. Thank you for attendance. Thank you.