Transcript: Intermediaries and human rights - between co-opted law enforcement and human rights protection

From EuroDIG Wiki
Jump to: navigation, search

FINISHED FILE

EuroDIG 2016 BRUSSELS, BELGIUM 10 JUNE 2016 2:30 PM LOCAL TIME PL 3b: INTERMEDIARIES AND HUMAN RIGHTS - BETWEEN CO-OPTED LAW ENFORCEMENT AND HUMAN RIGHTS PROTECTION GOLD HALL

SERVICES provided by: Caption First, Inc. P.O. Box 3066 Monument, CO 80132 1-877-825-5234 +001-719-481-9835 www.captionfirst.com

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.

>> MATTHIAS KETTEMANN: Ladies and gentlemen, welcome to the Plenary Session 3b: Intermediaries and human rights, and the roles between co opted law enforcement and human rights protection. I'm Matthias Kettemann from the University of Frankfurt in Germany, and I would like to welcome my panel and my Remote Moderator, Matei Vasile and my reporter Ana Gascon Marcen. Thank you very much. Thank you for joining us here after our fantastic lunch. Have you heard of the pregnancy phenomenon? Once you or your loved one is pregnant, you start seeing pregnant people everywhere. I've had the same experience with intermediaries. Once you start working on intermediaries, you see intermediaries everywhere, and you also see legal approaches to regulating intermediaries everywhere. And just like a pregnancy, sometimes that makes problems. It's a challenging time for intermediaries. The Council of Europe just installed a Committee of Experts on Intermediaries. On the 1st of June, it published a 47 state study on blocking, filtering, and take down of Internet content. Just last week, the European Commission and IT companies announced a Code of Conduct on hate speech. And just earlier this week, the new Special Rapporteur on freedom of expression, David Kaye, published his report on the study of freedom of expression and the private sector in the digital age. During this session, we will try to look at the role and responsibility that intermediaries have online. They offer and organise and they police the spaces in which communication takes place. But we're also in a position where they have progressively taken over a role as ensuring human rights, and they can also en danger human rights. David Kaye in his 2016 report wrote: The contemporary exercise of freedom of opinion and expression also, of course, of other human rights owes much of its strength to private industry, which wields enormous power over digital space, acting as a gateway for information and as an intermediary for expression. Should we limit this strength through norms, or rather can we somehow harness the power of intermediaries for the common good and for the protection of human rights? And if yes, who should harness the power under which conditions? To discuss this we've assembled a podium of renowned experts on many issues touched upon by intermediaries, by intermediary liability. But we also count on you, our audience here in Brussels, and the remote participants, at their computers and mobile devices worldwide, joining us through different kinds of access intermediaries. So let's have a little Meta discussion, using intermediaries to talk about intermediaries. Sharing your thoughts on social media, about social media, use your access to contribute ideas. Let me now turn to our first participant, Allon Bar. He will talk about some of the key challenges facing intermediaries and their accountability. Allon is from ranking digital rights from Germany. Allon, you have the floor.

>> ALLON BAR: Thanks very much, Matthias and hello everyone. It's very nice to be here and talk about this topic that is on all our minds every day I'm sure intermediaries and human rights. It's an important topic because we've all recognized that decisions that intermediaries are taking really have a big impact on how we enjoy our rights to privacy, our right to freedom of expression as well as other rights and do not talk only in vague terms of intermediaries maybe it makes more concrete to talk about mail providers, search engines, social networks, et cetera. With ranking rights we're trying to get more insight into what actually intermediaries are doing with respect to human rights. There's often this discussion okay, human rights may or may not be affected by intermediaries but what does it actually look like? We've tried to set standards to make it very clear what we expect from intermediaries, how should they hold themselves accountable to the outside world, to the users, to other stakeholders in terms of how they treat the users right to privacy and right to freedom of expression and so we've set these standards and measured how globally operating technology companies deal with privacy and expression issues. We released our accountability index looking at 16 companies around the world how they deal with these issues. I want to share highlights to give context in this discussion about what it looks like when we're thinking about accountability and intermediaries, how intermediaries actually deal with these human rights issues. So the main headline from our Corporate Accountability Index has been that there are no winners. Basically across the board, the industry is not performing very well when it comes to holding themselves accountable on the rights of the users and the right to privacy and the right to freedom of expression so just go further to delineate, the bad news message of companies are not performing well on privacy and free expression, usually are left in the dark as to how companies deal with their information, what data is collected about them, how they share that data, how long they retain that data, and these pieces of information that are crucial for a user to understand how their rights are being treated are either missing or unclear. Further bad news is that inasmuch as companies are releasing transparency reports, these reports may deal with Government requests but they don't deal with private requests for user information or content restriction and they definitely don't deal with terms of services which is another critical area of how companies for example social networks restrict content that's transpiring online and I hope in the discussion, question can touch on the Code of Conduct that was instituted a few days ago by Internet companies together with European Commission that has a lot to do with terms of service and lack of transparency around that. There's also some good news, good news is that all companies are doing something right. There's no company that gets 0s across the Board. There is an increasing number of transparency reports being released even though as mentioned in mostly those from Government requests. Another piece of good news is for all companies there are easy ways by which they could improve so what we're doing is not just saying: "Okay, here's your score. You're not doing well enough." We want to show companies a clear pathway to help them understand what they can do to do better on these issues so that in the next Index we release, they'll be able to explain better to users how to deal with the human rights issues. Lastly I want to touch on key recommendations we have from our index. One is that we think that companies should conduct risk assessments, meaning they should do human rights impact assessments as they are releasing the products but also as they're reviewing their own terms of service to better understand how their products and services may affect users' rights or for example when they're entering a new market what that could do for the rights of the users that they want to serve. Other recommendations we have are about institutionalizing commitments. Is there Board level oversight on human rights issues? Sometimes there is. Often there isn't. We want companies to show they're serious about the matters, and their Boards have actually responsibility to oversee how the company deals with privacy and freedom of expression. Another key recommendation is that companies should not only be transparent about Government requests, but as mentioned, also about in terms of service enforcement and private requests that affect restriction and user data. They should communicate much clearer with users about what happens to their data. They should also establish effective grievance and revenue mechanisms. This is something that was also highlighted in the excellent report by David Kaye that Matthias just mentioned that was released just a few days ago. Companies since they have such a crucial impact on how users enjoy their rights should give users also the ability to appeal decisions that the company has made that has affected their rights. Lastly another recommendation that I want to mention is that companies should also show evidence of strong security practices. That is really key if we want to make sure that users' data is secure. Our full results for recommendations are there, and I'll leave it at this for now. Thank you.

>> MATTHIAS KETTEMANN: Thank you very much, Allon. We'll now look at what the different stakeholders really want from intermediary regulation whether we do actually have a problem with the intermediary liability regime in the first place. And for our first view on that question, I would like to give the floor to the Public Policy Manager from the Computer and Communications Industry Association here in Brussels. Maud, you have the floor.

>> MAUD SACQUET: Is it working? Thank you, Matthias and hi, everyone. I think it's fair to say that digital economy has provided a lot of opportunities to communicate and their knowledge on a scale that's never been seen before and that many Internet companies intermediaries have committed to protect the freedom of expression of their users. But we are of course all aware that ICT companies face a lot of pressure from Government to comply with regulation and policies that may conflict with freedom of expression and the privacy of their users and intermediaries are kind of stuck in the middle, and it's not a very comfortable position, to be honest. So against this background, many Internet companies have done a lot of work to protect freedom of expression of their users. To give an example, they are working today with Governments to promote freedom of expression. They're also calling for greater transparency from Government regarding all the requests they receive for user data for online restrictions. And they also published transparency reports, disclosing as much as they can legally speaking numbers and the nature of the requests they receive. But we do basically whatever we can, but we also firmly believe that intermediaries should not be the police of the Web. That is not their job to lead on this issue. Their job is to create products, services for their users, but in practice, they're often stuck in the middle and asked for a solution. That should not be their role. Their role should be to listen, to help find the solution if asked but of course, ideally what we would like is to see all stakeholders, and I'm quite aware that's not easy to do but to see all stakeholders, Civil Society, organisation, citizens, representative public servants to come together and discuss what kind of society they want so that we can follow in the wake of this discussion and act accordingly to the will of all the people. One thing I would like to add is that we'll have debates but one thing should still be key in the discussion regarding the protection of freedom of expression and it is the limited liability regime for intermediaries. And especially as we in Brussels today the e Commerce Directive for European law, and I guess in my opinion I think this regime is one of the most important legal instruments that we have in the Internet era and it is key for the freedom of expression. This regime enables intermediaries to be protected from legal liability regarding the content that is posted online by their users, and so it ensures that they're actually willing to host it and we are all aware as well that any kind of greater responsibility for this company would ultimately lead to restriction to access to platforms, for example. And so we might actually limit the number of illegal content on the Web or pirated content but at a very great cost which we're kind of not willing to pay. And this is why we were worried last year when the European Commission announced it was going to look at whether platforms should be regulated, and whether a duty of care should be introduced at a European level. So duty of care has never been to my knowledge defined but we understand it as being some kind of monitoring obligation that would apply to platform. So when the Commission published its communication on platforms last month, we were relieved to see that this wasn't mentioned anymore, and that platform, and the Commission decided not to regulate platforms on a horizontal basis so with one law that would apply to all platforms but we need to stay vigilant because the Commission will now look at sectoral regulation, and this regulation could also harm or impact negatively this regime, and I'm thinking of the reform that will come after this summer. So if anything, we think that for Internet to continue to grow, liability limitations should be expanded. For example, currently intermediaries cannot take any kind of additional measure they would like on the website to fight against legal content or illegal activities without being at risk of losing their limited liability protection because if they do take any kind of initiative on their own platforms, the Court could decide they are no longer merely neutral, passive and merely technical, so they have actually knowledge of the content, and we don't think that should be the case. And this is why we have urged the Commission to look at this question, as well. And so to conclude, and to sum up, I would say that we strongly believe that intermediaries should not be the police of the Internet but they're kind of stuck in the middle and that what we would prefer is to follow in the wake of meta discussion from all relevant stakeholders and that if Europe wants to have a vibrant and connected digital economy, then it should ensure to have strong online intermediary protection across the EU. Thank you.

>> MATTHIAS KETTEMANN: Thank you very much, Maud, you've impressively shown that sometimes the law incentivizes behavior that doesn't actually leads to human rights sensitive solutions. Our next speaker is Professor Michael Rotert. He's the person who sent Germany's first email in 1985 so we're happy to have him here. If you ever have problems receiving emails, he's the one to call. He's now the Chair of eco the Association of the Internet Industry, but he's done a lot of work for Government, especially the German Government, and I'd like to invite you to share your view.


>> MICHAEL ROTERT: Hello, everyone. This session after lunch is normally called "grave yard session," and I think this perfectly fits into the intermediary theme. We'll see what comes out. Intermediaries are targeted by many stakeholders and in many ways, and I have to admit, I'm myself an intermediary. I will come back to that later. Governments go to intermediaries when they need data or when they want to introduce mechanisms for controlling, blocking, filtering the Internet, and special case intelligence services. We have seen them at intermediaries' offices, as well. They go directly without asking someone and without telling someone, and even the intermediaries are not allowed to talk about their visits and whatever happens. But also from other industries, if they try to go to Governments first and if they don't succeed they try to go the other way around. I'm speaking about the rights holder industry. They once called the intermediaries the gatekeepers. They were the ones bringing up these gatekeeper issue of the intermediaries because they tried to protect their outdated business models with the support of intermediaries, so they're not only police, but should also support business models from other ones. The problem is for the intermediaries and I think that's quite clear for many Governments, that in many cases intermediaries can't help providing data, and you mentioned it before, without violating laws and norms, at least in many countries. And this brings me to the before mentioned Code of Conduct. I looked at this Code of Conduct and said, well, it's nice if Governments come along around this way. It's a nice try. But for me, once I've read the information I had so far, the Code of Conduct is nothing I would recommend to my members at home in Germany, as long as I don't have more information. Some of the issues are very questionable which are in the Code of Conduct and I think it doesn't work the way the European Commission thinks, at least not from what I know so far. I was already mentioned the Deputies. And when we discussed between Governments and Internet industries about the Deputy role of intermediaries, I usually claim the industry has to pay a lot for introducing regulations from Governments, the Internet industry but for instance for data retention of blocking or filtering, industry has to invest into devices and other things, the same amount at least should be invested by the Government to get most out of these regulations. Otherwise it doesn't work. Just to have the intermediaries as chief police or chief Deputies, this doesn't work. And I wouldn't rely, as Government, on the data or even not on the requests coming from certain police. There was a discussion I just followed some time ago where it was questioned if police from other European countries can directly go to intermediaries or to service providers in other countries in order to speed up looking for evidence. This is something I think will not work unless there is not a common understanding of the legal basis or common legal basis in all the countries. Otherwise, this will get a mess, because who knows that this is really police calling in from another country? No one. And it's hard to prove, although they say, well, we can have something in between who can verify this. But this is one request coming up from time to time from Governments to intermediaries and discussion about that. I mentioned before that I'm an intermediary myself. My Association runs a peering point in Germany. It's called DE CIX, where around 400 service providers are peering for information nationally. The reason I'm mentioning this is the throughput on the switch is currently about 5 terabit per second, so I know that 5 terabit per second doesn't say some of you not much. It's about 600 gigabytes per second, and 600 gigabytes is usually half a disk of a newer model of a notebook, per second, full disk. And what we found out when discussing with Governments is that for mostly the speed especially by large intermediaries or those switches, the speed and the technology is too expensive for Governments to get the data directly. What we have done in terms of between Governments and intermediaries is human rights guidelines for Internet service providers, we are reviewing this. There was a meeting yesterday where I explained it, why it needs a review. This is where we're working at to give something real on the hand of intermediaries of Internet service providers and we're trying this time to translate it even in most of the languages in the 47 countries of the Council of Europe, and then we'll see. At least with the first version, we had very big success that ISPs and intermediaries loved this. It was just the opposite on what Governments are requesting from intermediaries. Thank you.

>> MATTHIAS KETTEMANN: Thank you very much. I guess that's often a problem that something that intermediaries love, Governments might not love too much, and Civil Society might not love either but let's now turn to someone whose life sounds like it came from the matrix Karmen Turk. She works for the law firm Triniti and is a lecturer in IT law and is known to intermediary people because she was the lawyer from the Delphi case Delphi being Oracle which gives us the connection. In terms of platform, for special intermediaries, what is their biggest fear? What do they want?

>> KARMEN TURK: Hi, everyone. I'm sure it's hard to listen to us, but bear with us a little bit longer, because we do actually have a point in general, because you can't use an Internet as you can use a bath. You need something in between. So it's a crucial subject that needs to be resolved, and it can never be resolved on the level of a state or a group of states, so it needs to be discussed thoroughly in these kind of foras. So as Matthias wrote me that, he's going to ask a question for me what news platforms want. It's very hard to reply, because I'm not a news platform, but, yes, true is that I have a few friends working in news platforms, and so what do they want? In my opinion, only thing they want, being also intermediaries, because they have their news part, which is their own content. Then they have many of them, at least, have comments platform or use a generated content platform. So they have different roles. So this is why they are kind of very much in tight spot because people don't really always want to see that they might have different roles and this is how the code of human rights as well has come to play. So what my point is, is that only thing they want is clarity. They want to know what is the playing field they are playing in, and today, I am there to say that the clarity is gone, and the playing field is getting muddier with every single day. To give you an explanation why I say that, I think before 2015, at least in Europe, I think we started to have a standard. I think everyone started to share the idea that intermediaries are there. They shouldn't control. They shouldn't police. There are certain rules. They have to take down content if they get notification about this or they will be liable for the content and so forth. And I think the standard was kind of there, and the understanding was already shared, because we had e Commerce directive from 2004 for EU and from Council of Europe we have had Declaration on platform the same kind of principles of not policing already from 2003, so there was a shared standard. But then 2015 became true, and during the summer, the code of human rights, so 17 of the highest judges in Europe, ultimate protectors of human rights decided that, well, if you're a news portal, and you do have a comment platform, then all may be good and nice. However, if you have some readers who tend to resort to clearly unlawful speech, then it is up to you to, on your own initiative, to police your user Section and be liable for the hate speech or clearly unlawful information, unless you delete it on your own initiative. So this was clearly a different approach, at least in parts. So 47 Member States of Council of Europe were in a tight spot. To be even more precise, 28 of them, who are also members of EU, were basically left in limbo, because they had e Commerce Directive saying that you're not supposed to police. And you had Court of Human Rights decision saying you have to police, so what were the intermediaries in those 28 Member States supposed to do from a legal perspective? There was no lawyer I think in Europe who could have given them an advice and be 100% sure that this is a correct advice. So then 2016 kicked in, and at the very, very start the Court of Human Rights kind of got a chance for a do over, so in February they published a decision on exactly, or at least very, very similar, circumstances. You have a news portal. They have a commenting platform. There are readers who resort to let's say very vulgar speech, the same kind of factual circumstances the news portal got a notification. They took it down the same day and so forth and the Court of Human Rights, what they did, they did try to control the damage, and at least in my very naive hopes. What they did was they said that well, from the decision for the summer there is an exception and we should see it as such. So in general normal circumstances the notice and the notice and take down whatever we call it, this kind of procedure should be sufficient to make sure that a person is diligent enough and avoid liability. So everything kind of started to go in a very nice direction at least in my opinion, and now as Matthias was stating from the start, that EU started to change its position, at least in my opinion. So during May we had a Code of Conduct. End of May we had communication in online platforms which have language in it that every legal mind reading it, especially such legal minds who actually dream about intermediaries and keep seeing intermediaries everywhere, then the language that platforms are mandated to be responsible for behavior of online platforms, and to protect the core values. How can intermediary protect core values? I have no idea. And to make things even worse, they also started to propose an amendment to the audio visual media services directive and in there the language is that video sharing portals or platforms are mandated to make sure that all citizens are protected from illegal content. So this is to go back to the question that I was asked and then I think the answer is that platforms want clarity, this is definitely not something that they are getting, and but even more, I think even though they want clarity, they don't want clarity at any cost, especially if the cost of the clarity is that they become effective gatekeepers of all information online, and protectors of our society and every citizen that is using Internet. So this is my point of view. Thank you.

>> MATTHIAS KETTEMANN: Thank you very much. Thank you for mentioning citizens, before we continue with our panelists I would like to give the audience a chance to participate. There are two microphones here, if anyone feels the urgent need to chip in, if somebody has a question to ask. Yes, please?

>> FARZANEH BADII: Yeah, my name is Farzaneh Badii. I just wanted to comment on something that we kind of ignore and don't talk about much, and that's the role of intermediaries. In many ways, the intermediaries can help and provide this resolution and access to justice because they are right in the middle of the disputes and they can handle the disputes. And we they are not and the dispute resolution mechanism they provide is not inherently unfair, and I think we have this premise that we always think that private dispute resolution that is provided by private corporations and intermediaries is unfair. That is not true. What we need to look at is to look at what sort of regulation, as you also mentioned, what sort of regulation we have that deters these intermediaries to first provide dispute resolution and second to uphold justice. And also the other issue is access to courts and whether which mechanism is better, whether they should go to court when the parties have a dispute, or whether they should go to the intermediary. And I think the question of going to court is to provide them with some sort of due process, and I think that intermediaries are also able to provide some sort of due process if they're incentivized, and there are also external mechanisms other than public court, such as private arbitration, that is quite tested and there are quite a lot of practice on how we can make them adjust so I think access to justice and the role of intermediaries in dispute resolution is very important. Thank you.

>> MATTHIAS KETTEMANN: Thank you very much. I think afterwards Frank La Rue would also be able to shed light on the relationship between justice and intermediaries and whether or not you need special regimes to govern the behavior. A second participant?

>> STEFAN HERWIG: Hello, I'm Stefan Herwig, and I prepared a little statement that maybe is, I'm going to try to be a little bit of an antagonist here on the issue of intermediary liability. First of all obviously there's several types of different intermediaries, right? Like are we talking about service providers and host providers? And obviously not all intermediaries should share the same liability, I think. So that's something that we need to talk about. But I think that the intermediary liability issue is one of the central issues within the last years of the Internet Governance debate and it's gaining more traction every year. It's may position that the intermediary liability rules and regulation regimes is currently inconsistent and imperfect and that it can be improved. At the moment, there's an increasing amount of fallout due to these deficits and fallout for users and for third parties, and for the Internet ecosystem as a whole, which I think is slowly drifting towards an out of bounds scenario. I think the problems that we face and that we have started to pile up will not go away by themselves. And like everyone in the process, the diverse intermediaries and platforms are stakeholders and I'm afraid that they have done little more than trying to fortify their stakeholder interest position. I think that is understandable on the one hand but it's short sighted as well. Because like I said, the problems will not go away. One of my research topics is a tension between anonymity and accountability on the Internet compared on how we balance anonymity and accountability in our world. In this research I did conclude the balance of anonymity and accountability we've currently installed online is significantly different from the balance and logic that has proven to be effective and reasonable in our carbon world, and this will create friction, or it has already created friction, and will continue to create friction. Why is this topic an intermediary issue? Because intermediaries do currently set the standards and rules on the anonymity and accountability not only of the users but also of other platforms more or less by themselves, and based mostly based on their own interest. This is like if a car manufacturer decided whether or not a certain car needs a license plate or not. Platform users, third party users and others do then often have to deal with the fallout. I think we will struggle and fail to probably implement bodies of rights, by privacy rights, copyrights, or any other information based body of rights properly before we have tackled this central accountability issue, and I think this is a political issue, and I'm very skeptical that this can be solved by self regulatory measures only.

>> MATTHIAS KETTEMANN: Okay, thank you very much. I think we can all agree that intermediaries will not be able to solve those issues by themselves, but the question is how we can nevertheless encourage responsible intermediary behavior. If we have no comments currently from the remote participants, I would like to give the floor to Martin. Martin Husovec is Assistant Professor at Tilburg University and also runs a technology blog.

>> MARTIN HUSOVEC: Thank you for this advertising. So I'm here rather short notice because a colleague of mine from Tilburg Law School, Nicolo Zingales, could not make it. I'd like to at the end of my brief statement say a little bit of what he wanted to say, but first I'll start with what I want to say. So I think we all recognize that intermediaries respond to incentives. And those incentives are set by the law. The law that we call intermediary liability, the body of norms. If the result of those incentives is over compliance, so that individual players behave differently than we thought would be legitimate, then that is a problem. And that this is a problem is something that is recognized in the case law of the European Court of Human Rights, it's recognized in the case law of the court of justice. Where the court whenever is issuing some measure always tries to think about this measure as requiring some targeting, so that it pursues some legitimate aim that is proportionate, necessary and everything. The problem is that many of these intermediary liability laws and this is regardless, this goes across the board of different types of intermediaries, do encourage over compliance, and do not necessarily bring safeguards in how to curb that overcompliance. So what you can do in that situation, you can either try to change the intermediary liability law by changing the incentives that you have sent, as some people would argue, or you can strengthen the safeguards. Now, safeguards, let me illustrate this to be less abstract on an example, an example that is familiar to me because I come from a field of intellectual property law. So as some of you may know in the intellectual property law we are facing an increased rise of enforcement attempts against all kinds of intermediaries, especially those intermediaries that are further from the chain, from the content supply chain. So Internet access providers, but going even beyond. Now, one of the measures that you see these days is website blocking. And what is happening in these website blocking cases in the better scenario is that you go to the court, you ask the court: Hey, Dear Court, there is a website that's infringing on my rights. I'm suing this Internet access provider. Could you force him to block it? And the Court will be since he's a Court, he's an agent of a Government, directly by human rights will do the proportionality exercise and say you have to target the blog and only block maybe these specific websites. These are the safeguards and everything. But the problem is what happens afterwards. So decision is granted, and it goes to intermediary back who complies with decision. And now that intermediary if he undercomplies with the decision there's a specific remedy meaning fine, contempt of the court, many different things that can happen to that particular intermediary. If he overcomplies with that, there is very little that will happen. Despite the fact that the same court when he issues a decision issues on the condition that overcompliance is the problem. So court of justice when he heard one of the cases on website blocking, thought that the solution would be that a user should have a possibility to challenge overblocking. Well, that's nice, and the court thought that maybe the procedural rules of the particular country could be the solution. But the problem was that in exactly the country from which the referral was coming, there was no such procedural remedy, or safeguard, so the question was: How do you make sure that if you think the citizens are those who are in charge of policing overcompliance, actually have tools for going to the Internet access provider to prevent that overcompliance. And this is a difficult issue, because it's a horizontal relationship, where the freedom of expression that is at issue, at stake, does not necessarily give positive tools against another individual to enforce directly. So individuals have to look for some means such as tort law, contract law, or something else, to be able to achieve the aim which is to bring down the overcompliance to the legitimate aim, which was pursued in the very first place. So this is a big problem and I think since we face this, and unless of course we cut back on that, and privatize enforcement whether it's in IP or outside, we'll increasingly need some tools how to curb and safeguard the overcompliance on which we condition the measures in the very first place does not overgrow the entire landscape. To conclude very brief remark on behalf of Nicolo, he would like me to promote one safeguard which is thinks is a good one, his work within the Dynamic Coalition on platform responsibility which is to look at terms of service of different online services and try to see to what extent they respect different fundamental rights and maybe try to benchmark them, try to make them somehow accountable, bring the different issues to the spotlight, and hence mitigate possible overcompliance of course provided that the practice is somehow formulated in terms of service. So I'm sure he would encourage you to look at the platform. responsibility.info I think is the website and maybe look at the work of this Dynamic Coalition. Thank you.

>> MATTHIAS KETTEMANN: Thank you, Martin, so we got two academics for the price of one, fantastic. We will now turn to Maryant. You mentioned terms of service, Martin. Terms of service are also covered in two other sessions during the EuroDIG but are there also other positive negative influences cloaked in voluntary mechanisms? That I would like your input. Maryant Fernández is with EDRi, a coalition of European organisations fighting for freedom of expression on the Internet. Maryant?

>> MARYANT FERNÁNDEZ: Thank you very much. First I would like to make three points, and I'll take a step back before coming to your question. I think the first step I would like to address and the first point is that we live in an era of dichotomies. We have a lot of political issues that public authorities, companies, citizens would like to resolve, such as hate speech, terrorism, copyrighting instruments and many other policy issues, so often we see that the way that these issues are being solved is that we put security versus rights. We put crime versus freedoms. We put copyright infringement versus freedom of expression. And the first thing I would like to tell you is that I hope that you would not take that approach, and then you would not say, we need security or rights, but we need both, and we need to achieve both goals, and same thing goes for fighting against crimes and for fighting for your human rights and for your fundamental freedoms. Once we fight the different problems, this means also we cannot find a solution that would solve all of the problems. So a "one size fits all" solution is not the answer. That's the first point. So the second point is something that the other speakers already addressed and it's the trend that we see from Governments and political authorities to push companies to be the police of the Internet, be also the controllers, be, apply a certain unknown defined duty of care, and so on. But the problem is, as we already discussed before, is the avoidance of accountability and responsibility not only from Governments, but also from different actors involved, including intermediaries. So as an example and since I've been already asked to comment on the Code of Conduct I will express our views on this Code of Conduct. So the context is I don't know if you guys know what the Code of Conduct on hate speech is. Do you all know about it? Or not? Just to see whether I would need to explain what it is.

>> MATTHIAS KETTEMANN: Perhaps in one or two words?

>> MARYANT FERNÁNDEZ: So this Code of Conduct was put forward in the context of a dialogue that the European Commission put forward. That dialogue is called the IT Forum or the EU Internet Forum, which is directed to fight against terrorism and hate speech. So as part of one of these initiatives such as dialogue the Commission thought it would be nice for companies to actually take the lead and put forward a Code of Conduct, meaning a non legally binding text that is not necessarily enforceable to actually address the issue of hate speech. So if you look at this code of conduct we see a lot of issues. First of all it is expressly stated that the companies should take the lead in the fight against hate speech. It also says however that the most efficient way to actually combat hate speech is to also include Member States. However, Member States have not been included in the Code of Conduct, and by the way, Civil Society was not included either. The signatories of this Code of Conduct only include four big companies and you may ask, are these companies European companies? At least one of them? None of them. They're all big U.S. companies. So we would also ask whether how other companies like SMEs or European companies would address these issues. Would they be able to actually commit to the, what is agreed to in the Code of Conduct without even being able to have a seat at the table? We don't think so. Also, a big problem that we see in this Code of Conduct actually is that companies would have to assess different content in their platforms, not on the basis of the law but on the basis of the terms of service. And where necessary, they would need to look at the law. That means the terms of service would be absorbing the law, and then the law would never need to be actually assessed, and then we come to the problem that my colleague already addressed, the problem of overcompliance, which means there's some legal content that would be removed or blocked or taken down, perhaps but we don't know because another problem is that there's no single mention of any digital oversight whatsoever. There's no mention of any commitments to conduct a serious process, there's no transparency in place. There's the involvement of Civil Society but it's only said that the companies will train them to actually know how the terms of service of these companies work. And so these Civil Society organisations would be able to tell them whether there's a breach of the terms of service so again we have the law. We have a European firm position on hate speech. We have different laws in Member States and then we have these voluntary mechanisms that do not take account of these. For us it's a problem in the sense that who's accountable for any infringement of your rights including for example freedom of expression? Is it the Governments? No, because well they're not signatories of this. What about investigators and prosecutors? Would companies be obliged to actually refer any hate speech they find on the platform? There's no such obligation in the Code of Conduct. Would actually the it mean that Civil Society would actually is really aware of all of the laws of all of the details of the different jurisdictions of the competencies? No. So we see there's a trend of putting forward many different voluntary mechanisms that bypass the rule of law and more importantly that they bypass Article 52 of the Charter that says restrictions must be provided for law be necessary and proportionate and these assessments are not being put forward. In that sense I agree with Karmen there's a trend in the EU to change the rules. I think if we want to come forward to resolution we need to actually first identify the problems and assess it before not all problems would need the same solutions. Second of all, would be to actually make sure that whatever measure you're going to implement the measure needs to be effective, it needs to be transparent. There needs to be assessments with regard to proportionality and necessity. Assessment of the key performance indicators. There needs to be an assessment whether the evidence that was gathered at the time actually is still relevant, is the problem still existing? Are these companies not doing enough? Or if not, then what's the problem? How do we assess this? And then what's the role of the courts? Because we live in a society and actually it's very interesting in the case of the European Union because the European Union was set up to bring peace. The foundations of the European Union actually respect human rights, they respect fundamental freedoms and suddenly when it comes to the interpret we forget this. We reduce the debate to the dichotomies, no but if I have to choose between security and human rights respect, I will choose security. I hope after this panel you would say, no. And if somebody asked you about that you would say we need both. We need to protect citizens' security and we need to protect citizens' human rights and fundamental freedoms.

>> MATTHIAS KETTEMANN: So if they ask say you're asking the wrong question. [ Applause ] Thank you very much. Before we turn to the audience and the remote participation of which we have two I would like to ask our final panelist to contribute, Frank La Rue is the current assistant Director General for communication and information for UNESCO but he's best known among Internet scholars for his ground breaking reports during his long and successful tenure as Special Rapporteur where he wrote 10 landmark reports on the issues of freedom of expression in particular as they relate to the Internet. You've always fought for the legal system, and against privatization of state function through intermediaries. Would you care to elaborate a bit?

>> FRANK LA RUE: Well, thank you very much, and a real pleasure to be here, and let me begin with your last statement and the last speaker. There are several ways to talk about intermediaries in the arena of Internet, but there's very important levels. We can talk about the legal level, in particular the human rights level. We can talk about the ethical level and the voluntary codes of conduct. We can talk about the cultural level and what is the respectful cultural diversity and linguistic diversity around the world. All these three are different but all of them are very important. But I always begin on the first one which is the legal level, and from a human rights perspective. And let me clarify here because the one common law we have around the world and I keep on saying this, is the human rights law. And this is very important. This human rights law has become also criminal law to the extent that the international criminal court was established for the worst violations of human rights still, but it was all based on the human rights principles. Why? Because we needed a legal system that unifies our mission around the world. As different regions and different perspectives and different cultures, we wanted one common standard, so this is very important to say because although we're talking here at EuroDIG and this is a European Forum to talk about Internet, if we're referring to Internet from a human rights perspective, it can be no different than the rest of the content and regions of the world and the debate we'll be having in the IGF in Guadalajara, Mexico, in December. So this is very important because the level of human rights coherence is very clear, and the standards are there. We have the international conventions. We have the Declaration conventions and we have doctrine and jurisprudence in some cases depending on the different courts, that actually sustain this. So I say this because freedom of expression is something that initially corresponds to the states to protect, as well as all the other rights, so you were right in saying one of the issues that I have criticized is that the states want to transfer the responsibility to other actors, in this case to intermediaries, and I know there's a technological problem. The massiveness of the communication puts states in difficulty, the court in the United States went back to legislation by saying look, there's more work than we can surveil or watch or monitor or follow up in our own decisions so clearly there's a technological problem but the fact that there is a technological problem doesn't mean the State then has to wash its hands and resign to its responsibility. This would be unacceptable. Ultimately the State representatives are elected for the protection of their citizens and their paramount obligation is to maintain the peace and the respect for human rights and that can be not be delegated to anyone else. Secondly, we do believe in professional standards of ethics, and we do believe in professional responsibility in the UN, and especially among the Rapporteurs. The principles of corporate social responsibility were a very important accomplishment, so we're not denying the role of big platforms or big communication enterprises. All of them have a very important role that we value very strongly. But never in substitution of the State. Again, it should be in complement to the State decisions, to the State policies, and to the State regulations. Now, oftentimes intermediaries say, well, we're caught in the middle. This is yes and no because oftentimes intermediaries are caught in their own business orientation as well because they're trying to benefit from the amount of users and stay in some area, although they may be demanded issues from states that they know are not legitimate in the realm of human rights. And here's where the legality becomes a problem. Why I insist on the human rights legal standard is because the laws of countries can change. Yes, here we are in Europe, and there is sort of a general understanding of basic rights legislation in every European country is different, but sort of sustained basic principles and standards. But if you go around the world, you'll find many other regimes, and some of them are very dictatorial regimes. They're actually legislating to control power and to guarantee their abuse of freedom of expression and the means of communication. So one cannot say that legality is the solution everywhere. Legality is the solution if we're talking about human rights law and international law, but not by the standards established individually in any country arbitrarily. In any case yes legislation is valid in as much as it is sustained in the same international human rights standards. This is very important because oftentimes legality is used as an excuse. I'm just following the law and what I was ordered by the authorities and if I don't follow it become illegal. They should not be in this particular area, this particular country, even if they lose that market which the way that many times we are seeing. Thirdly, I think that there has to be a cultural sensitivity. Yes, ironically here we defend diversity, there has to be cultural diversity, linguistic diversity, a respect for traditions and all that but again without applying this to cultural relativism on human rights which is often the temptation so this would be valid for gender equity, for non violence at home and domestic violence against children, against women or it would be valid for any violence, incitement of violence of ethnic minorities or different racial groups. And here is where I think that in the prevention, it is true that the world today is suffering a serious threat of extremist abuse and incitement to violence and oftentimes through Internet it would be a mistake to see Internet the main source of this violence. I think Internet is being used. Yes. We cannot deny it. We have all seen the horrible images and pictures but I think the real reasons are in other areas, we have to look at structural reasons, we have to look at the geopolitical situations in many countries, we have to look at the situation of young people there. The fact that Internet is misused means that we should pay attention and we should try to find the positive use of Internet, or in our own countries have preventive policies but not blame Internet for the issue, but also look at our own policies of integration, of young people, giving them equal opportunities to health services, to education and especially to jobs. This is crucial. And to end on the Code of Conduct, I think it's very important. Yes, codes of conduct can exist. We have always encouraged the media and the press to have self regulation. But not as a substitute for state responsibility. This is what I find with the recent Code of Conduct that it can easily be raised as, we're following the Code of Conduct, when in reality, this is not a decision that should be only on the intermediaries. This is a multistakeholder decision which is why we have been assisting the multistakeholder dialogue by so many years of having the IGF and the EuroDIG and all the Forums, we have to make the dialogue, the multistakeholder dialogue, really more relevant in determining policies and determining regulation.

>> MATTHIAS KETTEMANN: Thank you. Thank you very much. I would now like to turn to our two remote moderators. Okay, I'm sorry? [ Off Microphone ] Absolutely.

>> We have two, well, one observation and one question. First from Chris Oldknow. All are asked to do more than focus on customers. It implies the arrival of further responsibilities for digital services, ultimately the e Commerce Directive will eventually need to narrow from its current status. As Karmen highlighted, eventually it will be human rights ruling that the safe harbors are overbroad and incompatible with the human rights framework of the Convention and/or the Charter. And the question from Sophie SB, a question for Maud: Do you think that we need the good Samaritan exemption for intermediaries?

>> MATTHIAS KETTEMANN: Okay. If I understand correctly the first one was for Karmen and the second one was for Maud? Do you want to start perhaps?

>> MAUD SACQUET: This is what I was talking about with the IT but right now intermediaries cannot take, can have additional measure on the platforms ideally to fight against illegal content without being at risk of losing their current limited liability protection regime, and so this is something we are actually fighting for with the commission to, or urging the Commission to actually look at it and see whether it would make sense, in our mind it would. Because it could potentially help discussion that we are currently having today, and we could come up with new solutions through multistakeholder dialogue that would actually help fighting against illegal content in a way that everybody would agree with.

>> MATTHIAS KETTEMANN: Thank you very much. Karmen?

>> KARMEN TURK: I will try to reply to the question about whether when we take human rights standards and then we take a look at the safe harbors that we currently have in e Commerce Directive whether the safe harbors are too wide and should be narrower. I think the best answer, I would steal the words of one of the panelists, Frank La Rue has once said in one of his reports that no one should be considered liable for the words that he or she has not said. So I firmly believe that safe harbors are not too large, not too foggy. They are just the way they should be. However, of course, there is something that could should be discussed from the human rights perspective as well because now we have that an intermediary is not liable until he gets the knowledge and then has to remove the content. We can talk about what this knowledge is, whether it is a knowledge after getting a notice, whether we would apply an Institute of willful blindness for example, being that very simple words that intermediary cannot close its eyes when it's evident that something is going on, so this is what we could discuss, but the safe harbors as such, I think they are there and they should be there, and we shouldn't even really seriously debate about losing them, because then we will not have walkways and driveways in the Internet anymore because there is not a single intermediary who would be stupid enough to have users going around in his platform doing whatever they want.

>> MATTHIAS KETTEMANN: Thank you very much. I would like now to offer the audience the chance to participate. There's a lady on the right side.

>> Hi, good afternoon. My name is Corinne Cath. I work for Article 19. We're an NGO that protects the right to expression online. I had the following question: So for us, it's been really interesting to be here for the last couple of days, also to hear Frank speak. He's really paved the way in terms of anchoring this debate on Internet Governance in human rights. And on that same note, I also wanted to mention the report by the most recent UN Special Rapporteur David Kaye that came out this Monday who focuses specifically on the role of private actors in the digital age. Now, one of the things that he mentions as being very important is doing human rights impact assessments, and not only for the big companies that we all speak of but also for standard developing organisations like the IETF or technical infrastructure providers like ICANN. This is something Article 19 has been engaged in for a long time working both at the IETF and at ICANN, where we've been reasonably well received and where the door is open for that, but some of the things I've been seeing in the debate over the past couple of days is that there's a lot of talk of more private public partnerships when it comes to solving these kind of issues we're faced with. A good example being the recent Code of Conduct that came out from the European Commission which is problematic to us. And so something I would be interested in hearing from the panel is that, yes, we need to have multistakeholder dialogue, and, yes, we need to do it in a way that it protects the rule of law, but how to do it when as a Civil Society actor, you're interested in having a seat at the table, but the door is closed in your face.

>> MATTHIAS KETTEMANN: Perhaps Maryant as being a representative from Civil Society? Was the door closed in your face? Or was it rather you refused to participate in a process which you see as problematic?

>> MARYANT FERNÁNDEZ: Yeah it was very problematic so first of all in the context of the IETF Forum and the Code of Conduct, the Commission had said several times in official communications it would involve Civil Society companies, and actually it would be to bring together Member States, the European Commission, companies and Civil Society. In practice however this did not happen. We got acknowledgment from the Commission that this was happening so what we did was to make a freedom of information request that were constantly delayed and we got a lot of refusal of access to get documents. So we conveniently actually filed a complaint before the European ombudsman because of this and they recently decided to open an inquiry on how the European Commission is handling this so another problem is when it comes to specific actions like the Code of Conduct we were aware of it because also filing an official statement. We asked for it, made a freedom of information request. We had meetings and they said they were not able to share the text with us because it was not a stable text. And this was like this for the whole process. The Commission announced the communication of the code even before it had an agreement with these companies and for us it's also like a bad negotiation practice and I fully agree and with Frank La Rue, if Code of Conduct need to exist to complement the action of States, it should be done with certain safeguards, and the first one, in my opinion, I would second Article 19 comments, is that issue should be based on a multistakeholder process and a transparent process, and that has not happened in this case, for example, so that's a lesson to be learned.

>> MATTHIAS KETTEMANN: Thank you very much. Would you like to comment?

>> FRANK LA RUE: Just very briefly, I agree with that, but also the one element I would like to add that I forgot to mention myself was that in all this process, whether it be the State regulation being applied or whether it be a Code of Conduct, the most important element is transparency. And I know it has been repeated, but the fact of the matter is people want to understand the logic, whether of legislation or of the Code of Conduct and what is the objectives and the goals being pursued and how it's being implemented and applied. This is not necessarily the case, even in the terms of service. The terms of service in many, in many corporations, can be changed without notification to the user. So I think we have to insist that the terms of service be much more understandable, shorter, briefer, clearer, accessible to everyone, but they should people should know if they're going to be modified and if there is going to be a Code of Conduct, open it to a discussion and debate for comments.

>> MATTHIAS KETTEMANN: Thank you very much. The particularly added value of any panel is that we bring together different views, and for the first time, we have also decided to bring in short, brief inputs from academics so you don't have to read whole books and whole publications. We will now provide you with three selected academics organised by our colleague Nicolo, Jef, Cristina and Felipe who will briefly tell you about their recent research results about the key issues that human rights, the key human rights which are engaged by intermediary behavior. Then we'll open up the floor to audience participation.

>> FELIPE ROMERO: I'm fell line Romero from a University. This is based on my doctoral thesis and is one of my findings. I don't know if you remember this graceful draft of the Act. In one of his special reports on the promotion and protection of the right of freedom of expression, Mr. La Rue mentioned that the monitoring being carried out by the private sector, in particular monitoring companies, could constitute a violation of Article 8 of the European Convention on human rights. There is no doubt that for example, intellectual property, in particular copyright and related rights, must be protected. We also know that this has obviously been expressly recognized in Article 1 of the first protocol of the European Convention on Human Rights, as well as Article 17.2 of the Charter of the Fundamental Rights of the European Union. However, we also know that copyright is not an absolute right and needs to be balanced against other fundamental rights, in particular, the right to privacy, as well as the protection of personal data. Remember Article 8 of the European human rights and Article 7 and 8 of the Charter. Either way, that there is an incompatible issue in the way that monitoring companies such as monitor in the states. Contravene Article 8 of the Convention is not up for debate. Basically, the solution to the privacy problem is not about combating Internet into paying customers. It's about offering customers what they want and need, but most importantly striking a fair balance between all the fundamental rights at stake. For this reason, in order for this specific type of monitoring to be compliant with Article 8, what I argue in my doctoral thesis is that this monitoring needs to be subject to a number of safeguards, in particular, the monitoring needs to be limited in scope, specifically existing. Court proceeding needs to be limited in time which means it should be conducted at only certain days or times, and it needs to be directed to a specific people. These are commercial, copyright infringers. Another finding of my dissertation, this is in order for the monitoring to be compliant with the principle of enforceability and the first problem of the court, this is in accordance with the law. The second finding would be that monitoring and the data that this investigator agents gather it needs to be destroyed, it needs to be secure, and the individuals having access to it needs to be singled out. Very important and this is key, the monitoring should require the involvement of state authorities. In this particular case I'm talking about courts as well as data protection authorities. In the U.K. it would be the information Commissioner's office. And my last finding of this particular chapter of my thesis is that monitoring should be targeted, which means that it should be initially trying a less invasive form of monitoring for example by targeting copyright offenders with that I thank you.

>> MATTHIAS KETTEMANN: Thank you very much.

>> CRISTINA ANGELOPULOS: Hello. Like Matthias already mentioned, my name is Cristina Angelopulos. I'd like to focus on the is it not loud enough?

>> MATTHIAS KETTEMANN: It is loud enough but a bit louder.

>> CRISTINA ANGELOPULOS: Okay. So what I'd like to focus on is the question of how to solve human rights conflicts and of course particularly with regard to the session how to solve human rights contexts pertain to intermediaries. In my own research, my own research is around the question of intermediary liability for copy right infringements committed by third parties. As many of you will already know and has already been mentioned in the panel earlier, the answer that has been given to this question by the European courts, by the highest European courts so by the CJU in Luxembourg at in Strasbourg has centered around the question of fair balance that was mentioned. The question of intermediary liability has been reinterpreted as a question of a conflict between opposing fundamental rights. Those fundamental rights have been decided to be of equal normative value, so neither can trump the other by definition and as a result a compromise solution must instead be found. From this perspective the pertinent question to solving questions of intermediary liability and human rights conflicts in general is that of locating the fair balance, how to locate a fair balance. So far the case law that has come out of the CJU has not been very helpful in this regard, what the CJU has done instead of simply examine the facts of each individual case and then declare that a fair balance has or has not been struck without giving us insight into how it reached that conclusion. Sometimes it doesn't go that far but simply declares a fair balance must be struck and lets the National courts figure it out by themselves, the ECHR on the other hand has been much more proactive. It has and this is helpful and this has to be acknowledged what the ECHR has done is managed to identify factors intended to guide us through the balancing process. Now, problems do remain in this regard however because those factors tend to be very mutable. They change from case to case so far, and also the mode of their application is entirely obscure so the result is arbitrary and the Delphi case, as was mentioned earlier is an excellent example of that. Nevertheless I think that the approach is the way to go. There are lessons to be learned, and other courts, and especially the legislators should pay attention to this. What we need and again this was mentioned earlier, what we need is clear fixed standards that can guide us through this balancing, if balancing is going to be our answer and delivered legal certainty for all stakeholders, for all involved parties, that we need to start developing benchmarks for balancing in order to be able ultimately to give concrete content to this notion of a fair balance and our solution to intermediary liability. Thank you.

>> MATTHIAS KETTEMANN: Thank you very much. The last contribution we open up again the audience for audience participation. Then we'll have some panel reactions.

>> JEF AUSLOOS: Thank you. So I'm Jef Ausloos from the University of Leuven. I come from a data protection point of view, so I've published a couple of papers with some colleagues on the interaction between data protection, freedom of expression and intermediary liability, and surprisingly, the right to be forgotten. So I think the main message I'd like to convey is that from at least definitely from a data protection perspective, granularity is key. So what I mean with that we should when we assess intermediaries and their liability platform responsibility, we need to dissect their different operations because increasingly they are performing all these different, wider and wider variety of operations on this information and makes it less and less sense to try to fit them into one legal category or the other. Some categories of operations can indeed be considered traditional intermediary activities, like storing or transferring information, but also, with regard to some activities they proactively direct flows of information. I'm just thinking of video hosting platform for example performing analytics on videos for facial recognition for actively directing and recommending videos to its users. And so with regard to these last activities, and only with regard to these last activities, they can and should be held responsible, definitely from a Data Protection perspective. So the fact that this happens on a huge and automated scale should not be an excuse to evade this responsibility. And so the Google Spanish case, of course, is a perfect example of this point. In this case, as many of you know I'm not going to repeat the facts here many different activities have actually affected the Spanish citizen and his fundamental rights. So the digitization and subsequent making available of all newspaper articles, as well as Google's indexation and subsequent linking that specific Web page to his name, and in the end as we all know, the Court focused on the very specific activity only of linking the name, the search term, with the search result, which is arguably a fairly proportionate manner emphasizing the fact that granularity is key. So in sum I think this granularity is key when taking when looking at platform responsibility because without it, you end up in an all or nothing debate that only suits extreme positions at both ends of the debate. Whereas reality is much more complex and nuanced.

>> MATTHIAS KETTEMANN: Okay, thank you very much. The case would show that sometimes intermediaries are given roles they've neither wanted nor can fulfill in a human rights sensitive way. We did approach the European Commission whether they wanted to be present on our panel, and they demurred saying the invitation is in flux and the norms are in development and that's why it might be difficult to present a clear picture. Are there any other members of the audience? I see there's a remote participant who wants to come in? Would you mind giving the Maryant, would you mind giving the

>> MATEI-EUGEN VASILE: There's another question from Sophie. Question for a panel. If you had to rewrite Article 14 of the e Commerce Directive what would you change?

>> MATTHIAS KETTEMANN: We're at the heart of wishing and dreaming which is good because that's actually my favorite part of being a lawyer, dreaming of how the law could be better but I think we can fold that question elegantly into that final round and in the final round I would like to ask the panelists to really quickly envisage a fantastic intermediary liability world. How would that world look like? And how should the stakeholders relate the each other? What should the norms look like. It's difficult to say that in a couple of words but in our final round, please shed some light on to how you would want the liability regimes of the future to look like. Let's start with Maryant and come over to my side.

>> MARYANT FERNÁNDEZ: Okay, my dream situation, I think I have not reflected on the dream situation so well. But I think I will make some short statement. Basically, I see that the dream situation would be that all actors involved and affected would be at the table. That the transparency mechanism is in place, that the rule of law is respected, and then also by the laws, I fully agree it's not only important to have laws but these laws are actually respectful with human rights standards. I also would like to, that the ideal system would involve review mechanisms that would involve an assessment of the evidence available, that would involve a clear problem identification, that would involve input assessments and I think the list is definitely non exhaustive and I would like the others to complete it.

>> MATTHIAS KETTEMANN: Thank you. I do realize it gets harder and harder as we approach but luckily we have an experienced diplomat at the end.

>> MARTIN HUSOVEC: So I would say some of the questions about Article 14, I have proposed several solutions in my recent paper "Accountable, Not Liable." I would say abandon passive versus active discussion completely. Equip Article 14 specifically with safeguards, which we have as we know growing support for this cross stakeholder groups so I don't see any problem there and not only introduce counter notice but also introduce a system where intermediary is incentivized to respect the notice by implementing inverted ADR system where basically you file a counter notice, the other side contest that you can easily go to some cheap ADR, and if the intermediary respects that ADR outcome, he's not facing any liability, in that case is a completely rational choice to comply with that, and to me it seems like a sufficient safeguard in Article 14.

>> MATTHIAS KETTEMANN: Thank you.

>> KARMEN TURK: Such a nice last question from Matthias and it's nice to be third already. I'm not even going to think what you're going to say but to be short, I think to say a dream situation, since we seem to be taking steps back and back and back so I would actually like to get back to where we were in 2004. That would be good enough actually, but I think so that would be also the reply to the Article 14 question a bit. But I think the dream situation, I think from the perspective that I started this, from the perspective of news platforms, I think I go back to the clarity in the meaning that if there would be just enough clarity so if a person is an intermediary, he would be able to find one lawyer left standing who would be able to give him a risk assessment of his risks. As long as there is at least one lawyer who can say I know the answer, I know what are your legal risks then I think they are good to go.

>> I have my own mic. Well, intermediary liability existed long before the Internet with Postal services, so I don't see any need to change the world now when we have the Internet, firstly. Secondly, and in the ideal world I would not open the e Commerce Directive or have it open yet.

>> MATTHIAS KETTEMANN: Thank you.

>> MAUD SACQUET: It's starting to be difficult to find new ideas. I think I'd like to recall the passive versus active thing. That's really something to be abandoned and we should focus on whether the intermediary has actual knowledge of the content or not and that should be kind of the only point that matter I think in this debate. Regarding Article 14, well, it's difficult to comment on that because I've been arguing for months that the e Commerce Directive should not be re opened so perhaps more practically I would perhaps look for more harmonization in the way it is actually implemented across the EU today because that's not the case and that's something that can be difficult to deal with, with intermediary cross borders. And the last point I want to recall as well is that safe harbors are, well perhaps not broad enough. We had this debate. I think there's no need to make them smaller than they are today, and they should actually be extended. Thank you.

>> I think as a general principle intermediary should not be held liable for content by others. We should expect from them that they undertake policies and practices in a transparent manner and communicate those to the users and wider stakeholders that they deal with. They should conduct human rights assessments to see what is offered. Offer the path to remedy to make sure that users that have been affected to have some chance to get a change in the company's practice on a particular issue. Thank you.

>> FRANK LA RUE: Well, I think I would agree basically with all my fellow panelists. Again, just to summarize, I think that there should be no intermediary liability for content that is not originated by them. Secondly there should be an important practice of looking into notice and take down when it is serious material, after receiving a notice, after being notified. This should be discussed with the State, and there should be a state regulation. Again, the State should not resign its responsibility. It can share it, it can discuss it with intermediaries but not transfer it to them. And this is very important to tell the courts by the way because the court decisions on the Delphi case and on the other case both transferred the responsibility entirely to the intermediaries. And finally I think is to make effective a total transparency policy and an active multistakeholder dialogue and decisions. I believe as I said before that the multistakeholder dialogue has to be an effective dialogue, not just sitting everyone at the table, having a simple consultation. There has to be a policy debate that actually can affect policies and can determine the future and I would encourage as Article 19 said, everyone to read the latest report of David Kaye, the current Rapporteur on freedom of expression which is precisely on the responsibility of intermediaries.

>> MATTHIAS KETTEMANN: Thank you very much. It exactly that kind of multistakeholder panel that we wanted to provide to you. Let me now come to the brief conclusions of our panel, which have been provided by Ana Gascon Marcen from the Council of Europe. She's responsible among other things for the Committee of expert on Internet intermediaries which is currently working on a recommendation to be finalized by 2017. Among the key outputs of this session we can count the following, that intermediaries have a crucial impact on how we exercise our human rights. Freedom of expression is the key enabling right for all other human rights in the online environment and we have to thank Frank La Rue for underlining that already in his early reports. At the same time, it is through intermediaries that we can exercise our human rights. They provide the space, but they cannot be the chief police as has been said of the Internet. It should not be their responsibility to be co opted into law enforcement roles by the State. They cannot substitute the responsibility by states to respect, to protect and implement the human rights of all people within their jurisdiction. It is not companies that should have the primary responsibility to protect human rights. It is primary duty of states to ensure that. Independent of that there's of course responsibility to protect human rights that is also incumbent upon private actors. It is also necessary to assess the impact of human rights laws on intermediaries, and to make sure these human rights impact assessments are following a clearly defined process. And finally, all stakeholders should be heard in the process of finding solutions to the key issues that intermediaries are grappling with today. I would like finally to thank our panel very much for their interventions. I would like to extend my thanks to the organisers of EuroDIG and of this session. I would like to express my thanks to our Remote Moderator, Matei Vasile, and our reporter. Thank you, audience, for your questions and for your support, and I wish you a wonderful conference and great last session. Thank you very much. [ Applause ] [ End of session ]

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.