Online content policies in Europe – where are we going? – PL 01 2010

From EuroDIG Wiki
Revision as of 13:51, 24 November 2020 by Eurodigwiki-edit (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

29 April 2010 | 16:45-18:30
Programme overview 2010

Session teaser

Key issues that could be discussed: Is it ever ‘right’ to block content? What procedural and other safeguards exist in European states to prevent disproportionate blocking of content? How are these safeguard applied in practice? Are the reasons for blocking content always transparent and justifiable? Which duties do the different actors involved in creating and publishing content online have? Which rights and duties should they have? What internet governance infrastructure is necessary to avoid that content blocked in one country is is not also unavailable in neighbouring countries where it might be considered legal?

People

Key Participants

  • Franziska Klopfer, Council of Europe
  • Nicholas Lansman, EuroISPA, European ISP Association
  • Giacomo Mazzone, European Broadcasting Union (EBU) Head of strategic audit
  • Meryem Marzouki, European Digital Rights (EDRI) & CNRS
  • Ženet Mujic (OSCE)
  • Vladimir Radunovic, Diplo-Foundation Coordinator, Internet Governance Programme
  • Maja Rakovic, Ministry of Culture of Serbia, Adviser
  • Jeroen Schokkenbroek, Head of the Human Rights Development Department, Directorate General of Human Rights and Legal Affairs, Council of Europe
  • Chris Sherwood, Director, Public Policy, Yahoo
  • Andrei Soldatov, Agentura.Ru, Journalist
  • Avniye Tansug, Lawyer, Senior Member of Cyber-Rights.Org.TR

Co-moderators

  • Nicholas Lansman, EuroISPA
  • Maja Rakovic, Ministry of Culture of Serbia

Rapporteur

  • Michael Truppe, Federal Chancellery Austria, Council of Europe

Remote participation moderator

  • Franziska Klopfer, Council of Europe

Key messages

There is no clear common and holistic strategy regarding the issues of liability for and blocking of Internet content.

It is increasingly unclear what “actual knowledge of illegal activity or information” is with regard to the liability of service providers. The overly-cautious behaviour of these providers can be in conflict with user’s freedom of expression. Users themselves are also increasingly being held liable for their online activities, particularly because of the criminalizing of copyright infringements. Concern was raised regarding the proportionality of the (legal) measures being introduced to deal with Internet content. The proportionality of any blocking measure vis-à-vis human rights was highlighted with reference to the need for a specific (legal) basis that makes it foreseeable (rule of law) while, on the other hand, procedural safeguards should be in place that allow users to question and challenge blocking measures.

Messages (extended)

The plenary was divided into two major parts: The first part dealt with the question of liability, namely who is responsible for what on the Internet. The second part covered the issue of blocking internet content by the internet industry (without informing the user), comprising both self regulatory regimes and mandatory regulation.

The following questions were asked: What direction is European content policy heading in? Is there a common direction? Is it the right direction, and if not, what should be changed and how?

What direction is European content policy heading in?

The discussion showed that with regard to liability of (not solely but in particular) internet service providers, the legal framework itself appears to be relatively stable. As a general tendency it can be noted that the role of the service providers becomes to some extent more complicated with regard to determining whether or not “qualified” actual knowledge of illegal content could result in a liability of the service provider exists in a given case. Some interventions also pointed out that users themselves are increasingly being held liable for their online activities, particularly through criminalizing copyright infringements. Even new sanctioning mechanisms are being introduced, such as the possibility to be cut off from internet access for a certain period of time.

With regard to blocking, reference was made to current legislative initiatives to block child pornography websites, while at the same time alternatives do exist and significant effort is being put in combating the problem at the source, namely by taking down websites. Standard-setting is also being conducted with regard to procedural safeguards and minimum requirements when applying blocking mechanisms.

Is there a common direction?

The discussion showed that to some extent a common policy direction exists at European level. With regard to the liability issue, the EU since 2000 has in place its Directive 2000/31/EC on electronic commerce, setting out detailed rules for the liability of providers of information society services. This legal framework seems to be quite stable and forms a broadly acknowledged basis for a balance of responsibilities still valid, in principle, today. With regard to the increasing tendency to hold users themselves liable for their online activity at the national level, no common strategy is obvious yet. In fact the measures being introduced at national level vary greatly from country to country, particularly with regard to the idea of using the cutting of internet access as a sanction for illegal online behaviour.

The same is valid also for the question of blocking: Some contributors referred to recent plans announced by the EU Commission to block online child pornography which suggests that the issue will probably remain on the European political agenda for quite some time. Representatives from Eastern European countries gave similar and even further reaching examples of blocking practices in their countries. Standard setting in this field has been conducted by the Council of Europe which presented its Recommendation (2008)6 on measures to promote the respect for freedom of expression and information with regard to Internet filters, that sets out minimum rules for the exercise of blocking measures either being conducted by state or private actors. On the national level however, the practices appear pretty inhomogeneous, ranging from a “no blocking at all” policy to quite extensive models.

Is it the right direction, and if not, what should be changed and how?

With regard to liability it was criticised that it is becoming increasingly unclear, what “actual knowledge of illegal activity or information” (which would lead to a liability of the service provider) means, in particular with regard to interactive user generated content (Web 2.0). Some participants claimed that it should not be the responsibility of the service provider to decide upon the legality of the content, but that independent courts need to be further involved. Some are afraid that increasing the liability of service providers could lead to “over cautious” behaviour conflicting with user’s freedom of expression. Several interventions also questioned the proportionality of sanctions versus users for illegal online activities, particularly with regard to criminalizing copyright infringements or cutting internet access of perpetrators.

As blocking is concerned, a number of participants questioned the approach in general, referring to other methods of combating illegal activities at the source of the problem, namely the host provider level. Some argued that in the vast majority of the cases a takedown of the content could be achieved within hours even in cross-border cases. In addition, very practical problems (such as the efficiency of blocking and the probability of “over-blocking”) need to be taken into consideration. Some in favour of blocking mechanisms referred to it as a “second best solution”: While taking down the content itself and hunting down the criminals should be the priority, blocking has proven to have a very measurable effect. There was general agreement that the key question to be solved is in any case how to ensure proportionality of any blocking measure in relation to related human rights, namely the freedom to receive and impart information. Reference was made to formal requirements that need to be observed: On the one hand, blocking does require a specific (legal) basis that makes it foreseeable (rule of law) while, on the other hand, procedural safeguards should be in place that allow users to question and challenge blocking measures, typically by way of a court decision. Several interventions also stressed the need to further work on the removal of administrative and practical barriers when confronted with cross-border cases both inside and outside Europe.

So where is Europe going with its online content policy? In these two specific fields of liability and blocking, one could conclude from the plenary that while a number of developments take place at the moment there is no clear common and holistic strategy that could be identified. It was emphasized that in order to combat criminal activities at source, there is a significant demand for improvements in international cooperation, particularly by creating efficient procedures and thus speeding up content takedown processes.

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com


This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.


>> Can we start with plenary 1, please. This is a plenary about online content policies.

>> NICHOLAS LANSMAN: I’m Nicholas Lansman and I’m from EuroISPA, the Internet trade body.

>> MAJA RAKOVIC: Before we start, I want to tell you about the structure of this. We don’t have any panelists because we would like you to be our panelists. We have contributors, though, and I’ll thank them for actually helping us with this discussion. That would be Meryem Marzouki, Dmitry Dubrovsky, Chris Sherwood from Yahoo, Andre Soldatoved. The idea is to have discussion from the audience as much as possible.

There are two issues. There are issues of reliability of online content. We need to talk about what content are we talking about.

>> NICHOLAS LANSMAN: Maja the areas that we will cover. Content is a complicated area. And we will look at types of content that we are looking at. Are we looking at criminal, or illicit, content that is unlawful but not illegal, and what about content that is harmful? These are areas of content. But also for this discussion, and we will have to look at defining things, there is a lot of terminology used interchangeably, which doesn’t help. So, we will have to deviuse a way of defining what blocking is. And so for this discussion, blocking is going to be where governments force Internet Service Providers at the network level to stop content being seen, and that content might be hosted in one country or any elsewhere in the world. So that is what blocking is.

For this discussion, filtering will be about choice. Filtering is going to be when individuals will decide to either use a service from their ISP or use software that they put on their own computers to choose what they do and don’t want to see. Also the other thing about content that is posted that can be identified by a ISP, and once they have been put on notice that content can be removed. Those are the three definitions. So if we stick to blocking being at the network level, when it’s forced by government or law or self regulation. And filtering is what users have the control over.

Okay, so, without further ado, we will show you – what we’re trying to do is stimulate the discussions. We have a couple of short films.

>> MAJA RAKOVIC: One small remark. We have a remote participation moderator. Franziska Klopfer. We have 7 hubs. So it would be good if hub participants can send questions and comments on the discussion that we are going to hear.

We also have a Rapporteur for this plenary session, we have Michael Truppe.

>> NICHOLAS LANSMAN: Michael, stand up and say hello.

>> MICHAEL TRUPPE: I will be summing up this discussion, and the expectation is to try to map the European content policy and to see whether there is a common direction. And if this direction exists, whether it’s the right direction. If not, what should be done? And we can start now with liability.

>> NICHOLAS LANSMAN: Before I was so rudely interrupted! Francisco will push a button. These are just to stimulate thoughts and ideas. You may agree or disagree with the films. And that is to create the debate.

(Video)

>> Three Google executives were convicted of privacy violations in Italy because they didn’t act quickly enough to pull down a YouTube video showing bullies abusing an autistic boy. It’s the first trial of its kind and it’s watched around the world due to the implications for Internet freedoms. Google called the decision “astonishing” and said it would appeal.

>> None of the employees had anything to do with this video. They didn’t upload it, review it, and yet they have been found guilty. So if this is left to stand, we believe that it would threaten the very freedom that the Internet has brought about.

>> The charges were sought by an advocacy group for people with Downs Syndrome. People alerted prosecutors to the 2006 video showing an autistic student being beaten in school. Google took down the video. Mark Hamric, the Associated Press.

>> MAJA RAKOVIC: Thank you. Now, this is just an intro. We will not be discussing this particular case, but this is just to illustrate what has been going on. And for this particular part of the session we will be speaking about liability. So what kind of duty do we have in terms of reporting or taking down content. Why are certain intermediaries considered libel but not others? Do they just have to do the police’s job? And how can we prevent the private sector having to act as becoming a state? Do safeguards exist? What kinds of sensors are being made by the private sector?

>> NICHOLAS LANSMAN: For this session we have prepared speakers and we will call on the audience and the remote participation. We will have people from industry, from charities, from sort of civil entities, NGOs. But I’d like to call on Chris Sherwood, who is here representing Yahoo. And without further ado, Chris, if you would like to –

>> NICHOLAS LANSMAN: Good point, Chris, we need microphones. This was all perfectly prepared earlier.

>> CHRIS SHERWOOD: Hello, yes. So my name is Chris Sherwood. I’m the public policy director for Yahoo in Brussels, which means I’m the Brussels lobbyist for Yahoo. And I want to take you briefly through some short background about Yahoo and tell you then with reference to the EU, legal framework. And I’m conscious that there are a lot of people in the room and outside the room who are not in countries which are part of the EU, but those are the countries where we are established. So I’ll deal with the EU framework.

Some of the things that we think are positive about the EU framework, for Web services, and some of the things which we think are challenging.

So, first of all, Yahoo is one of the world’s most popular Web destinations. Most of you will have heard of it, I hope. And we have operations in five countries in Europe, the UK, France, Germany, Italy and Spain.

In those countries, we have big Web sites with editorial content and a whole range of services in local languages. Across the rest of Europe, we do have what we call light sites, which are much lighter. There is not necessarily content in local language. And so those are less impressive.

We provide a range of services, most of them are cloud based services, information services such as news and shopping, and communication services, like e-mail and messenger, instant messaging. Social interaction services, like FLIKR, which is the photo sharing site, or Yahoo groups. And entertainment, such as video or games or music.

The EU framework for the Internet, for commercial activity on the Internet, is of course the e-commerce directive. And the intention at the time where the e-commerce directive was introduced was to set down an appropriate balance of rights and responsibilities between user, rights owner, creators and intermediaries.

And obviously fundamental rights are very important. We have a new Lisbon treaty in the EU and we think that that balance is still appropriate today. And one of the key aspects of the balance of rights, which was decided on in the adoption of the e-commerce directive, was to do this across the range of different services and across the range of different types of content and different types of liabilities that might exist.

So, there is a single regime for liabilities with very few exceptions across the EU and across the services. So that’s a key enabler, because it means that we have the ability, and in fact more so for small companies, we have the ability to create Web services for European users without having to deal with a vast range of different legal regimes for different types of services.

But we do have difficulty interpreting laws when it comes particularly to actual knowledge. So now we are dealing with content, which we might need to restrict access to or remove, and we need to do this. Our liability limitation no longer exists if we have actual knowledge of this content being on our sites, because of course much of the content on our sites is user generated.

And if we don’t know about these things, obviously we can’t be liable, but if we know, if we have actual knowledge, then we are liable. But how do you define actual knowledge? And, in fact, the legal framework today doesn’t define it sufficiently clearly.

The problem that that creates is that as a Web service provider, you are not incentivized to pick the low hanging fruit, to help out with voluntary measures, such as maybe random monitoring of user generated content, which we might theoretically be able to do. But if you do that, you then open yourself up to liability for a whole range of different activities.

So if there was clarity about what actual knowledge meant, and about giving us a safe harbor, IE, if we act in good faith based upon the knowledge that we have, it would make it much easier for us to carry out some of the voluntary measures to help avoid harmful content or illegal content online.

And of course the other fundamental issue is, and I’ll come back to this briefly, is that if you begin to ask ISPs or Web service providers to make decisions about what is illegal or to make any interpretations, you then go down a road in terms of democracy and fundamental rights which you might not want to go down. We tend to think that courts and judicial authorities are the ones who should be making judgments about these things.

>> NICHOLAS LANSMAN: Can I interrupt you just for a second to clarify the point? Now we had in the EU, and I appreciate there are many people here who are not from the EU but from other European countries, but the e-commerce directive and regulations that you’re talking about have been in place for almost ten years. Are they working or do we need to do something different to protect businesses so they can get on giving the services?

>> CHRIS SHERWOOD: It’s essential in order to provide the legal certainty which companies need to give a range of services across the Web. And to do so in, you know, in safety from being prosecuted from all sides for all sorts of things which are not done by them. So that is essential.

The problem is the way that is implemented in national law, and the way it’s interpreted by courts and by rights holders or by NGOs or whoever, is attacking us for starting to –

>> NICHOLAS LANSMAN: It sounds like we need a bit more clarity. The law exists, the structure is there, but it’s the interpretation by courts and NGOs and so forth that might still be the problem.

>> CHRIS SHERWOOD: The fundamentals are right, but we are not helping by some of the details and how it’s implemented.

>> NICHOLAS LANSMAN: Thank you very much.

>> MAJA RAKOVIC: Do you have any concrete examples that you could point to or perhaps someone from the audience regarding national practices and the way it’s being implemented?

>> CHRIS SHERWOOD: Sure. So, we have had, for example, some court injunctions in the UK for defamation, which we think were clearly in breech of Article 15 of the e-commerce directive, which is the liability provisions. So we had to go back to the court at our own expense to ask for those decisions to be changed.

And in the meantime, we’re in contempt of court if we don’t act on the court’s injunction. So, we’re in a very tricky legal position.

>> NICHOLAS LANSMAN: Thank you very much, Chris.

Are there any comments from the audience? Does everyone agree that the current structure, legally, works? Does anyone have a radical view? I think the microphone is coming down to you now. If you could just introduce yourselves. Say who you are and where you’re from.

>> AUDIENCE: Well, I’m Michael Rotert from EuroISPA as well.

>> NICHOLAS LANSMAN: I’ve never seen you before in my life!

>> MICHAEL ROTERT: But in this case, from the German ISP Association. There are elements which point in a totally different direction of what you just said, you were just mentioning. There are guidelines from the Netherlands saying the parties involved are also free to decide for themselves which information is considered undesirable, irrespective of the question of it being in conflict with the law and so on and so forth. And there is also a new recommendation for public/private cooperation. It’s a draft currently from the European Commission to counter the dissemination of illegal content within the European Union.

And it says notification by law enforcement authorities, complaint hot line or other body duly authorized on the national law to monitor Internet content. They also cover notification by citizens, which means you have knowledge no matter what you’re doing if these recommendations are coming through in this format. Thank you.

>> NICHOLAS LANSMAN: thank you, Michael.

Anymore comments from the audience or anyone want to ask a question?

Maja, have we got any remote text coming in?

>> MAJA RAKOVIC: Nothing specifically so far.

>> NICHOLAS LANSMAN: We have to be more provocative to encourage people from around the world to chip in. Question in the front?

>> JEROEN SCHOKKENBROEK: Just a question. In European law, generally, it is not unique, the case you mentioned, the terms used in directives are not totally clear. And there is disparity in the implementation of directives across the EU 27 Member States.

The traditional, the normal route for lawyers, in order to establish more clarity, is to go to court and if necessary obtain a preliminary ruling from the court of justice in Luxenbourg. Is there any attempt? Do ISPs go to court when faced with such restrictions? And is there any case pending at the moment? That’s my question.

>> NICHOLAS LANSMAN: Chris, have you been to court recently?

>> CHRIS SHERWOOD: Not personally!

I think obviously going to court is an expensive and time consuming business. It involves teams of lawyers, both in-house lawyers and outside counsel. So, you know, we prefer to – what we like better is a clear legal situation.

Another example I thought would be interesting is of course that in the US, with the – I think it’s the DMCA, Digital Millennium Copyright Act, my understanding is – and my apologies to those more expert than I in the room – my understanding is that people who come to ISPs or Web service providers with a request to take down allegedly illegal content actually have themselves then the liability for a false claim; a claim which turns out to be incorrect. You asked for a picture to be removed because it’s infringing copyright, and actually your claim that it’s infringing copyright is incorrect, and you have that liability rather than Yahoo. And that makes it much easier for us to help the rights holders to do work. And it’s the same situation. It’s analogous to any situations you could have in other areas outside of intellectual property rights.

And so if we could have a safe harbor for acting in good faith and if the people making requests of us could take their share of the liability, things would move much more smoothly.

>> NICHOLAS LANSMAN: Very good. Thank you.

>> AUDIENCE: I’m from Iceland. I would like to comment on something.

>> AUDIENCE: Thank you very much. Yes, my name is Elfa Yr Glyfadottir, coming from the Ministry of Culture in Iceland. I think this is quite provocative that is being said here. I’m from the Council of Europe perspective, where I’m also representing Iceland. What is being stated here is that we are facing an enormous Chilling effect in all Europe. Which means that if we are threatened by going to court in every case that can come up, and we know just like if Yahoo thinks it’s very expensive, what about everyone else? It has often been said that of course justice is like a five star hotel. It’s open for everyone, but very few can afford it.

So the question here is what direction are we going?

>> NICHOLAS LANSMAN: So, what direction are we going? Chris Wilkinson is formerly from the European Commission, now from ISOC. Give us your view. What direction are we going in?

>> CHRISTOPHER WILKINSON: Thank you, Nick. I’m speaking personally because the Internet Society has not yet gotten into the details of this whole debate, but I’ve been following it for more than ten years.

I must say that I’m not sympathetic to the direction of increasing ISP liability for information that transits, for several reason, some of which have already been mentioned. But I think it results in privatizing of police function. It creates – it encourages the creation of administrative penalties, rather than judicial penalties. And it, in effect, bypasses the citizens’ rights to legal protection.

Needless to say, as a result, I’m not very sympathetic to the potential consequences of the recent UK digital law and the hydopic approach.

I think there is a qualitative difference between criminal content in terms of terrorism, financial fraud, pedophilia and so on. I think authorities do cooperate with law enforcement whenever necessary. Copyright infringement ought to remain a civil offense. And the attempts through – in Europe and internationally to criminalize copyrights, first of all, has a serious distortion on the results and serious distortion in what the police priorities actually have. The police have got far more important things to do than to pursue minor infringements of copyright.

We have known for more than ten years that this problem was going to come up. Digital media have existed. We have been expecting it. And nevertheless, a number of measures to circumscribe the problems have not yet been taken, particularly by the industry. The industry does need an agreed encryption standard for protecting digital material of too much digital material, which the industry claims is protected, is in effect – is in fact cast upon the waters of the Internet without any protection whatsoever.

We need new business models. And we have just had a discussion in the previous working group as to what kind of business models might emerge, in general, and particularly in the publishing industry.

And we must have a legal form of digital downloading, which is sufficiently well advertised and available that the general public and particularly the younger generations can use it and can afford to use it.

This, in my experience, though my experience is limited, I don’t think I ever succeeded in downloading a music track, but in my experience –

>> NICHOLAS LANSMAN: But you tried, Chris?

>> CHRIS: It was out of copyright. So that’s where I come out in this stage of the debate.

>> NICHOLAS LANSMAN: Thank you, Chris. Now we have a question that has come from from afar.

>> MAJA RAKOVIC: Strasbourg. Not too far, but –

>> NICHOLAS LANSMAN: If you just want to read out the question.

>> MAJA RAKOVIC: There was a short discussion about the Hydopic law. Someone asked what the people here think of this law, and especially whether they think it’s proportioned, given that people are – have access and to their connections being cut.

>> So the question is to anybody, not to specific people. Janet Moich from the OIC.

>> I’d like to answer the question with a question. Would it be right to end the subscription of a newspaper for someone who has nixed three books in a store or is denied access to a library? Everyone should have the choice of how he or she receives information. And if you disconnect the Internet access for illegally downloading files that would equal, really, stealing DVDs in a store and then having your TV set taken away.

>> NICHOLAS LANSMAN: Can I ask? I was also in workshop 3 and I know there was passionate rightsholders there who are really at the end of their Tether about protecting their content. Are there any rightsholders who would like to comment? Just introduce yourself. Giacomo Mazzone.

>> GIACOMO MAZZONE: The approach is different from the simple copyright owner. The mission, for instance, for the public broadcasting is to bring content that we believe are of relevance to our audience, that are citizens of the country to which we want to communicate with.

So, for us, there is a point where the – there is a penetration at the interest of giving the maximum circulation of the content with the natural respect of the copyright owners.

So, I have not a so strong approach like everyone has to be punished as stealing a book in a bookstore. It’s quite different. Of course it could be different if I had a major novel. I could understand.

>> NICHOLAS LANSMAN: The reason –

>> It’s a different point of view, even in the copyright domain.

>> NICHOLAS LANSMAN: Yes, there are different points. And that’s the problem with the definition of what is illegal and what is just unlawful. And copyright infringement is unlawful. It’s not criminal except if you take content and then resell it yourself, and then it is criminal.

But I think we have talked, and we’re moving now into what else do you do? We talked about liability. Now I’ll hand it over to Maja to move to the next section.

>> MAJA RAKOVIC: I think we should look at the video, which will – there is one point of view on blocking in this video. So this is just to discuss whether this approach is okay or you think it’s completely wrong or you think it’s actually very good. So, can we see the video, please.

(Video)

>> If you go online nowadays everywhere you look there are Web sites full of child pornography. Pedophiles don’t share a safe channel. No, they use easily accessible public Web sites. But we have a solution. We are going to put a big warning sign on every bad Web site. With this, we’re following a tactic that has been tried and tested in the real world for years. Just look away. Of course, some critics say we should actually delete the sites. Just like banks all over the world remove fraudulent phishing sites within a few hours. But most child porn Web sites are hosted in very poor countries. They just don’t have the resources. Requesting deletion here would be very difficult.

And blocking sites has another advantage. The organization behind the site gets an immediate warning.

This will help save the resources of police forces and the courts, because the criminals themselves will help out by removing their illegal images.

And there is another advantage to creating our own list of links to illegal Web sites. Eventually, even the PITA files that disabled our blocking system will find out which sites are definitely not allowed.

Of course, it will cost a lot of money to set up our system. Money that some people say could be invested in prevention or in therapy to help the victims of child pornography. But remember our multifunctional blocking system will do much more. It will be extended to help people around the world who are victims of gambling, terrorism, copyright infringement, criticism, political opposition, freedom of speech and democracy.

Even China and North Korea and Iran are using this successful Web site blocking system to protect their freedom and democracy. Let’s follow them into the future for a cleaner and safer Internet. Cleanternet for a cleaner and safer Internet, Cleanternet.org.

>> NICHOLAS LANSMAN: We thought you would like that. We have to say a warning that it’s not EuroDIG or Maja herself to condone that approach. But it’s to stimulate discussion.

What we will try to do in the next session is that there are certain countries that are look at blocking and some countries that implemented it. And what we’re going to look at in this section is how, if you do go ahead with blocking, how you do it in a way that is proportionate, that has safeguards and that is transparent, so either are a way of what is being done.

>> MAJA RAKOVIC: Well, obviously, we might put this blocking arguments pro or con in some historical perspective and background. Because there are a lot of arguments that are new. Technology is new. But arguments are not new, they are old. But maybe Meryem Marzouki from the European Digital Rights will tell us more about that.

>> MERYEM MARZOUKI: Thank you. Yes, I’m getting old, that’s the reason why probably the moderators are asking me to give some history kind of background.

I’m representing European Digital Rights, which is a European umbrella organization, which is almost 30 members from organizations from 20 different European countries. And our main objective is the defense of all human rights in the digital environment.

This video has said it all. What I found really good with the IGF, the European IGF, the global IGF, is that it brings new people in the discussion. But, we are discussing very, very old issues.

The European Commission has proposed exactly one month ago, on March 29th, a new draft directive to fight against child abuse, including child pornography on the Internet.

And this proposal has a provision on blocking content, which means forbidding access to this content, whenever it is hosted as defined by the moderator at the beginning of this session.

So, this is yet another proposal or instrument to fight child pornography. Why we have – why we have different instruments, including the Council of Europe Convention, including corporations, which is really working among different law enforcement authorities, Europol is taking care of this, and we have various instruments.

But just to give you an idea of the background, the first time we heard in Europe about blocking or attempts of blocking, it was in 1995. I don’t know if anyone remembers an ISP called CompuServ, and if anyone remembers the representative of Compuserv? And the head refused to give access to child pornography. This was in 1995. From 1996 to 1991, in France, two representatives of all the ISPs were under prosecution, under criminal prosecution for the same thing, for giving access to – allegedly giving access to child pornography through Internet news groups. So this is new news.

And since then, since the mid ’90s, we have now quite a big corpus of legislation, regulatory instruments, at the European level, and I cannot agree with the Yahoo representative who says that this – the directive provides legal security for ISPs. This is not true, even for you ISPs.

And, in addition, this is what we call, us and – this what is we call NGO digital rights. This is what we call privatized censorship. Because when you go to court, you have a lot of safeguards. You have a right to remedies and you have a process. When there is a notification made to a European ISP, what the ISP will – which is a private company in general – it will simply evaluate what would be less costly for his company. I mean, to take down the Web site or to say no, this is not an illegal content and I will leave this content.

So, this is not legal security for ISPs, for intermediaries in general, and this is not protection of human rights, especially the right to freedom of expression, freedom of information, also the right to privacy sometimes for citizens.

Now, we have this proposal at the European Union level of the directive. It is the hottest issue on the table. And we have put together a list of questions that we want to ask with regards to this initiative. I’m not going to list all of these questions. This will be on our Web site. But, first of all, the main question is: What is this new instrument for? Because we have other instruments, and we know that in the countries where this kind of blocking has already been implemented, and this is not China. This is not Iran. This is not Tunisia. This is the most Democratic country in Europe. This is Scandinavia. Blocking has been implemented in Denmark. It has been implemented in Sweden, in Norway. And in Denmark in 2005 there has been a case, a court case, a Web site called Desire Detokue was blocked.

And the decision was made by the police and the action was taken by the ISP and the site was not illegal at all.

So, these are the – one of the many problems with the blocking. And I hope we will discuss more.

>> NICHOLAS LANSMAN: You mentioned collateral damage. I’ll call on Jonka, here as the European NGO alliance for child safety. I’ll ask you to tell us what is happening in the UK.

>> Well, in the UK, of course, if illegal child abuse images are detected as being present on any server within the UK, a notice will be served upon the hosting entity, the ISP or whoever it might be, and there has not been a case yet that I’m aware of where that material, where the image has not been deleted and removed from the server within 24 hours. And broadly speaking, within the UK, at any rate, 24 hours is considered to be a reasonable amount of time, a reasonable amount of notice to give to an ISP company to remove content which is determined as illegal. The problem however is the vast majority of illegal child abuse images, 99.5 or 6 or 7 percent are not being hosted within the UK. And over many years, and I’ve been involved in this area now for 15 years, since 1995, despite notices being served to the corresponding ISPs overseas via the police service in those countries, the images have stayed up for months, sometimes even years, even after the police as I said in the other – the relevant authorities of those countries have been notified of their existence.

And that’s in why 2004, BTE originally developed assist tomorrow for blocking access to sites, wherever they happen to be in the world, that contained images which were identified as being illegal.

Now, from our point of view as a child protection agency, this is very definitely a second best. Any way you look at it, what we want is for the images to be deleted, to be removed, for the people who have published them to be investigated, caught, prosecuted, for the children in the images to be identified and rescued. We want all of those things as part of what should be done in every single case.

>> NICHOLAS LANSMAN: Where are the images mainly? You talked about them being in other countries. Do you have any idea, can you tell the audience where the images are?

>> AUDIENCE: About roughly half of the latest figures, they are coming out tomorrow, we will see from the British agency that I’m speaking about, the Watch Foundation, roughly 50 percent are housed in the USA and the other 50 percent come from other countries around the world, including different parts of Europe.

The IWF is a private agency, however it has been held that every action and decision which it takes is subject to judicial scrutiny. So if anybody disagrees with a decision which the IWF might take, that decision is capable of being legally reviewed by a court. In that sense, they’re a quasi judicial body. They also have an independent panel of experts, which includes at least one person from the free speech, civil liberties community, that scrutinizes the policy, the procedure, the practices.

This is an entirely independent agency, an independent group, rather, that looks at how the IWF works, so as to reassure people that the list and the way the list is produced is not being skewed to include anything which ought not to be there.

And if I can just take up the point that was made about the – that case in Denmark, nobody that I know is in favor of innocent sites being blocked. Absolutely not. It discredits and undermines the good things that these types of policies can do. But from a child protection point of view, my last three points, think of it this way: Think of it first and foremost in relation to the children being depicted in the image. Think of it as a violation of that child’s right to privacy. If you don’t think of it in any other way, think of it as a gross violation of that child’s right to privacy. That child being depicted in the image is being raped or sexually abused without their consent. They cannot consent to the action neither can they consent to the image. If you don’t think of it in any other way, think of it as a protective measure of that child’s rights to privacy.

Think of it also as a way of preventing – there are statistics and data which I can give you ad nauseam which show that people who get involved in collecting and downloading these images, a higher proportion of these people will go on to commit offenses themselves. That is an extra reason why it’s very important to try to stop the images continuing to be displayed.

These are the reasons why we support filtering and blocking. We much rather that there is no need for it. But after 15 years of being involved in this area of work, nothing else is working. Nothing else is having an effect. This is having an effect. That’s why we support it.

>> NICHOLAS LANSMAN: Thank you, John.

>> MAJA RAKOVIC: Just one quick point to respond.

>> AUDIENCE: Thank you. I’m sorry, just one quick word in response. For 15 years now, digital rights advocates more or less are considered as people who don’t really find that child abuse is a crime. I think and we all think in this room that child abuse is a crime. Not only a crime, a horrible crime.

The only difference between us is by what mean can we fight this? And I can assure you that the association of victims, children that were abused, are against blocking. So they are the victims and they are against blocking. So let’s talk about the different means. Let’s talk about the efficiency. Let’s talk about their collateral damages. But let’s not talk about whether child abuse is a crime or not. It is a crime for all of us.

>> NICHOLAS LANSMAN: Thank you very much. I’m just going to say we have a lot of people who want to comment. Can you keep your comments brief? Name, what your job is, and then quick comments. So over there.

>> AUDIENCE: Hello? My name is Patitsa Rodriquez, I’m an international rights director of the Electronic Frontier Foundation and I’ll make a specific comment about something that happened to my own organization.

In the UK, there are some supports to low income families for the home access program.

In the UK, the laptop is supposed to be used for education and creativity. Well, recently, we are in – we have received comments from a few families where our Web site, Electronic Frontier Foundation, were labored and blocked, and there are hackers and other trades labels, which I thought it was fun, but actually it was not. And this is the kind of concern. Because what we talk about in our Web site, it’s about creation and innovation, various rights. And the rights of how you could exercise your rights. So I just want to make this comment.

>> NICHOLAS LANSMAN: Thank you very much. We have a comment in the back. One there. And then one there. We have lots of hands going on up. So make your point quickly.

>> AUDIENCE: This is Nadine from the European News Forum. I would like to go back to one assumption that we had a couple minutes ago, like linking the illegal download mainly to young people. And I do think that illegal download is a decision everyone individually is taking on his or her own. So it’s not linked to young people, not linked to elderly. It’s not linked to any kind of age.

And so it’s the same decision you do as you do what kind of shoes you wear in the morning and what kind of food you eat for lunch. So, it’s a decision for an individual, not linked to youth.

And what I want to say is that the few people, the few young people who do these illegal downloads, I think one day when they earn the same money as you folks, I think they will pay for it and this is not the main concern in that regard. Thank you.

>> NICHOLAS LANSMAN: Thank you. Say your name.

>> AUDIENCE: Andreas from European Digital Rights. I’d like to come back to the argument that it takes a lot of time to take down child pornography images overseas. I had the opportunity last year to talk with a representative of the Austrian police who is in charge of fighting child pornography. And he explained that the – their problem that they want to solve with the Internet blocking is that it takes three weeks to communicate with the US authorities and to get something taken from the net there. Illegal content. So, the bureaucracy between Austria and the United States of America takes three weeks to actually get action there.

On the other hand, if NGOs, send e-mails to organizations overseas, then there is experience that this content is illegal, it is removed within a few hours.

So I think the problem that we need to solve is not necessarily how to block images or how to block access, but how to make efficient procedures of communication between law enforcement authorities and how to enable the police forces to communicate with each other and to have quick conversations.

And so I think this would be the more effective thing than blocking images. Thank you.

>> NICHOLAS LANSMAN: Thank you.

>> MAJA RAKOVIC: Thank you.

>> AUDIENCE: I’m Wolfgang. I’m teaching at the University of Aarhus and my argument goes in the same direction. This is a big challenge for the improvement of intergovernmental action, because this is mainly a challenge for governments to speed up their processes. You know, if there are so many illegal content hosted in Democratic countries, why is this continuing? And you know I can report a case in Germany where the law enforcement wanted to go to find out, you know, where the money is going. So as they figure out certain services, which you had to pay 29.99 Euros for certain services, and they worked together with Visa and Master Card, because you had to pay by credit card, and they ended up around – for one or two months with 150,000 transactions, and around 20,000 were related to one single bank account, where the money for this service went to. And they figured out this was a bank account in the Philippines. And there is no agreement between the government of Germany and the government of Philippines to deliver the data. So, it means – they got, you know, on the first right steps. They found where the money is. But who is behind the bank account was unable to get because there is a low level of intergovernmental relationship. And if we talk about the role of governments, I think in this field, there is a huge improvement that governments do their homework and work together and not try to send problems to other nongovernmental stakeholders. It’s – it’s to the benefits of governments to respect the law.

>> MAJA RAKOVIC: What is the Council of Europe’s view on this?

>> Thank you. I’d like to make a comment. The question of content that is illegal, there are – that is related to the substance of our discussion. There are, of course, European consensuses on a whole range of issues. We’re not only talking about child pornography, we’re talking about, for example, insight to terrorism. And we’re talking about – we will soon be talking about the sale of counterfeit medicine on the Internet. We are finalizing a council Convention which will criminalize the offer to fly through the Internet counterfeit medicine. The interest there is public health. All of this terrorism is to prevent disorder and to protect the rights of others, and child pornography is to protect the rights and dignity of children.

These are all legitimate motives for restricting freedom of expression in general, and that is not different on the Internet.

Now, coming to the specific question of rocking, again, the human rights framework and freedom of expression. Of course such measures affect freedom of expression and freedom to receive information. It affects both freedoms. And any such measures are restrictive, and therefore they should be exceptional. They should be justified for one of the legitimate aims such as the ones that I mentioned, which are listed in the European Convention. And they must be prescribed by law.

I think this is a very important requirement. And many of the problems that we have been hearing this afternoon are about law issues. The European court has said that the law providing for such restrictions has to be precise. It has to be foreseeable. It has to be specific. And part of the problems I think that ISPs, for example, face, is that laws may not be very precise. And if the laws are not precise, they can be absolutely killing in terms of the freedom of expression because the risk of abuse is huge.

We have tried to set out more specifically these requirements in a recommendation of 2008 to governments of the councilmember states, actually dealing with blocking; saying that any such forbidding of access to specific Internet content will restrict freedom of expression and information, requiring that such measures must respect the European Convention requirements, and such action by the states should only be taken if the filtering or blocking concerns specific and clearly identifiable content. And where a competent national authority has taken the decision on its illegalities, and provided the decision can be reviewed by an independent and impartial tribunal or regulatory body in accordance with the Article 6 of the European Convention. The safeguards are absolutely key to this.

These are guidelines addressed to Member States, where public authorities block. What about the whole private sector that is obviously a partner in crime, in this?

I would draw – we have no clear case call in the Council of Europe, but there is an interesting Swedish case about satellite reception. Someone wanted to mount a satellite dish and the landlord said it was not acceptable because of the tenancy agreement and the local regulations.

There was a dispute before the courts, civil courts, in Sweden. And the couple was unsuccessful. They went to the Strasbourg court. The Strasbourg court said freedom of expression and the right to receive information is not only a negative duty for the state, not to interfere, but it contains a positive obligation for the state to actively protect these freedoms. And they found – Sweden had said this is nothing to do with Swedish authority. It was a private dispute between the landlord and the tenant. But the courts said no, it was Swedish law that made it lawful for the landlord to act in the way he did. It was the Swedish civil courts that upheld the tenancy agreement that didn’t allow the satellite dish. And, therefore, Sweden is responsible under the European Convention.

Now, if you apply that logic to measures taken by private parties restricting access to content, it’s absolutely clear this this places obligations on states to make sure that their national law is not disproportionate. That there is not room for overbroad application. That the circumstances for filtering and blocking are not really defined.

>> NICHOLAS LANSMAN: Well, those were some safeguards from the Council of Europe. Michael, from ISPA, does that give you comfort there?

>>MICHAEL: A little bit. But again, we have seen in the movie that banks are able to delete phishing sites within hours. We tried the same stuff from the German association for notice and take down by talking directly to ISPs everywhere in the world and we had the same experience. They can be deleted within hours.

And I still believe that deleting these pictures is a much better protection for the victims than blocking, which means just not seeing it, would be.

And therefore this would be – goes to the direction of child pornography. But as the Council of Europe said, there are lots of other things and who is the one putting these – putting terrorism in a list to block? Who will decide what terrorism is for me or for someone else in any other country?

>> NICHOLAS LANSMAN: Well, it won’t be the ISPs deciding what is terrorism.

>> MICHAEL: I mean techies to decide what is terrorism, never, ever.

>> NICHOLAS LANSMAN: There are so many people that want to speak. Keep the interventions as brief as you can.

>> AUDIENCE: I’m with the United Services Union. I just wanted to support the idea. I don’t understand if you know the URL, why should you not delete? I just cannot accept this. We’re talking about Democratic states. I cannot understand blocking instead of deleting.

And then there is one other issue, you said national authorities should define the list. And I don’t understand why do you want to bypass the court? I don’t understand it.

So if you open the door to say that the police can make a list, some national authority, and I don’t think that this is possible in a Democratic state.

>> MAJA RAKOVIC: Thank you. Now, what are certain problems that need further regulation or that are taken into consideration? For instance, there are some countries, could you tell us something about Nordic countries, what content are you actually considering full blocking and filtering?

>> AUDIENCE: Yes. Thank you. Yes, I think I will take the floor maybe to explain what is actually going on and the reasoning in the Nordic country, since this was made public here. And I think what is at stake here is that some of you who maybe have come to the Scandinavian countries, there are a lot of public health issues. And this is a very big point, meaning that alcohol advertising is forbidden, marketing towards children, and so forth. There is also what is called responsible gambling and gaming, which means that there are national monopolies, and then the money that is gained is also part of that going into the health system because of obviously that there are gambling issues involved here, or addicts who have issues.

And as governments, we are faced with demands all the time, cyber bullying Web sites, child pornography, copyright content, and these gambling sites. And what has happened, for example, in Iceland and this is the Ministry of Justice, they are copying now a modeling that is actually taken from Norway, where to prevent mostly UK based Web sites, where there is a completely different view on public health issues where gambling and gaming is concerned, there is actually the blocking, not blocking the sites, but the credit cards that are used on these Web sites.

So this is a law that is now in parliament, as I said, already has been done in Norway. And the idea is then that it’s taken – it’s a little bit more difficult for those who have difficulties with gambling to get a credit card elsewhere, not in your own country.

So, this is some of the initiatives in blocking, dealing with public health.

>> NICHOLAS LANSMAN: Thank you very much for that. So we have got some interesting developments in Scandinavia. Now, we have two people now here who came from a distance. I’ve got Andre Soldatov is a journalist in Russia. And I’ve got Avniye Tansug, who has come a long way. So we will take those interventions and comments from the audience.

>> Andre Soldatov: I’m the editor of a Web site. It’s about mostly social services and terrorism issues. Seven years ago, it was difficult to understand that this was a system of Internet blocking. But now this system is fully operational. And this process began in 2007, in July 2007, it occurred abuse in Siberia, local ISPs, to close down access to four Web sites operated by (inaudible) and this decision was made operational only because of new counter extremism operations approved in 2000. All four Web sites, they produced a list. And this list was approved by Moscow.

But the problem is not the Web sites, but some specific materials published on these Web sites. So the problem – there was a decision to close down the access to these materials that might be illegal, but technically it’s impossible to block only these stories or Articles or video clips.

So you have to block the Web site as a whole. And these might be more controversial.

And I have to say that it’s more about censorship than protection of the population from terrorist propaganda.

I should give you some context here, to explain this issue. The most part of my career I covered terrorist attacks. And I should say for the last six or eight years, the Kremlin did it best to prevent journalists to get information. For example, according to – there is a law on contra-terrorism adopted in 2006. Every village might be claimed as an area of counter-terrorists. And it’s impossible for journalists to get there. And I’m not talking about terrorists, but I’m talking about victims. So for us, as a journalist, Web sites of terrorists might sometimes be a good source of information. And this is exactly what happened after the recent attack in the Moscow metro, which two trains exploded simultaneously in March. From the beginning, some correspondents started to circulate that these attacks might be carried out by Russian secret services.

And these stories circulated until there was a special statement posted on one of the Web sites, claiming their responsibility. And this came more clear.

I have to say that this practice has expanded in the year, and now ISPs of Russian are encouraged by authorities to close down access to a Web site on their own. On April 18, the Russian state communication agency, Vascom Mazure, spent a special letter to some of the ISPs, asking them to close down access to some Web sites by their own. In this situation, the Internet Service Providers who are keen cooperatives of government started to shut down some Web sites. But, for example, some oppositional Web sites. And it became a very big problem for freedom of the press.

And I should say that another problem is a problem about forums. Most of the biggest oppositional Web sites in Russia have their own forum and they are slightly regulated. The problem is now under the new law, you can post sensitive stuff on the forums and these might be used as ways to close down the Web sites as a whole. And Visas became a reason for the oppositional Web sites to close down forums.

>> NICHOLAS LANSMAN: I’ll hand it over to Avniye. That speaker was Andre and now I’m going to Avniey Tansug.

>> AVNIYE TANSUG: Good afternoon. I’m representing cyber rights, the Turkey branch. And this is a nonprofit organization, established to protect the interests of all honest law-abiding Internet users.

So, last January the OAC representative on freedom of the media asked the Turkish authorities to bring the current Internet law with the OACE commitments and other international standards on freedom of expression.

A report of the OAC, which you can find already on the EuroDIG Web sites, the report on Turkey and Internet censorship documented that 3,700 Internet Web sites are blocked from Turkey under the current Internet law.

The law enabled the courts to issue judicial blocking orders, and also enabled the telecommunications presidency, shortly TIB, to issue administrative blocking orders ex officio, according to a set of crimes. And then a significant number of foreign Web sites and also Web sites in Turkish or addressing Turkey-related issues has been subject to court orders since the Internet law came into force. This is mainly prevalent in new sites dealing with south eastern Turkey or for two Web sites which combined to form the largest online gay community in Turkey.

And also, access to popular Web 2.0 based services have been also blocked on the basis of intellectual property services. So, the deficiencies are affecting not only the freedom to speech and receive information, but also the right to receive a fair trial.

We also witnessed the lack of judicial and administrative transparency regarding blocking orders issued by the courts and TIB.

Furthermore, TIB has not published blocking statistics since May 2009, which is a step backward. At least 200 Web sites blocking decisions are outside the scope of the current Internet law. The reason for this extension is unknown, as the statistics are not being published for almost one year.

We constantly argue that that could be a breech of Article 10 of the European Convention on human rights if blocking or filtering are used at a state level to silence particularly moderated speech on the Internet or criteria for blocking or filtering is secret or the decisions of the eight administrative bodies are not publicly made available for legal challenge.

Based on ongoing censorship on the YouTube Web site for almost a year, since the May 2000 date, there was an appeal launched with the European court of human rights. The Director of Cybersights.Org did the same regarding the blocking of lust FM from Turkey by a public prosecutors officer under the law of intellectual and arts.

>> MAJA RAKOVIC: Speaking about statistics, Google released details about how often countries around the world are used to hand over user data and to sensor information, and there is information about the countries and actually about how many times they do it. And Brazil tops the list with 3,663 data requests. Then the US and the third place is UK. So it’s a European country as well.

Thank you very much.

Now we are almost coming to an end. Can you hear more comments? Are there any other comments before summing up this session? Yes. Please. I see.

>> AUDIENCE: Yes. I work for the organization for secured incorporation in Europe, for that representative that was just mentioned. He is – now it’s a she. She is an intergovernmental media watch dog and we monitor media freedom in 56 participating states. I want to bring up two issues that were not really mentioned a lot, and that is related to blocking. We think that in blocking, we don’t encourage states to do that. We take down the approach here. We think it disguises the symptoms of a problem and hides them from the public sphere. It distorts reality. So we promote education awareness raising. And I haven’t heard much of this. You know, examples of how to educate or how to raise awareness. And starting in, you know, with children, maybe.

And the second thing is related to self regulation. We do encourage self regulatory systems on the Internet. However, I have to caution a little bit. Self regulatory systems are not always transparent. And they are only effective if they are independent and if they make their policies very transparent. We haven’t – I personally haven’t seen any statistics from ISPs, for example, how they decide to take down certain content and how not. 90 percent of the requests that they get to take down certain content might be straightforward. It’s illegal. But the remaining 10 percent is what makes the procedure very problematic.

>> NICHOLAS LANSMAN: Thank you. A couple quick comments up there. And then we have a comment coming from somewhere or another.

>> AUDIENCE: I’m Rudolph Myer. I’m with the Dutch CCTLD. I’ve been working in this sector for five years, it’s probably far shorter than many you. But I’m still often amazed at the way in which this sector on certain subjects seems to be detached from the real world. We do not stop arresting suspected criminals because they are countries where people are arrested and imprisoned without any good reason.

Of course, locating, deleting the criminal content, and arresting, prosecuting the people who produce this is the best option. But we all know that it doesn’t work in all cases.

In the Netherlands we introduced two different schemes. I’ll give you examples. One is what we call a code of conduct for notice and take down. It’s specifically aimed at things that have to be deleted quickly, like phishing sites. You said that banks, the gentleman is gone, there was a gentleman from EuroISPA who said that banks are able to delete Web sites in a few hours. They cannot delete Web sites unless it’s their own. The registry, there’s no way to get a court order in 24 hours. So the sector has to take its responsibility there. And I think it’s the same with the, for instance, the images that come from countries where the Netherlands doesn’t have a legal treaty with. Our police forces can do anything then. Of course it’s better that somebody deletes it and prosecutes the manufacturer, but if that is not possible I think you have a good excuse to start educating the users, or maybe the producers, and hope that the problem disappears by itself.

>> NICHOLAS LANSMAN: We will go now to hear a quick comment about Norway. A wide stretch this.

>> Thank you. I’m from the Norwegian post and authority of the Norwegian regulator. We did have meetings with the police and the law enforcement in 2004 and then of course the cooperation took on in 2005. And also we encouraged the cooperation between SP, and the law enforcement, to do this activity regarding blocking with regards to child pornography. And we consider this as self regulation activity. Even the Ministry of Justice cited the amendments to the criminal law a couple years back, that if this mechanism did not work, they had to investigate possibilities of legislation. So just a quick comment. Thanks.

>> MAJA RAKOVIC: Can we hear before, can we hear a rebut, about the questions?

>> AUDIENCE: Yes, there is – that’s been under discussion also about pornography and then there was a comment about – well, actually it’s being done in society to prevent even especially in child pornography being created. So then they are saying society should do more to – before this stuff even is put online.

And then there is another comment from the Armenia hub who just wanted to speak about the problem of access in general. Even though they know it’s not part of this session, but they just wanted to say there is a huge problem also with the alphabet. A lot of information is not available in alphabets of some countries. And they wanted to have people’s views on that. And just remind everybody that this is still a problem.

>> MAJA RAKOVIC: Okay. Thank you. This is just – this is certainly a very important question. But also slightly falls out of the scope of this particular focus of the content policy. So perhaps with some further discussions about content policy we could consider focusing on that question. Thank you.

>> NICHOLAS LANSMAN: If I can –

>> AUDIENCE: If I can comment again. Just to respond to a point made by someone in the audience, addressing the Council of Europe, what I said earlier.

I did not suggest, I think maybe it was a misunderstanding, I did not suggest that police authorities should be able arbitrarily to block access. That is certainly not the line of the Council of Europe. When I spoke of a competent national authority, that means prescribed by law. There is a law that designates the competent authority and power to take such measures. And the law should make – be very clear to avoid arbitrariness; it should define the specifics.

>> NICHOLAS LANSMAN: Thank you.

>> MAJA RAKOVIC: Before that, we have another comment and then we are summing up.

>> AUDIENCE: To respond to the gentleman from the Netherlands, I didn’t mean that. If you can’t take down child pornographic content, then you should be educating children and adults how to deal with the Internet. Apart from child pornography, which is a very clear cut issue, there are so many issues that are not clear cut, like hate speech. We treat it very differently across Europe. Very often western European approaches are used in eastern Europe. You have Russia, Kazakhstan, they came up with extremism laws without identifying it. You have our position media as being closed down based on this. And we think the EU countries should, you know, come forward with some positive examples how to deal with the Internet, so as not to be used as a pretext for, let’s say, emerging democracies.

>> NICHOLAS LANSMAN: Thank you.

>> MAJA RAKOVIC: Okay. Just a short interaction because we will miss the reception. And people are starting to leave because they want to go. So one word and we need to sum up.

>> AUDIENCE: I’ll just reiterate my point. The fact that other people or other countries, other regimes, are wrongly following a set example shouldn’t stop you from behaving in the right way. Like I said, if people are arrested without any cause and put in prison in certain countries, it shouldn’t stop us from arresting what we consider in relation to our legal system to be criminals.

But I’m not arguing. It has to be – it has to be criminal or at least illegal. And that’s why we have laws.

>> MAJA RAKOVIC: Thank you. Now, coming back to our Rapporteur, Michael Truppe. What do you think having heard the discussions? Have we answered the questions from the title of this plenary, online content policies in Europe. Where are we going?

>> This is tough.

>> MICHAEL TRUPPE: I think I have one minute, but if you will allow me to take three. I think the first question is what direction European content policies are heading to. I would like to divide it into the liability issue and the blocking issue, if you allow me.

As regards liability, it appears that the legal framework is stable at the moment. What we see and what we have been told about is that the role of the service providers appears to become more difficult and there is a tendency to criminalize infringement, particularly with regards to copyright infringements. New sanctions are being imposed, and it’s – one question, whether or not they are proportionate.

As regards blocking, we witnessed that at the EU level administration is prepared to block access to child pornography and other instruments are being updated in this respect. The debate is not new, and at the Council of Europe level we have been told that a special recommendation and standard setting has been on its way. And in a number of countries, they introduced legislation on or even public/private partnerships with regard to blocking of content.

So, is there a common direction in Europe? Well, that’s not so easy to say. I would say yes to some extent.

As regards liability, we already mentioned that the e-commerce directive is still the relevant legal act that should provide a sort of balance for the service providers. For the rest, for the sanctions, I think we are not in the position to see a real common European approach to that issue. It varies largely from country to country.

The same applies to blocking. Standard setting has been done to some extent. At the pan European level, the Council of Europe was mentioned twice. At the national level, I think the practices are still very imhomogeneous. We heard about examples from western, eastern, Scandinavian countries, where the practices are very varying in detail.

So now to the more controversial part. Is it the right direction and what should be changed and how if not? Liability, I think it’s not clear yet what really would make it easier for service providers to decide in what position they are.

The responsibility as was said should not be left to the ISPs themselves and courts should have an important role in the whole debate to decide on the legality of the content.

What is the liability of service providers and the proportionality of sanctions that are being provided for infringement? Now, blocking was the core debate we had in the last one and a half hours. The direction of blocking access itself can be questioned, particularly because there are other instruments. I think that was one of the core messages that has come out. We have had quite some extensive reference to practical problems, with black lists, their efficiency has been questioned. But it is also to be mentioned that some interventions mentioned that it could be the second best solution and particularly it has an effect, compared to other measures.

The core question for me remains, and it would be a key word to the proportionality issue of the whole thing.

Bureaucracy seems to be a problem at the level of cooperation between governments and I think there was generally agreement, but I didn’t notice any objections to it, that this could be one of the most important fields of work for the future to enhance cooperation with regards to taking down illegal content as the preferred solution to combat illegal content online.

A particular issue seems to be the Web 2.06 content, where it’s more difficult to assess illegality and to take appropriate measures to this extent. And I think – it’s not exaggerated that a lot of interventions demand it for improvements with regard to the measures being taken at blocking level.

So my conclusion would be there is no clear path yet. But it appears that particularly the international cooperation with regards to taking down content could be strengthened in the future. Thank you very much.

>> NICHOLAS LANSMAN: Thank you, Michael. Very well done.

(Applause)

I think it was lively. I think you agree it’s been a lively debate. We heard from government, NGO, industry, and others. We had a wide section of various review, and we heard from countries as far afield as Russia and Turkey and Armenia. It’s been good.

I have an update on the reception tonight. There was a comment that there would be a first come, first served. They have laid on more room. There are more drinks and we’re all invited. That is the good news. There is some bad news. And it takes about 50 minutes to get there by metro. We have to get to the metro station SOL. You should have in the pack this morning, or there are more copies outside, all of the instructions on how to get to City Hall where the reception is taking place. It will be fun.

With that, I want to thank all of the people who contributed all the questions, all the people who came remotely. I want to thank the people who did the transcript, which was superb, and thank you to all of you attending. And we look forward to the next session.

(Applause)

And over to Lee.

>> LEE HIBBARD: Just one point. Any of you who are interested in the future of EuroDIG, there is going to be a formal donors meeting from 8 to 8:45 in the restaurant part where you had lunch today. So if you’re interested, come along. Thank you.

>> NICHOLAS LANSMAN: Thank you. You are all free to go and get drunk. Bye-bye.