Your Freedom of Expression vs. mine? Who is in control? – WS 10 2018: Difference between revisions

From EuroDIG Wiki
Jump to navigation Jump to search
(35 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{Sessionadvice01}}
6 June 2018 | 14:00-15:30 | CENTRAL ROOM | [[image:Icons_live_20px.png | YouTube video | link=https://youtu.be/_6rM_zNamac]]
Working title: <big>'''Your Freedom of Expression vs. mine? Gatekeepers are gone, is regulation the answer?'''</big><br /><br />
<br />
== <span class="dateline">Get involved!</span> ==
[[Consolidated programme 2018| '''Consolidated programme 2018 overview''']]
You are invited to become a member of the session Org Team by subscribing to the [https://list.eurodig.org/mailman/listinfo/ws10 '''mailing list'''].
If you would just like to leave a comment feel free to use the [[{{TALKPAGENAME}} | discussion]]-page here at the wiki. Please contact [mailto:wiki@eurodig.org '''wiki@eurodig.org'''] to get access to the wiki.
 
== Session teaser ==
== Session teaser ==
Internet intermediaries played an essential role in the internet ecosystem as users relied on them and they influenced how
This workshop looks at the various issues discussed at EuroDIG 2018 through the prism of freedom of expression.
freedom on the internet was exercised. Now they are gone and we have to confront the alternatives: do we need regulation to safeguard our freedom of expression online? And by whom? What the roles of civil society, governments and private sector be in this new landscape?
Freedom of expression remains a central tenet of every democratic society. Obviously, there are limits to that freedom, and they may vary from country to country.
But over all, are the limits getting wider or narrower? And what specifically has been the impact of the huge changes brought about by technical development? Have they been beneficial of detrimental  - or both?
We also need to ask, who are those setting limits and controlling our ability to express ourselves freely online?
Are governments trying to curtail freedom of speech more or less than they used to do?
Are internet intermediaries, trying to keep their platforms clean from fake news and hate speech, adopting practices that restrict freedom of expression?  
Are attempts to copyright reform restricting freedom of expression – or, to the contrary, enabling original content creators to enjoy it?


== Keywords ==
== Keywords ==
Until <span class="dateline">1 April 2018</span>.
Freedom of expression, regulation, intermediaries liability, censorship and privacy, take downs, human rights.
They will be used as hash tags for easy searching on the wiki.


== Session description ==  
== Session description ==  
Line 18: Line 19:


== Format ==  
== Format ==  
Until <span class="dateline">30 April 2018</span>.
The format will loosely be the following:
Please try out new interactive formats. EuroDIG is about dialogue not about statements, presentations and speeches. Workshops should not be organised as a small plenary.
* a.) Introduction from the moderator
* b.) Reporting from Plenary Session 2
* b.) Debate
* c.) Q&A with key participants
* d.) Open the discussion to the room
* e.) Wrap-up


== Further reading ==  
== Further reading ==  
Line 28: Line 34:
'''''Please provide name and institution for all people you list here.'''''
'''''Please provide name and institution for all people you list here.'''''


'''Focal Point'''  
'''Focal Point:'''  
*Esmeralda Moscatelli, Policy and Research Officer (IFLA)
*Esmeralda Moscatelli, Policy and Research Officer (IFLA)


'''Organising Team (Org Team)''' '''''List them here as they sign up.'''''  
'''Subject Matter Expert (SME):'''  
The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member by subscribing to the mailing list.
*Yrjö Länsipuro – ISOC Finland


* Hamlet Simonyan, Yerevan State University Student Council
'''Organising Team (Org Team)'''
* Eduardo Santos, Defesa dos Direitos Digitais
* Ucha Seturi, Small and Medium Telecom Operator's Association of Georgia
* Ucha Seturi, Small and Medium Telecom Operator's Association of Georgia
* Natalia Filina, EURALO
* Natalia Filina, EURALO
* Ketevan Kochladze, Youth and Environment Europe YEE
* Anna Romandash, Digital Communications
* Elisabeth Schauermann


'''Key Participants'''
'''Key Participants'''
 
* Giacome Mazzone - European Broadcasting Union - Reporting from Plenary 2
Until <span class="dateline">14. May 2018</span>.
* Irina Drexler - No Hate Speech Movement - Freedom of Expression in Romania
Key Participants are experts willing to provide their knowledge during a session – not necessarily on stage. Key Participants should contribute to the session planning process and keep statements short and punchy during the session. They will be selected and assigned by the Org Team, ensuring a stakeholder balanced dialogue also considering gender and geographical balance.
* Professor Wolfgang Benedek - University of Graz - Challenges for intermediaries respecting freedom of expression
Please provide short CV’s of the Key Participants involved in your session at the Wiki or link to another source.
* Natalia Mileszyk - Communia Association - Copyright as a tool to restrict FoE in the light of ongoing EU reform?
* Pearse O'Donohue - European Commission - How to tackle illegal content while ensuring freedom of expression and information
* Giorgi Gvimradze - Georgia Broadcasting Corporation - Religious Feelings VS Freedom of Expression


'''Moderator'''
'''Moderator'''
 
* Cristian Urse -  Head of Council of Europe Office in Georgia
Until <span class="dateline">14. May 2018</span>.
The moderator is the facilitator of the session at the event. Moderators are responsible for including the audience and encouraging a lively interaction among all session attendants. Please make sure the moderator takes a neutral role and can balance between all speakers. Please provide short CV of the moderator of your session at the Wiki or link to another source.


'''Remote Moderator'''
'''Remote Moderator'''
Line 54: Line 64:


'''Reporter'''
'''Reporter'''
 
*Aida Mahmutovic
Reporters will be assigned by the EuroDIG secretariat in cooperation with the [https://www.giplatform.org/ Geneva Internet Platform]. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:
*are summarised on a slide and  presented to the audience at the end of each session
*relate to the particular session and to European Internet governance policy
*are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
*are in (rough) consensus with the audience


== Current discussion, conference calls, schedules and minutes ==
== Current discussion, conference calls, schedules and minutes ==
Line 68: Line 73:


== Messages ==   
== Messages ==   
A short summary of the session will be provided by the Reporter.
*There should be greater transparency in how algorithms are developed, and a public debate on the approaches taken by private companies.
*When discussing ways to tackle disinformation, we need to assess the implications for both democracy and freedom of expression.
*Quality journalism is essential for maintaining democracy.
*Finding the right balance between disinformation and freedom of expression is important, and educational awareness raising is needed in order to achieve this balance and to make people understand the information they receive.
 
Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/your-freedom-expression-vs-mine-who-control


== Video record ==
== Video record ==
Will be provided here after the event.
https://youtu.be/_6rM_zNamac


== Transcript ==
== Transcript ==
Will be provided here after the event.
Provided by: Caption First, Inc. P.O Box 3066. Monument, CO 80132, Phone: +001-877-825-5234, +001-719-481-9835, www.captionfirst.com
 
 
''This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.''
 
 
>> CRISTIAN URSE: Good afternoon, everyone. Welcome to this session. Your freedom of expression vs mine? Who is in control? My name is Cristian, and I'm going go the Moderator of this working group this afternoon. I hope the technical issues are set. I'm very pleased to introduce to you our speakers today.
 
At my right, Professor Wolfgang Benedek from the University of Graz. He's a former Director of the Institute of International Law and International Relations at the University of Graz, also lecturer at the Diplomatic Academy of Vienna and the European Master Programme on Human Rights and Democracy in Venice. And I'm also glad to mention Professor Benedek is also a member of the Council of Europe expert group drafting the consular guide on human rights of Internet users.
 
We also have Mr. Pearse O'Donohue from the European Commission, Director for the Future Networks, Directorate of the DG CONNECT dealing with policy development and research supporting the Digital Single Market.
 
Also with us today is Natalia Mileszyk, lawyer and Public Policy expert, copyright, reform and openness.
 
Irina Drexler, the National Coordinator of the No Hate Speech campaign of Romania, will speak about a recent report on freedom of expression in Romania.
 
And last but not least it's Giorgi Gvimradze from the Public Broadcaster of Georgia. The Director of News and Current Affairs, also an Advisor to the Director-General of the public broadcaster, and I believe also relevant for the topic he'll tackle the one of religious feelings versus freedom of expression. He's a graduate of the Theological Academy of Tbilisi. So I think that adds to the ingredients we need for this session.
 
But before going to the speakers, we will hear from -- sorry -- from Giacome -- I'm a bit lost in my papers.
 
>> Can I introduce myself?
 
>> CRISTIAN URSE: That's relying too much on papers, so Giacome Mazzone, he worked in public service broadcasting and media for 20 years. I can pick out and mention here international companies like EuroNews, EuroSport. Giacome will brief us about the Plenary Session of yesterday as an introduction to the debate that we will have. Please.
 
>> GIACOME MAZZONE: Thank you very much, especially for the discount you made me, because I work since 30 years, but 10 years less makes me younger so good.
 
Not true, fake news.
 
And I have to report about yesterday's session about fake news, so I want to debunk the first fake news you gave. The session yesterday was I think very interesting. And gave some hints that are important to be heard before to go into the discussion of today.
 
I think that what was important yesterday is Claudia Lucian in the keynote speech at the very beginning of the session said that fake news and misinformation have a big impact on democracy on citizens' lives so they need to be tackled because of that.
 
Claudia said the Commission itself between finding effective remedies but without hurting human rights respect is crucial for European values.
 
There has been other consideration the banking activity, the source checker, that need to be financed. This is an activity that has to be financed. He said that this could be something that could become in itself a source of revenues. For the moment it's only a source of cost for the traditional media.
 
He said there's an activity that need to be done hand in hand with the platforms because if not, is not effective. He raised the question, who controlled the controllers? And he gave also an answer saying that peer to peer could be the best solution.
 
And he said that the work of the Commissioners started will be implemented before the end of the year and the beginning of next year until the election of 2019, the European election, will be the perfect test time of all the activity the Commission has put in place.
 
Ana Kakalashvili said that we need to differentiate remedies because in the same information disorder box we have fake news and propaganda, and probably the two needs to be curated in a different way and need different answers.
 
Patrick Penninckx, as Council of Europe, provoked a little bit the audience saying that in 22 out of 28 countries, media are the less trusted institutions, and so he said that this leave room for improvement, especially he said that only 1 out of 4 trust social media as a source of information. Probably because I was in the room he mentioned that the public service media are going better, and are even improving in the barometer, that probably was just for me. I don't know.
 
And he said that we need to face as a society with the news schemers and the news avoiders that are relevant phenomena, the people that doesn't want to -- that have a very small view on the news doesn't go at all in deep, and the avoiders, those that don't go at all at the news because they don't find the news interesting or relevant for their life, and both are very worrying phenomena.
 
And he said that the answer for many of these people that are scheming and avoiding is to cocooning so that they look into media or into messages that reassure them on their own positions. And that this is of course for the society a disaster because it means that you will not go for the best option, but you go for the option that you like more, even if it's not the right one.
 
Google was present in the debate, and they said, gave us a long list of technical possible solutions to the problems. It seems that they look only into algorithm solution or technical implementation that could solve the issue. Not everybody in the room was in agreement with that.
 
And then finally, I think the final conclusion of the reporter of the session was that remember us, the reason of all the debate. Democracy is only about making informed choice for citizens. And if I can add something on my side, when I start to study journalism 20 years ago as you said, but let's say more, the most important lessons that I learned is that you never have to say to your viewers or your listener or your readers what they want to listen. You have to report them especially the things that they don't want to listen, and I think that this could -- 
 
[ Off microphone ]
 
Exactly. Thank you.
 
>> CRISTIAN URSE: Thank you very much for that summary, and of course I apologize for mixing up the numbers.
 
[Laughter]
 
Let us get into the debate, and before giving the floor to the first speaker, let me also mention that we will have as an informed speaker from the floor someone from the Council of Europe's No Hate Speech campaign, and then about the organisation of this work.
 
We will give to the speakers somewhere between 5 and 7 minutes to make their intervention, after which we will open the floor for questions. We'll try to get as many interventions as possible. Please bear in mind, I don't know let's bear in mind all together the size of the group and the time that we have, approximately 90 minutes -- well, 80 from now altogether in order to allow as much participation as we can, and also to be able in the end to arrive to a couple of conclusions.
 
So I will go first to Irina Drexler on the country report on freedom of expression in Romania, please.
 
>> IRINA DREXLER: Thank you, yes. Thank you. Hello everyone, so yes my name is Irina Drexler. I'm among other things I'm an independent researcher on human rights with a focus on pragmatics, rhetoric and discourse analysis, also Internet Governance, but also my role here is as the National Coordinator of the No Hate Speech Movement campaign of the Council of Europe in Romania. This campaign was also implemented in Romania, and it started -- so I'll tell a bit about the context of the report I'm going to briefly present to you. So the No Hate Speech Movement campaign in Romania was initiated by the Ministry of Youth and sports who has 9 NGOs in Romania active on human rights.
 
Among these 9 NGOs there is one called Active Watch, which is monitoring the press in Romania. The organisation basically advocates for human rights and for freedom of expression, anti-discrimination and media education. These are the three pillars and in their work, they are developing each year two reports, one on freedom of expression with special attention to the freedom of expression of the media in Romania and another one which is connected is a yearly report on the State of hate speech and intolerant speech in Romania.
 
So briefly, the report is 200 pages long. I'm not going to present all the cases they have come up with regarding the state of freedom of expression in Romania in the previous year, 2017, but it's a newly launched report on the occasion of the World Day for the Liberty of the Press, so at the beginning of May.
 
If I were to summarize the main idea of this report is that media in Romania has greatly contributed to the polarization of the society. The politicians, on the one side we have the politicians and the State institutions and on the other side we have the citizens. There is a tension that the authors from Active Watch organisation notice and state in this report which has arised immediately after the new elections in Romania in December 2016, so after these elections in December, if you may remember from the press, from the international press in Romania, we had huge protests which amounted to some 600,000 people in the streets protesting among other things also against corruption and new developments in the political sphere in Romania, but also against the silencing of journalists who were investigating these corruption initiatives by the new government.
 
And also the polarization is not just between the institutions of the State and the citizens, but also between the press, which is the good press and the bad press so to say, or the press of the Government and the press of the opposition, and the authors remarked that there is not real press for the citizens and for the public interest of the country and of its citizens.
 
One of the first initiatives was of the State institutions that is remarked here in the report is that public gatherings of people who are trying to protest against this information and against the initiatives of the Government have been discouraged also by sanctions given by State-controlled police, and the protests have been described in the press as being a way of manipulating from using sources from outside. It was conspiracy theory, which is a common -- which is a common theme in other countries, as well.
 
And also with regards to the social networks, the State has, or the police has, and some courts have, ruled that the press needs to take down completely all articles that are criticizing too harshly the ruling parties in the Government so it was a case of freedom of expression of the journalists being attacked.
 
Also, the report notices that just like in the case of the National television in Romania and the National radio in Romania, who have faced a new leadership change once the new elections, or once the new Government was installed, it was also the case of the National Press Agency who was being targeted for a similar initiative trying to have the leadership changed to have a more favorable one for the new coalition that's governing the country.
 
But on the other side, on the plus side, there is also a more active role that the National Council on audio-visual has been taking since, or in the period that the report is looking into, in 2017. However, despite this institution being more active, in fact, there is not a predictability on the sanctions that it gives to televisions, to radio stations and to TV stations. And it's not a predictable sanctions and also disproportionate sanctions concerning who is doing or who is saying what.
 
The report also mentions pressure on journalists. In the sense that for instance investigative journalists have been attacked or their parents have been attacked or the cities have been spread with graffiti against journalists who are doing investigative work on both the opposition or the Government, but especially for the ones who are criticizing the decisions on initiatives of the Coalition.
 
And also there have been more controls financial controls for the publications who are being critical and in the sense of the discourse, the public discourse, because of the trends in mass media, there has been a radicalization that the authors of the report have noticed with sanctions also at the level of the Civil Society who are trying to be active. Because I have only one minute I will not go into the other findings of the report but just to mention this report is connected to the report on hate speech in Romania and I will be available to speak more about this in the No Hate Speech initiative in regards to freedom of expression of society in general. Thank you.
 
>> CRISTIAN URSE: Thank you very much, Irina, for that. I think it's a good opening also to start from a specific case rather than having it at the end. I'll pass the floor to Natalia Mileszyk on the copyright of freedom of expression.
 
>> NATALIA MILESZYK: There was a question Mark afterwards of course the statement. Hello everyone. It was good to be here since I was so nicely introduced I will skip this part and go directly to the question that probably interests you: Why are we talking about copyright during this session on freedom of expression? Of course for many of us in this room copyright might be perceived as only a property right, ownership right that protects creativity and those who should benefit and those who should earn money on the creativity but at the other side you can also look at the copyright at the way of restricting and limiting somebody else's ability and possibilities to publish things online for example, so restrict in a way freedom of expression. The question is if it's legitimate or not.
 
And European Court of Human Rights many times stated in many decisions that copyright presents risk to freedom of expression and I was thinking about the example to share with you and I didn't want to become too political so I decided to go for non-European example. If you Google or Bing or whatsoever like whichever search engine you use, notice and take down and the President of Ecuador, you will find plenty of articles showing how notice and take down mechanisms, so the mechanism of implementation of copyright online was used for several years to take down commentaries against President of Ecuador, so it's happening.
 
Coming back to on going European Union copyright reform, I want to underline it's ongoing and that's why I will focus on general concepts that were introduced. I won't go into very legislative details. I have to apologize to those of you who are really experts in copyright. I'm looking at two gentlemen to my left and to my right.
 
So because we simply don't have time for this but I just wanted to show you how freedom of expression is visible in this copyright discussion, and where our tensions and where are things we should care about and be very sensible about.
 
So first of all, there is so-called lawyers love to operate with numbers, so so-called -- not so-called. It's Article 13 that at the very beginning was the idea to implement filtering obligation for platforms that host user-generated content, and I must ask you whoever posted something online that you created, generated, wrote, anything?
 
Only three people? Four people? No comments, no articles, like academic articles? No photos on Facebook? Guys, you don't share photos on Facebook? You do.
 
So imagine the situation where everything you post online must be scrutinized, must be checked before being available online by any platform that hosts this kind of photos, articles, comments, and everything, in order to avoid copyright infringement. Right now, the -- we agree in the European Union that it should be something like notice and takedown so if there is copyright infringement, you should let know the platform that something like this is happening and then platform has to take certain steps.
 
But with this proposal, of course, the worst case scenario, I want to underline I'm talking about worst case scenario, platforms will be obliged to monitor proactively the content published online, so you can imagine what might be the outcome, and also the idea of automatic filters will introduce so it usually will be not humans behind it but artificial intelligence, probably, especially in the case of big companies.
 
So that's the first thing, and I want to let you know that at the end of November 2018, 80 organisations from across European Union sent a letter to officials in the European Union naming 29 other letters on this issue, so, like, 29 other letters, like, public letters, were published before saying how harmful it might be for freedom of expression if there is private entity that has to monitor everything that you do online.
 
And second, very controversial Article is Article 11 that introduces new rights for press publishers, so besides having copyright on the content that is usually in the hands of creators or sometimes acquired by publishers, publishers will be granted additional layer of protection, and they will be able to regulate who and how you use hyperlinks and snippets and short parts of the things published in their newspapers online, and of course, I'm saying again, I'm talking about the most radical proposals we've seen during this discussion.
 
And of course, we all agree that press publishers in European Union right now are struggling or are sinking as somebody already told as previous panel on copyright, but at the end of April, 169 copyright academics wrote to also European Union institutions saying that this solution and this very radical, in this very radical form, can be very restrictive to freedom of expression.
 
And since I was already given a notice that I have to finish in properly 30 seconds, I just wanted to underline that this is an example how seemingly not human rights related regulation because copyright is perceived in the European Union under Digital Single Market agenda so it's mostly a market oriented programme to enforce and create common digital market for the whole European Union, might affect various rights, in this case freedom of expression. And it shows that we should always be aware and very conscious about human rights whenever we proceed. So thanks all for this.
 
>> CRISTIAN URSE: Thank you very much, Natalia, and we turn now to Professor Wolfgang Benedek on challenges to intermediaries in regard to respecting freedom of expression.
 
>> WOLFGANG BENEDEK: Thank you very much. I would like to address to some new challenges, not so new for you maybe, but I have to say when I published together with Matthias Kettemann this book on Freedom of Expression and the Internet with the Council of Europe Publishing House five years ago, we had a different focus and different challenges in mind than what we see today. In the past, the Internet was mainly seen, or the Internet companies were mainly seen, as facilitators of free speech, of dissemination and so on, and today we discuss much more the problem of gatekeepers and of responsibility or accountability of such intermediaries.
 
And so I would like to take two examples. The one would be the right to forget, and Google in particular, the other one would be the content management by Facebook, and the concern is to say that there are standards and procedures in place which lack transparency. There are decision makers where we do not know neither their qualification nor where they work from, how they do it and so on. And there are problems also with regard to the right to remedy. This would not be so important if these platforms have not become public interest, because they provide a public space in which we all operate.
 
Now, regarding the rights to be forgotten or the right to be erased was the famous Google Spain case, but in the meantime we have the GDPR which has also already been discussed here and in this new regulation, we find Article 17 which says that these restrictions shall not apply to the extent the processing is necessary as for exercising the right to freedom of expression and information. So this exception for the right to freedom of expression and information is very important.
 
What is unclear is its interpretation, and the GDPR clearly says that in the public interest, when it is about illegal claims and so on, then there has to be this emphasis on freedom of expression. It goes even further. In Article 85, there are certain exemptions seen for academic, artistic, or literary expression and so on, so the question is: How do platforms, social platforms like Google for example, implement that now in practice? If anybody can help me in understanding this better, I would be very grateful, because the public information existing is very limited.
 
Certainly we could have some ideas how this is being done, and I do not envy Google for this task, by the way. It is kind of the position of the EU and others that they should not interfere with their work, so it is kind of self-regulation issue, but on the other hand, there are standards, in particular the ones I have mentioned.
 
Also, the issue if Google is delisting only in the EU or beyond, and if you're aware the Canadian Supreme Court ruled last year that certain search results have to be blocked worldwide, so here we have also different standards.
 
Then I'm going to say a few words on Facebook and here the transparency report is very interesting, which is -- so I wanted to say something on the transparency report of Google first. And here, Google has published that 686,000 delisting requests have reached it, and it is referring to 2.5 million URLs, of which 44% have been delisted so that should give us also an idea that there is a very high delisting rate, but on what basis this delisting finally has happened and if human rights have been taken into account is uncertain. As we just heard about copyright by the way, there have been delisting requested for 3.5 billion URLs on the copyright and 90% were delisted, again this shows a different dimension which exists in that field.
 
I wrote on Facebook and here I would like to pose three questions to you which you can see here already, and I would ask you for your opinion, yes or no.
 
So should there be -- should information of this type be deleted or not? Poor black people should still sit at the back of the bus? Is this hate speech? Yes or no? Who says yes? Please raise your hand. Who says no?
 
Hardly anybody. That's interesting.
 
Actually, Facebook is considering that this is not to be delisted because race is a protected category but social class is not. 92% of people who have been asked by New York Times had a different opinion. Actually also my students which asked on that.
 
Second example: White men are all assholes. Yes or no?
 
[Off microphone]
 
[Laughter]
 
It's not whether you agree with the statement. But whether it should be --
 
[Laughter]
 
So who says yes?
 
Who says no?
 
Okay. You are a bit split I see. There are many undecided. In that case it was "yes," because race and sex is at stake.
 
And finally the third one: I never trust a Muslim immigrant. They are all thieves and robbers. Should that be taken out? Yes or no? Who says yes?
 
Many say yes.
 
Okay. You're not aware of the Google -- of the Facebook policy, because it is "no." Immigrants are only protected only in context of violence but not in the context of exclusion.
 
So what do I want to show with this? That these criteria are often used in account intuitive way. I do not want to say that we should always have what the people want or so, but I want to say that it's counter-intuitive, and also some other such counter-intuitive decisions have been corrected, like for example, targeting black children which are now better protected because the category of age has been included.
 
Now, this brings me to two points more, and this is when you look how this is done, then you realize that it is often done through algorithms and in a semi automatic way, so the question now is: How can you develop the algorithm to take human rights freedom of expression anyway better into account? And here I cannot but emphasize also the study of the Council of Europe on Algorithms and Human Rights, which you can find over there and which deals with this in a very good way, and also we had just a flash session on the other side of the road which also dealt with this very well.
 
So I cannot go into details, but the issue is the challenge how to organise the self-regulation so a semi automatic decision-making procedure anyway that it has not a chilling effect on freedom of expression, and that the process in general is foreseeable and hopefully also legal in the sense that it does take human rights into account, and maybe also provides an effective remedy.
 
So if you're put in the so-called Facebook jail because your account is blocked, you would like to know why, and you would like to know which remedies you can take, and this is not sufficiently the case.
 
I would like to end with another document which you can find over there, and this is the Guide for -- this is the new Guidelines for Internet Intermediaries of the Council of Europe. You can also find it over there. The basis is actually the UN Guidelines on Business and Human Rights, and they apply this to gatekeepers, to intermediaries, and they come to certain conclusions and recommendations which are quite important when it comes to this company, these companies. Because they say standards and practices of content moderation should be publicly available, and the user should be informed. Restrictions should be based on principles, like least restrictive approach and so on, and we have no information whether this is happening or not.
 
Now, I'm not against algorithms as such. I only recently found out that the European code of human rights itself is using algorithm in filtering applications in order to find out whether there's human rights violation or not, so if that is the case, then why not others, as well?
 
And if we have in mind the large number of cases at stake with which such companies are confronted, it is I think understanding that they use that. But the question indeed is how these algorithms are being developed, and to what extent there can be what we heard in the other panel this morning an efficient transparency which does not mean that we have to know all the details of the algorithm, but we have to be given a basic understanding, and maybe also the possibility to participate in the discussion how it should be shaped. Thank you very much.
 
>> CRISTIAN URSE: Thank you very much for that insightful presentation. I think it linked very well with the previous one. Now we move to Pearse O'Donohue on how to tackle illegal content while ensuring freedom of expression and information.
 
>> PEARSE O'DONOHUE: Thank you very much, and good afternoon. It's very interesting to hear the other speakers. I would just like to first of all just do a quick inventory as to what we do in the European Union with regard to both illegal content but also of course in relation to disinformation which are two different categories of issue.
 
For years, our rules have been set by the e-Commerce Directive which had the objective of helping the Internet to develop. We're talking about rules developed between 2000 and 2002, while promoting the freedom of speech so it did put obligations on intermediaries, but it made sure that the intermediaries were not liable for content. The only situation in which they were liable for content was where if they were clearly notified, and even then there were certain protections.
 
More recently, very recently in fact, and this is where in a way we diverge now between illegal content and other issues, is that the Commission issued a recommendation on measures to tackle illegal content online, and that was driven by a number of different controversies but of course, would in the first instance for example be targeted at online material which was promoting terrorism or indeed hate, and/or child pornography or other issues. Now, it does of course in some cases touch on copyright which I'll have to come back to because of the examples that were given to us.
 
And then very soon after that the Commission issued a policy communication on tackling online disinformation. That was in fact one of the subjects of the Plenary Session yesterday. I won't go into detail on that, but it was based on the report of the high level expert group which did help us to focus our thoughts a lot.
 
The last element that I would mention is actually the Charter on fundamental rights. So one of the fundamental founding documents of the European Union which is now and has been for some years an integral part of the treaties and is operative as well as just a statement of a key set of issues. What do I mean by that? For example the very famous court of justice ruling, the first Facebook case, the court based most of its ruling not on specific legislation or even on the Treaty but based its ruling on elements of the Charter, so this is a dynamic and effective way of protecting different rights.
 
So we have that set of policies, but I must stress that while the e-Commerce Directive, the original piece, was a piece of legislation, it did not impose many obligations. We do set out now a set of overarching principles and objectives which are in line with the Charter but the approach we then take is soft law and a voluntary approach so self regulatory initiatives whether it's in how the e-Commerce intermediaries organise themselves or with regard to monitoring. There are no general monitoring obligations, and in fact, our Vice President made it very clear there would not be general monitoring obligations.
 
Anecdotally it's interesting to note, as a former Prime Minister of a country that was under the Soviet Bloc until the early '90s, when we first discussed this issue with them a year ago, he said: This is very important, but there will be no Ministry of Truth. And this is also looking back to the Soviet era, in which we could not put ourselves in the position or allow any Member State to put themselves in a position in which the law decided what was disinformation, what was fake news, what was appropriate or not. We had to have an independent, transparent, and self-auditing process if we were to try and tackle an online disinformation because of the implications, the serious implications potentially for freedom of expression.
 
So everything I've said confirms what we said we were talking about at the start: Your freedom of expression versus mine. It is that this is all a balancing act, that there is a very real tension between different Public Policy objectives and human rights, and even between those different Public Policy objectives.
 
So we can be criticized and hopefully will constantly be scrutinized for what we do to make sure that we get it right, and we have to be in a position to rapidly adjust if we do get it wrong. But we do have to do the impossible. We have to consider and then somehow get a balance between the implications of online disinformation for democracy versus the freedom of expression.
 
So in the summary report that we had of the Plenary Session, it was clear that we cannot legislate for or against that unfortunately large group of people who are not interested, are not consumers of news or who actively avoid news, which leads to confirmation bias and which leads to a lack of exposure to anything but the most sensationally and probably manipulated news but we can create the conditions how that we can maintain the strong journalism and maintain a strong freedom of expression for those who are prepared to take the time to listen.
 
And then finally, as Natalia did mention it, and Natalia, no, I am not an expert in copyright law. I was relieved when you said that you weren't going to talk about the EU example but then you talked in detail about the EU example and I'm not really the expert to respond to you but clearly this is another example of this tension where there are, of course, going to be problems with regard to an outright ability to express oneself and the rights of holders of copyright. And also those who work hard and invest in the creation of news and content, because we are seeing a situation in which what I would call independent -- I know I heard a laugh when the phrase was used earlier on -- but quality journalism is maintained and preserved.
 
That quality journalism is essential for maintaining democracy, and so it is actually, this is where the inherent contradiction or rather tension arises, that we need well resourced journalism so we can actually have sustainability and independence in that journalism whether it is printed or online or broadcast. And therefore their material needs to be protected, but the provisions which I know you're an expert in which the Commission proposed and which are still under negotiation, so the Commission has slightly lost control of its text now, did make provisions for the use of hyperlinks and snippets, did recognize that there were legitimate uses which were exceptions to the right of copyright. Parity for example is one of them and that's very important because in regimes where as we've heard an example where a head of Government is not happy with the way that he or she is being portrayed or that the Government is being portrayed, parity of course is a very useful Civil Society way of agitating against undemocratic regimes. I'm just thinking back to IGF 2014 which was in Istanbul. I don't know if anyone else was there but that's when the Turkish regime introduced, they had shut down Twitter because there were Twitter accounts which were exposing allegations of corruption among members of the Government and members of family of members of the Government, and that didn't please the regime so they shut it down.
 
And that's the sort of thing which we all want to avoid, but we do know that when it comes to legitimate journalism, we have to preserve that. It is essential in other words for Democratic society and accountability so even if it means that some people cannot reuse material, it's a justifiable cost in the view of the policymakers who decided on these proposals.
 
So that's really the point I wanted to make specifically in relation to copyright. Thank you.
 
>> CRISTIAN URSE: Thank you very much. Thank you. And now we are going to the last speaker, Giorgi Gvimradze, on religious feelings versus freedom of expression. My personal feeling is that this is a debate that is gaining weight in Georgia. Nowadays I think it's very timely to discuss it. Please.
 
>> GIORGI GVIMRADZE: I'm Giorgi Gvimradze representing the Georgia Broadcasting Corporation, and it's my honor and I'm pleased to present this organisation on this very interesting event. First of all, I would like to express my appreciations to organisers of this event, giving me an opportunity to share with you my concerns and questions which are not easy to answer regarding the topic of freedom of expression versus religious feelings, even realizing that maybe even this is the wrong point to put religious feeling versus freedom of expression.
 
But anyway, I'm happy to relay that this discussion finally came to my country, because we are, despite we are talking about that for a long time now. First of all, as a media Representative, I should admit that there is no public space yet for qualitative and academic discussion of this issue of the issues of similar importance, and that's why I was glad to come here to make you share your experience with us.
 
Believe me, there are several issues regarding the topic which should be defined correctly. While we are aware of and can more or less define what is the freedom of expression, there is no clear understanding what is religious feelings, and even more, it is hard to define the meaning of heart feelings. To understand someone's heart feeling is a matter of our own feelings. The matter -- this is difficult to identify the level of heart feelings.
 
To understand someone's feelings in the matter of our disposition, this is the matter of our disposition toward the person or group of people. How much someone's feelings are heard is also very difficult. That's why I'm putting these questions to discuss it with you. As I mentioned before, we are talking about that a lot, not discussing but talking too much.
 
Recently, the issue was the amendment in law about culture, which also was talked in the frame of controversy of the topical -- of my topic. For a long time, we used to read in the law that it could be restricted out of publication in case it violates other people's rights and et cetera, but without reference who should judge it.
 
Now the amendment say that this should be a judge within the frame of private law. Actually, okay, who else could judge it? But the question was: Is the judge prepared for that properly? How he or she could judge the case. Of course, some could say that this is not a legitimate question because judges are there to judge regardless of difficulty of the issue, difficulties of the cases, but we are not talking about who is guilty, which is the case usually.
 
The question is: What is the violence itself? Where is it and who is violating what? What is the work of art? And could it violate someone's right? Can we define the characters as a piece of art? Can we say it was heart feeling of Muslim people? Of course, there is no excuse for violation especially for terrorist attack but if reaction were different, if Muslim people reacted with a peaceful protest, could we say that it will be rightful? Could we say that the Charlie Hebdo activity was violated? I suppose we'll never achieve a consensus between the parties. In that case, what could be the judgment? Does that depend on which party representative judge is?
 
Another issue I would like to touch is the political correctness which is kind of restriction to freedom of expression, and especially modern practice regarding religions. In the societies with religious majorities in Europe, and first of all I can speak about my country, we accept critics of religion beliefs, practices and even theology of one, while at the same time criticizing the same issues in religions of minorities could be defined as Islamophobia or anti-Semitists or something like that and this is not the problem with Islam or Judaism or other religions, this is a problem of practice of political correctness. This is a problem of our disposition with this value.
 
All religions could be criticized, and all of them have what for, especially when we're speaking about their public practices and their behavior in the society, while at the same time they all have to be respected equally from the secular society.
 
Finally, that can mean because of the difficulties with the topic, a lot is about our own responsibility. We should be responsible individually. We should be responsible as a part of society. Media should be responsible, and, yes, while the traditional media as usual has more resources than irresponsible individuals, it should take advantage of on digital platform as well as bring the responsibility on it.
 
Finally, media should bring the knowledge and understanding of the difficulties of the contradiction. It should bring the platform for better knowledge of each other and better understanding of each other. Thank you.
 
>> CRISTIAN URSE: Thank you very much, Giorgi, for that. It also adds I think to the list of topics that we put on the table, and we will indeed open now the floor for questions and comments.
 
Just to give you an idea, we intend to conclude this at about 3:15, 3:20, when we'll go to Esmeralda Moscatelli. We also have the Remote Moderator Virginija, so we'll be able to add to the questions from the floor whenever the need be. So we are opening it. Just a kind request to everyone, please shortly introduce yourself, state your affiliation so we know and the speakers the panelists know who they interact with.
 
Floor is open.
 
It takes a while, probably. Maybe I can ask Menno whether he wants to break the ice.
 
>> MENNO ETTEMA: I can try to break the ice and try to bring in also new perspectives. I've appreciated the various inputs because I think it really puts the finger on the sore spots and where the challenges lie. I mean, I think was mentioned by Pearse, the balance is where the challenge I think lies in all these discussions. Where's the balance and how do we work on that?
 
I think if this session is also about seeking solutions and challenging the solutions and the possibilities, I think finding this balance really needs, and this is where I want to bring in a new point of the educational awareness raising approach into bringing a broader political -- broader society into the debate, we really also need to think of the awareness raising elements to the discussion to capacitate people to be critical consumers of news of the Internet these tools. Especially when we look into self-regulation and more transparency the question is do we also understand and the people that use the platforms, do they actually understand information that they are provided? I think there's a challenge still on educating people to understand what are the risks and opportunities when it comes to privacy on these platform settings, the opportunities and the risks of consuming media, the sources, checking the resources, and also to understand if there's redress mechanisms, okay, when do I use these redress mechanisms? And what are my expectations and how do I put may case forward?
 
I think there's a lot of need to look into educational elements. In that case I think the Council of Europe is trying to strengthen this comprehensive approach for example with the ECRI, European Commission against Racism and Intolerance, had a policy recommendation on combating hate speech where it seeks to have very comprehensive definition on hate speech, for example, but also calls for regulation, self-regulation, but also really calls for promoting human rights narratives or alternative counter narratives, more education and more victim support. I think that's also a perspective we need to see in.
 
In that session, for me part of the discussion on the freedom of expression is also the freedom of expression of who? I think there's also consideration to see that we have National may north for example speaking of the LBGT community that they're part and parcel of this discussion and regulation and they also can fully take part in designing regulations if needed but also designing self-regulatory methodologies that are involved in this process. Civil Society should also be involved. So I think these are some of the struggles we're looking at and if I look at the Georgian situation, we just had a session on the situation in Georgia where we see, we got the comments there's quite good legislation in Georgia on freedom of expression for example, protecting it.
 
But when it comes to implementing the capacity is lacking with judges to understand the issues at present and how to implement it. And the police are not able to follow up on complaints. Civil Society is also not organized to actually bring forward points so there's a need to seek all partners involved and the question is how do you do that? And how do you get everybody involved?
 
>> CRISTIAN URSE: Thank you very much, Menno, for that, that's a good addition I think, bringing in education and awareness raising elements. As you said. Again, the floor is open for comments and interventions.
 
[Off microphone]
 
Please.
 
>> WOLFGANG BENEDEK: If there's nobody having a question or comment at the moment I'd like to say something. I'm certainly also of the opinion that it is not an issue for State or European Union or whatever interference. In Zimbabwe they created the Ministry for communication which was also to control the Internet and the Minister very quickly got the nickname Minister WhatsApp, yeah? So this is what people not really need.
 
But on the other hand we have also to see that the situation is different from what our assumptions usually are. Our assumptions usually are that there is a remedy, so the diversity of media. So if you find in this media a certain information, then you can check it with information in other media. But if people are living in their echo chambers, in the filter bubbles, in big platforms like Facebook, you do not get -- I mean, you're not confronted with this diversity from the moderators, so to say, of content moderators of Facebook, because they pursue a certain line.
 
And on the National level, we also have institutions which are not necessarily interfering into the performance of the media, and these are independent regulators, so if we do -- if we need independent regulators on the National level, why don't we think about some form of regulation independent also on the international level? Isn't there a need for something like this, where one could get together and in cooperation with the platform set up some mechanism which has a due process, which is transparent, and so on, and which might deal let's say with issues or give guidance similar?
 
I'm all for the balancing. For example the European Court of Human Rights is requesting the National level to do this balancing, and if the balancing comes to certain conclusion, he will accept it, because balancing has taken place. But what do we know about balancing in Internet platforms, by Internet intermediaries? Do they have the time? Is this balancing in-built into the algorithm? We do not know, and therefore I think there needs to be a public debate, these platforms always say that yes we are in favor of multistakeholder approach, but in practice, they have expert committees, self-selected to advise them.
 
They are shying away from the public debate of their approaches, and this gives us the feeling that something is maybe not perfect, because otherwise, they could discuss it in public. Thank you.
 
>> CRISTIAN URSE: Thank you for that.
 
Please.
 
>> Thank you all very much for your presentations. My name is Nadia Tjahja, and I represent the Youth Coalition on Internet Governance, and I would like to direct my question to Professor Benedek, and perhaps if I have made a false connection, please correct me. But you mentioned regarding the European Court of Human Rights, the different algorithms that need to be discussed and specifically that there need to be a basic understanding and in how we can participate in this. So I vaguely recall there's a series of trade negotiations at the WTO and FTA and I think also ARCEP where they're discussing bans and restricted access to particular algorithms to protect e-Commerce and trade secrets.
 
So when you mention participate, and a basic understanding, do you mean access to source codes? Do you mean -- so in what aspect do you perceive the participation of the technical and Civil Society community? And in terms of the discussions, I don't know if this is within your realm of discussion, but would it be possible for tech and Civil Society to participate in organisations like the WTO and ARCEP, for example? Thank you.
 
>> WOLFGANG BENEDEK: Yes. Thank you very much. When I mentioned the European Court of Human Rights, you know that they receive around 60,000 applications every year and in the filtering process, it seems they are using also algorithms which help them to identify whether this is a true complaint based on the violations of rights in the European Convention of human rights or not, yeah?
 
What you mentioned regarding the WTO debate, e-Commerce, that's a totally different area. I mean, here we are dealing with trade law, and part of this intellectual property rights is indeed trade secrets, so you can have protection of trade secrets. It's part of this system, but what I have suggested was not that you have to reveal exactly the algorithm in any detail.
 
What I have suggested is that you inform your users about the principles, about the basic configuration so to say so that they understand which elements are there and which relevance in the algorithm. And I say if that is possible to use algorithms in human rights context then maybe it is possible also to have this kind of human rights by design built in like privacy by design, why not freedom of expression by design, built in, and certainly that needs to be discussed.
 
And if there is no willingness to do so, then I think self-regulation is becoming risky for the users, and that attracts then proposals for regulation from the outside, which is not what would be my preference.
 
>> GIORGI GVIMRADZE: Thank you. I want to complement what Wolfgang was saying. As has been correctly remembered, we have dealt many years human rights through commerce law. We are talking of taking down as something that is in our tradition. We never had this in our tradition. If I publish on my website as a broadcaster something wrong, I'm, even if I take down one second after I publish, I'm fully responsible for it. I go to jail if it's something defamating somebody in some countries. I can pay huge fines. I can suspend my journalistic license, et cetera, et cetera. If I do on Facebook or if somebody publish on Facebook, then the only obligation Facebook within a certain delay of time as to take it down. So there is something wrong there and I think that we need to come back to the essential that when it comes to human rights, the human rights cannot be bartered against trade or against anything. It comes as a first value for a society.
 
The second point I want to make clear is the example that we have of self-regulation have not worked. The right to be forgotten is self-censorship, self-absolved, self-indulgent by the same company that provoked the problem. I give you an example I mentioned briefly yesterday, BBC keeps a record of all the items that they publish on the website that have been taken down by Google because of the right to be forgotten Act.
 
And they were 66 months ago and I think how they could be -- and they believe that a certain number of these items has been taken down were taken down with arbitrary reason, but because nobody knows which criteria were applied, who applied, and because the expert list that is making the analysis ex post what happened is not open, is something -- it's a black box. Can as a society today live in a black box when it comes to human rights? I don't think so.
 
Third point and sorry to be long, it is about algorithms. There is a very commendable initiative of the Council of Europe that this artificial intelligence in media Working Group, I'm part of this group and proud to be part of this group because as a journalist I don't understand nothing of nothing but I listen and I try to learn, and I learn in the first meeting of this group two months ago something very important, that an algorithm is not a unique body. The algorithm is the first part, the second part and the third part.
 
The first part is what you want to get through the algorithm, okay? The second part is once that we've agreed what you want to get through the algorithm, then you put all the equations that are needed to get the objective that we have stated. And then the first part is a self-assessment, if there is a match between the working -- the working made by the algorithm compared to the original scope. I don't care what is in the second part. I would care very much to know what there is in the first part, because most of the problems that we are discussing now are exactly that in the first part, the first thing is how can I make more money out of it.
 
That's the real reason why, and this is something that probably you need to discuss in a Civil Society, as media we have a right to argue that society is not only based on making money. That there are limbs to the money that you can make. When I discuss with my Facebook colleagues coming from the media Sector, because now there are some working for them, I say you forgot every newsroom journalist that I hired in my life, the first thing I ask him is fact checking everything twice. Now that you are doing the same job as main, why you are not submitted to the same rules? Because practically we're doing the same job.
 
So there is something that we need to stand up and we need to -- and the solution for me is a cooperative solution because all of these topics that have been afforded, like the one mentioned by our Georgian colleague, are things that the journalists assess and solve every day in their job. Why we have to go to look for experts somewhere else?
 
Work together and try to assess which are the right solutions. Yes, of course, it costs less to take 1,000 people and put in Dublin or in Germany and give them algorithms to do the job. Unfortunately, society is complex and this cannot be solved everything. Sorry for being long.
 
>> CRISTIAN URSE: Thank you. I have first the gentleman on the left, and then Pearse.
 
>> Hi, I'm Edward from the Digital Rights Organisation from Portugal. I have some questions or comments to Mr. Pearse O'Donohue. Sorry for the pronunciation.
 
You said that it was so important not to have general monitoring obligations on tackling illegal content online but in the copyright case, there are general monitoring, Article 13 requires general monitoring because it requires that 100% of the content that is uploaded should be tested against a filter. So why is copyright different from the other subjects?
 
And we know that filters don't work only for illegal content. We know that they also will have problems with parity, like you said the free uses and the exceptions or what the Americans call the fair uses that are rights of the citizens to express themselves.
 
So I want to ask if you really think that it's justifiable cost that you said, that you -- it is fair or justified to restrict freedom of expression of the citizens just to give funding to journalists and rights-holders? It's the only option to do that is to restrict the freedom of expressions of citizens with filters, or are there any impact assessments on the consequences of this legislation? Thank you.
 
>> PEARSE O'DONOHUE: Thank you for the question, and perhaps I can give precision, but first of all just to clarify. You quoted me back, but what I said about no general monitoring obligations that was in relation to the e-Commerce Directive and what we've done to supplement that in relation to illegal content, which by the way is not law. It is a recommendation but what we have in relation to illegal content before I move on to your question, I know it was focused on copyright, is that there should be an ability to systematically monitor and to use automated means, but because of the use of automated means, we then have to address even more seriously the weakness which Giacome spoke about in the past and even was in the example you gave about Facebook which is clearly we have to have absolute transparency upstream and downstream of this filtering process as you call it, I would not call it a filtering process, but upstream and downstream. In other words that what we have now recommended is that the intermediaries, the entities, need to have a transparency policy which means ex ante they have to explain their decision and ex post of any decision to take down they actually have to explain that. But what is just as important as again to really fill in the weaknesses that have been recognized we need to have an independent means of challenging any such decision.
 
And that is very important to making this system work. Now, in the specific context of your question which I know was in relation to copyright, of course if you say and if you ask me the question, is it a legitimate cost just to support journalists to restrict the freedom of expression of citizens? Of course, if I answer yes to that, it can be well quoted and misused but I did in my presentation try to make a more reflective analysis to why this balance was required as to why it was in order, first of all, to protect rights-holders, because there is a right there of people who have created material, created content, and they do have the right to have it protected but there are a series of exceptions that you obviously are aware of that I listed some of them with regard to snippets, with regard to hyperlinks, with regard to parity and so on, where those rights are waived, and the copy holder's right is not an absolute right. That's been made clear in our proposal and in previous court of law. So it is acceptable to have some trade-off to protect the future viability of journalism, of quality journalism, in the European Union which is itself fundamental to supporting democracy, to pluralism, and therefore to a Democratic framework which allows the freedom of expression, because if we just look to our partners across the Atlantic, although they're on hopefully not a very slippery slope, one has to ask themselves as the journalism becomes more partisan and as the ability to attack others without any redress becomes stronger, how is that supporting plurality, Democratic discussion in that country?
 
So of course it's not -- we would prefer not to have to make these trade-offs. Certainly don't think we're in a position to make these judgments on their own. We have made proposals with the legislators and each of the mechanisms I described in turn will rely on regulators, we haven't talked about independent regulators, on transparency and making sure those who audit and check for compliance are totally independent of the Government.
 
>> CRISTIAN URSE: Thank you very much. For just one minute I'll go to Natalia and Giorgi who have announced their intention and then we'll go to Esmeralda as we approach the time limit of this session. Please.
 
>> NATALIA MILESZYK: For me, the biggest question of this session in EuroDIG is the role of the platforms and I see it very visibly that we are still not capable of answering this question, because platforms appeared in this human rights ecosystem. There are very powerful players right now and it is tempting for states for example to hand over the protection and some responsibilities in this field for the platforms because platforms are effective. That's really true. So they're the one capable of for example monitoring, filtering, taking down and regulating what we see and what we cannot see.
 
But on the other hand, platforms are not newsrooms. They're not newspapers. And I believe that we all don't want them to be newspapers. Because, you know, if it becomes newspaper and I'm not quality journalist and I don't have proper education, I don't have other place to go to express myself. So for me, and I don't know the answer.
 
The question is: How shall we really embody platforms in the human rights protection ecosystem, and how to make the system sustainable? I don't know the answer but for me from this session and from the previous sessions as well, that's the biggest we are facing because it's really a pity we don't have any platforms in the room right now.
 
>> CRISTIAN URSE: Thank you, Giorgi?
 
>> GIORGI GVIMRADZE: I will go on with Natalia, we need really the platform for discussion of these issues. I believe that discussion about may topic especially, I can speak about that, it came -- it's somehow coming to our country, because this is a really important issue right now. When you have in a country the majority of one of the religions and they're very active and they're really proactive to express their religious beliefs, religious practices on the public, and this is really always really very important, and especially when there is on the contrary, there is a part who is expressing their freedom of how to say to contradict themselves to this kind of expression of this kind of religious beliefs and religious practices.
 
So this is kind of let's say not actually the new challenge for us. This challenge is existing with us during all these 27 years of being independent from the Soviet Union, but anyway, we are -- I do believe, and I would like to believe, that we are ready right now somehow to discuss this -- academically discuss this issue and quantitatively discuss this issue, and what I would like to ask you about is kind of to somehow share your experience with us. I mean, not on the floor of this event, but also on the next meetings and the next occasions, to really to share your experience.
 
When I was preparing for this issue, I found out that this issue, this discussion, is really active in Israel, for instance, because Israel community also discussing where is starting religious feeling and where is -- where should we start speaking about religious feeling and stop talking about freedom of expression, and on the vice versa, so I do suppose that we need bigger platforms for these kinds of issues, and these issues are really very active right now, especially -- maybe even in Europe as I can see, because the rise of Islamic communities inside Europe is really very obvious, so thank you very much for this kind of opportunity to speak about these issues.
 
And, yes, we do think about that, and inside our public broadcasting we do think to finally put on the floor the platform for really qualitative and really academic discussions of this kind.
 
>> CRISTIAN URSE: Thank you, Giorgi. I'll very briefly please, and then Esmeralda --
 
>> I will try. Many things have been covered. Just want to say that again, transparency is not enough if the information provided are not made accessible, and again, the failure of the multistakeholder approach is not representative, because I think we have a huge problem of representation, who makes the decision has been said that if it is not representative, really representative, of all the parts involved, it will fail. Also the Professor Strickland said that many times, and I think we still have a huge problem of representation when it comes to decision-making bodies and I was really happy that I heard the word "feeling" just from the, I don't remember the name -- George, because I think we're talking about algorithms and I'm happy we talk about education because I think that if we talk about human rights and accessibility we shouldn't forget the ability also to, like, push for empathy and critical thinking, which is totally missing from all these arguments, and I think when it comes to hate speech and talking about protecting rights of people, it shouldn't be missing and it should come to education and the real peer to pear dialogue, which is now missing. So I wanted just to find out these things.
 
>> CRISTIAN URSE: Thank you. And we go now to Esmeralda, right?
 
>> ESMERALDA MOSCATELLI: So hi, everybody. This is Esmeralda Moscatelli from the International Federation of Library Associations. I'm here because I helped organize this workshop, but I'm not the Internet platform person. That comes after me. I was asked to briefly touch upon the remedies that were discussed here, but I'm going to be very general, and so it seems to me that when there is a consensus from everybody and when our own human rights and fundamental freedoms are either restricted or violated, we must have the right to a remedy.
 
But the ways for us to seek this remedy should be in a way available to us all, and accessible and fully representative. I heard that we often talk and discuss and we hear with the idea of this multistakeholder, which is very nice and fine, but some of the actors are shying away from public discourse regarding this matter, so this is something we have to think about. And also we have to think about solution and remedies that are capable of providing a real practical solution to tackle the problem.
 
So the effective remedy should be or could be obtained from Internet Service Provider, and we heard I think in general about the lack of some credibility, transparency issue. Why don't we have international regulator in cooperation with this platform that could sort of highlight certain due process and the issue of balancing reiterated by many people is important, but do we have it these days? Yes? No? There's ample room for discussion there.
 
And then also, why is credibility, when we talk about Internet Service Provider, their credibility at times is not called into question vis-a-vis the more established media fact-checking tradition that we had in the past, and I'm talking about analog media.
 
Also, effective remedies should be obtained for public authorities and any other human rights or Government institution involved, and Mr. O'Donohue talked about auditing and monitoring too, that we don't really have, and the fast pace of this is sort of biting our tail.
 
So is self-regulation a remedy? Is becoming -- from users this is becoming very risky and at times ineffective, and the right to be forgotten was called in question, and I think in general effective remedy as you have highlighted should sort of tackle real matter, clarify issues, and be accessible to us all.
 
>> CRISTIAN URSE: Thank you. Then we move to Aida.
 
>> My name is Aida, I am here in front of Geneva Internet platform. You can find full summaries of all of these, as well as EuroDIG wiki page. Now what we have here are just to say Esmeralda we will also include those remedies that you just told us.
 
Now I have here session messages and what we are trying to do is get a rough consensus in the room and we can do that. We're not inviting for discussion but you can just show thumbs up or thumbs down and I will note if we have it or not. It's okay to disagree.
 
One, there should be more transparency on how algorithms are developed and a public debate on the approaches private companies take.
 
Right.
 
Two, when discussing ways to tackle disinformation we need to assess the implications for both democracy and freedom of expression.
 
Cool.
 
Quality journalism is essential for maintaining democracy.
 
Okay, I see smiles. That's also good.
 
Four, since traditional media has more resources, it should take advantage of a digital platform and bring responsibility to it. Should I read again?
 
And five. If you need to change a word -- to repeat, okay.
 
Since traditional media has more resources, it should take advantage of a digital platform and bring responsibility to it.
 
[ Off microphone ]
 
This is what one of the speakers -- 
 
[Off microphone]
 
Yes.
 
[Laughter]
 
>> GIORGI GVIMRADZE: And actually in my words it was that traditional media, traditional media, the big broadcasters and media producers have much more resources and some irresponsible individuals which are using the social media for fake use or disinformation and something like that, in that case, I do believe the traditional media should invest more in the digital platforms and to bring this kind of responsibility or quality of journalist on the digital platforms as well, on social media and so on.
 
[Off microphone]
 
It has better equipments, it has huge amount of human rights, quality journalists and so on and so on, which we exceed in let's say, exceeded in amount all the information which is put on the digital platform.
 
>> Aida: Would you guys agree to rephrase it a little bit and you will help me right after this session? Yes? Okay, okay.
 
And the last one and finding the education is important and awareness raising is important in order to achieve that balance and to make people understand the information their provided.
 
[Off microphone]
 
I see a little bit -- okay. I feel like we have four we agreed on. Thank you everyone for holding in there.
 
>> CRISTIAN URSE: Thank you very much indeed everyone. I want to thank again the speaker, Giacome. Please, thank you.
 
>> GIACOME MAZZONE: Natalia reminded me for our purposes, we have an open consultation on our recommendation on tackling illegal content online. It's the second stage of the consultation where primary stakeholders already made recommendations about the next steps but we'd really like to have the views of the wider community on what might be the next steps. The consultation is open until the 25th of June. I apologize.
 
>> CRISTIAN URSE: I just wanted to say we'll have the chance to exercise our rights to a coffee break, and then you are welcome to pursue and continue the discussions with the panelists or among yourselves. Thank you very much once again for participation.
 
[Applause]
 
[End of session]
 
 
''This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.''


[[Category:2018]][[Category:Sessions 2018]][[Category:Sessions]][[Category:Media and content 2018]]
[[Category:2018]][[Category:Sessions 2018]][[Category:Sessions]][[Category:Media and content 2018]]

Revision as of 16:53, 2 July 2018

6 June 2018 | 14:00-15:30 | CENTRAL ROOM | YouTube video
Consolidated programme 2018 overview

Session teaser

This workshop looks at the various issues discussed at EuroDIG 2018 through the prism of freedom of expression. Freedom of expression remains a central tenet of every democratic society. Obviously, there are limits to that freedom, and they may vary from country to country. But over all, are the limits getting wider or narrower? And what specifically has been the impact of the huge changes brought about by technical development? Have they been beneficial of detrimental - or both? We also need to ask, who are those setting limits and controlling our ability to express ourselves freely online? Are governments trying to curtail freedom of speech more or less than they used to do? Are internet intermediaries, trying to keep their platforms clean from fake news and hate speech, adopting practices that restrict freedom of expression? Are attempts to copyright reform restricting freedom of expression – or, to the contrary, enabling original content creators to enjoy it?

Keywords

Freedom of expression, regulation, intermediaries liability, censorship and privacy, take downs, human rights.

Session description

Until . Always use your own words to describe the session. If you decide to quote the words of an external source, give them the due respect and acknowledgement by specifying the source.

Format

The format will loosely be the following:

  • a.) Introduction from the moderator
  • b.) Reporting from Plenary Session 2
  • b.) Debate
  • c.) Q&A with key participants
  • d.) Open the discussion to the room
  • e.) Wrap-up

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, so only links to external resources are possible. Example for an external link: Website of EuroDIG

People

Please provide name and institution for all people you list here.

Focal Point:

  • Esmeralda Moscatelli, Policy and Research Officer (IFLA)

Subject Matter Expert (SME):

  • Yrjö Länsipuro – ISOC Finland

Organising Team (Org Team)

  • Eduardo Santos, Defesa dos Direitos Digitais
  • Ucha Seturi, Small and Medium Telecom Operator's Association of Georgia
  • Natalia Filina, EURALO
  • Ketevan Kochladze, Youth and Environment Europe YEE
  • Anna Romandash, Digital Communications
  • Elisabeth Schauermann

Key Participants

  • Giacome Mazzone - European Broadcasting Union - Reporting from Plenary 2
  • Irina Drexler - No Hate Speech Movement - Freedom of Expression in Romania
  • Professor Wolfgang Benedek - University of Graz - Challenges for intermediaries respecting freedom of expression
  • Natalia Mileszyk - Communia Association - Copyright as a tool to restrict FoE in the light of ongoing EU reform?
  • Pearse O'Donohue - European Commission - How to tackle illegal content while ensuring freedom of expression and information
  • Giorgi Gvimradze - Georgia Broadcasting Corporation - Religious Feelings VS Freedom of Expression

Moderator

  • Cristian Urse - Head of Council of Europe Office in Georgia

Remote Moderator

The Remote Moderator is in charge of facilitating participation via digital channels such as WebEx and social medial (Twitter, facebook). Remote Moderators monitor and moderate the social media channels and the participants via WebEX and forward questions to the session moderator. Please contact the EuroDIG secretariat if you need help to find a Remote Moderator.

Reporter

  • Aida Mahmutovic

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.

Messages

  • There should be greater transparency in how algorithms are developed, and a public debate on the approaches taken by private companies.
  • When discussing ways to tackle disinformation, we need to assess the implications for both democracy and freedom of expression.
  • Quality journalism is essential for maintaining democracy.
  • Finding the right balance between disinformation and freedom of expression is important, and educational awareness raising is needed in order to achieve this balance and to make people understand the information they receive.

Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/your-freedom-expression-vs-mine-who-control

Video record

https://youtu.be/_6rM_zNamac

Transcript

Provided by: Caption First, Inc. P.O Box 3066. Monument, CO 80132, Phone: +001-877-825-5234, +001-719-481-9835, www.captionfirst.com


This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.


>> CRISTIAN URSE: Good afternoon, everyone. Welcome to this session. Your freedom of expression vs mine? Who is in control? My name is Cristian, and I'm going go the Moderator of this working group this afternoon. I hope the technical issues are set. I'm very pleased to introduce to you our speakers today.

At my right, Professor Wolfgang Benedek from the University of Graz. He's a former Director of the Institute of International Law and International Relations at the University of Graz, also lecturer at the Diplomatic Academy of Vienna and the European Master Programme on Human Rights and Democracy in Venice. And I'm also glad to mention Professor Benedek is also a member of the Council of Europe expert group drafting the consular guide on human rights of Internet users.

We also have Mr. Pearse O'Donohue from the European Commission, Director for the Future Networks, Directorate of the DG CONNECT dealing with policy development and research supporting the Digital Single Market.

Also with us today is Natalia Mileszyk, lawyer and Public Policy expert, copyright, reform and openness.

Irina Drexler, the National Coordinator of the No Hate Speech campaign of Romania, will speak about a recent report on freedom of expression in Romania.

And last but not least it's Giorgi Gvimradze from the Public Broadcaster of Georgia. The Director of News and Current Affairs, also an Advisor to the Director-General of the public broadcaster, and I believe also relevant for the topic he'll tackle the one of religious feelings versus freedom of expression. He's a graduate of the Theological Academy of Tbilisi. So I think that adds to the ingredients we need for this session.

But before going to the speakers, we will hear from -- sorry -- from Giacome -- I'm a bit lost in my papers.

>> Can I introduce myself?

>> CRISTIAN URSE: That's relying too much on papers, so Giacome Mazzone, he worked in public service broadcasting and media for 20 years. I can pick out and mention here international companies like EuroNews, EuroSport. Giacome will brief us about the Plenary Session of yesterday as an introduction to the debate that we will have. Please.

>> GIACOME MAZZONE: Thank you very much, especially for the discount you made me, because I work since 30 years, but 10 years less makes me younger so good.

Not true, fake news.

And I have to report about yesterday's session about fake news, so I want to debunk the first fake news you gave. The session yesterday was I think very interesting. And gave some hints that are important to be heard before to go into the discussion of today.

I think that what was important yesterday is Claudia Lucian in the keynote speech at the very beginning of the session said that fake news and misinformation have a big impact on democracy on citizens' lives so they need to be tackled because of that.

Claudia said the Commission itself between finding effective remedies but without hurting human rights respect is crucial for European values.

There has been other consideration the banking activity, the source checker, that need to be financed. This is an activity that has to be financed. He said that this could be something that could become in itself a source of revenues. For the moment it's only a source of cost for the traditional media.

He said there's an activity that need to be done hand in hand with the platforms because if not, is not effective. He raised the question, who controlled the controllers? And he gave also an answer saying that peer to peer could be the best solution.

And he said that the work of the Commissioners started will be implemented before the end of the year and the beginning of next year until the election of 2019, the European election, will be the perfect test time of all the activity the Commission has put in place.

Ana Kakalashvili said that we need to differentiate remedies because in the same information disorder box we have fake news and propaganda, and probably the two needs to be curated in a different way and need different answers.

Patrick Penninckx, as Council of Europe, provoked a little bit the audience saying that in 22 out of 28 countries, media are the less trusted institutions, and so he said that this leave room for improvement, especially he said that only 1 out of 4 trust social media as a source of information. Probably because I was in the room he mentioned that the public service media are going better, and are even improving in the barometer, that probably was just for me. I don't know.

And he said that we need to face as a society with the news schemers and the news avoiders that are relevant phenomena, the people that doesn't want to -- that have a very small view on the news doesn't go at all in deep, and the avoiders, those that don't go at all at the news because they don't find the news interesting or relevant for their life, and both are very worrying phenomena.

And he said that the answer for many of these people that are scheming and avoiding is to cocooning so that they look into media or into messages that reassure them on their own positions. And that this is of course for the society a disaster because it means that you will not go for the best option, but you go for the option that you like more, even if it's not the right one.

Google was present in the debate, and they said, gave us a long list of technical possible solutions to the problems. It seems that they look only into algorithm solution or technical implementation that could solve the issue. Not everybody in the room was in agreement with that.

And then finally, I think the final conclusion of the reporter of the session was that remember us, the reason of all the debate. Democracy is only about making informed choice for citizens. And if I can add something on my side, when I start to study journalism 20 years ago as you said, but let's say more, the most important lessons that I learned is that you never have to say to your viewers or your listener or your readers what they want to listen. You have to report them especially the things that they don't want to listen, and I think that this could -- 

[ Off microphone ]

Exactly. Thank you.

>> CRISTIAN URSE: Thank you very much for that summary, and of course I apologize for mixing up the numbers.

[Laughter]

Let us get into the debate, and before giving the floor to the first speaker, let me also mention that we will have as an informed speaker from the floor someone from the Council of Europe's No Hate Speech campaign, and then about the organisation of this work.

We will give to the speakers somewhere between 5 and 7 minutes to make their intervention, after which we will open the floor for questions. We'll try to get as many interventions as possible. Please bear in mind, I don't know let's bear in mind all together the size of the group and the time that we have, approximately 90 minutes -- well, 80 from now altogether in order to allow as much participation as we can, and also to be able in the end to arrive to a couple of conclusions.

So I will go first to Irina Drexler on the country report on freedom of expression in Romania, please.

>> IRINA DREXLER: Thank you, yes. Thank you. Hello everyone, so yes my name is Irina Drexler. I'm among other things I'm an independent researcher on human rights with a focus on pragmatics, rhetoric and discourse analysis, also Internet Governance, but also my role here is as the National Coordinator of the No Hate Speech Movement campaign of the Council of Europe in Romania. This campaign was also implemented in Romania, and it started -- so I'll tell a bit about the context of the report I'm going to briefly present to you. So the No Hate Speech Movement campaign in Romania was initiated by the Ministry of Youth and sports who has 9 NGOs in Romania active on human rights.

Among these 9 NGOs there is one called Active Watch, which is monitoring the press in Romania. The organisation basically advocates for human rights and for freedom of expression, anti-discrimination and media education. These are the three pillars and in their work, they are developing each year two reports, one on freedom of expression with special attention to the freedom of expression of the media in Romania and another one which is connected is a yearly report on the State of hate speech and intolerant speech in Romania.

So briefly, the report is 200 pages long. I'm not going to present all the cases they have come up with regarding the state of freedom of expression in Romania in the previous year, 2017, but it's a newly launched report on the occasion of the World Day for the Liberty of the Press, so at the beginning of May.

If I were to summarize the main idea of this report is that media in Romania has greatly contributed to the polarization of the society. The politicians, on the one side we have the politicians and the State institutions and on the other side we have the citizens. There is a tension that the authors from Active Watch organisation notice and state in this report which has arised immediately after the new elections in Romania in December 2016, so after these elections in December, if you may remember from the press, from the international press in Romania, we had huge protests which amounted to some 600,000 people in the streets protesting among other things also against corruption and new developments in the political sphere in Romania, but also against the silencing of journalists who were investigating these corruption initiatives by the new government.

And also the polarization is not just between the institutions of the State and the citizens, but also between the press, which is the good press and the bad press so to say, or the press of the Government and the press of the opposition, and the authors remarked that there is not real press for the citizens and for the public interest of the country and of its citizens.

One of the first initiatives was of the State institutions that is remarked here in the report is that public gatherings of people who are trying to protest against this information and against the initiatives of the Government have been discouraged also by sanctions given by State-controlled police, and the protests have been described in the press as being a way of manipulating from using sources from outside. It was conspiracy theory, which is a common -- which is a common theme in other countries, as well.

And also with regards to the social networks, the State has, or the police has, and some courts have, ruled that the press needs to take down completely all articles that are criticizing too harshly the ruling parties in the Government so it was a case of freedom of expression of the journalists being attacked.

Also, the report notices that just like in the case of the National television in Romania and the National radio in Romania, who have faced a new leadership change once the new elections, or once the new Government was installed, it was also the case of the National Press Agency who was being targeted for a similar initiative trying to have the leadership changed to have a more favorable one for the new coalition that's governing the country.

But on the other side, on the plus side, there is also a more active role that the National Council on audio-visual has been taking since, or in the period that the report is looking into, in 2017. However, despite this institution being more active, in fact, there is not a predictability on the sanctions that it gives to televisions, to radio stations and to TV stations. And it's not a predictable sanctions and also disproportionate sanctions concerning who is doing or who is saying what.

The report also mentions pressure on journalists. In the sense that for instance investigative journalists have been attacked or their parents have been attacked or the cities have been spread with graffiti against journalists who are doing investigative work on both the opposition or the Government, but especially for the ones who are criticizing the decisions on initiatives of the Coalition.

And also there have been more controls financial controls for the publications who are being critical and in the sense of the discourse, the public discourse, because of the trends in mass media, there has been a radicalization that the authors of the report have noticed with sanctions also at the level of the Civil Society who are trying to be active. Because I have only one minute I will not go into the other findings of the report but just to mention this report is connected to the report on hate speech in Romania and I will be available to speak more about this in the No Hate Speech initiative in regards to freedom of expression of society in general. Thank you.

>> CRISTIAN URSE: Thank you very much, Irina, for that. I think it's a good opening also to start from a specific case rather than having it at the end. I'll pass the floor to Natalia Mileszyk on the copyright of freedom of expression.

>> NATALIA MILESZYK: There was a question Mark afterwards of course the statement. Hello everyone. It was good to be here since I was so nicely introduced I will skip this part and go directly to the question that probably interests you: Why are we talking about copyright during this session on freedom of expression? Of course for many of us in this room copyright might be perceived as only a property right, ownership right that protects creativity and those who should benefit and those who should earn money on the creativity but at the other side you can also look at the copyright at the way of restricting and limiting somebody else's ability and possibilities to publish things online for example, so restrict in a way freedom of expression. The question is if it's legitimate or not.

And European Court of Human Rights many times stated in many decisions that copyright presents risk to freedom of expression and I was thinking about the example to share with you and I didn't want to become too political so I decided to go for non-European example. If you Google or Bing or whatsoever like whichever search engine you use, notice and take down and the President of Ecuador, you will find plenty of articles showing how notice and take down mechanisms, so the mechanism of implementation of copyright online was used for several years to take down commentaries against President of Ecuador, so it's happening.

Coming back to on going European Union copyright reform, I want to underline it's ongoing and that's why I will focus on general concepts that were introduced. I won't go into very legislative details. I have to apologize to those of you who are really experts in copyright. I'm looking at two gentlemen to my left and to my right.

So because we simply don't have time for this but I just wanted to show you how freedom of expression is visible in this copyright discussion, and where our tensions and where are things we should care about and be very sensible about.

So first of all, there is so-called lawyers love to operate with numbers, so so-called -- not so-called. It's Article 13 that at the very beginning was the idea to implement filtering obligation for platforms that host user-generated content, and I must ask you whoever posted something online that you created, generated, wrote, anything?

Only three people? Four people? No comments, no articles, like academic articles? No photos on Facebook? Guys, you don't share photos on Facebook? You do.

So imagine the situation where everything you post online must be scrutinized, must be checked before being available online by any platform that hosts this kind of photos, articles, comments, and everything, in order to avoid copyright infringement. Right now, the -- we agree in the European Union that it should be something like notice and takedown so if there is copyright infringement, you should let know the platform that something like this is happening and then platform has to take certain steps.

But with this proposal, of course, the worst case scenario, I want to underline I'm talking about worst case scenario, platforms will be obliged to monitor proactively the content published online, so you can imagine what might be the outcome, and also the idea of automatic filters will introduce so it usually will be not humans behind it but artificial intelligence, probably, especially in the case of big companies.

So that's the first thing, and I want to let you know that at the end of November 2018, 80 organisations from across European Union sent a letter to officials in the European Union naming 29 other letters on this issue, so, like, 29 other letters, like, public letters, were published before saying how harmful it might be for freedom of expression if there is private entity that has to monitor everything that you do online.

And second, very controversial Article is Article 11 that introduces new rights for press publishers, so besides having copyright on the content that is usually in the hands of creators or sometimes acquired by publishers, publishers will be granted additional layer of protection, and they will be able to regulate who and how you use hyperlinks and snippets and short parts of the things published in their newspapers online, and of course, I'm saying again, I'm talking about the most radical proposals we've seen during this discussion.

And of course, we all agree that press publishers in European Union right now are struggling or are sinking as somebody already told as previous panel on copyright, but at the end of April, 169 copyright academics wrote to also European Union institutions saying that this solution and this very radical, in this very radical form, can be very restrictive to freedom of expression.

And since I was already given a notice that I have to finish in properly 30 seconds, I just wanted to underline that this is an example how seemingly not human rights related regulation because copyright is perceived in the European Union under Digital Single Market agenda so it's mostly a market oriented programme to enforce and create common digital market for the whole European Union, might affect various rights, in this case freedom of expression. And it shows that we should always be aware and very conscious about human rights whenever we proceed. So thanks all for this.

>> CRISTIAN URSE: Thank you very much, Natalia, and we turn now to Professor Wolfgang Benedek on challenges to intermediaries in regard to respecting freedom of expression.

>> WOLFGANG BENEDEK: Thank you very much. I would like to address to some new challenges, not so new for you maybe, but I have to say when I published together with Matthias Kettemann this book on Freedom of Expression and the Internet with the Council of Europe Publishing House five years ago, we had a different focus and different challenges in mind than what we see today. In the past, the Internet was mainly seen, or the Internet companies were mainly seen, as facilitators of free speech, of dissemination and so on, and today we discuss much more the problem of gatekeepers and of responsibility or accountability of such intermediaries.

And so I would like to take two examples. The one would be the right to forget, and Google in particular, the other one would be the content management by Facebook, and the concern is to say that there are standards and procedures in place which lack transparency. There are decision makers where we do not know neither their qualification nor where they work from, how they do it and so on. And there are problems also with regard to the right to remedy. This would not be so important if these platforms have not become public interest, because they provide a public space in which we all operate.

Now, regarding the rights to be forgotten or the right to be erased was the famous Google Spain case, but in the meantime we have the GDPR which has also already been discussed here and in this new regulation, we find Article 17 which says that these restrictions shall not apply to the extent the processing is necessary as for exercising the right to freedom of expression and information. So this exception for the right to freedom of expression and information is very important.

What is unclear is its interpretation, and the GDPR clearly says that in the public interest, when it is about illegal claims and so on, then there has to be this emphasis on freedom of expression. It goes even further. In Article 85, there are certain exemptions seen for academic, artistic, or literary expression and so on, so the question is: How do platforms, social platforms like Google for example, implement that now in practice? If anybody can help me in understanding this better, I would be very grateful, because the public information existing is very limited.

Certainly we could have some ideas how this is being done, and I do not envy Google for this task, by the way. It is kind of the position of the EU and others that they should not interfere with their work, so it is kind of self-regulation issue, but on the other hand, there are standards, in particular the ones I have mentioned.

Also, the issue if Google is delisting only in the EU or beyond, and if you're aware the Canadian Supreme Court ruled last year that certain search results have to be blocked worldwide, so here we have also different standards.

Then I'm going to say a few words on Facebook and here the transparency report is very interesting, which is -- so I wanted to say something on the transparency report of Google first. And here, Google has published that 686,000 delisting requests have reached it, and it is referring to 2.5 million URLs, of which 44% have been delisted so that should give us also an idea that there is a very high delisting rate, but on what basis this delisting finally has happened and if human rights have been taken into account is uncertain. As we just heard about copyright by the way, there have been delisting requested for 3.5 billion URLs on the copyright and 90% were delisted, again this shows a different dimension which exists in that field.

I wrote on Facebook and here I would like to pose three questions to you which you can see here already, and I would ask you for your opinion, yes or no.

So should there be -- should information of this type be deleted or not? Poor black people should still sit at the back of the bus? Is this hate speech? Yes or no? Who says yes? Please raise your hand. Who says no?

Hardly anybody. That's interesting.

Actually, Facebook is considering that this is not to be delisted because race is a protected category but social class is not. 92% of people who have been asked by New York Times had a different opinion. Actually also my students which asked on that.

Second example: White men are all assholes. Yes or no?

[Off microphone]

[Laughter]

It's not whether you agree with the statement. But whether it should be --

[Laughter]

So who says yes?

Who says no?

Okay. You are a bit split I see. There are many undecided. In that case it was "yes," because race and sex is at stake.

And finally the third one: I never trust a Muslim immigrant. They are all thieves and robbers. Should that be taken out? Yes or no? Who says yes?

Many say yes.

Okay. You're not aware of the Google -- of the Facebook policy, because it is "no." Immigrants are only protected only in context of violence but not in the context of exclusion.

So what do I want to show with this? That these criteria are often used in account intuitive way. I do not want to say that we should always have what the people want or so, but I want to say that it's counter-intuitive, and also some other such counter-intuitive decisions have been corrected, like for example, targeting black children which are now better protected because the category of age has been included.

Now, this brings me to two points more, and this is when you look how this is done, then you realize that it is often done through algorithms and in a semi automatic way, so the question now is: How can you develop the algorithm to take human rights freedom of expression anyway better into account? And here I cannot but emphasize also the study of the Council of Europe on Algorithms and Human Rights, which you can find over there and which deals with this in a very good way, and also we had just a flash session on the other side of the road which also dealt with this very well.

So I cannot go into details, but the issue is the challenge how to organise the self-regulation so a semi automatic decision-making procedure anyway that it has not a chilling effect on freedom of expression, and that the process in general is foreseeable and hopefully also legal in the sense that it does take human rights into account, and maybe also provides an effective remedy.

So if you're put in the so-called Facebook jail because your account is blocked, you would like to know why, and you would like to know which remedies you can take, and this is not sufficiently the case.

I would like to end with another document which you can find over there, and this is the Guide for -- this is the new Guidelines for Internet Intermediaries of the Council of Europe. You can also find it over there. The basis is actually the UN Guidelines on Business and Human Rights, and they apply this to gatekeepers, to intermediaries, and they come to certain conclusions and recommendations which are quite important when it comes to this company, these companies. Because they say standards and practices of content moderation should be publicly available, and the user should be informed. Restrictions should be based on principles, like least restrictive approach and so on, and we have no information whether this is happening or not.

Now, I'm not against algorithms as such. I only recently found out that the European code of human rights itself is using algorithm in filtering applications in order to find out whether there's human rights violation or not, so if that is the case, then why not others, as well?

And if we have in mind the large number of cases at stake with which such companies are confronted, it is I think understanding that they use that. But the question indeed is how these algorithms are being developed, and to what extent there can be what we heard in the other panel this morning an efficient transparency which does not mean that we have to know all the details of the algorithm, but we have to be given a basic understanding, and maybe also the possibility to participate in the discussion how it should be shaped. Thank you very much.

>> CRISTIAN URSE: Thank you very much for that insightful presentation. I think it linked very well with the previous one. Now we move to Pearse O'Donohue on how to tackle illegal content while ensuring freedom of expression and information.

>> PEARSE O'DONOHUE: Thank you very much, and good afternoon. It's very interesting to hear the other speakers. I would just like to first of all just do a quick inventory as to what we do in the European Union with regard to both illegal content but also of course in relation to disinformation which are two different categories of issue.

For years, our rules have been set by the e-Commerce Directive which had the objective of helping the Internet to develop. We're talking about rules developed between 2000 and 2002, while promoting the freedom of speech so it did put obligations on intermediaries, but it made sure that the intermediaries were not liable for content. The only situation in which they were liable for content was where if they were clearly notified, and even then there were certain protections.

More recently, very recently in fact, and this is where in a way we diverge now between illegal content and other issues, is that the Commission issued a recommendation on measures to tackle illegal content online, and that was driven by a number of different controversies but of course, would in the first instance for example be targeted at online material which was promoting terrorism or indeed hate, and/or child pornography or other issues. Now, it does of course in some cases touch on copyright which I'll have to come back to because of the examples that were given to us.

And then very soon after that the Commission issued a policy communication on tackling online disinformation. That was in fact one of the subjects of the Plenary Session yesterday. I won't go into detail on that, but it was based on the report of the high level expert group which did help us to focus our thoughts a lot.

The last element that I would mention is actually the Charter on fundamental rights. So one of the fundamental founding documents of the European Union which is now and has been for some years an integral part of the treaties and is operative as well as just a statement of a key set of issues. What do I mean by that? For example the very famous court of justice ruling, the first Facebook case, the court based most of its ruling not on specific legislation or even on the Treaty but based its ruling on elements of the Charter, so this is a dynamic and effective way of protecting different rights.

So we have that set of policies, but I must stress that while the e-Commerce Directive, the original piece, was a piece of legislation, it did not impose many obligations. We do set out now a set of overarching principles and objectives which are in line with the Charter but the approach we then take is soft law and a voluntary approach so self regulatory initiatives whether it's in how the e-Commerce intermediaries organise themselves or with regard to monitoring. There are no general monitoring obligations, and in fact, our Vice President made it very clear there would not be general monitoring obligations.

Anecdotally it's interesting to note, as a former Prime Minister of a country that was under the Soviet Bloc until the early '90s, when we first discussed this issue with them a year ago, he said: This is very important, but there will be no Ministry of Truth. And this is also looking back to the Soviet era, in which we could not put ourselves in the position or allow any Member State to put themselves in a position in which the law decided what was disinformation, what was fake news, what was appropriate or not. We had to have an independent, transparent, and self-auditing process if we were to try and tackle an online disinformation because of the implications, the serious implications potentially for freedom of expression.

So everything I've said confirms what we said we were talking about at the start: Your freedom of expression versus mine. It is that this is all a balancing act, that there is a very real tension between different Public Policy objectives and human rights, and even between those different Public Policy objectives.

So we can be criticized and hopefully will constantly be scrutinized for what we do to make sure that we get it right, and we have to be in a position to rapidly adjust if we do get it wrong. But we do have to do the impossible. We have to consider and then somehow get a balance between the implications of online disinformation for democracy versus the freedom of expression.

So in the summary report that we had of the Plenary Session, it was clear that we cannot legislate for or against that unfortunately large group of people who are not interested, are not consumers of news or who actively avoid news, which leads to confirmation bias and which leads to a lack of exposure to anything but the most sensationally and probably manipulated news but we can create the conditions how that we can maintain the strong journalism and maintain a strong freedom of expression for those who are prepared to take the time to listen.

And then finally, as Natalia did mention it, and Natalia, no, I am not an expert in copyright law. I was relieved when you said that you weren't going to talk about the EU example but then you talked in detail about the EU example and I'm not really the expert to respond to you but clearly this is another example of this tension where there are, of course, going to be problems with regard to an outright ability to express oneself and the rights of holders of copyright. And also those who work hard and invest in the creation of news and content, because we are seeing a situation in which what I would call independent -- I know I heard a laugh when the phrase was used earlier on -- but quality journalism is maintained and preserved.

That quality journalism is essential for maintaining democracy, and so it is actually, this is where the inherent contradiction or rather tension arises, that we need well resourced journalism so we can actually have sustainability and independence in that journalism whether it is printed or online or broadcast. And therefore their material needs to be protected, but the provisions which I know you're an expert in which the Commission proposed and which are still under negotiation, so the Commission has slightly lost control of its text now, did make provisions for the use of hyperlinks and snippets, did recognize that there were legitimate uses which were exceptions to the right of copyright. Parity for example is one of them and that's very important because in regimes where as we've heard an example where a head of Government is not happy with the way that he or she is being portrayed or that the Government is being portrayed, parity of course is a very useful Civil Society way of agitating against undemocratic regimes. I'm just thinking back to IGF 2014 which was in Istanbul. I don't know if anyone else was there but that's when the Turkish regime introduced, they had shut down Twitter because there were Twitter accounts which were exposing allegations of corruption among members of the Government and members of family of members of the Government, and that didn't please the regime so they shut it down.

And that's the sort of thing which we all want to avoid, but we do know that when it comes to legitimate journalism, we have to preserve that. It is essential in other words for Democratic society and accountability so even if it means that some people cannot reuse material, it's a justifiable cost in the view of the policymakers who decided on these proposals.

So that's really the point I wanted to make specifically in relation to copyright. Thank you.

>> CRISTIAN URSE: Thank you very much. Thank you. And now we are going to the last speaker, Giorgi Gvimradze, on religious feelings versus freedom of expression. My personal feeling is that this is a debate that is gaining weight in Georgia. Nowadays I think it's very timely to discuss it. Please.

>> GIORGI GVIMRADZE: I'm Giorgi Gvimradze representing the Georgia Broadcasting Corporation, and it's my honor and I'm pleased to present this organisation on this very interesting event. First of all, I would like to express my appreciations to organisers of this event, giving me an opportunity to share with you my concerns and questions which are not easy to answer regarding the topic of freedom of expression versus religious feelings, even realizing that maybe even this is the wrong point to put religious feeling versus freedom of expression.

But anyway, I'm happy to relay that this discussion finally came to my country, because we are, despite we are talking about that for a long time now. First of all, as a media Representative, I should admit that there is no public space yet for qualitative and academic discussion of this issue of the issues of similar importance, and that's why I was glad to come here to make you share your experience with us.

Believe me, there are several issues regarding the topic which should be defined correctly. While we are aware of and can more or less define what is the freedom of expression, there is no clear understanding what is religious feelings, and even more, it is hard to define the meaning of heart feelings. To understand someone's heart feeling is a matter of our own feelings. The matter -- this is difficult to identify the level of heart feelings.

To understand someone's feelings in the matter of our disposition, this is the matter of our disposition toward the person or group of people. How much someone's feelings are heard is also very difficult. That's why I'm putting these questions to discuss it with you. As I mentioned before, we are talking about that a lot, not discussing but talking too much.

Recently, the issue was the amendment in law about culture, which also was talked in the frame of controversy of the topical -- of my topic. For a long time, we used to read in the law that it could be restricted out of publication in case it violates other people's rights and et cetera, but without reference who should judge it.

Now the amendment say that this should be a judge within the frame of private law. Actually, okay, who else could judge it? But the question was: Is the judge prepared for that properly? How he or she could judge the case. Of course, some could say that this is not a legitimate question because judges are there to judge regardless of difficulty of the issue, difficulties of the cases, but we are not talking about who is guilty, which is the case usually.

The question is: What is the violence itself? Where is it and who is violating what? What is the work of art? And could it violate someone's right? Can we define the characters as a piece of art? Can we say it was heart feeling of Muslim people? Of course, there is no excuse for violation especially for terrorist attack but if reaction were different, if Muslim people reacted with a peaceful protest, could we say that it will be rightful? Could we say that the Charlie Hebdo activity was violated? I suppose we'll never achieve a consensus between the parties. In that case, what could be the judgment? Does that depend on which party representative judge is?

Another issue I would like to touch is the political correctness which is kind of restriction to freedom of expression, and especially modern practice regarding religions. In the societies with religious majorities in Europe, and first of all I can speak about my country, we accept critics of religion beliefs, practices and even theology of one, while at the same time criticizing the same issues in religions of minorities could be defined as Islamophobia or anti-Semitists or something like that and this is not the problem with Islam or Judaism or other religions, this is a problem of practice of political correctness. This is a problem of our disposition with this value.

All religions could be criticized, and all of them have what for, especially when we're speaking about their public practices and their behavior in the society, while at the same time they all have to be respected equally from the secular society.

Finally, that can mean because of the difficulties with the topic, a lot is about our own responsibility. We should be responsible individually. We should be responsible as a part of society. Media should be responsible, and, yes, while the traditional media as usual has more resources than irresponsible individuals, it should take advantage of on digital platform as well as bring the responsibility on it.

Finally, media should bring the knowledge and understanding of the difficulties of the contradiction. It should bring the platform for better knowledge of each other and better understanding of each other. Thank you.

>> CRISTIAN URSE: Thank you very much, Giorgi, for that. It also adds I think to the list of topics that we put on the table, and we will indeed open now the floor for questions and comments.

Just to give you an idea, we intend to conclude this at about 3:15, 3:20, when we'll go to Esmeralda Moscatelli. We also have the Remote Moderator Virginija, so we'll be able to add to the questions from the floor whenever the need be. So we are opening it. Just a kind request to everyone, please shortly introduce yourself, state your affiliation so we know and the speakers the panelists know who they interact with.

Floor is open.

It takes a while, probably. Maybe I can ask Menno whether he wants to break the ice.

>> MENNO ETTEMA: I can try to break the ice and try to bring in also new perspectives. I've appreciated the various inputs because I think it really puts the finger on the sore spots and where the challenges lie. I mean, I think was mentioned by Pearse, the balance is where the challenge I think lies in all these discussions. Where's the balance and how do we work on that?

I think if this session is also about seeking solutions and challenging the solutions and the possibilities, I think finding this balance really needs, and this is where I want to bring in a new point of the educational awareness raising approach into bringing a broader political -- broader society into the debate, we really also need to think of the awareness raising elements to the discussion to capacitate people to be critical consumers of news of the Internet these tools. Especially when we look into self-regulation and more transparency the question is do we also understand and the people that use the platforms, do they actually understand information that they are provided? I think there's a challenge still on educating people to understand what are the risks and opportunities when it comes to privacy on these platform settings, the opportunities and the risks of consuming media, the sources, checking the resources, and also to understand if there's redress mechanisms, okay, when do I use these redress mechanisms? And what are my expectations and how do I put may case forward?

I think there's a lot of need to look into educational elements. In that case I think the Council of Europe is trying to strengthen this comprehensive approach for example with the ECRI, European Commission against Racism and Intolerance, had a policy recommendation on combating hate speech where it seeks to have very comprehensive definition on hate speech, for example, but also calls for regulation, self-regulation, but also really calls for promoting human rights narratives or alternative counter narratives, more education and more victim support. I think that's also a perspective we need to see in.

In that session, for me part of the discussion on the freedom of expression is also the freedom of expression of who? I think there's also consideration to see that we have National may north for example speaking of the LBGT community that they're part and parcel of this discussion and regulation and they also can fully take part in designing regulations if needed but also designing self-regulatory methodologies that are involved in this process. Civil Society should also be involved. So I think these are some of the struggles we're looking at and if I look at the Georgian situation, we just had a session on the situation in Georgia where we see, we got the comments there's quite good legislation in Georgia on freedom of expression for example, protecting it.

But when it comes to implementing the capacity is lacking with judges to understand the issues at present and how to implement it. And the police are not able to follow up on complaints. Civil Society is also not organized to actually bring forward points so there's a need to seek all partners involved and the question is how do you do that? And how do you get everybody involved?

>> CRISTIAN URSE: Thank you very much, Menno, for that, that's a good addition I think, bringing in education and awareness raising elements. As you said. Again, the floor is open for comments and interventions.

[Off microphone]

Please.

>> WOLFGANG BENEDEK: If there's nobody having a question or comment at the moment I'd like to say something. I'm certainly also of the opinion that it is not an issue for State or European Union or whatever interference. In Zimbabwe they created the Ministry for communication which was also to control the Internet and the Minister very quickly got the nickname Minister WhatsApp, yeah? So this is what people not really need.

But on the other hand we have also to see that the situation is different from what our assumptions usually are. Our assumptions usually are that there is a remedy, so the diversity of media. So if you find in this media a certain information, then you can check it with information in other media. But if people are living in their echo chambers, in the filter bubbles, in big platforms like Facebook, you do not get -- I mean, you're not confronted with this diversity from the moderators, so to say, of content moderators of Facebook, because they pursue a certain line.

And on the National level, we also have institutions which are not necessarily interfering into the performance of the media, and these are independent regulators, so if we do -- if we need independent regulators on the National level, why don't we think about some form of regulation independent also on the international level? Isn't there a need for something like this, where one could get together and in cooperation with the platform set up some mechanism which has a due process, which is transparent, and so on, and which might deal let's say with issues or give guidance similar?

I'm all for the balancing. For example the European Court of Human Rights is requesting the National level to do this balancing, and if the balancing comes to certain conclusion, he will accept it, because balancing has taken place. But what do we know about balancing in Internet platforms, by Internet intermediaries? Do they have the time? Is this balancing in-built into the algorithm? We do not know, and therefore I think there needs to be a public debate, these platforms always say that yes we are in favor of multistakeholder approach, but in practice, they have expert committees, self-selected to advise them.

They are shying away from the public debate of their approaches, and this gives us the feeling that something is maybe not perfect, because otherwise, they could discuss it in public. Thank you.

>> CRISTIAN URSE: Thank you for that.

Please.

>> Thank you all very much for your presentations. My name is Nadia Tjahja, and I represent the Youth Coalition on Internet Governance, and I would like to direct my question to Professor Benedek, and perhaps if I have made a false connection, please correct me. But you mentioned regarding the European Court of Human Rights, the different algorithms that need to be discussed and specifically that there need to be a basic understanding and in how we can participate in this. So I vaguely recall there's a series of trade negotiations at the WTO and FTA and I think also ARCEP where they're discussing bans and restricted access to particular algorithms to protect e-Commerce and trade secrets.

So when you mention participate, and a basic understanding, do you mean access to source codes? Do you mean -- so in what aspect do you perceive the participation of the technical and Civil Society community? And in terms of the discussions, I don't know if this is within your realm of discussion, but would it be possible for tech and Civil Society to participate in organisations like the WTO and ARCEP, for example? Thank you.

>> WOLFGANG BENEDEK: Yes. Thank you very much. When I mentioned the European Court of Human Rights, you know that they receive around 60,000 applications every year and in the filtering process, it seems they are using also algorithms which help them to identify whether this is a true complaint based on the violations of rights in the European Convention of human rights or not, yeah?

What you mentioned regarding the WTO debate, e-Commerce, that's a totally different area. I mean, here we are dealing with trade law, and part of this intellectual property rights is indeed trade secrets, so you can have protection of trade secrets. It's part of this system, but what I have suggested was not that you have to reveal exactly the algorithm in any detail.

What I have suggested is that you inform your users about the principles, about the basic configuration so to say so that they understand which elements are there and which relevance in the algorithm. And I say if that is possible to use algorithms in human rights context then maybe it is possible also to have this kind of human rights by design built in like privacy by design, why not freedom of expression by design, built in, and certainly that needs to be discussed.

And if there is no willingness to do so, then I think self-regulation is becoming risky for the users, and that attracts then proposals for regulation from the outside, which is not what would be my preference.

>> GIORGI GVIMRADZE: Thank you. I want to complement what Wolfgang was saying. As has been correctly remembered, we have dealt many years human rights through commerce law. We are talking of taking down as something that is in our tradition. We never had this in our tradition. If I publish on my website as a broadcaster something wrong, I'm, even if I take down one second after I publish, I'm fully responsible for it. I go to jail if it's something defamating somebody in some countries. I can pay huge fines. I can suspend my journalistic license, et cetera, et cetera. If I do on Facebook or if somebody publish on Facebook, then the only obligation Facebook within a certain delay of time as to take it down. So there is something wrong there and I think that we need to come back to the essential that when it comes to human rights, the human rights cannot be bartered against trade or against anything. It comes as a first value for a society.

The second point I want to make clear is the example that we have of self-regulation have not worked. The right to be forgotten is self-censorship, self-absolved, self-indulgent by the same company that provoked the problem. I give you an example I mentioned briefly yesterday, BBC keeps a record of all the items that they publish on the website that have been taken down by Google because of the right to be forgotten Act.

And they were 66 months ago and I think how they could be -- and they believe that a certain number of these items has been taken down were taken down with arbitrary reason, but because nobody knows which criteria were applied, who applied, and because the expert list that is making the analysis ex post what happened is not open, is something -- it's a black box. Can as a society today live in a black box when it comes to human rights? I don't think so.

Third point and sorry to be long, it is about algorithms. There is a very commendable initiative of the Council of Europe that this artificial intelligence in media Working Group, I'm part of this group and proud to be part of this group because as a journalist I don't understand nothing of nothing but I listen and I try to learn, and I learn in the first meeting of this group two months ago something very important, that an algorithm is not a unique body. The algorithm is the first part, the second part and the third part.

The first part is what you want to get through the algorithm, okay? The second part is once that we've agreed what you want to get through the algorithm, then you put all the equations that are needed to get the objective that we have stated. And then the first part is a self-assessment, if there is a match between the working -- the working made by the algorithm compared to the original scope. I don't care what is in the second part. I would care very much to know what there is in the first part, because most of the problems that we are discussing now are exactly that in the first part, the first thing is how can I make more money out of it.

That's the real reason why, and this is something that probably you need to discuss in a Civil Society, as media we have a right to argue that society is not only based on making money. That there are limbs to the money that you can make. When I discuss with my Facebook colleagues coming from the media Sector, because now there are some working for them, I say you forgot every newsroom journalist that I hired in my life, the first thing I ask him is fact checking everything twice. Now that you are doing the same job as main, why you are not submitted to the same rules? Because practically we're doing the same job.

So there is something that we need to stand up and we need to -- and the solution for me is a cooperative solution because all of these topics that have been afforded, like the one mentioned by our Georgian colleague, are things that the journalists assess and solve every day in their job. Why we have to go to look for experts somewhere else?

Work together and try to assess which are the right solutions. Yes, of course, it costs less to take 1,000 people and put in Dublin or in Germany and give them algorithms to do the job. Unfortunately, society is complex and this cannot be solved everything. Sorry for being long.

>> CRISTIAN URSE: Thank you. I have first the gentleman on the left, and then Pearse.

>> Hi, I'm Edward from the Digital Rights Organisation from Portugal. I have some questions or comments to Mr. Pearse O'Donohue. Sorry for the pronunciation.

You said that it was so important not to have general monitoring obligations on tackling illegal content online but in the copyright case, there are general monitoring, Article 13 requires general monitoring because it requires that 100% of the content that is uploaded should be tested against a filter. So why is copyright different from the other subjects?

And we know that filters don't work only for illegal content. We know that they also will have problems with parity, like you said the free uses and the exceptions or what the Americans call the fair uses that are rights of the citizens to express themselves.

So I want to ask if you really think that it's justifiable cost that you said, that you -- it is fair or justified to restrict freedom of expression of the citizens just to give funding to journalists and rights-holders? It's the only option to do that is to restrict the freedom of expressions of citizens with filters, or are there any impact assessments on the consequences of this legislation? Thank you.

>> PEARSE O'DONOHUE: Thank you for the question, and perhaps I can give precision, but first of all just to clarify. You quoted me back, but what I said about no general monitoring obligations that was in relation to the e-Commerce Directive and what we've done to supplement that in relation to illegal content, which by the way is not law. It is a recommendation but what we have in relation to illegal content before I move on to your question, I know it was focused on copyright, is that there should be an ability to systematically monitor and to use automated means, but because of the use of automated means, we then have to address even more seriously the weakness which Giacome spoke about in the past and even was in the example you gave about Facebook which is clearly we have to have absolute transparency upstream and downstream of this filtering process as you call it, I would not call it a filtering process, but upstream and downstream. In other words that what we have now recommended is that the intermediaries, the entities, need to have a transparency policy which means ex ante they have to explain their decision and ex post of any decision to take down they actually have to explain that. But what is just as important as again to really fill in the weaknesses that have been recognized we need to have an independent means of challenging any such decision.

And that is very important to making this system work. Now, in the specific context of your question which I know was in relation to copyright, of course if you say and if you ask me the question, is it a legitimate cost just to support journalists to restrict the freedom of expression of citizens? Of course, if I answer yes to that, it can be well quoted and misused but I did in my presentation try to make a more reflective analysis to why this balance was required as to why it was in order, first of all, to protect rights-holders, because there is a right there of people who have created material, created content, and they do have the right to have it protected but there are a series of exceptions that you obviously are aware of that I listed some of them with regard to snippets, with regard to hyperlinks, with regard to parity and so on, where those rights are waived, and the copy holder's right is not an absolute right. That's been made clear in our proposal and in previous court of law. So it is acceptable to have some trade-off to protect the future viability of journalism, of quality journalism, in the European Union which is itself fundamental to supporting democracy, to pluralism, and therefore to a Democratic framework which allows the freedom of expression, because if we just look to our partners across the Atlantic, although they're on hopefully not a very slippery slope, one has to ask themselves as the journalism becomes more partisan and as the ability to attack others without any redress becomes stronger, how is that supporting plurality, Democratic discussion in that country?

So of course it's not -- we would prefer not to have to make these trade-offs. Certainly don't think we're in a position to make these judgments on their own. We have made proposals with the legislators and each of the mechanisms I described in turn will rely on regulators, we haven't talked about independent regulators, on transparency and making sure those who audit and check for compliance are totally independent of the Government.

>> CRISTIAN URSE: Thank you very much. For just one minute I'll go to Natalia and Giorgi who have announced their intention and then we'll go to Esmeralda as we approach the time limit of this session. Please.

>> NATALIA MILESZYK: For me, the biggest question of this session in EuroDIG is the role of the platforms and I see it very visibly that we are still not capable of answering this question, because platforms appeared in this human rights ecosystem. There are very powerful players right now and it is tempting for states for example to hand over the protection and some responsibilities in this field for the platforms because platforms are effective. That's really true. So they're the one capable of for example monitoring, filtering, taking down and regulating what we see and what we cannot see.

But on the other hand, platforms are not newsrooms. They're not newspapers. And I believe that we all don't want them to be newspapers. Because, you know, if it becomes newspaper and I'm not quality journalist and I don't have proper education, I don't have other place to go to express myself. So for me, and I don't know the answer.

The question is: How shall we really embody platforms in the human rights protection ecosystem, and how to make the system sustainable? I don't know the answer but for me from this session and from the previous sessions as well, that's the biggest we are facing because it's really a pity we don't have any platforms in the room right now.

>> CRISTIAN URSE: Thank you, Giorgi?

>> GIORGI GVIMRADZE: I will go on with Natalia, we need really the platform for discussion of these issues. I believe that discussion about may topic especially, I can speak about that, it came -- it's somehow coming to our country, because this is a really important issue right now. When you have in a country the majority of one of the religions and they're very active and they're really proactive to express their religious beliefs, religious practices on the public, and this is really always really very important, and especially when there is on the contrary, there is a part who is expressing their freedom of how to say to contradict themselves to this kind of expression of this kind of religious beliefs and religious practices.

So this is kind of let's say not actually the new challenge for us. This challenge is existing with us during all these 27 years of being independent from the Soviet Union, but anyway, we are -- I do believe, and I would like to believe, that we are ready right now somehow to discuss this -- academically discuss this issue and quantitatively discuss this issue, and what I would like to ask you about is kind of to somehow share your experience with us. I mean, not on the floor of this event, but also on the next meetings and the next occasions, to really to share your experience.

When I was preparing for this issue, I found out that this issue, this discussion, is really active in Israel, for instance, because Israel community also discussing where is starting religious feeling and where is -- where should we start speaking about religious feeling and stop talking about freedom of expression, and on the vice versa, so I do suppose that we need bigger platforms for these kinds of issues, and these issues are really very active right now, especially -- maybe even in Europe as I can see, because the rise of Islamic communities inside Europe is really very obvious, so thank you very much for this kind of opportunity to speak about these issues.

And, yes, we do think about that, and inside our public broadcasting we do think to finally put on the floor the platform for really qualitative and really academic discussions of this kind.

>> CRISTIAN URSE: Thank you, Giorgi. I'll very briefly please, and then Esmeralda --

>> I will try. Many things have been covered. Just want to say that again, transparency is not enough if the information provided are not made accessible, and again, the failure of the multistakeholder approach is not representative, because I think we have a huge problem of representation, who makes the decision has been said that if it is not representative, really representative, of all the parts involved, it will fail. Also the Professor Strickland said that many times, and I think we still have a huge problem of representation when it comes to decision-making bodies and I was really happy that I heard the word "feeling" just from the, I don't remember the name -- George, because I think we're talking about algorithms and I'm happy we talk about education because I think that if we talk about human rights and accessibility we shouldn't forget the ability also to, like, push for empathy and critical thinking, which is totally missing from all these arguments, and I think when it comes to hate speech and talking about protecting rights of people, it shouldn't be missing and it should come to education and the real peer to pear dialogue, which is now missing. So I wanted just to find out these things.

>> CRISTIAN URSE: Thank you. And we go now to Esmeralda, right?

>> ESMERALDA MOSCATELLI: So hi, everybody. This is Esmeralda Moscatelli from the International Federation of Library Associations. I'm here because I helped organize this workshop, but I'm not the Internet platform person. That comes after me. I was asked to briefly touch upon the remedies that were discussed here, but I'm going to be very general, and so it seems to me that when there is a consensus from everybody and when our own human rights and fundamental freedoms are either restricted or violated, we must have the right to a remedy.

But the ways for us to seek this remedy should be in a way available to us all, and accessible and fully representative. I heard that we often talk and discuss and we hear with the idea of this multistakeholder, which is very nice and fine, but some of the actors are shying away from public discourse regarding this matter, so this is something we have to think about. And also we have to think about solution and remedies that are capable of providing a real practical solution to tackle the problem.

So the effective remedy should be or could be obtained from Internet Service Provider, and we heard I think in general about the lack of some credibility, transparency issue. Why don't we have international regulator in cooperation with this platform that could sort of highlight certain due process and the issue of balancing reiterated by many people is important, but do we have it these days? Yes? No? There's ample room for discussion there.

And then also, why is credibility, when we talk about Internet Service Provider, their credibility at times is not called into question vis-a-vis the more established media fact-checking tradition that we had in the past, and I'm talking about analog media.

Also, effective remedies should be obtained for public authorities and any other human rights or Government institution involved, and Mr. O'Donohue talked about auditing and monitoring too, that we don't really have, and the fast pace of this is sort of biting our tail.

So is self-regulation a remedy? Is becoming -- from users this is becoming very risky and at times ineffective, and the right to be forgotten was called in question, and I think in general effective remedy as you have highlighted should sort of tackle real matter, clarify issues, and be accessible to us all.

>> CRISTIAN URSE: Thank you. Then we move to Aida.

>> My name is Aida, I am here in front of Geneva Internet platform. You can find full summaries of all of these, as well as EuroDIG wiki page. Now what we have here are just to say Esmeralda we will also include those remedies that you just told us.

Now I have here session messages and what we are trying to do is get a rough consensus in the room and we can do that. We're not inviting for discussion but you can just show thumbs up or thumbs down and I will note if we have it or not. It's okay to disagree.

One, there should be more transparency on how algorithms are developed and a public debate on the approaches private companies take.

Right.

Two, when discussing ways to tackle disinformation we need to assess the implications for both democracy and freedom of expression.

Cool.

Quality journalism is essential for maintaining democracy.

Okay, I see smiles. That's also good.

Four, since traditional media has more resources, it should take advantage of a digital platform and bring responsibility to it. Should I read again?

And five. If you need to change a word -- to repeat, okay.

Since traditional media has more resources, it should take advantage of a digital platform and bring responsibility to it.

[ Off microphone ]

This is what one of the speakers -- 

[Off microphone]

Yes.

[Laughter]

>> GIORGI GVIMRADZE: And actually in my words it was that traditional media, traditional media, the big broadcasters and media producers have much more resources and some irresponsible individuals which are using the social media for fake use or disinformation and something like that, in that case, I do believe the traditional media should invest more in the digital platforms and to bring this kind of responsibility or quality of journalist on the digital platforms as well, on social media and so on.

[Off microphone]

It has better equipments, it has huge amount of human rights, quality journalists and so on and so on, which we exceed in let's say, exceeded in amount all the information which is put on the digital platform.

>> Aida: Would you guys agree to rephrase it a little bit and you will help me right after this session? Yes? Okay, okay.

And the last one and finding the education is important and awareness raising is important in order to achieve that balance and to make people understand the information their provided.

[Off microphone]

I see a little bit -- okay. I feel like we have four we agreed on. Thank you everyone for holding in there.

>> CRISTIAN URSE: Thank you very much indeed everyone. I want to thank again the speaker, Giacome. Please, thank you.

>> GIACOME MAZZONE: Natalia reminded me for our purposes, we have an open consultation on our recommendation on tackling illegal content online. It's the second stage of the consultation where primary stakeholders already made recommendations about the next steps but we'd really like to have the views of the wider community on what might be the next steps. The consultation is open until the 25th of June. I apologize.

>> CRISTIAN URSE: I just wanted to say we'll have the chance to exercise our rights to a coffee break, and then you are welcome to pursue and continue the discussions with the panelists or among yourselves. Thank you very much once again for participation.

[Applause]

[End of session]


This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.