Information disorder: causes, risks and remedies. – PL 02 2018: Difference between revisions

From EuroDIG Wiki
Jump to navigation Jump to search
No edit summary
(31 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{Sessionadvice01}}
5 June 2018 | 16:45-18:15 | GARDEN HALL | [[image:Icons_live_20px.png | YouTube video | link=https://youtu.be/nSibifXPPYg]]
Working title: <big>'''Remedies to social media problems: How to cope with fake news and other ills.'''</big><br /><br />
<br />
== <span class="dateline">Get involved!</span> ==
[[Consolidated programme 2018| '''Consolidated programme 2018 overview''']]
You are invited to become a member of the session Org Team by subscribing to the [https://list.eurodig.org/mailman/listinfo/pl2 '''mailing list'''].
If you would just like to leave a comment feel free to use the [[{{TALKPAGENAME}} | discussion]]-page here at the wiki. Please contact [mailto:wiki@eurodig.org '''wiki@eurodig.org'''] to get access to the wiki.
 
== Session teaser ==
== Session teaser ==
Until <span class="dateline">1 April 2018</span>.
The session will deal with problems arising from certain types of use of social media channels and possible remedies to address them. The purpose of the session is to shed light on the effects misinformation, disinformation, malinformation, false and fake news can have on democracy and the climate of society. Participants to the session will be invited to give input on the most pressing concerns from their point of view. Key participants will host interventions to explore possible remedies and their effects.
1-2 lines to describe the focus of the session.


== Keywords ==
== Keywords ==
Until <span class="dateline">1 April 2018</span>.
*Information disorder
They will be used as hash tags for easy searching on the wiki.
*Misinformation
 
*Disinformation
== Session description ==
*Malinformation
Until <span class="dateline">30 April 2018</span>.
*Fake news
Always use your own words to describe the session. If you decide to quote the words of an external source, give them the due respect and acknowledgement by specifying the source.
*Remedies
*Community/Alternative Narratives
*Media literacy
*Legal Regulation
*Monitoring and Deletion


== Format ==  
== Format ==  
Until <span class="dateline">30 April 2018</span>.
The session will start with a 5 minute introduction which will present the state of the issue, which is followed by a 15 minute discussion with EuroDIG participants who can raise their concerns in an open mic style. Afterwards, there will be key participant interventions of 5 minutes, intermittent with 15 minutes of audience engagement, which is concluded by the key participants and the moderator.
Please try out new interactive formats. EuroDIG is about dialogue not about statements, presentations and speeches. Workshops should not be organised as a small plenary.


== Further reading ==  
== Further reading ==  
Line 32: Line 31:
* [https://www.intgovforum.org/multilingual/content/igf-2017-day-3-room-xxiii-ws134-fake-news-and-possible-solutions-to-access-information IGF 2017 Fake News and possible solutions to access information]
* [https://www.intgovforum.org/multilingual/content/igf-2017-day-3-room-xxiii-ws134-fake-news-and-possible-solutions-to-access-information IGF 2017 Fake News and possible solutions to access information]
* [https://www.intgovforum.org/multilingual/content/igf-2017-day-4-room-xxii-ws301-fake-news-content-regulation-and-platformization-of-the-web-a IGF 2017 Fake News, Content Regulation and Platformization of the web]
* [https://www.intgovforum.org/multilingual/content/igf-2017-day-4-room-xxii-ws301-fake-news-content-regulation-and-platformization-of-the-web-a IGF 2017 Fake News, Content Regulation and Platformization of the web]
* [https://ec.europa.eu/education/sites/education/files/digital-education-action-plan.pdf EU Digital Education Action Plan]


Recommendations by Yrjö Länsipuro
Recommendations by Yrjö Länsipuro
Line 51: Line 51:


* [https://rm.coe.int/information-disorder-report-2017/1680766412 Council of Europe Information Disorder report]
* [https://rm.coe.int/information-disorder-report-2017/1680766412 Council of Europe Information Disorder report]
* [https://rm.coe.int/dgi-2018-01-spaces-of-inclusion/168078c4b4 Spaces of Inclusion]


== People ==  
== People ==  
Line 71: Line 72:
*Stéphanie Matt, Media Convergence & Social Media, DG Connect, European Commission
*Stéphanie Matt, Media Convergence & Social Media, DG Connect, European Commission
*Claudia Scandol, Young European Leadership, YouthDIG Fellow
*Claudia Scandol, Young European Leadership, YouthDIG Fellow
*Virginija Balciunaite, YouthDIG Fellow
*Virginija Balciunaite, Young European Leadership, YouthDIG Fellow
*Aamir Ullah Khan
*Aamir Ullah Khan
*Anna Keshelashvili, Board Member ISOC - Georgia, Professor of media and communications at Georgian Institute of Public Affairs
*Menno Ettema, Council of Europe
*Ketevan Kochladze
*Giacomo Mazzone
*Anastasiia Korotun
*Ucha Seturi


'''Key Participants'''
'''Moderator & Key Participants'''
 
Until <span class="dateline">14. May 2018</span>.
Key Participants are experts willing to provide their knowledge during a session – not necessarily on stage. Key Participants should contribute to the session planning process and keep statements short and punchy during the session. They will be selected and assigned by the Org Team, ensuring a stakeholder balanced dialogue also considering gender and geographical balance.
Please provide short CV’s of the Key Participants involved in your session at the Wiki or link to another source.


'''Moderator'''
[https://www.dropbox.com/s/q71jb1ehyvugd6a/Final%20EuroDIG%20PL2%20Key%20Participants%20%26%20Outline.pdf?dl=0 Please find a PDF here with the Moderator & Key Participants' biographies]


Until <span class="dateline">14. May 2018</span>.
*Paolo Cesarini, DG Connect, European Commission
The moderator is the facilitator of the session at the event. Moderators are responsible for including the audience and encouraging a lively interaction among all session attendants. Please make sure the moderator takes a neutral role and can balance between all speakers. Please provide short CV of the moderator of your session at the Wiki or link to another source.
*[[Patrick Penninckx]], Head of the Information Society Department, Council of Europe
*Ana Kakalashvili, Institute for the Development of Freedom of Information
*[https://www.intgovforum.org/multilingual/content/croll-jutta Jutta Croll], Stiftung Digitale Chancen
*Tamar Kintsurashvili, Head of Media Development Foundation, Mythbuster
*Clara Sommier, Google


'''Remote Moderator'''
'''Remote Moderator'''


The Remote Moderator is in charge of facilitating participation via digital channels such as WebEx and social medial (Twitter, facebook). Remote Moderators monitor and moderate the social media channels and the participants via WebEX and forward questions to the session moderator. Please contact the [mailto:office@eurodig.org EuroDIG secretariat] if you need help to find a Remote Moderator.
Ketevan Kochladze


'''Reporter'''
'''Reporter'''


Reporters will be assigned by the EuroDIG secretariat in cooperation with the [https://www.giplatform.org/ Geneva Internet Platform]. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:  
*Adriana Minovic
 
Reporters are assigned by the EuroDIG secretariat in cooperation with the [https://www.giplatform.org/ Geneva Internet Platform]. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:  
*are summarised on a slide and  presented to the audience at the end of each session  
*are summarised on a slide and  presented to the audience at the end of each session  
*relate to the particular session and to European Internet governance policy
*relate to the particular session and to European Internet governance policy
Line 98: Line 107:


== Current discussion, conference calls, schedules and minutes ==
== Current discussion, conference calls, schedules and minutes ==
See the [[{{TALKPAGENAME}} | discussion]] tab on the upper left side of this page. Please use this page to publish:
See the [[{{TALKPAGENAME}} | discussion]] tab on the upper left side of this page.  
*dates for virtual meetings or coordination calls
 
*short summary of calls or email exchange
*2 May 2018, 3pm CEST - Coordination Meeting on the Title & Scope of the session
Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.
*17 May 2018, 9am CEST - Coordination Meeting on Key Participants


== Messages ==   
== Messages ==   
A short summary of the session will be provided by the Reporter.
A short summary of the session will be provided by the Reporter.
Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/information-disorder-causes-risks-and-remedies


== Video record ==
== Video record ==
Will be provided here after the event.
https://youtu.be/nSibifXPPYg


== Transcript ==
== Transcript ==
Will be provided here after the event.
Provided by: Caption First, Inc. P.O Box 3066. Monument, CO 80132, Phone: +001-877-825-5234, +001-719-481-9835, www.captionfirst.com
 
 
''This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.''
 
 
>> NADIA TJAHJA: Hello. So good afternoon, ladies and gentlemen. Welcome to the Plenary Session on the information disorder, risks and remedies. Before I start I would love to invite my key participants to take their seats here in the front. Thank you very much. For those coming in, I want to once again welcome to the Plenary Session on the information disorder, the causes, risks and remedies. I'm Nadia and I'm the Steering Committee on the youth coalition of internet governance. It is my pleasurer to host the discussion to look at the different remedies being developed on the different stakeholder fronts. I'm excited to see and hear your views during this session. I encourage you to actively contribute to this session and actually question what is being said here today in fact.
 
But to put this session into perspective, the information disorder is distorting people's ability to make sense of the world around them and threatens the democratic processes around the world. While this is not a new phenomenon, the problem is compounded by both the speed, the information travels in this network world and the technological and cultural filter bubbles in which we live our lives. This is a problem that impacts us all.
 
Today we will be focusing on the remedies of the information disorder. Just to clarify we would like to move away from the term fake news. This term is just too small to describe the complexity, the complexity that we face that surrounds this topic and has been sensationalized in our society. The Council of Europe published a study on the information disorder and promoted a framework which was adopted at the information pollution phenomenon in three types. So we talk about misinformation. That is when false information is shared. But no harm is meant. We looked at disinformation where false information is shared with the intent to harm people. And also malinformation, when genuine information is shared to cause harm, which is often by moving information designed to stay private into the public sphere. This leads me to the question for today: How can we address the information disorder? We will have some key participant interventions by the European Commission, IDFI, Council of Europe, Google, and the digital opportunities foundation and it will be my pleasure to introduce them in a moment.
 
But first I would like to open the floor and hear your concerns, which we then can have a look at throughout the course of our session and giving our key participants an indication of the points that you would like them to address during their interventions.
 
We have an open mic here in the middle. So please do come up and share your concerns. And state your name and affiliation and don't forget that you can also add your concerns to the word cloud through (indiscernible).com which is anonymous. Come to the microphones and share with us what your concerns are. I would like to ask the moderator from the last session to provide a brief overview of the major concerns that were addressed during that session and their conclusions.
 
>> AUDIENCE: I'm here together with my colleague Virginija. We moderated the session this early afternoon. The main take-aways, there is an abundance of challenges that have to be addressed on three levels, government, internet platforms and businesses and the users.
 
But with threat also comes opportunities for AI and algorithms we can create better designs and also educate citizens to be more informed about how they use the internet.
 
There is a large gap between technology experts and policymakers. It is important to try to close that gap. And we also need to be more active and transparent about disinformation, for example in the political sphere, who is funding campaigns and the importance, overall importance of the advertising platforms. This has to be done on a multistakeholder kind of level. Thank you.
 
>> NADIA TJAHJA: Okay? Thank you very much. Are there any other concerns from the audience that anybody would like to raise?
 
Ooo, no one has concerns about the information disorder?
 
While we wait for incoming participation from either the audience but also from our online participants, I would like to ask for our first intervention. And to start with our first intervention and introduce Mr. Paolo Cesarini, the head of the unit responsible for media convergence within Digital Communication Network and Technology at the European Commission. He has been teaching as a visiting Professor and a lecturer of the universities of Turin and Montpelier. I would like to ask you to stick to the time limit of five minutes. I give you the floor.
 
>> PAOLO CESARINI: Thank you, Nadia. Well, I can say that the concerns popping up on the screen in the meantime. Propaganda, governance, surveillance, lack of ethical data cleaning, lazy data, state obey ensure, comes again. All these concerns have been widely discussed in the last months at the European Commission which has organised a number of consultations, also created high level group to address a phenomenon which is far from being easy to deal with. And indeed, the first issues that we had to live with is what is the phenomenon we want to address? There were several definitions that were done at the Council of Europe. Indeed it is an important one. The information disorder is ground breaking work that has been done. The approach taken by the commission, it is a bit more narrow in scope. We are not dealing with misinformation or malinformation but disinformation. This is an effort by memory, verifiable, false, misleading information which is created, promoted, or amplified, disseminated online for profit, for economic gain or to intentionally deceive the public and to cause public harm.
 
So those are the elements in this definition. The most important, I think, is about the harm that can be caused to democracies, the public harm is strictly linked to the protection of the functioning of our democracies in particular as regards to things, electoral processes, linked to propaganda which can be orchestrated by state or non-state actors, which can be from third countries or from domestic groups. The phenomenon is far from being linear and easy to apprehend in a single formula.
 
In another element of this is about harm that can be created to public policies. So that is very important because the sustainability and credibility of an institutional system relies on the good communication that policymakers can have with the public at large. If there are messages which blur the actual intentions of policymakers, which are very important such as immigration, health, climate change, finance, you name it, can be in fact distorted in the perception of the public and, therefore, be difficult to implement in a correct way.
 
In definition, by the way, excludes pure journalistic errors, et cetera, it is important to keep that in mind. One of the most important problems is how to avoid remedies that would encroach on the fundamental principle, which is freedom of expression.
 
How to avoid that information space is polluted while at the same time gig people the possibility to use the internet and other means of communication to freely express their views and opinions. So it is far from being an easy topic. The answer to that is that, and this comes also very strongly out of the public consultation in the survey that has been carried out in the last month. The answer cannot only be at the national level, there is a need for answers at the EU level. First of all because this information knows no borders. Many Member States are affected by these information campaigns, orchestration of public opinions in different areas, but sometimes by the same actors. And because internet has no borders. So the origin of the piece of information can come from another country and affect the population across the board.
 
So the answer to avoid fragmentation of responses has to be coordinated at the higher level while keeping in mind there is a wide margin for Member States due to the principle to remain active and do things needed at the national level.
 
This is a phenomenon that is in constant evolution, as the technologies that support and which have been harnessed about by malicious actors are constantly evolving. The response cannot simply be done once and forever. It has to be constantly evaluated and adjusted to the changing needs that are, or could be coming up as the phenomenon evolves.
 
At the point in time while we are speaking, there are certain main line of actions that are going to be implemented in the next weeks and months. And reports -- revolves around three main concepts. The first is transparency. Transparency means in essence to start with the self regulatory process while reserving the possibility for stronger action if the process would fail with expected results by the end of this year.
 
When we talk about transparency, we talk about something very concrete. We need more transparency about the flows of money that finance purveyors of this information. We need to cut those flows of money to demonetize the websites that are systematically purchase vague false information. We need more transparency about political advertising which has come up quite prominently during the Congress debates of last fall in the United States.
 
We need more transparency about the human versus automatic interactions, by bots and about the accounts which do not hide any real individual behind that take account.
 
We also need more transparency about the sources of information and that are required to elaborate systems that would enable users of such platforms to better identify where the information comes from and whether the sources are trustworthy or not by elaborating indicators that are useful for that purpose and that would require also a strict cooperation with the media sector, of course. So as to dilute the content with good content without falling into censorship or surveillance. The idea is to give more transparency to give information that is objective in nature and reflects the interests and the practices of the good journalism.
 
And finally, we need transparency as regards the data. We need to have access to platforms data for research purposes to better map this information campaigns and the spread online of these campaigns. We need this data in order to switch from a reactive approach to a preemptive approach. To act faster means to be able to understand where possible targets of disinformation are and which topics are going to develop.
 
The second point is about credibility of the information. The credibility requires that our society is to be better equipped to sort the falsehood from the truth. And this cannot be done by the government because no Minister of truth, of course, can be the source solution to this problem. This needs to be done by the investigative journalists, the real experts, who probably will not have a position in the newsrooms which are cutting down, unfortunately, in the home staff, but they could be usefully employed into emerging sectors such as fact checking.
 
We are creating or trying to create a network of fact checkers across Europe who will develop best practices to share information to have a common approach to this information and for that the commission will provide a digital service infrastructure which will be used also to link the work of fact checkers with the researchers. When I say researchers, I think about computer scientists, network analysts, I think about, of course, experts in communication and political scientists because all these competencies, all this is necessary in order to have realtime mapping of phenomenon that can deeply affect our democracies and therefore they need to have the means, we need to have the means to combat them.
 
Finally, the third point is about the diversity. This is a tricky one. The diversity of information is tricky one, how to ensure in the long run that citizens can have access to diverse information landscape when we know that the business model traditional journalism is in deep crisis. So we need to activate tools, financial tools to support training of journalists, modernization of showrooms, up take new technologies that enable faster identification of information and content verification, but also new technologies that are used for the public good, social good like block chain, cognitive algorithms, artificial intelligence that can educate eventually our audiences to be more demanding when it comes to the quality of information they get online.
 
These are on the supply side. We also need to create on the demand side because in order for audiences to be demanding, they need to be critically alerted and to be able to read critically the information they get, whether off or online. So a lot of -- yes, those are the last words.
 
So the last words are about the need to deploy resources to up to a scale which you must allow to go down too far, to promote media literacy actions and start grassroots organisations in all Member States with better organisation, again like for fact checkers at the EU level to develop best practices not only for the schools, so for young generations but also for the older generations with adapted tools for each of the two groups of citizens.
 
(Applause.)
 
>> NADIA TJAHJA: Thank you very much, Mr. Cesarini. If I can ask that the other microphone be moved down? So if we go slightly over time I have the opportunity to intervene, but thank you very much.
 
I do, we did go over slightly, but I will give a little bit of additional time to accommodate this.
 
So before we go into our second intervention I just wanted to have a look at the word cloud and I have actually invited the YouthDIG Fellow to give a short outline of the major concerns that were raised online and by participants here in the room.
 
>> VIRGINIJA BALCIUNAITE: Thank you very much, everybody, for giving us a lot of various issues that we can focus on. And so people were addressing their concerns about the trust issues. For example, which can be resulted in people decreasing participation in political affairs, for example in voting.
 
Also people were concerned about spamming, which were seen on the screens, a lot of spam. And as well people were not sure between fake advertising and fake news, how to verify, who can be verifying it and how can people trust the information they find online.
 
Another issue expressed was the national security. For example, the regional areas, for example in the Baltic states which is affected by the Russian propaganda and Russian trolls, as well as in the Ukraine. Also the bubble society, people are living these days in their own bubbles, not exposed to different types of information.
 
Also media and critical thinking which is discouraged and also data breaches and corporate surveillance.
 
>> NADIA TJAHJA: Great. Thank you very much, Virginija.
 
(Applause.)
 
>> NADIA TJAHJA: So having looked at this, I wanted to ask, are there any remote participants who had any questions? No? Not at this time.
 
I would like to move into the next intervention and insight Ana contact have I, an analyst at the institute for the development of freedom of information and European works, the European Commission project promoting prosecutorial. She has been involved in internet governance for over six years and member of the Internet Society, ISOC in Georgia and founder of the ISOC block chain interest group. Now take the floor for your intervention. Is there a microphone over there? Thank you.
 
>> ANA KAKALASHVILI: Thank you. Yes, it's working. Hello, everyone. It is a pleasure to be here. Let me start with a small introduction. I will try to be brief.
 
Fake news is information, propaganda, in social media and driving the content of the important political conversations among all of us, citizens, stakeholders, governments, everyone. We are so taken by that. News and information, we see that they have become weaponized against the very institutions and values that the freedom of speech was supposed to protect.
 
Trust me, I have been around here and we have been talking about it for a long time. Information, freedom of speech. But the situation, I'm sure this is not my personal view, has really gotten worse. We have observed with our eyes the U.S. election, the French election, how the fake news have been so much influencing the process and the free will of the voters. We still don't know what is happening. And the scenario has somehow caused a complete collapse of public trust in democratic institutions. When governments are so fearful right now of this populist nationalism, they are scrambling and trying ton find policies aimed at combating fake news.
 
It is not that they are trying to somehow sensor anyone, but they are trying their best to somehow find the solution in the regulatory terms. And this is where we come to sometimes, a confusing situation. For example, Germany adopted legislation for companies to pay fines up to 50 million Euros if they fail to delete illegal content within 24 hours. I'm sure we know that.
 
Italy, they have introduced a bill aimed at preventing the manipulation of online information and the fine for non-journalistic website of social media that published or disseminated fake news, exaggerated news regarding facts that are manifested false or unproven, they may get fined -- let me look at my stats. 5K Euro, 5,000 Euros. Sorry.
 
We see that governments are really trying. It is not that they want something bad. What I've learned is that there is nowhere, mostly nowhere in this scenario where you have the good and bad on the scale. It's always one good value against another good value. It is depending very much on the situation and on the circumstances and on everything which one will go up and which one will go down.
 
And I am very confused when it comes to this fake news topics, but let me tell you solving the problem with regulations is certainly must create more problem if done wrongly and not very carefully. Government regulations of fake news would be a cure far worse than the disease. Especially me coming from Georgia, this region, it is so -- you have to be so careful. This is, in Georgia and other countries, my neighboring countries and other countries that are just being establishing this democratic values, the principles of the good governance, rule of law. It is a very tiny and very thin line to walk when it comes to regulating the most crucial thing which is freedom of speech and freedom of information.
 
Not that we don't have to. We have to wash our hands and not do anything. Of course no. But keeping the balance and seizing what is in that situation, more important to defend and at the same time also tackle the issue that you are so critically facing, which is the fake news and what it does to, as I said, the free expression of the voters. It is very crucial to find the balance and keep up to it.
 
And let me tell you that stats, I looked at some stats. Unfortunately, for example, this economist Yugo Paul did this interview with people and they asked whether courts should be able to shut down media outlets for publishing broadcast stories. 28 percent were in favor. 29 percent were opposed. And 49 percent undecided. And of course, there are those who are more Republicans, they do want a regulation. Stats are not always in favor of what I'm saying, that we don't need the harsh regulation. But you need to understand that stats are not something we should be always relying upon. And there are bigger values we have to always keep in mind when it comes to regulations.
 
So let me tell you that we in Georgia and other countries, I'm sure, my neighboring countries, we all look up to the Europe union and their owe European Union and their standards. We have so many standards to implement and so many things we have to keep up with Europe. You have to be our example of defending the most crucial value. This is defending freedom of speech, freedom of information. I'm sure there are ways to tack em fake news. In other ways, for example, the previous speaker has already said about transparency of media outlets, many other ways to just avoid this kind of harsh authoritarian regulations. Of course, and this will be more or less my last point. To avoid the questions later, we have to make a very clear distinction about what is propaganda and what is other fake news. Propaganda, an information war that is something far, far, far more dangerous than anything else. So propaganda has to be tackled and treated and somehow not regulated but solved in one way, which is, for example, the government first of all has to officially recognize there is another state fighting with information, my country. It has to be officially recognised. There has to be anti-propaganda strategies, transparency for many different organisations, whether it's nongovernment, media. Then we have the fake news of other types. This is someone, I don't know, sending some information, whatever the aim is, parody, social, political, financial with other aims. This has to be tackled separately or somehow other way around. But what I always say, and I will keep saying: Do not make it easy for you. Do not go for a very strict, harsh regulation. This is very, very dangerous not only because it might create censorship, but it might also create self censorship.
 
>> NADIA TJAHJA: Thank you. Sorry to have to intervene but I would like to move on to the next -- first of all, thank you very much. I would like to use this opportunity to open the floor and ask if there were any projects or remedies that you have been engaging with or that you know about that you would like to share. Therefore, we go first open a word cloud. Yes!
 
And please, do submit your projects, remedies. What have you been working on? What have you been contributing to? What have you seen online? Also our remote participants, please do send in what you are working on. We are really interested to see what is out there locally, nationally, regionally, internationally. Very happy to have you join and participate this way.
 
So while we collect all this information of the different types of projects and remedies there are I would like to invite Jutta Croll on the project involving hate speech.
 
>> JUTTA CROLL: Ana Kakalashvili has already explained the law. Are there any questions about the law? I can explain a little more in detail how it works.
 
It is called the action to improve the law in social networks. From the title it becomes clear that it is a law that tries to make the law that we already have in the German Penal Code to come into force also on the networks.
 
So the new law didn't make anything illegal that had not been illegal before in Germany. It also applies, only applies to the dissemination of illegal content via networks.
 
And then it is also important to know that it only applies to social networks that have more than 2 million people registered to the network. And that is important to understand because it is not that we have a different law for networks that have more registrants and we have another law which is less questioning for smaller social network laws. So it makes the law applicable to only those big social networks and that is because the government recognised that it is more necessary to address the networks where it can be spread, the false information, the fake news can be spread to a broader audience. Let me think, it is necessary to add that the law differentiates between the content that is obviously illegal. Where you can recognize from within a really short time that it is illegal and then you need to take down this content within 24 hours. For any other content where it is questionable, whether it is legal or not, there is time span of seven days where it needs to be taken into due consideration whether this content needs to be blocked or taken down. So I am ready to answer any other question about the law. I am not defending the law. I appreciate that we had a critical debate in advance of the law and the critical debate is going on.
 
There is set a time frame for reporting of those social networks that the law applies to. And I think the law came into force in October last year and the first reports are due at the end of July this year. So then we will know also in a very transparent manner how it was handled by the social network platforms and how it really could come into effect. I think we also need to continue this debate whether such laws can work or they do work when we have the first reports from the social networks. Thank you.
 
>> NADIA TJAHJA: Thank you very much. Are there any other members of the audience who would like to come up to the microphone and talk about any projects you are working on? Or anything you've seen? You have any concerns you would like to address? We have an open mic session. There's an open mic. Please, go that way. Yes, please.
 
>> AUDIENCE: Hello. Dominick Share from the Council of Europe, but I'm speaking as a private person. I would like to mention this. So I do not have my own project that I would like to present, but I have been wondering if we just blocked the content that is online and that we consider is not suitable or is not on our standards, are we actually helping this problem to be solved or are we merely fighting against the symptoms? Because if we block content, it just goes away very unnoticed and most of the citizens won't realize. So I kind of doubt that it will actually help to the question to find where this borderline is to also make a broad society aware where this borderline is and what is acceptable and what is not. Thank you.
 
>> NADIA TJAHJA: Thank you very much. I also believe that Mark Carville is attending the session, the head of the international online policy at the Department for Digital Culture Media and Sports of the U.K. government. Perhaps you could share some comments?
 
>> AUDIENCE: Yes, thank you, Nadia. Thanks very much for inviting me to take the mic to say a few words about the U.K. government policy development in this area and at this very timely session on disinformation.
 
We welcome the fact that there is now a lot of international interest in developing responses to this emerging, fast emerging risk of disinformation, as here at EuroDIG.
 
And the U.K. government supports a coordinated international response on this issue, in particular how to deal with maliciously intended interference in sovereign democratic processes. That is a key challenge for governments. We must address those challenges in concert with the stakeholder community and the private sector -- thank you -- the private sector, the technical community, Civil Society.
 
And I have to say at the moment there is no evidence of that happening in the U.K. in terms of direct threats to our democratic processes, but we do recognize that this is a risk. There are threats here that we need to be prepared for. We have a team looking at this in my Ministry, the Department for digital culture, media and sport. We are working on this under our digital charter, which is our approach to developing norms of behavior in the internet sphere and dealing with online harms and threats to the welfare of our citizens on this issue of disinformation. Disinformation is a key element of that policy development. We don't have the laws yet but we are doing a lot of work and we are working with the technical community, the social media platforms and Civil Society, as I say, and also with European and international partners on this.
 
We are focusing on five areas. Firstly, we do need to get our facts. We need to undertake a lot more research into understanding this problem. And that is a point that was well made earlier on here in this session. Secondly, on media literacy, we are looking at the opportunities for developing education and guidance for our citizens to enable them to have the skill to differentiate between fact and fiction. So we are looking at that. We are working with our colleagues at our education ministry on that.
 
>> NADIA TJAHJA: We ask you to come to your concluding remarks?
 
>> AUDIENCE: I'll finish on the final elements of our policy which I think is important to make clear. We are working with the technical sector to develop really a framework of voluntary practices that we in government can support and we are checking if we have the right regulation in place. We are well aware of the risks that Ana in particular articulated about an approach which is going to be heavy handed and impact on freedom of expression and looking across our strategic couples in the whole of government. I hope that's a helpful intervention. I'm sorry I ran over a little bit.
 
>> NADIA TJAHJA: Thank you for your contribution. I'm pleased there are so many projects coming up. I wonder if Claudia would like to come up and give an overview of the projects coming up online.
 
>> AUDIENCE: Hello again, everybody. A lot of initiatives are focusing on education and online resources. I believe the first one that was submitted was anonymous. I caught that correctly, which I guess is one option. There's a lot of initiatives about journalists, trust initiatives for journalists, cyber hygiene training. There was one (non-English phrase.) It is the day of, the secure internet day? Stop fake news initiatives, research being performed by think tanks and other organisations, the no hate speech movement and lots of studies being performed. Thank you.
 
>> NADIA TJAHJA: Thank you very much. So I just wanted to have a look at the remote moderator. Is there a contribution from the online community that they want to share with us?
 
There are currently no questions. I'm reaching out to the audience at this time. Do you have any interventions? I'm next going to Mr. Patrick Penninckx from the head of Information Society department at the Council of Europe. He contributed to the transformation processes of the organisation and developing partnerships with national and international institutions. Heading the Information Society Department as Director General of Human Rights and International Law, he coordinates standard setting and cooperation activities in et fields of immediate, cyber crime.
 
>> PATRICK PENNINCKX: How many of you have read a paper news paper today? This says more about your age than about your ability to read.
 
(Laughter.)
 
>> PATRICK PENNINCKX: Second, who read an online newspaper today? Yeah? A couple more. Also very informed community.
 
Who read an article on a newspaper of which you have no subscription?
 
Yeah? New York Times? The Guardian, yeah? I see. How many paid for your reading? Of those ... so you paid because you read an article on them? Well, that's quite exceptional, quite frankly. I will present to you, I brought for you especially from Strasbourg, the most important broadcaster device you have ever seen. It takes me a second.
 
You are all puzzled but you all have it in your pockets. This has become the most potent broadcasting device that we are talking about. I will talk due about -- talk to you about five different things. First I will speak about trust. When we speak about false information, basically we are talking about trust and trusted information.
 
Second, I will speak about news avoiders and news skimmers. I will speak about cocooning and actors. Trust.
 
The Edelman Trust Barometer? Anyone heard about it? Edelman Trust Barometer? Media is the least trusted institution in all or almost all of the countries surveyed by the Edelman Trust Barometer.
 
Distrusted in 22 out of 28 countries. The ones in which it is least distrusted are China, Malaysia and some other countries which play in the same category.
 
Six in ten people do not know how to tell good journalism from rumors and falsehoods. One in four trust social media for news and information.
 
It is very little. I'm not surprised, Mark, that in fact we need to do a lot more study on what is really the impact of false news and false information before we draw some quick conclusions.
 
And I think the first lesson is about false news or disinformation, first lesson is make sure that you get your facts right before distributing more information about disinformation. Because it is a hike and the reason why the Council of Europe did this study and called it disinformation disorder is because everything was heaped up on one single file which was called fake news and which was used by any actor that did not like the news that was being produced.
 
Second, I will speak about news avoiders and news skimmers. 33 percent of us -- I'm not going to ask you because it may be a little bit more difficult to raise your hands, but 33 percent of the interviewed read less news compared to the year before. 33 percent. 19 percent avoid news all together. 40 percent find news too depressing.
 
So what is the answer? My next point, cocooning. That is people start to live in trusted cells. Positive note, Giacomo, is that the traditional media are gaining in trust in a number of countries. And Mark, for your relief, in the United Kingdom is one of the highest scores for trust in the public media. So that is also quite important to mention.
 
Now, just to also reveal a little bit to you that we are not alone in this mettle or fight in this campaign. I attended a conference entitled: "The role of governments and internet platforms."
 
Any idea who the organiser of this conference was? Microsoft. You were going to say Facebook. Not far wrong, because Facebook and Google participated.
 
The key thing that we are all in this together. There is a common goal obviously. There are some issues that need to be settled separately. But when you listed the key actors that had to be involved in tackling the issue of fake news, I think you left out a couple. It is not only governments. It is not only internet companies. It is not only the consumers, but it is also the technical community. We speak about very often when we speak about data protection, for example, we speak about data protection by design, to which extent can we ensure that in the designing of algorithms, of self learning algorithms, of artificial intelligence which create filter bubbles, which create distribution of information, rapid distribution of information and disinformation. It is not that this has never existed before. But it is a question of how, with which speed is this currently being distributed? It is also what can companies, what can technical communities do in order to create a number of speed bumps that may allow us to counter the spreading of fake news, whether that is being done through governmental agencies or through individuals seeking quick gain.
 
I think it is important that we also do not forget about the role of Civil Society in this. In this room there is, of course, all the different multistakeholder actors that are involved in internet governance and dialogue, but the key participation is definitely also on Civil Society. There is quite a number of things to be done. All those actors which I mentioned have their specific role to play. The media actors have their specific role to play with regard to quality journalism, with regard to ensuring that journalists receive the correct education, but that also in there, there is a lot of work to be done on debunking disinformation. So when we are speaking about this, look at what each of those parts of our society can do to debunk disinformation because we are all in this together. That's why a multistakeholder approach to this also applies. Thank you.
 
(Applause.)
 
>> NADIA TJAHJA: Thank you very much, Mr. Penninckx for your contribution. So after this intervention I would now like to invite Tamar Kintsurashvili, she served as Deputy Secretary of in Georgia and was first elected Director General of the Georgian, Deputy Director of CSO Liberty Institute. Tamar is Associate Professor at the university teaching propaganda research methods.
 
>> TAMAR KINTSURASHVILI: Good afternoon and good day. First of all, I would like totally to agree with you that the problem is much more complex rather than the term fake news, but I would rather say that it is more complex than even information disorder because propaganda is part of hybrid warfare. Not only in countries like Georgia and the Ukraine suffering from this problem because our countries are occupied by a neighboring country, but traditional democracies as well.
 
We can see an intervention in the election in traditional democracy. And the major challenge of the moderate world is how undemocratic countries use democracy and democratic institutions like media against democracy and open society.
 
As regards to solutions, of course we need collaborative approach when it comes to the fight against hybrid warfare, not only fact connectors or professional media outlets can address this problem, but we need collaboration with government educators because it is about strategic communication as well. It is about transparency, not only media outlets but other actors, political parties. It is related to money laundering and dirty money coming in our country and used against democracy itself.
 
Major problem you mentioned, this trusting media is creating some kind of cause in the country not to believe in anything. It is very sensitive especially in post-Soviet societies, coming from the regime when we distrusted everything. But democracy is about making informed choices. We need to somehow preserve trust in quality media. It is easy to say, but difficult to achieve because tab Lloyd media inciting hate speech is more actually fractive unfortunately for our societies and often this hate against different groups of our societies or triggering historic trauma against our other neighbors is part of this propaganda methods. For instance, our organisation is working on media content analysis, tracking the very fine different kind of fake mu ins -- news which tries to provide society with information about propaganda methods as well.
 
The true phobia is part of the problem in Georgia. There are different threats Georgia is facing to historical ones, saying that if Russia is occupied, why not Turkey? It is especially dangerous when we have Muslim minorities in our countries and it is related to inciting not only hate speech against these groups but in some cases provoking or mobilising radical groups against the minorities.
 
Legal solutions are actually, we have very little legislation. Referring to European experiences of criminalizing hate speech is not a solution in countries with lack of democracy and rule of law because, for instance, our alliance party recently initiated a blasphemy law, actually aimed at protecting majority from minority rather than vice versa, which is more European experience rather than remedy to solve the problem of hate speech, which might be really translated in hate crimes.
 
Speaking on our own experience, we are engaging media literacy programmes as well because society and social media users are also responsible for not misleading others by sharing information they become disseminators of information and this is a positive development on the one hand because anybody can be journalists and we have more pluralistic digital world, but on another hand it is about responsibilities and we try to teach youths how to distinguish quality media programme for unverified fake news or other types of content.
 
This is in short and we can, I can answer that questions later. Thank you.
 
(Applause.)
 
>> NADIA TJAHJA: Thank you very much for your contribution.
 
>> AUDIENCE: Well, I am representative of the country you mentioned, representative of Civil Society of Russia, academia of Russia. I would like to note that Civil Society in Russia is also important and a huge threat due to the military propaganda against neighboring countries and this is also a huge problem for Russian Civil Society who may be against some political interventions and that is why it is also important, it is killing Civil Society. It is killing another positions which could be inconsistent with. So we have to make, initiate some projects, maybe to analyze what the military propaganda is, how the internet and other media sources are used to widespread of the military propaganda. How the internet could contradict the military propaganda, how it could spread other relevant opinions. How it could be one of the possible media to distribute alternative opinions, to make friendly connections between. And I think as well to have an opportunity to communicate formally and informally with representatives of Ukraine, of Georgia, other countries to have communications to make different context because it is maybe the only source to communication between societies to maybe combat the situation. Thank you very much.
 
(Applause.)
 
>> NADIA TJAHJA: Thank you for your contribution. It really excites me that there seems to be a lot of questions now from the floor. However, before I would like to open the floor I would actually like to invite Clara Sommier to give a short intervention and show the perspective from Google regarding. She works within the EU public policy and government relations team at Google in Brussels. She focuses on how to strengthen the positive impact of the web and make sure that web remains a safe place and follows such actively the work of the European institutions. Before joining Google she worked at the European parliament for three years focusing on security and fundamental rights. Thank you very much for being here.
 
>> CLARA SOMMIER: Thank you. Thank you very much. Indeed I will try to explain very shortly how we address and how we try to address disinformation at Google and on YouTube. First I'm sure you are aware our Google mission is to make the word information universally accessible and useful. This is why from the very start we've tried to address the issue that we are seeing on the platform from spam to content forums to malware and the work we are doing on fake news we see as a continuation of this first work.
 
What has to be said, at this stage, the situation as rightly pointed out by other panelists take many forms. This is why there isn't a one size fits approach which doesn't mean that it doesn't take it seriously. Let me be very clear on that. For us if we show misinformation on a platform, it means that we haven't fulfilled our mission and we haven't served our users. It is a mistake. It is not something we want to do. This is why we are trying to work on different tracts because we know that only one angle won't be enough to tackle this information. We have developed five different tracks for working on to have this 360-degree approach on fake news and disinformation.
 
First what are we doing? The first pillar of our work is trying to promote good content. We have done a couple of changes in our products to make sure that we are going to promote qualitative and authoritative sources when you are looking for information on Google search, as well in Google news. We are trying to demote low quality content as well to make sure you find the right and trustworthy information when you are looking for information.
 
The next pillar, of course, is to fight back against bad actors. We have heard much about that. Money is one of the incentives as well. One way against fighting back against the bad actors is cutting the money flows. For that we have made changes to advertising policy to make clear that platforms that are misleading our users can not use our advertising service if those bad actors can't have ads, they won't make money and it will cut one part of the problem.
 
The next is about empowering the users. We have seen the power of fact check tags. We have employed them on Google news and Google search O it is fact checked when it is checked by a newspaper community or a publisher. Another point about empowering the commune, part of the first pal lar is media literacy. It is also a huge part where we see that we have a role to play together with the Civil Society. This is why we partnered with different Civil Society organisations across the world. I can tell you maybe about one concrete initiative that we launched recently which is called be InterTelligent. We have been working with leading child safety organisation to put together a curricula of activities to teach children how to be safe online. It comes on online format with lots of games, very interactive and powerful as well as an offline book that you can download or get for cheap. Now we only launched in the U.K. but we plan to launch into more European countries and internationally. One part of the curricula focused on the information, on learning to recognize what is real and what isn't. It is one big part of this curriculum. The fourth dimension that I want to mention today is the support to the news industry. We have seen that if we want to help a user find the right information, the right information should be out there. This is why it is part of our mission as well to support the news industry in that regard. We have put, developed the digital innovation fund as well as the Google news initiative recently. We try to support innovative way for the press to display content as well as finance research and project that can make a difference. One example is U.K. organisation fact meta that is using machine learning to see if that is a way to tackle disinformation. It is one of the projects we are supporting to see if that is a solution.
 
Another way we try to protect and support the press is on the subscription. Maybe some of you have had then experience recently while reading a news Article for free, after some time being asked the option to either directly subscribe or to pay to be able to see content of the Article. That is a feature that has been asked by the news industry and that we've incorporated lately and helped to put into place to try to support them. And my last point before we can open up for discussion and question is the collaboration that we are putting into place with other actors. It has been said again and I can only reinforce this point. It is only if we work all together that there can be a solution. This is why we try to support research as well as other projects. I will only mention two projects that I find particularly interesting. First is first draft. It is a community that really is working on the fact checking side and on supporting the newsroom. They have been working a lot on the election as well. The second one is the trust project. It is a project where they try to establish credibility indicators for media outlets, a very interesting one when we have, are faced with this difficult question on what is trustful or not, what is authoritative content or not.
 
Those are only a few ways that we are trying to prevent the spread of disinformation on this platform. We know that the process isn't over and we look forward to be engaging with all the stakeholders.
 
(Applause.)
 
>> NADIA TJAHJA: Thank you very much for your contribution, Ms. Sommier. I'm pleased to have had a diverse perspective from the different stakeholders that are present today, from Civil Society, business, and government.
 
So I would like to open the floor for you to ask any questions or make any comments or raise concerns. Please walk up to the mic.
 
>> AUDIENCE: Hello. I'm from the Ukraine, European Media Platform. I have several questions. Does EuroDIG website have any mechanism to remove fake information or disinformation? Thank you.
 
(Applause.)
 
>> NADIA TJAHJA: Thank you very much for your question. Unfortunately, I am not a representative of the EuroDIG Secretariat. I don't feel comfortable making any comment on that. However, this session is recorded. I will make note of your question. So I will ensure that, I will get your question to the EuroDIG Secretariat. Thank you for your question.
 
Are there any other questions or comments that reflect on the comments or questions of the speakers? Please.
 
>> AUDIENCE: Hello, everyone. Hello. My name is Ahmed, a EuroDIG Fellow. My question is, it has been mentioned that the end user should differentiate between propaganda and fake news, but the question is how can an end user know that this is fake news, propaganda? As end user we normally believe what we see and we check the number of shares that the news has. We believe what we see. How can we know this is propaganda and or fake news and how can users be encouraged towards critical thinking rather than believing what they are seeing? Thank you.
 
(Applause.)
 
>> NADIA TJAHJA: Is there anyone who would like to take this question? Mr. Cesarini.
 
>> PAOLO CESARINI: Yes, valid point. I think the answer lies with the tool, one side may have mentioned it, we need to develop critical spirit and reading by the users. That is a long-term action.
 
The other solution is to create a community of fact checkers and researchers. I mean, the contribute of information requires that there is professional data to debunk fake news. I don't think there is a way to do it otherwise. Somebody else?
 
>> PATRICK PENNINCKX: Propaganda is, of course, is as old as the existence of political interaction. I think it is a very specific form of disinformation which has a specific objective which has a lot of news behind it. And it is organised to destabilize states, communities, ensure that discrepancies are being promoted in societies. There are a number of states that are willingly embarking in this type of propaganda war, you could say. But that is not the sole problem. If you look at, for example, the spread of information, disinformation on Facebook in Myanmar which led to ethnic cleansing or Kenya where the fake opinion polls made a completely different political spectrum in society, these are the things that we have to look at. Of course, political propaganda is certainly part of that because it is organised to disorient and destabilize other countries or communities.
 
>> TAMAR KINTSURASHVILI: There are many tools to verify Articles or photos, to look at the news is shared with a large number of people is no reason to believe it is true. This is a method, they use very sensitive topic. In some cases nobody reads the content itself. The headline and the picture, and visual effects. There are some criticisms regarding fact checking, whether it is necessary or not because some people think that it is like belief. People believe in certain things and there is no reason to debunk.
 
But I think it is about labeling, naming and shaming. If you show, based on facts not on alternative propaganda messages, but based on facts that somebody is lying and you show the profile of this media outlet, who is behind this media, because it is very easy to launch online media platforms in modern worlds and they hope to have, we have folks disseminating fake content. They don't use guardian, USA radio or et cetera. This all helps us to critically assess the content we are sharing on Facebook. And acknowledge our own responsibility. This is in short.
 
>> CLARA SOMMIER: To add to that, one thing that we are doing at Google, also when you are searching for a news outlet, we try to display in the first category what we call the knowledge box. That is what appears before the search results. What we know about the news publishers, what type of topics are covered, if the publisher won any prize. To help you understand a bit better what is behind any news outlet that you can see out there.
 
>> NADIA TJAHJA: Thank you.
 
>> AUDIENCE: I'm from the European Workers Union. Some questions out of the debate. First, we have seen from some of the projects that are mentioned there and which we are working currently that the debunking and fact-checking are a waste of time if they are not done hand-in-hand with the social media. And the search engines. For instance, we made the research with the EPF, part of the media project which we see that if you don't access in realtime to the information that the platform owns, for instance on the impact on votes or on the speed on which some news are propagating, then you are fact checking something that is completely irrelevant and you are missing the real ones. So without tight cooperation and transparency there is no possibility to do anything.
 
There is another exposed problem that we have as traditional media. The platform, they take out a lot of material from the web, not only the news but all the categories of material that can be removed. And they self analyze, self judge and self absolve about the work that has been done. Typical example, the right to be forgotten. Analysis has been made on data that are not shared by a board of experts that has been decided by the direct interested company. And you cannot verify what has been taken out, et cetera.
 
We know only because part of the material that were mentioned on the website of the public service broadcaster are not any more accessible. So for instance, BBC taking account what has been removed. The main things that has been removed, we don't see any reason for it being removed unless it was somebody who was bordered by relevant public comments that does not want to be shared with other people. There is a problem that cannot be judged and a party at the same time.
 
The same thing, we have the risk also in the sounding board and the group that has been put in place because there also, who is the judge and the party? Who is responsible for that? I think this is something that we need to be very careful in mind in the next months.
 
Last point is I think that the exercise of the platform, the group that, high level Expert Group has recommended on trade news -- fake news need to be fact checked through some concrete experience. We have some elections in the period in which you will improve this. Let's work together and let's see if the platform and the traditional media can cooperate together and work together and which tools we need because on the real experience we can get the real lessons about what works and what doesn't work. Thank you.
 
>> NADIA TJAHJA: Thank you for your contribution. I believe Mr. Penninckx would like to reply to this?
 
>> PATRICK PENNINCKX: Yes, gladly.
 
Well, on debunking, I fully agree that it has to be done together, media, social media, technical community as well. But speed bumps, the way to deal with all of it has to be done in tight cooperation and transparency.
 
Now, transparency leads me to your second question. That is to which extent can we as governments, as communities leave the sole responsibility to the internet intermediaries. That is a big question. To which extent is the work that is being done transparent?
 
Some call the social media and service providers as enigmatic regulators based on community standards which is all very fine but the issue is really to which extent do our governments, states give out of hand a task which is really a governmental task? That is to ensure that the standards that we are applying are balanced, are foreseeable and the outcomes are foreseeable.
 
In an earlier session the question was raised, who read the terms of service of this or that application that you may be using. Obviously, as always, there is very few that say that they have read it except Giacomo, of course.
 
But very few have read it. That is almost logical. There is also a number of recommendations which we can make with regard to the terms of service, but the terms of service of internet service providers are not value-free. They are not value-free. That we have to realize. If those values correspond to the values of the community in which you live, they are called community standards but we don't know actually who has defined those standards necessarily and whether they correspond to the legislation in a given country. So they are enigmatic, enigmatic regulators. We need more transparency on that.
 
I have a question back to Giacomo, actually. That is, there has been a tradition for media regulation and the question I would have is, is there a space for social media regulators as we have in the traditional media for the past decades? That is a question to the European Broadcasting Union.
 
>> NADIA TJAHJA: Thank you very much. I believe Mr. Cesarini had a brief comment.
 
>> PAOLO CESARINI: I could not agree more about the work between fact-checkers and broadcasters and fact checking needs to evolve to something else. They need to become much more robust when it comes to the outcome of the -- they need to check their facts and their resources and say sources of information and also the transparency, who will provide the money to the source that have been debunked. So it should be transparency built into the activity of the fact-checkers which would be source checked, source checkers at the same time and researchers at the same time. So you mentioned elections. Yes. That hints to one important point, but checking can be done on everything from the flat earth myth to the European actions that -- election that is coming up the next year.
 
Fact checking needs to be organised by teams. And the teams which count need to be at the fore front of the priority setting mechanism that is shared. Priority setting. Yes, we started the work on the 31st, actually on the 29th we started the Forum activity to progress with the formation of the rules for the code of practice and two days later we started the discussion with amongst fact checking organisations and technologists.
 
Those are two aspects that goes hand-in-hand. And my vision for the future fact checking, it is really a complex set of activities that starts, it starts with the identification of workflows and systems that streamline the methods, the different fact checking organisation across Europe applies to try to get the best out of it and also to announce the possibility for them to communicate and to exercise something that you have been hinting at. So who controls the controllers? Once fake news is debunked, who tells us that the news is reliable.
 
The peer to peer controls that can be exercised from the different organisations, the independently from any governmental influence, I need to stress this point. The trustworthiness of the work going to be carried out within this network very much rely on two things. The condition, its terms and under which the fact-checking organisation would be allowed into the network. So there must be ethical standards and professional standards clearly set out.
 
Secondly, the fact that there is a readiness for the subject to appear to peer review to avoid des portions in the materials that will emerge out of this. Another important point is fact-checking is not only about checking the news and then put it in a repository in a database which is a shared resources, which indeed is useful because it avoids duplication of work and speed ups other things, but there are two other, three other things that needs to be kept in mind when we talk about fact-checking, source checking of the future, of the next near future which is tools, these organises do not have much money to buy the tools. The projects that Google and Facebook promote for them, but we have to have fact-checking that is independent of platforms, while being in cooperation with you.
 
>> NADIA TJAHJA: We have to come to the last point.
 
>> PAOLO CESARINI: I think these are points that needs to be stressed. The second point that needs to be stressed is the cooperation with the platform for what? Well, use of the data, the fact-checkers and the source checkers and researchers need data and that is an area of cooperation which is indispensable. Third point, what do we do with the media? This organisation, this network is not an end in itself, not about shaming, naming and shaming only. It is about providing the information to the media in order for them not to fall in the trap of contribution to clarification of this information because it is not time for them to be informed at the right moment about an attempt to weaponize the information for the bad. So the fourth element is whether or not fact-checking can become a business in it sell. That is my final point. Whether fact-checking and this type of network can be a business in it sell. That cooperates not only the platforms but also with the media sector in order to develop a comprehensive and collaborative media landscape across Europe.
 
>> NADIA TJAHJA: Thank you very much. So to summarize the session and present the final concluding remarks I invite you to come and take to the stage, she does research national European and international projects concerning social media, data security, child online safety and digital literacy. Thank you very much to provide the final summary before we conclude the session.
 
>> Thank you for inviting me to the summary of the session. Thank you to all the panelists for the huge input you've given to the discussion.
 
Let me first, Claudia said all human rights are impacted by the phenomena that we are facing now. I think this could be guiding principle for all that we have discussed, keeping in mind human rights that are affected by the phenomena we face on the internet now with fake news, with false news or call it disinformation or propaganda.
 
Coming to the point that has been discussed, it is important that we see that there is an intention why disinformation is created, promoted and amplified. And this intention is to cause harm to our democracies. And so when we talk about the remedies, we need to have in mind that most of these phenomena are, have made with intent to cause harm to democracies. And also policies are -- Paolo mentioned we need to avoid remedies that encroach on the fundamental principle of freedom of expression. We are always in these two thoughts that we on the one hand need to ensure the right to freedom of expression on the one hand which is human right but also effective remedies to the phenomenon.
 
I think it's important from Ana Kakalashvili that we need to differentiate between propaganda and fake news. Both need to be addressed with probably different remedies. That was also part of the point that was made. It cannot be left to the users only to be able to differentiate between propaganda and between fake news, false news, disinformation. We need to find remedies that address these phenomena in a different way.
 
So let me have a look. What I also mentioned, yes, I do think Patrick Penninckx made an important point when he said the technical community has to play a role here and that there need to be fine designs of artificial intelligence that can help to counteract the phenomena. And I do think that also Clara Sommier is looking for those technical approaches, I would say. I don't say it is a solution, it can be a technical approach to the phenomenon. Again I like to quote TAM kin. You say that this is all about making good choices. This goes to something that I would like to present from the work our foundation is doing in Germany. You might have seen the leaflets on your Chairs. It was one of the concerns that was mentioned about information disorder saying while propaganda creators are very strong, positive, critical communicators lack the visibility and I do think there can be something done about that. We have done a research study in Germany on 620 social engagement projects where we had a look how the social engagement projects from Civil Society are using social media for their work what we found out, it is known that the civil engagement has a fundamental promotion of democratic values. This can be reinforced when social media are used for Civil Society engagement.
 
There is an impact that goes beyond what they do for society when these projects use social media. And we have seen that very often these positive long-term effects are not on the first priority of the Agenda of those people who are engaged for society but when they see that they can achieve these long-term effects with social media, they strengthen their social media engagement.
 
Also the study substantiates that the use of social media for civic engagement has the potential to preventively counteract tendencies to radicalization. What we can see, we have heard a lot about fact-checking now, about what we can do against and with these social engagement projects we see that they produce counter narratives or community narratives against false information by the work they are doing once they spread the information about their work via social media. So therefore, they tend to rather implicitly the promotion of democracy by the work they are doing and I just would like to refer you to the website that is propagated on the leave let. There you will find the study results, I think about a month later after this conversation.
 
Thank you so much for listening and have a nice evening.
 
(Applause.)
 
 
''This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.''
 


[[Category:2018]][[Category:Sessions 2018]][[Category:Sessions]][[Category:Media and content 2018]]
[[Category:2018]][[Category:Sessions 2018]][[Category:Sessions]][[Category:Media and content 2018]]

Revision as of 16:28, 2 July 2018

5 June 2018 | 16:45-18:15 | GARDEN HALL | YouTube video
Consolidated programme 2018 overview

Session teaser

The session will deal with problems arising from certain types of use of social media channels and possible remedies to address them. The purpose of the session is to shed light on the effects misinformation, disinformation, malinformation, false and fake news can have on democracy and the climate of society. Participants to the session will be invited to give input on the most pressing concerns from their point of view. Key participants will host interventions to explore possible remedies and their effects.

Keywords

  • Information disorder
  • Misinformation
  • Disinformation
  • Malinformation
  • Fake news
  • Remedies
  • Community/Alternative Narratives
  • Media literacy
  • Legal Regulation
  • Monitoring and Deletion

Format

The session will start with a 5 minute introduction which will present the state of the issue, which is followed by a 15 minute discussion with EuroDIG participants who can raise their concerns in an open mic style. Afterwards, there will be key participant interventions of 5 minutes, intermittent with 15 minutes of audience engagement, which is concluded by the key participants and the moderator.

Further reading

Recommendations by Nadia Tjahja:

Recommendations by Yrjö Länsipuro

Many of the issues that will be subject of PL 2 were aired last night at a revealing and pretty much unprecedented hearing in the U.S. Senate where 44 Senators from the Judiciary and Commerce Committees grilled Mark Zuckerberg for 4 hours, with more to come today from the House side. One of the questions that came to mind: are we now in for more government regulation of the platforms, also in the U.S.? I also noted that GDPR got favorable mentions from some Senators...

Recommendations by Livia Walpen:

Recommendations by Narine Khachatryan:

Recommendations by Charlotte Altenhoener-Dion:

People

Focal Point

  • Nadia Tjahja, Steering Committee Member (WEOG & EEG), Youth Coalition on Internet Governance

Organising Team (Org Team)

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member by subscribing to the mailing list.

  • Rim Hayat Chaif, Journalist, Blogger and Human rights activist
  • Jutta Croll, Stiftung Digitale Chancen
  • Charlotte Altenhöner-Dion, Council of Europe
  • Kristina Olausson, Policy Officer, ETNO
  • Narine Khachatryan, Safer Internet Armenia
  • Livia Walpen, International Relations, Federal Department of the Environment, Transport, Energy and Communications DETEC, Federal Office of Communications OFCOM, Switzerland 
  • Julie Mozer, Communications Officer, A Jewish Contribution to an Inclusive Europe
  • Sandro Karumidze, Director Telecom Business Development at Georgina IT Integrator UGT and Chairman of Internet Society - Georgia chapter
  • Stéphanie Matt, Media Convergence & Social Media, DG Connect, European Commission
  • Claudia Scandol, Young European Leadership, YouthDIG Fellow
  • Virginija Balciunaite, Young European Leadership, YouthDIG Fellow
  • Aamir Ullah Khan
  • Anna Keshelashvili, Board Member ISOC - Georgia, Professor of media and communications at Georgian Institute of Public Affairs
  • Menno Ettema, Council of Europe
  • Ketevan Kochladze
  • Giacomo Mazzone
  • Anastasiia Korotun
  • Ucha Seturi

Moderator & Key Participants

Please find a PDF here with the Moderator & Key Participants' biographies

  • Paolo Cesarini, DG Connect, European Commission
  • Patrick Penninckx, Head of the Information Society Department, Council of Europe
  • Ana Kakalashvili, Institute for the Development of Freedom of Information
  • Jutta Croll, Stiftung Digitale Chancen
  • Tamar Kintsurashvili, Head of Media Development Foundation, Mythbuster
  • Clara Sommier, Google

Remote Moderator

Ketevan Kochladze

Reporter

  • Adriana Minovic

Reporters are assigned by the EuroDIG secretariat in cooperation with the Geneva Internet Platform. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page.

  • 2 May 2018, 3pm CEST - Coordination Meeting on the Title & Scope of the session
  • 17 May 2018, 9am CEST - Coordination Meeting on Key Participants

Messages

A short summary of the session will be provided by the Reporter.

Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/information-disorder-causes-risks-and-remedies

Video record

https://youtu.be/nSibifXPPYg

Transcript

Provided by: Caption First, Inc. P.O Box 3066. Monument, CO 80132, Phone: +001-877-825-5234, +001-719-481-9835, www.captionfirst.com


This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.


>> NADIA TJAHJA: Hello. So good afternoon, ladies and gentlemen. Welcome to the Plenary Session on the information disorder, risks and remedies. Before I start I would love to invite my key participants to take their seats here in the front. Thank you very much. For those coming in, I want to once again welcome to the Plenary Session on the information disorder, the causes, risks and remedies. I'm Nadia and I'm the Steering Committee on the youth coalition of internet governance. It is my pleasurer to host the discussion to look at the different remedies being developed on the different stakeholder fronts. I'm excited to see and hear your views during this session. I encourage you to actively contribute to this session and actually question what is being said here today in fact.

But to put this session into perspective, the information disorder is distorting people's ability to make sense of the world around them and threatens the democratic processes around the world. While this is not a new phenomenon, the problem is compounded by both the speed, the information travels in this network world and the technological and cultural filter bubbles in which we live our lives. This is a problem that impacts us all.

Today we will be focusing on the remedies of the information disorder. Just to clarify we would like to move away from the term fake news. This term is just too small to describe the complexity, the complexity that we face that surrounds this topic and has been sensationalized in our society. The Council of Europe published a study on the information disorder and promoted a framework which was adopted at the information pollution phenomenon in three types. So we talk about misinformation. That is when false information is shared. But no harm is meant. We looked at disinformation where false information is shared with the intent to harm people. And also malinformation, when genuine information is shared to cause harm, which is often by moving information designed to stay private into the public sphere. This leads me to the question for today: How can we address the information disorder? We will have some key participant interventions by the European Commission, IDFI, Council of Europe, Google, and the digital opportunities foundation and it will be my pleasure to introduce them in a moment.

But first I would like to open the floor and hear your concerns, which we then can have a look at throughout the course of our session and giving our key participants an indication of the points that you would like them to address during their interventions.

We have an open mic here in the middle. So please do come up and share your concerns. And state your name and affiliation and don't forget that you can also add your concerns to the word cloud through (indiscernible).com which is anonymous. Come to the microphones and share with us what your concerns are. I would like to ask the moderator from the last session to provide a brief overview of the major concerns that were addressed during that session and their conclusions.

>> AUDIENCE: I'm here together with my colleague Virginija. We moderated the session this early afternoon. The main take-aways, there is an abundance of challenges that have to be addressed on three levels, government, internet platforms and businesses and the users.

But with threat also comes opportunities for AI and algorithms we can create better designs and also educate citizens to be more informed about how they use the internet.

There is a large gap between technology experts and policymakers. It is important to try to close that gap. And we also need to be more active and transparent about disinformation, for example in the political sphere, who is funding campaigns and the importance, overall importance of the advertising platforms. This has to be done on a multistakeholder kind of level. Thank you.

>> NADIA TJAHJA: Okay? Thank you very much. Are there any other concerns from the audience that anybody would like to raise?

Ooo, no one has concerns about the information disorder?

While we wait for incoming participation from either the audience but also from our online participants, I would like to ask for our first intervention. And to start with our first intervention and introduce Mr. Paolo Cesarini, the head of the unit responsible for media convergence within Digital Communication Network and Technology at the European Commission. He has been teaching as a visiting Professor and a lecturer of the universities of Turin and Montpelier. I would like to ask you to stick to the time limit of five minutes. I give you the floor.

>> PAOLO CESARINI: Thank you, Nadia. Well, I can say that the concerns popping up on the screen in the meantime. Propaganda, governance, surveillance, lack of ethical data cleaning, lazy data, state obey ensure, comes again. All these concerns have been widely discussed in the last months at the European Commission which has organised a number of consultations, also created high level group to address a phenomenon which is far from being easy to deal with. And indeed, the first issues that we had to live with is what is the phenomenon we want to address? There were several definitions that were done at the Council of Europe. Indeed it is an important one. The information disorder is ground breaking work that has been done. The approach taken by the commission, it is a bit more narrow in scope. We are not dealing with misinformation or malinformation but disinformation. This is an effort by memory, verifiable, false, misleading information which is created, promoted, or amplified, disseminated online for profit, for economic gain or to intentionally deceive the public and to cause public harm.

So those are the elements in this definition. The most important, I think, is about the harm that can be caused to democracies, the public harm is strictly linked to the protection of the functioning of our democracies in particular as regards to things, electoral processes, linked to propaganda which can be orchestrated by state or non-state actors, which can be from third countries or from domestic groups. The phenomenon is far from being linear and easy to apprehend in a single formula.

In another element of this is about harm that can be created to public policies. So that is very important because the sustainability and credibility of an institutional system relies on the good communication that policymakers can have with the public at large. If there are messages which blur the actual intentions of policymakers, which are very important such as immigration, health, climate change, finance, you name it, can be in fact distorted in the perception of the public and, therefore, be difficult to implement in a correct way.

In definition, by the way, excludes pure journalistic errors, et cetera, it is important to keep that in mind. One of the most important problems is how to avoid remedies that would encroach on the fundamental principle, which is freedom of expression.

How to avoid that information space is polluted while at the same time gig people the possibility to use the internet and other means of communication to freely express their views and opinions. So it is far from being an easy topic. The answer to that is that, and this comes also very strongly out of the public consultation in the survey that has been carried out in the last month. The answer cannot only be at the national level, there is a need for answers at the EU level. First of all because this information knows no borders. Many Member States are affected by these information campaigns, orchestration of public opinions in different areas, but sometimes by the same actors. And because internet has no borders. So the origin of the piece of information can come from another country and affect the population across the board.

So the answer to avoid fragmentation of responses has to be coordinated at the higher level while keeping in mind there is a wide margin for Member States due to the principle to remain active and do things needed at the national level.

This is a phenomenon that is in constant evolution, as the technologies that support and which have been harnessed about by malicious actors are constantly evolving. The response cannot simply be done once and forever. It has to be constantly evaluated and adjusted to the changing needs that are, or could be coming up as the phenomenon evolves.

At the point in time while we are speaking, there are certain main line of actions that are going to be implemented in the next weeks and months. And reports -- revolves around three main concepts. The first is transparency. Transparency means in essence to start with the self regulatory process while reserving the possibility for stronger action if the process would fail with expected results by the end of this year.

When we talk about transparency, we talk about something very concrete. We need more transparency about the flows of money that finance purveyors of this information. We need to cut those flows of money to demonetize the websites that are systematically purchase vague false information. We need more transparency about political advertising which has come up quite prominently during the Congress debates of last fall in the United States.

We need more transparency about the human versus automatic interactions, by bots and about the accounts which do not hide any real individual behind that take account.

We also need more transparency about the sources of information and that are required to elaborate systems that would enable users of such platforms to better identify where the information comes from and whether the sources are trustworthy or not by elaborating indicators that are useful for that purpose and that would require also a strict cooperation with the media sector, of course. So as to dilute the content with good content without falling into censorship or surveillance. The idea is to give more transparency to give information that is objective in nature and reflects the interests and the practices of the good journalism.

And finally, we need transparency as regards the data. We need to have access to platforms data for research purposes to better map this information campaigns and the spread online of these campaigns. We need this data in order to switch from a reactive approach to a preemptive approach. To act faster means to be able to understand where possible targets of disinformation are and which topics are going to develop.

The second point is about credibility of the information. The credibility requires that our society is to be better equipped to sort the falsehood from the truth. And this cannot be done by the government because no Minister of truth, of course, can be the source solution to this problem. This needs to be done by the investigative journalists, the real experts, who probably will not have a position in the newsrooms which are cutting down, unfortunately, in the home staff, but they could be usefully employed into emerging sectors such as fact checking.

We are creating or trying to create a network of fact checkers across Europe who will develop best practices to share information to have a common approach to this information and for that the commission will provide a digital service infrastructure which will be used also to link the work of fact checkers with the researchers. When I say researchers, I think about computer scientists, network analysts, I think about, of course, experts in communication and political scientists because all these competencies, all this is necessary in order to have realtime mapping of phenomenon that can deeply affect our democracies and therefore they need to have the means, we need to have the means to combat them.

Finally, the third point is about the diversity. This is a tricky one. The diversity of information is tricky one, how to ensure in the long run that citizens can have access to diverse information landscape when we know that the business model traditional journalism is in deep crisis. So we need to activate tools, financial tools to support training of journalists, modernization of showrooms, up take new technologies that enable faster identification of information and content verification, but also new technologies that are used for the public good, social good like block chain, cognitive algorithms, artificial intelligence that can educate eventually our audiences to be more demanding when it comes to the quality of information they get online.

These are on the supply side. We also need to create on the demand side because in order for audiences to be demanding, they need to be critically alerted and to be able to read critically the information they get, whether off or online. So a lot of -- yes, those are the last words.

So the last words are about the need to deploy resources to up to a scale which you must allow to go down too far, to promote media literacy actions and start grassroots organisations in all Member States with better organisation, again like for fact checkers at the EU level to develop best practices not only for the schools, so for young generations but also for the older generations with adapted tools for each of the two groups of citizens.

(Applause.)

>> NADIA TJAHJA: Thank you very much, Mr. Cesarini. If I can ask that the other microphone be moved down? So if we go slightly over time I have the opportunity to intervene, but thank you very much.

I do, we did go over slightly, but I will give a little bit of additional time to accommodate this.

So before we go into our second intervention I just wanted to have a look at the word cloud and I have actually invited the YouthDIG Fellow to give a short outline of the major concerns that were raised online and by participants here in the room.

>> VIRGINIJA BALCIUNAITE: Thank you very much, everybody, for giving us a lot of various issues that we can focus on. And so people were addressing their concerns about the trust issues. For example, which can be resulted in people decreasing participation in political affairs, for example in voting.

Also people were concerned about spamming, which were seen on the screens, a lot of spam. And as well people were not sure between fake advertising and fake news, how to verify, who can be verifying it and how can people trust the information they find online.

Another issue expressed was the national security. For example, the regional areas, for example in the Baltic states which is affected by the Russian propaganda and Russian trolls, as well as in the Ukraine. Also the bubble society, people are living these days in their own bubbles, not exposed to different types of information.

Also media and critical thinking which is discouraged and also data breaches and corporate surveillance.

>> NADIA TJAHJA: Great. Thank you very much, Virginija.

(Applause.)

>> NADIA TJAHJA: So having looked at this, I wanted to ask, are there any remote participants who had any questions? No? Not at this time.

I would like to move into the next intervention and insight Ana contact have I, an analyst at the institute for the development of freedom of information and European works, the European Commission project promoting prosecutorial. She has been involved in internet governance for over six years and member of the Internet Society, ISOC in Georgia and founder of the ISOC block chain interest group. Now take the floor for your intervention. Is there a microphone over there? Thank you.

>> ANA KAKALASHVILI: Thank you. Yes, it's working. Hello, everyone. It is a pleasure to be here. Let me start with a small introduction. I will try to be brief.

Fake news is information, propaganda, in social media and driving the content of the important political conversations among all of us, citizens, stakeholders, governments, everyone. We are so taken by that. News and information, we see that they have become weaponized against the very institutions and values that the freedom of speech was supposed to protect.

Trust me, I have been around here and we have been talking about it for a long time. Information, freedom of speech. But the situation, I'm sure this is not my personal view, has really gotten worse. We have observed with our eyes the U.S. election, the French election, how the fake news have been so much influencing the process and the free will of the voters. We still don't know what is happening. And the scenario has somehow caused a complete collapse of public trust in democratic institutions. When governments are so fearful right now of this populist nationalism, they are scrambling and trying ton find policies aimed at combating fake news.

It is not that they are trying to somehow sensor anyone, but they are trying their best to somehow find the solution in the regulatory terms. And this is where we come to sometimes, a confusing situation. For example, Germany adopted legislation for companies to pay fines up to 50 million Euros if they fail to delete illegal content within 24 hours. I'm sure we know that.

Italy, they have introduced a bill aimed at preventing the manipulation of online information and the fine for non-journalistic website of social media that published or disseminated fake news, exaggerated news regarding facts that are manifested false or unproven, they may get fined -- let me look at my stats. 5K Euro, 5,000 Euros. Sorry.

We see that governments are really trying. It is not that they want something bad. What I've learned is that there is nowhere, mostly nowhere in this scenario where you have the good and bad on the scale. It's always one good value against another good value. It is depending very much on the situation and on the circumstances and on everything which one will go up and which one will go down.

And I am very confused when it comes to this fake news topics, but let me tell you solving the problem with regulations is certainly must create more problem if done wrongly and not very carefully. Government regulations of fake news would be a cure far worse than the disease. Especially me coming from Georgia, this region, it is so -- you have to be so careful. This is, in Georgia and other countries, my neighboring countries and other countries that are just being establishing this democratic values, the principles of the good governance, rule of law. It is a very tiny and very thin line to walk when it comes to regulating the most crucial thing which is freedom of speech and freedom of information.

Not that we don't have to. We have to wash our hands and not do anything. Of course no. But keeping the balance and seizing what is in that situation, more important to defend and at the same time also tackle the issue that you are so critically facing, which is the fake news and what it does to, as I said, the free expression of the voters. It is very crucial to find the balance and keep up to it.

And let me tell you that stats, I looked at some stats. Unfortunately, for example, this economist Yugo Paul did this interview with people and they asked whether courts should be able to shut down media outlets for publishing broadcast stories. 28 percent were in favor. 29 percent were opposed. And 49 percent undecided. And of course, there are those who are more Republicans, they do want a regulation. Stats are not always in favor of what I'm saying, that we don't need the harsh regulation. But you need to understand that stats are not something we should be always relying upon. And there are bigger values we have to always keep in mind when it comes to regulations.

So let me tell you that we in Georgia and other countries, I'm sure, my neighboring countries, we all look up to the Europe union and their owe European Union and their standards. We have so many standards to implement and so many things we have to keep up with Europe. You have to be our example of defending the most crucial value. This is defending freedom of speech, freedom of information. I'm sure there are ways to tack em fake news. In other ways, for example, the previous speaker has already said about transparency of media outlets, many other ways to just avoid this kind of harsh authoritarian regulations. Of course, and this will be more or less my last point. To avoid the questions later, we have to make a very clear distinction about what is propaganda and what is other fake news. Propaganda, an information war that is something far, far, far more dangerous than anything else. So propaganda has to be tackled and treated and somehow not regulated but solved in one way, which is, for example, the government first of all has to officially recognize there is another state fighting with information, my country. It has to be officially recognised. There has to be anti-propaganda strategies, transparency for many different organisations, whether it's nongovernment, media. Then we have the fake news of other types. This is someone, I don't know, sending some information, whatever the aim is, parody, social, political, financial with other aims. This has to be tackled separately or somehow other way around. But what I always say, and I will keep saying: Do not make it easy for you. Do not go for a very strict, harsh regulation. This is very, very dangerous not only because it might create censorship, but it might also create self censorship.

>> NADIA TJAHJA: Thank you. Sorry to have to intervene but I would like to move on to the next -- first of all, thank you very much. I would like to use this opportunity to open the floor and ask if there were any projects or remedies that you have been engaging with or that you know about that you would like to share. Therefore, we go first open a word cloud. Yes!

And please, do submit your projects, remedies. What have you been working on? What have you been contributing to? What have you seen online? Also our remote participants, please do send in what you are working on. We are really interested to see what is out there locally, nationally, regionally, internationally. Very happy to have you join and participate this way.

So while we collect all this information of the different types of projects and remedies there are I would like to invite Jutta Croll on the project involving hate speech.

>> JUTTA CROLL: Ana Kakalashvili has already explained the law. Are there any questions about the law? I can explain a little more in detail how it works.

It is called the action to improve the law in social networks. From the title it becomes clear that it is a law that tries to make the law that we already have in the German Penal Code to come into force also on the networks.

So the new law didn't make anything illegal that had not been illegal before in Germany. It also applies, only applies to the dissemination of illegal content via networks.

And then it is also important to know that it only applies to social networks that have more than 2 million people registered to the network. And that is important to understand because it is not that we have a different law for networks that have more registrants and we have another law which is less questioning for smaller social network laws. So it makes the law applicable to only those big social networks and that is because the government recognised that it is more necessary to address the networks where it can be spread, the false information, the fake news can be spread to a broader audience. Let me think, it is necessary to add that the law differentiates between the content that is obviously illegal. Where you can recognize from within a really short time that it is illegal and then you need to take down this content within 24 hours. For any other content where it is questionable, whether it is legal or not, there is time span of seven days where it needs to be taken into due consideration whether this content needs to be blocked or taken down. So I am ready to answer any other question about the law. I am not defending the law. I appreciate that we had a critical debate in advance of the law and the critical debate is going on.

There is set a time frame for reporting of those social networks that the law applies to. And I think the law came into force in October last year and the first reports are due at the end of July this year. So then we will know also in a very transparent manner how it was handled by the social network platforms and how it really could come into effect. I think we also need to continue this debate whether such laws can work or they do work when we have the first reports from the social networks. Thank you.

>> NADIA TJAHJA: Thank you very much. Are there any other members of the audience who would like to come up to the microphone and talk about any projects you are working on? Or anything you've seen? You have any concerns you would like to address? We have an open mic session. There's an open mic. Please, go that way. Yes, please.

>> AUDIENCE: Hello. Dominick Share from the Council of Europe, but I'm speaking as a private person. I would like to mention this. So I do not have my own project that I would like to present, but I have been wondering if we just blocked the content that is online and that we consider is not suitable or is not on our standards, are we actually helping this problem to be solved or are we merely fighting against the symptoms? Because if we block content, it just goes away very unnoticed and most of the citizens won't realize. So I kind of doubt that it will actually help to the question to find where this borderline is to also make a broad society aware where this borderline is and what is acceptable and what is not. Thank you.

>> NADIA TJAHJA: Thank you very much. I also believe that Mark Carville is attending the session, the head of the international online policy at the Department for Digital Culture Media and Sports of the U.K. government. Perhaps you could share some comments?

>> AUDIENCE: Yes, thank you, Nadia. Thanks very much for inviting me to take the mic to say a few words about the U.K. government policy development in this area and at this very timely session on disinformation.

We welcome the fact that there is now a lot of international interest in developing responses to this emerging, fast emerging risk of disinformation, as here at EuroDIG.

And the U.K. government supports a coordinated international response on this issue, in particular how to deal with maliciously intended interference in sovereign democratic processes. That is a key challenge for governments. We must address those challenges in concert with the stakeholder community and the private sector -- thank you -- the private sector, the technical community, Civil Society.

And I have to say at the moment there is no evidence of that happening in the U.K. in terms of direct threats to our democratic processes, but we do recognize that this is a risk. There are threats here that we need to be prepared for. We have a team looking at this in my Ministry, the Department for digital culture, media and sport. We are working on this under our digital charter, which is our approach to developing norms of behavior in the internet sphere and dealing with online harms and threats to the welfare of our citizens on this issue of disinformation. Disinformation is a key element of that policy development. We don't have the laws yet but we are doing a lot of work and we are working with the technical community, the social media platforms and Civil Society, as I say, and also with European and international partners on this.

We are focusing on five areas. Firstly, we do need to get our facts. We need to undertake a lot more research into understanding this problem. And that is a point that was well made earlier on here in this session. Secondly, on media literacy, we are looking at the opportunities for developing education and guidance for our citizens to enable them to have the skill to differentiate between fact and fiction. So we are looking at that. We are working with our colleagues at our education ministry on that.

>> NADIA TJAHJA: We ask you to come to your concluding remarks?

>> AUDIENCE: I'll finish on the final elements of our policy which I think is important to make clear. We are working with the technical sector to develop really a framework of voluntary practices that we in government can support and we are checking if we have the right regulation in place. We are well aware of the risks that Ana in particular articulated about an approach which is going to be heavy handed and impact on freedom of expression and looking across our strategic couples in the whole of government. I hope that's a helpful intervention. I'm sorry I ran over a little bit.

>> NADIA TJAHJA: Thank you for your contribution. I'm pleased there are so many projects coming up. I wonder if Claudia would like to come up and give an overview of the projects coming up online.

>> AUDIENCE: Hello again, everybody. A lot of initiatives are focusing on education and online resources. I believe the first one that was submitted was anonymous. I caught that correctly, which I guess is one option. There's a lot of initiatives about journalists, trust initiatives for journalists, cyber hygiene training. There was one (non-English phrase.) It is the day of, the secure internet day? Stop fake news initiatives, research being performed by think tanks and other organisations, the no hate speech movement and lots of studies being performed. Thank you.

>> NADIA TJAHJA: Thank you very much. So I just wanted to have a look at the remote moderator. Is there a contribution from the online community that they want to share with us?

There are currently no questions. I'm reaching out to the audience at this time. Do you have any interventions? I'm next going to Mr. Patrick Penninckx from the head of Information Society department at the Council of Europe. He contributed to the transformation processes of the organisation and developing partnerships with national and international institutions. Heading the Information Society Department as Director General of Human Rights and International Law, he coordinates standard setting and cooperation activities in et fields of immediate, cyber crime.

>> PATRICK PENNINCKX: How many of you have read a paper news paper today? This says more about your age than about your ability to read.

(Laughter.)

>> PATRICK PENNINCKX: Second, who read an online newspaper today? Yeah? A couple more. Also very informed community.

Who read an article on a newspaper of which you have no subscription?

Yeah? New York Times? The Guardian, yeah? I see. How many paid for your reading? Of those ... so you paid because you read an article on them? Well, that's quite exceptional, quite frankly. I will present to you, I brought for you especially from Strasbourg, the most important broadcaster device you have ever seen. It takes me a second.

You are all puzzled but you all have it in your pockets. This has become the most potent broadcasting device that we are talking about. I will talk due about -- talk to you about five different things. First I will speak about trust. When we speak about false information, basically we are talking about trust and trusted information.

Second, I will speak about news avoiders and news skimmers. I will speak about cocooning and actors. Trust.

The Edelman Trust Barometer? Anyone heard about it? Edelman Trust Barometer? Media is the least trusted institution in all or almost all of the countries surveyed by the Edelman Trust Barometer.

Distrusted in 22 out of 28 countries. The ones in which it is least distrusted are China, Malaysia and some other countries which play in the same category.

Six in ten people do not know how to tell good journalism from rumors and falsehoods. One in four trust social media for news and information.

It is very little. I'm not surprised, Mark, that in fact we need to do a lot more study on what is really the impact of false news and false information before we draw some quick conclusions.

And I think the first lesson is about false news or disinformation, first lesson is make sure that you get your facts right before distributing more information about disinformation. Because it is a hike and the reason why the Council of Europe did this study and called it disinformation disorder is because everything was heaped up on one single file which was called fake news and which was used by any actor that did not like the news that was being produced.

Second, I will speak about news avoiders and news skimmers. 33 percent of us -- I'm not going to ask you because it may be a little bit more difficult to raise your hands, but 33 percent of the interviewed read less news compared to the year before. 33 percent. 19 percent avoid news all together. 40 percent find news too depressing.

So what is the answer? My next point, cocooning. That is people start to live in trusted cells. Positive note, Giacomo, is that the traditional media are gaining in trust in a number of countries. And Mark, for your relief, in the United Kingdom is one of the highest scores for trust in the public media. So that is also quite important to mention.

Now, just to also reveal a little bit to you that we are not alone in this mettle or fight in this campaign. I attended a conference entitled: "The role of governments and internet platforms."

Any idea who the organiser of this conference was? Microsoft. You were going to say Facebook. Not far wrong, because Facebook and Google participated.

The key thing that we are all in this together. There is a common goal obviously. There are some issues that need to be settled separately. But when you listed the key actors that had to be involved in tackling the issue of fake news, I think you left out a couple. It is not only governments. It is not only internet companies. It is not only the consumers, but it is also the technical community. We speak about very often when we speak about data protection, for example, we speak about data protection by design, to which extent can we ensure that in the designing of algorithms, of self learning algorithms, of artificial intelligence which create filter bubbles, which create distribution of information, rapid distribution of information and disinformation. It is not that this has never existed before. But it is a question of how, with which speed is this currently being distributed? It is also what can companies, what can technical communities do in order to create a number of speed bumps that may allow us to counter the spreading of fake news, whether that is being done through governmental agencies or through individuals seeking quick gain.

I think it is important that we also do not forget about the role of Civil Society in this. In this room there is, of course, all the different multistakeholder actors that are involved in internet governance and dialogue, but the key participation is definitely also on Civil Society. There is quite a number of things to be done. All those actors which I mentioned have their specific role to play. The media actors have their specific role to play with regard to quality journalism, with regard to ensuring that journalists receive the correct education, but that also in there, there is a lot of work to be done on debunking disinformation. So when we are speaking about this, look at what each of those parts of our society can do to debunk disinformation because we are all in this together. That's why a multistakeholder approach to this also applies. Thank you.

(Applause.)

>> NADIA TJAHJA: Thank you very much, Mr. Penninckx for your contribution. So after this intervention I would now like to invite Tamar Kintsurashvili, she served as Deputy Secretary of in Georgia and was first elected Director General of the Georgian, Deputy Director of CSO Liberty Institute. Tamar is Associate Professor at the university teaching propaganda research methods.

>> TAMAR KINTSURASHVILI: Good afternoon and good day. First of all, I would like totally to agree with you that the problem is much more complex rather than the term fake news, but I would rather say that it is more complex than even information disorder because propaganda is part of hybrid warfare. Not only in countries like Georgia and the Ukraine suffering from this problem because our countries are occupied by a neighboring country, but traditional democracies as well.

We can see an intervention in the election in traditional democracy. And the major challenge of the moderate world is how undemocratic countries use democracy and democratic institutions like media against democracy and open society.

As regards to solutions, of course we need collaborative approach when it comes to the fight against hybrid warfare, not only fact connectors or professional media outlets can address this problem, but we need collaboration with government educators because it is about strategic communication as well. It is about transparency, not only media outlets but other actors, political parties. It is related to money laundering and dirty money coming in our country and used against democracy itself.

Major problem you mentioned, this trusting media is creating some kind of cause in the country not to believe in anything. It is very sensitive especially in post-Soviet societies, coming from the regime when we distrusted everything. But democracy is about making informed choices. We need to somehow preserve trust in quality media. It is easy to say, but difficult to achieve because tab Lloyd media inciting hate speech is more actually fractive unfortunately for our societies and often this hate against different groups of our societies or triggering historic trauma against our other neighbors is part of this propaganda methods. For instance, our organisation is working on media content analysis, tracking the very fine different kind of fake mu ins -- news which tries to provide society with information about propaganda methods as well.

The true phobia is part of the problem in Georgia. There are different threats Georgia is facing to historical ones, saying that if Russia is occupied, why not Turkey? It is especially dangerous when we have Muslim minorities in our countries and it is related to inciting not only hate speech against these groups but in some cases provoking or mobilising radical groups against the minorities.

Legal solutions are actually, we have very little legislation. Referring to European experiences of criminalizing hate speech is not a solution in countries with lack of democracy and rule of law because, for instance, our alliance party recently initiated a blasphemy law, actually aimed at protecting majority from minority rather than vice versa, which is more European experience rather than remedy to solve the problem of hate speech, which might be really translated in hate crimes.

Speaking on our own experience, we are engaging media literacy programmes as well because society and social media users are also responsible for not misleading others by sharing information they become disseminators of information and this is a positive development on the one hand because anybody can be journalists and we have more pluralistic digital world, but on another hand it is about responsibilities and we try to teach youths how to distinguish quality media programme for unverified fake news or other types of content.

This is in short and we can, I can answer that questions later. Thank you.

(Applause.)

>> NADIA TJAHJA: Thank you very much for your contribution.

>> AUDIENCE: Well, I am representative of the country you mentioned, representative of Civil Society of Russia, academia of Russia. I would like to note that Civil Society in Russia is also important and a huge threat due to the military propaganda against neighboring countries and this is also a huge problem for Russian Civil Society who may be against some political interventions and that is why it is also important, it is killing Civil Society. It is killing another positions which could be inconsistent with. So we have to make, initiate some projects, maybe to analyze what the military propaganda is, how the internet and other media sources are used to widespread of the military propaganda. How the internet could contradict the military propaganda, how it could spread other relevant opinions. How it could be one of the possible media to distribute alternative opinions, to make friendly connections between. And I think as well to have an opportunity to communicate formally and informally with representatives of Ukraine, of Georgia, other countries to have communications to make different context because it is maybe the only source to communication between societies to maybe combat the situation. Thank you very much.

(Applause.)

>> NADIA TJAHJA: Thank you for your contribution. It really excites me that there seems to be a lot of questions now from the floor. However, before I would like to open the floor I would actually like to invite Clara Sommier to give a short intervention and show the perspective from Google regarding. She works within the EU public policy and government relations team at Google in Brussels. She focuses on how to strengthen the positive impact of the web and make sure that web remains a safe place and follows such actively the work of the European institutions. Before joining Google she worked at the European parliament for three years focusing on security and fundamental rights. Thank you very much for being here.

>> CLARA SOMMIER: Thank you. Thank you very much. Indeed I will try to explain very shortly how we address and how we try to address disinformation at Google and on YouTube. First I'm sure you are aware our Google mission is to make the word information universally accessible and useful. This is why from the very start we've tried to address the issue that we are seeing on the platform from spam to content forums to malware and the work we are doing on fake news we see as a continuation of this first work.

What has to be said, at this stage, the situation as rightly pointed out by other panelists take many forms. This is why there isn't a one size fits approach which doesn't mean that it doesn't take it seriously. Let me be very clear on that. For us if we show misinformation on a platform, it means that we haven't fulfilled our mission and we haven't served our users. It is a mistake. It is not something we want to do. This is why we are trying to work on different tracts because we know that only one angle won't be enough to tackle this information. We have developed five different tracks for working on to have this 360-degree approach on fake news and disinformation.

First what are we doing? The first pillar of our work is trying to promote good content. We have done a couple of changes in our products to make sure that we are going to promote qualitative and authoritative sources when you are looking for information on Google search, as well in Google news. We are trying to demote low quality content as well to make sure you find the right and trustworthy information when you are looking for information.

The next pillar, of course, is to fight back against bad actors. We have heard much about that. Money is one of the incentives as well. One way against fighting back against the bad actors is cutting the money flows. For that we have made changes to advertising policy to make clear that platforms that are misleading our users can not use our advertising service if those bad actors can't have ads, they won't make money and it will cut one part of the problem.

The next is about empowering the users. We have seen the power of fact check tags. We have employed them on Google news and Google search O it is fact checked when it is checked by a newspaper community or a publisher. Another point about empowering the commune, part of the first pal lar is media literacy. It is also a huge part where we see that we have a role to play together with the Civil Society. This is why we partnered with different Civil Society organisations across the world. I can tell you maybe about one concrete initiative that we launched recently which is called be InterTelligent. We have been working with leading child safety organisation to put together a curricula of activities to teach children how to be safe online. It comes on online format with lots of games, very interactive and powerful as well as an offline book that you can download or get for cheap. Now we only launched in the U.K. but we plan to launch into more European countries and internationally. One part of the curricula focused on the information, on learning to recognize what is real and what isn't. It is one big part of this curriculum. The fourth dimension that I want to mention today is the support to the news industry. We have seen that if we want to help a user find the right information, the right information should be out there. This is why it is part of our mission as well to support the news industry in that regard. We have put, developed the digital innovation fund as well as the Google news initiative recently. We try to support innovative way for the press to display content as well as finance research and project that can make a difference. One example is U.K. organisation fact meta that is using machine learning to see if that is a way to tackle disinformation. It is one of the projects we are supporting to see if that is a solution.

Another way we try to protect and support the press is on the subscription. Maybe some of you have had then experience recently while reading a news Article for free, after some time being asked the option to either directly subscribe or to pay to be able to see content of the Article. That is a feature that has been asked by the news industry and that we've incorporated lately and helped to put into place to try to support them. And my last point before we can open up for discussion and question is the collaboration that we are putting into place with other actors. It has been said again and I can only reinforce this point. It is only if we work all together that there can be a solution. This is why we try to support research as well as other projects. I will only mention two projects that I find particularly interesting. First is first draft. It is a community that really is working on the fact checking side and on supporting the newsroom. They have been working a lot on the election as well. The second one is the trust project. It is a project where they try to establish credibility indicators for media outlets, a very interesting one when we have, are faced with this difficult question on what is trustful or not, what is authoritative content or not.

Those are only a few ways that we are trying to prevent the spread of disinformation on this platform. We know that the process isn't over and we look forward to be engaging with all the stakeholders.

(Applause.)

>> NADIA TJAHJA: Thank you very much for your contribution, Ms. Sommier. I'm pleased to have had a diverse perspective from the different stakeholders that are present today, from Civil Society, business, and government.

So I would like to open the floor for you to ask any questions or make any comments or raise concerns. Please walk up to the mic.

>> AUDIENCE: Hello. I'm from the Ukraine, European Media Platform. I have several questions. Does EuroDIG website have any mechanism to remove fake information or disinformation? Thank you.

(Applause.)

>> NADIA TJAHJA: Thank you very much for your question. Unfortunately, I am not a representative of the EuroDIG Secretariat. I don't feel comfortable making any comment on that. However, this session is recorded. I will make note of your question. So I will ensure that, I will get your question to the EuroDIG Secretariat. Thank you for your question.

Are there any other questions or comments that reflect on the comments or questions of the speakers? Please.

>> AUDIENCE: Hello, everyone. Hello. My name is Ahmed, a EuroDIG Fellow. My question is, it has been mentioned that the end user should differentiate between propaganda and fake news, but the question is how can an end user know that this is fake news, propaganda? As end user we normally believe what we see and we check the number of shares that the news has. We believe what we see. How can we know this is propaganda and or fake news and how can users be encouraged towards critical thinking rather than believing what they are seeing? Thank you.

(Applause.)

>> NADIA TJAHJA: Is there anyone who would like to take this question? Mr. Cesarini.

>> PAOLO CESARINI: Yes, valid point. I think the answer lies with the tool, one side may have mentioned it, we need to develop critical spirit and reading by the users. That is a long-term action.

The other solution is to create a community of fact checkers and researchers. I mean, the contribute of information requires that there is professional data to debunk fake news. I don't think there is a way to do it otherwise. Somebody else?

>> PATRICK PENNINCKX: Propaganda is, of course, is as old as the existence of political interaction. I think it is a very specific form of disinformation which has a specific objective which has a lot of news behind it. And it is organised to destabilize states, communities, ensure that discrepancies are being promoted in societies. There are a number of states that are willingly embarking in this type of propaganda war, you could say. But that is not the sole problem. If you look at, for example, the spread of information, disinformation on Facebook in Myanmar which led to ethnic cleansing or Kenya where the fake opinion polls made a completely different political spectrum in society, these are the things that we have to look at. Of course, political propaganda is certainly part of that because it is organised to disorient and destabilize other countries or communities.

>> TAMAR KINTSURASHVILI: There are many tools to verify Articles or photos, to look at the news is shared with a large number of people is no reason to believe it is true. This is a method, they use very sensitive topic. In some cases nobody reads the content itself. The headline and the picture, and visual effects. There are some criticisms regarding fact checking, whether it is necessary or not because some people think that it is like belief. People believe in certain things and there is no reason to debunk.

But I think it is about labeling, naming and shaming. If you show, based on facts not on alternative propaganda messages, but based on facts that somebody is lying and you show the profile of this media outlet, who is behind this media, because it is very easy to launch online media platforms in modern worlds and they hope to have, we have folks disseminating fake content. They don't use guardian, USA radio or et cetera. This all helps us to critically assess the content we are sharing on Facebook. And acknowledge our own responsibility. This is in short.

>> CLARA SOMMIER: To add to that, one thing that we are doing at Google, also when you are searching for a news outlet, we try to display in the first category what we call the knowledge box. That is what appears before the search results. What we know about the news publishers, what type of topics are covered, if the publisher won any prize. To help you understand a bit better what is behind any news outlet that you can see out there.

>> NADIA TJAHJA: Thank you.

>> AUDIENCE: I'm from the European Workers Union. Some questions out of the debate. First, we have seen from some of the projects that are mentioned there and which we are working currently that the debunking and fact-checking are a waste of time if they are not done hand-in-hand with the social media. And the search engines. For instance, we made the research with the EPF, part of the media project which we see that if you don't access in realtime to the information that the platform owns, for instance on the impact on votes or on the speed on which some news are propagating, then you are fact checking something that is completely irrelevant and you are missing the real ones. So without tight cooperation and transparency there is no possibility to do anything.

There is another exposed problem that we have as traditional media. The platform, they take out a lot of material from the web, not only the news but all the categories of material that can be removed. And they self analyze, self judge and self absolve about the work that has been done. Typical example, the right to be forgotten. Analysis has been made on data that are not shared by a board of experts that has been decided by the direct interested company. And you cannot verify what has been taken out, et cetera.

We know only because part of the material that were mentioned on the website of the public service broadcaster are not any more accessible. So for instance, BBC taking account what has been removed. The main things that has been removed, we don't see any reason for it being removed unless it was somebody who was bordered by relevant public comments that does not want to be shared with other people. There is a problem that cannot be judged and a party at the same time.

The same thing, we have the risk also in the sounding board and the group that has been put in place because there also, who is the judge and the party? Who is responsible for that? I think this is something that we need to be very careful in mind in the next months.

Last point is I think that the exercise of the platform, the group that, high level Expert Group has recommended on trade news -- fake news need to be fact checked through some concrete experience. We have some elections in the period in which you will improve this. Let's work together and let's see if the platform and the traditional media can cooperate together and work together and which tools we need because on the real experience we can get the real lessons about what works and what doesn't work. Thank you.

>> NADIA TJAHJA: Thank you for your contribution. I believe Mr. Penninckx would like to reply to this?

>> PATRICK PENNINCKX: Yes, gladly.

Well, on debunking, I fully agree that it has to be done together, media, social media, technical community as well. But speed bumps, the way to deal with all of it has to be done in tight cooperation and transparency.

Now, transparency leads me to your second question. That is to which extent can we as governments, as communities leave the sole responsibility to the internet intermediaries. That is a big question. To which extent is the work that is being done transparent?

Some call the social media and service providers as enigmatic regulators based on community standards which is all very fine but the issue is really to which extent do our governments, states give out of hand a task which is really a governmental task? That is to ensure that the standards that we are applying are balanced, are foreseeable and the outcomes are foreseeable.

In an earlier session the question was raised, who read the terms of service of this or that application that you may be using. Obviously, as always, there is very few that say that they have read it except Giacomo, of course.

But very few have read it. That is almost logical. There is also a number of recommendations which we can make with regard to the terms of service, but the terms of service of internet service providers are not value-free. They are not value-free. That we have to realize. If those values correspond to the values of the community in which you live, they are called community standards but we don't know actually who has defined those standards necessarily and whether they correspond to the legislation in a given country. So they are enigmatic, enigmatic regulators. We need more transparency on that.

I have a question back to Giacomo, actually. That is, there has been a tradition for media regulation and the question I would have is, is there a space for social media regulators as we have in the traditional media for the past decades? That is a question to the European Broadcasting Union.

>> NADIA TJAHJA: Thank you very much. I believe Mr. Cesarini had a brief comment.

>> PAOLO CESARINI: I could not agree more about the work between fact-checkers and broadcasters and fact checking needs to evolve to something else. They need to become much more robust when it comes to the outcome of the -- they need to check their facts and their resources and say sources of information and also the transparency, who will provide the money to the source that have been debunked. So it should be transparency built into the activity of the fact-checkers which would be source checked, source checkers at the same time and researchers at the same time. So you mentioned elections. Yes. That hints to one important point, but checking can be done on everything from the flat earth myth to the European actions that -- election that is coming up the next year.

Fact checking needs to be organised by teams. And the teams which count need to be at the fore front of the priority setting mechanism that is shared. Priority setting. Yes, we started the work on the 31st, actually on the 29th we started the Forum activity to progress with the formation of the rules for the code of practice and two days later we started the discussion with amongst fact checking organisations and technologists.

Those are two aspects that goes hand-in-hand. And my vision for the future fact checking, it is really a complex set of activities that starts, it starts with the identification of workflows and systems that streamline the methods, the different fact checking organisation across Europe applies to try to get the best out of it and also to announce the possibility for them to communicate and to exercise something that you have been hinting at. So who controls the controllers? Once fake news is debunked, who tells us that the news is reliable.

The peer to peer controls that can be exercised from the different organisations, the independently from any governmental influence, I need to stress this point. The trustworthiness of the work going to be carried out within this network very much rely on two things. The condition, its terms and under which the fact-checking organisation would be allowed into the network. So there must be ethical standards and professional standards clearly set out.

Secondly, the fact that there is a readiness for the subject to appear to peer review to avoid des portions in the materials that will emerge out of this. Another important point is fact-checking is not only about checking the news and then put it in a repository in a database which is a shared resources, which indeed is useful because it avoids duplication of work and speed ups other things, but there are two other, three other things that needs to be kept in mind when we talk about fact-checking, source checking of the future, of the next near future which is tools, these organises do not have much money to buy the tools. The projects that Google and Facebook promote for them, but we have to have fact-checking that is independent of platforms, while being in cooperation with you.

>> NADIA TJAHJA: We have to come to the last point.

>> PAOLO CESARINI: I think these are points that needs to be stressed. The second point that needs to be stressed is the cooperation with the platform for what? Well, use of the data, the fact-checkers and the source checkers and researchers need data and that is an area of cooperation which is indispensable. Third point, what do we do with the media? This organisation, this network is not an end in itself, not about shaming, naming and shaming only. It is about providing the information to the media in order for them not to fall in the trap of contribution to clarification of this information because it is not time for them to be informed at the right moment about an attempt to weaponize the information for the bad. So the fourth element is whether or not fact-checking can become a business in it sell. That is my final point. Whether fact-checking and this type of network can be a business in it sell. That cooperates not only the platforms but also with the media sector in order to develop a comprehensive and collaborative media landscape across Europe.

>> NADIA TJAHJA: Thank you very much. So to summarize the session and present the final concluding remarks I invite you to come and take to the stage, she does research national European and international projects concerning social media, data security, child online safety and digital literacy. Thank you very much to provide the final summary before we conclude the session.

>> Thank you for inviting me to the summary of the session. Thank you to all the panelists for the huge input you've given to the discussion.

Let me first, Claudia said all human rights are impacted by the phenomena that we are facing now. I think this could be guiding principle for all that we have discussed, keeping in mind human rights that are affected by the phenomena we face on the internet now with fake news, with false news or call it disinformation or propaganda.

Coming to the point that has been discussed, it is important that we see that there is an intention why disinformation is created, promoted and amplified. And this intention is to cause harm to our democracies. And so when we talk about the remedies, we need to have in mind that most of these phenomena are, have made with intent to cause harm to democracies. And also policies are -- Paolo mentioned we need to avoid remedies that encroach on the fundamental principle of freedom of expression. We are always in these two thoughts that we on the one hand need to ensure the right to freedom of expression on the one hand which is human right but also effective remedies to the phenomenon.

I think it's important from Ana Kakalashvili that we need to differentiate between propaganda and fake news. Both need to be addressed with probably different remedies. That was also part of the point that was made. It cannot be left to the users only to be able to differentiate between propaganda and between fake news, false news, disinformation. We need to find remedies that address these phenomena in a different way.

So let me have a look. What I also mentioned, yes, I do think Patrick Penninckx made an important point when he said the technical community has to play a role here and that there need to be fine designs of artificial intelligence that can help to counteract the phenomena. And I do think that also Clara Sommier is looking for those technical approaches, I would say. I don't say it is a solution, it can be a technical approach to the phenomenon. Again I like to quote TAM kin. You say that this is all about making good choices. This goes to something that I would like to present from the work our foundation is doing in Germany. You might have seen the leaflets on your Chairs. It was one of the concerns that was mentioned about information disorder saying while propaganda creators are very strong, positive, critical communicators lack the visibility and I do think there can be something done about that. We have done a research study in Germany on 620 social engagement projects where we had a look how the social engagement projects from Civil Society are using social media for their work what we found out, it is known that the civil engagement has a fundamental promotion of democratic values. This can be reinforced when social media are used for Civil Society engagement.

There is an impact that goes beyond what they do for society when these projects use social media. And we have seen that very often these positive long-term effects are not on the first priority of the Agenda of those people who are engaged for society but when they see that they can achieve these long-term effects with social media, they strengthen their social media engagement.

Also the study substantiates that the use of social media for civic engagement has the potential to preventively counteract tendencies to radicalization. What we can see, we have heard a lot about fact-checking now, about what we can do against and with these social engagement projects we see that they produce counter narratives or community narratives against false information by the work they are doing once they spread the information about their work via social media. So therefore, they tend to rather implicitly the promotion of democracy by the work they are doing and I just would like to refer you to the website that is propagated on the leave let. There you will find the study results, I think about a month later after this conversation.

Thank you so much for listening and have a nice evening.

(Applause.)


This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.