Internet in the “post-truth” era? – Pl 02 2017

From EuroDIG Wiki
Jump to navigation Jump to search

6 June 2017 | 17:00 - 18:30 | Grand Ballroom, Swissotel, Tallinn, Estonia | video record
Programme overview 2017

Session teaser

The internet has given voice to the many, great!, but how and who can now distinguish between ‘facts’, ‘opinions’ and ‘feelings’?

Keywords

Post-truth, fake news, manipulation, critical thinking, emotional appeal, post-truth politics, accountability, legitimacy, freedom of expression, access to information, filter bubble, algorithms, profit, business model, content creation, narratives, ethic, quality journalism, media literacy education, response to hate speech, take down, education, alternative narratives, trusted flaggers, multi-stakeholder approach, regulation, code of conduct, democracy, gatekeepers, fact checking

Session description

Deployment of Internet infrastructure is insufficient without media literacy education, aimed at equipping (young) people with the competences to reflect critically, to deal with ambiguity and diversity of opinions, with respect to freedom of expression and all other human rights.

Propaganda, misleading information and factually wrong news has always existed, irrespective of the spread of the Internet. Therefore, one question arises: why has fake news become such a phenomenon now? Is it because the polarization of society (social aspect) pushes people to their comfort zones, leading to the creation of ‘filter bubbles’, or do ‘filter bubbles’ lead to polarization – which comes first? Is the business model of the dominant IT companies making fake news more appealing than real news? Why do some narratives (e.g. counter and alternative narratives to hate speech) and news (i.e. real news) receive less visibility than others? At the same time, the proliferation of Internet access gave rise to the phenomenon of mass dissemination of fake news online, adding a new dimension: fake news is rapidly reproduced on social media, mostly without editorial control and with limited ability for readers to verify content. With this in mind, the need for journalists to get involved in fact checking before publishing news, so as to advance quality media, is all the more crucial.

Moreover, personalized algorithms which optimize instant gratification by prioritizing information that matches the world view of the user have reinforced the creation of ‘filter bubbles’, facilitating and causing fake news to spread at ease. Are such algorithms a symptom of segregation in society, in lack of communication with each other? Is ‘bubble-piercing’ through similar algorithms to confront users with ‘the view point of the other side’ an approach that is feasible, desirable and/or likely to be effective? Is flagging or removing fake news enough? As ‘filter bubbles’ become the norm the exposure of the reader to news that can challenge fake news is limited. Creating fake social media accounts compromises freedom of expression, while publishing fake news as advertisements undermines journalistic ethics. It would thus not be an exaggeration to say that fake news, while not yet being clearly defined and not always easy to spot, is a potential threat to democracy.

What is more, increased Internet access mainly via mobile devices does not provide complete access to the breadth of Internet/information. As fake news also often contains elements of hate speech and justifies or reinforces hate narratives and conspiracy theories, media literacy education, critical thinking and human rights education are urgently needed, ensuring a coherent joint effort of all stakeholders to play their role in safeguarding and fostering democracy.

Format

The plenary session explores the issue ‘Internet in the Post Truth Era’ in three phases:

a) Descriptive: How does ‘fake news’ express itself across Europe today, what are the consequences and why is this a concern for pluralistic democracy and human rights?

Two conjectures (added by Prof. Osman Coskunoglu, PhD): (i) Demand, that is, human cognitive limitations, needs and biases lie at the core of the problem. (ii) Fake news is a very old problem but it is getting so much attention nowadays mainly because recently there has been a sharp escalation in polarization in societies in a number of countries?

b) Normative: Can ‘fake news’ be defined, who is responsible to address it and on what grounds? Are there parallels to be drawn with the search for an effective approach to addressing hate speech in Europe?

c) Prescriptive: What approaches to combatting fake news and hate speech should be considered good practices that need to be strengthened. In particular what role is there for education, journalists and internet businesses to play, for example by strengthening human rights and Internet media literacy education, diversity and credibility of media distribution online and adapting algorithms to broaden the filter bubbles?

Each phase will be introduced with a 3 minute statement by a panellist. The introduction is followed by a discussion with the audience and other panellist. Interaction between members of the audience will be encouraged.

To strengthen the audience engagement during each phase, a quick check of the opinions on the audience on the issue through a statement and an (online) 'voting' tool will be conducted.

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, so only links to external resources are possible.

→ From Yrjö Länsipuro:

1. Reuters Institute (Nic Newman) A report on key developments in journalism, media and technology in 2016 focuses on fears about how changing technology is affecting the quality of information and the state of our democracy. http://reutersinstitute.politics.ox.ac.uk/sites/default/files/Journalism%2C%20Media%20and%20Technology%20Trends%20and%20Predictions%202017.pdf

2. Pew Center: Experts worry algorithms can also put too much control in the hands of corporations and governments, perpetuate bias, create filter bubbles. http://www.pewinternet.org/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age/

3. Meedia: Bislang gibt es keine Belege dafür, dass die gefälschten Nachrichten hierzulande überhaupt ein ernstes Problem darstellen. Springer-Chef Mathias Döpfner meint sogar, dass es schon immer Fake-News gab. http://meedia.de/2017/01/17/fake-news-debatte-jetzt-widersprechen-die-nachrichten-praktiker-der-politik/

4. Deutsche Welle: Countering fake news while safeguarding free speech. As fighting fake news becomes a global priority, questions remain about what exactly fake news is, what can be done to stop it and how to do so without curbing free speech. http://www.dw.com/en/countering-fake-news-while-safeguarding-free-speech/a-37924278

5. University of Washington. Examining the Alternative Media Ecosystems through the Production of Alternative Narratives of Mass Shooting Events on Twitter http://faculty.washington.edu/kstarbi/Alt_Narratives_ICWSM17-CameraReady.pdf

6. Seattle Times. A story about the above http://www.seattletimes.com/seattle-news/politics/uw-professor-the-information-war-is-real-and-were-losing-it/

7. Vanity Fair. Fake News Is About to Get Even Scarier than You Ever Dreamed. What we saw in the 2016 election is nothing compared to what we need to prepare for in 2020. http://www.vanityfair.com/news/2017/01/fake-news-technology

8. New York Times. Europe Combats a New Foe of Political Stability: Fake News https://www.nytimes.com/2017/02/20/world/europe/europe-combats-a-new-foe-of-political-stability-fake-news.html?_r=0

9. Tow Center. The influence infl uence of social media platforms and technology companies is having a greater effect on American journalism than even the shift from print to digital. There is a rapid takeover of traditional publishers' roles by companies including Facebook, Snapchat, Google, and Twitter that shows no sign of slowing http://towcenter.org/wp-content/uploads/2017/04/The_Platform_Press_Tow_Report_2017.pdf

10. Tow Center. Journalists today have an imperative—and an opportunity—to sift through the mass of content being created and shared in order to separate true from alse, and to help the truth to spread. Unfortunately, as this paper details, that isn’t the current reality of how news organizations cover unverified claims, online rumors, and viral con- tent. Lies spread much farther than the truth, and news organizations play a powerful role in making this happen. http://towcenter.org/wp-content/uploads/2015/02/LiesDamnLies_Silverman_TowCenter.pdf

11. New York Times Magazine (Farhad Manjoo) Mark Zuckerberg now acknowledges the dangerous side of the social revolution he helped start. But is the most powerful tool for connection in human history capable of adapting to the world it created? https://www.nytimes.com/2017/04/25/magazine/can-facebook-fix-its-own-worst-bug.html?_r=0

12. New York Times (Jonathan Taplin) Google has an 88 percent market share in search advertising, Facebook (and its subsidiaries Instagram, WhatsApp and Messenger) owns 77 percent of mobile social traffic and Amazon has a 74 percent share in the e-book market. In classic economic terms, all three are monopolies... Could it be that these companies — and Google in particular — have become natural monopolies by supplying an entire market’s demand for a service, at a price lower than what would be offered by two competing firms? And if so, is it time to regulate them like public utilities? https://www.nytimes.com/2017/04/22/opinion/sunday/is-it-time-to-break-up-google.html?_r=0

13. The Guardian (Evgeny Morozov) Moral panic over fake news hides the real enemy – the digital giants https://www.theguardian.com/commentisfree/2017/jan/08/blaming-fake-news-not-the-answer-democracy-crisis

14. Digitoday: How Sweden is fighting fake news http://digiday.com/uk/fake-news-in-sweden/

15. Washington Post: How do you stop fake news? In Germany, with a law. https://www.washingtonpost.com/world/europe/how-do-you-stop-fake-news-in-germany-with-a-law/2017/04/05/e6834ad6-1a08-11e7-bcc2-7d1a0973e7b2_story.html?tid=ss_fb&utm_term=.0a3b50c700c3

16. Nieman Lab. Wikipedia founder Jimmy Wales launches Wikitribune, a large-scale attempt to combat fake news. The crowd-funded news platform aims to combat fake news by combining professional journalism with volunteer fact checking: “news by the people and for the people.” http://www.niemanlab.org/2017/04/wikipedia-founder-jimmy-wales-launches-wikitribune-news-by-the-people-and-for-the-people/ and: https://www.wikitribune.com/

17. NewCo (Krishna Barat) How to Detect Fake News in Real-Time https://shift.newco.co/how-to-detect-fake-news-in-real-time-9fdae0197bfd

18. New York TimesA European front on fake news. https://www.nytimes.com/2017/05/01/business/europe-election-fake-news.html?_r=0


→ From Rachel Pollack:

19. UN, OSCE, OAS and ACHPR Special Rapporteurs on freedom of expression: Joint Declaration on Freedom of Expression and 'Fake News', Disinformation and Propaganda http://www.osce.org/fom/302796?download=true

20. UNESCO Colloquim: Journalism Under Fire http://en.unesco.org/journalism-under-fire

21. UNESCO: Campaign for World Press Freedom Day (video) https://www.youtube.com/watch?v=2fPm7d_RxeU


→ From Professor Osman Coşkunoğlu, PhD:

A scientifically appropriate and functionally useful approach to the fake news issue has to deal with the three aspects, namely, descriptive (describing the problem and its source), normative (setting norms and definitions), and prescriptive (approaches and techniques to solve the problem). The references below are categorized along these three aspects.


DESCRIPTIVE: Three sources of the problem are human fallibilities, polarized society, and poor or biased algorithms

Human

Herbert Simon received Nobel in Economic Sciences (1979) by his bounded rationality theory but seminal studies showing human biases and reasoning fallibilities were done collaboratively by psychology professors Daniel Kahneman and Amos Tversky. Their initial work was summarized in their two books: Judgement Under Uncertainty: Heuristics and Biases (1982) and Choices, Values, and Frames (2000). After receiving Nobel in Economic Sciences (2002), Kahneman (Tversky had died in 1996) wrote an international bestseller in 2013: Thinking Fast and Slow. The book explains, for general reader, his and Tversky’s findings on human cognitive biases and limitations.

As fake news has recently become center of attention, articles about human fallibility have been appearing in reputable media:

22. The New Yorker: Why Facts Don't Change Our Minds http://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds (27 February 2017)

23. The New York Times: Why People Continue to Believe Objectively False Things: https://www.nytimes.com/2017/03/22/upshot/why-objectively-false-things-continue-to-be-believed.html?_r=1 (22 March 2017)

24. Slate: Our Punny Human Brains Are Terrible at Thinking About the Future http://www.slate.com/articles/technology/future_tense/2017/04/why_people_are_so_bad_at_thinking_about_the_future.html (13 April 2017)

There seems to be a consensus on that, lies that play to individual’s prejudices and beliefs are more easily believed and are passed on thoughtlessly or for a feel-good effect.

25. Lotto, Beau: "Deviate" (2017) Professor of Neuroscience at University College London, Beau Lotto, in his very recent book (25 April 2017), entitled "Deviate", argues that all experience occurs in the brain – and that the brain is deluded. A review of the book: https://www.theguardian.com/books/2017/apr/22/deviate-the-science-of-seeing-differently-by-beau-lotto-review


Society

Eli Parisier’s bestselling 2012 book "The Filter Bubble: How the New Personalized Web is Changing What We Read and How We Think" is hailed anew to explain the polarization in the society. For example, 2016 Wired magazine article:

26. Wired: Your Filter Bubble is Destroying Democracy (2016) https://www.wired.com/2016/11/filter-bubble-destroying-democracy/

However, economists at Brown University and Stanford University recently (March 2017) reported that it is the “…the growth in polarization in recent years is largest for the demographic groups least likely to use the internet and social media…[Our findings] argue against the hypothesis that the internet is a primary driver of rising political polarization.”

27. Brown University and Stanford University: Is the Internet Causing Political Polarization? Evidence From Demographics (2017) https://www.brown.edu/Research/Shapiro/pdfs/age-polars.pdf


Algorithms

28. Domingos, Pedro: "The Master Algorithm" (2016) https://lareviewofbooks.org/article/algorithms-future-already-happened/ The Master Algorithm: The 2016 book by Pedro Domingos – a researcher on machine learning – on how algorithms can warp the reality. What is to be done? Informed active citizen. An excellent review of the book and criticism of the solution suggested by Domingos.

29. Pew Research Center: “Code-Dependent: Pros and Cons of the Algorithm Age” (February 2017): http://www.pewinternet.org/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age A very comprehensive – but not random – survey research by Pew.


NORMATIVE

30. The Guardian: Defining Fake News Will Help Us Expose It (12 May 2017) https://www.theguardian.com/media/commentisfree/2017/may/12/defining-fake-news-will-help-us-expose-it The Guardian’s Readers’ editor Paul Chadwick differentiates fake news from flawed journalism and proposes a draft definition of it

31. Slate: Stop Calling Everything 'Fake News' (6 December 2016) http://www.slate.com/articles/technology/technology/2016/12/stop_calling_everything_fake_news.html Towards a better characterization of fake news.


PRESCRIPTIVE

Criticized for their newsfeed during the presidential elections in the US, by the end of 2016 Facebook had already taken several steps to fight fake news

32. Slate: Facebook's Cautious, Sensible Plan to Fight Fake News http://www.slate.com/blogs/future_tense/2016/12/15/facebook_s_plan_to_fight_fake_news_is_sensibly_cautious.html In addition to some changes in newsfeed algorithm boostin authentic news, Facebook relies on user participation for flagging a fake news – to be reviewed by independent fact-checkers.

33. Slate: Only You Can Stop the Spread of Fake News http://www.slate.com/articles/technology/technology/2016/12/introducing_this_is_fake_slate_s_tool_for_stopping_fake_news_on_facebook.html Digital magazine Slate also relies on user participation (by a Chrome extension) in a similar fashion.

However Facebook’s approach (user flagging an article, then if two or more fact-checkers’ approve this “disputed” tag is placed) may not work:

34. The Guardian: Facebook Promised to Takle Fake News. But the Evidence Shows It's Not Working (16 May 2017) https://www.theguardian.com/technology/2017/may/16/facebook-fake-news-tools-not-working

35. The New York Times: Fact-Checking Can Change Views? We Rate That as Mostly True (5 November 2016) https://www.nytimes.com/2016/11/06/upshot/fact-checking-can-change-views-we-rate-that-as-mostly-true.html Darthmout professor Brendan Nyhan reports that their research shows that correcting people’s false beliefs can be ineffective, or worse, make them cling to their views even more strongly. However, other research we have done suggests that fact-checking can be effective.

36. The Guardian: How To Solve Facebook's Fake News Problem: Experts Pitch Their Ideas (29 November 2016) https://www.theguardian.com/technology/2016/nov/29/facebook-fake-news-problem-experts-pitch-ideas-algorithms A cadre of technologists, academics and media experts are thinking up solutions, from hiring human editors, to crowdsourcing or creating algorithms.

37. Bloomberg View: Closing the Safe Harbor for Libelous Fake News (16 December 2016) https://www.bloomberg.com/view/articles/2016-12-16/free-speech-libel-and-the-truth-after-pizzagate According to Harvard professor Noah Feldman, a legal remedy for victims of fake news that would be consistent with the First Amendment is complicated but possible.

38. Wadhwa, Vivek: What Google and Facebook Must Do About One of Their Biggest Problems (12 April 2017) http://wadhwa.com/2017/04/12/google-facebook-must-one-biggest-problems/ and http://digitalethics.net/what-google-and-facebook-must-do-about-one-of-their-biggest-problems-vivek-wadhwa Recognizing that there is no quick fix, distinguished scholar Vivek Wadhwa has some structural recommendations not only to Google and Facebook but also to individuals

39. Data & Society: Points: Did Media Literacy Backfire? (5 January 2017) https://points.datasociety.net/did-media-literacy-backfire-7418c084d88d Danah boyd argues that neither educational approaches nor self-regulation of information intermediaries will work. She concludes that “…if we want to address issues like propaganda, hate speech, fake news, and biased content, we need to focus on the underlying issues at play. No simple band-aid will work.”

40. Slate: It's Time to Give Up on Facts (8 February 2017) http://www.slate.com/articles/health_and_science/science/2017/02/counter_lies_with_emotions_not_facts.html In the meantime, a very different prescription: “It’s Time go Give Up on Facts - Or at least to temporarily lay them down in favor of a more useful weapon: emotions.” (1,600 Comments made and 22,000 person shared on social media)


GENERAL AND RELATED

41. Carr, Nicholas: “The Manipulators: Facebook’s Social Engineering Project” (2014) https://lareviewofbooks.org/article/manipulators-facebooks-social-engineering-project Nicholas Carr explains why in 2014 we came to realize that “…a new stage in the net’s history — one in which the public will be called upon to guide the technology, rather than the other way around. We may look back on 2014 as the year the internet began to grow up.”

42. The Guardian: How Technology Disrupted the Truth https://www.theguardian.com/media/2016/jul/12/how-technology-disrupted-the-truth A fairly comprehensive analysis of the issue – with 1,584 comments – by Katharine Viner, editor-in-chief of the Guardian, concluding with a call for a civic participation: “engaging with people as civic actors, citizens, equals. It is about holding power to account, fighting for a public space, and taking responsibility for creating the kind of world we want to live in.”

People

Please provide name and institution for all people you list here.

  • Focal Point

Focal Points take over the responsibility and lead of the session organisation. Focal Points are kindly requested to observe EuroDIG's session principles. Focal Points work in close cooperation with the respective Subject Matter Expert (SME) and the EuroDIG Secretariat.

1. Menno Ettema and Irina Drexler, Council of Europe – No Hate Speech Movement

  • Subject Matter Expert (SME):

SMEs are responsible for the clustering of submissions into a thematic category they have an expertise in. They define subtopics and identify submissions which fall under this subtopic. The aim is to verify submissions which can be merged in one session. In the course of the session organising process SMEs will serve as a mentor for the respective category by supporting all Focal Points.

1. Yrjö Länsipuro – ISOC Finland

  • Key Participants (for workshop) or Panellists (for plenary)

Key Participants (workshop) are experts willing to provide their knowledge during a session – not necessarily on stage. Key Participants should contribute to the session planning process and keep statements short and punchy during the session. Panellist (plenary) will be selected and assigned by the org team, ensuring a stakeholder balanced dialogue also considering gender and geographical balance. Panellists should contribute to the session planning process and keep statements short and punchy during the session.

1. Sven Mikser – Minister of Foreign Affairs, Republic of Estonia [1]

2. Divina Frau-Meigs – President and professor at Université Sorbonne Nouvelle (Paris III), media sociologist, researcher on media and information education, expert at UNESCO, the European Union and the Council of Europe

3. Jessikka Aro – awarded investigative reporter with Finnish Broadcasting Company's social media project Yle Kioski. Aro specializes in Russia, extremism and information warfare. In 2014 Aro became the target of serious and still ongoing international propaganda and hate speech campaign after starting a series of articles about the influence pro-Kremlin social media trolls. Currently Aro writes an investigative book about Kremlin's information warfare and trains reporters and general public to recognize and counter online disinformation. Aro is also lobbying for better legislation to counter hybrid threats and protect citizens from state-sponsored online security threats. https://link.springer.com/article/10.1007/s12290-016-0395-5

4. Ambassador Corina Calugaru – Thematic Co-ordinator on Information Policy of the Committee of Ministers of the Council of Europe, Permanent Representative of the Republic of Moldova to the Council of Europe

  • Moderator

The moderator is the facilitator of the session at the event. Moderators are responsible for including the audience and encouraging a lively interaction among all session attendants. Please make sure the moderator takes a neutral role and can balance between all speakers. Please provide short CV of the moderator of your session at the Wiki or link to another source.

1. Menno Ettema – Council of Europe (No Hate Speech Movement coordinator)

Mr. Menno ETTEMA is the European coordinator of the No Hate Speech Movement, a youth campaign of the Council of Europe for mobilizing young people for human rights online, freedom of speech and against hate speech. He works closely with the national campaign committees, European partners and online activists on the implementation of the campaign, through the campaign website nohatespeechmovement.org, Twitter and Facebook, European trainings, seminars and the various awareness-raising and training activities at national level. Mr. Ettema has previously worked as an educational advisor for the Youth Department of the Council of Europe, supporting European Youth Organizations to develop their human rights education programmes. He has been coordinating intercultural dialogue and peace building projects, and projects promoting access to human rights for young refugees and youth with disabilities. Before starting at the Council of Europe he worked with youth peace organisations in the Middle East and coordinated the international secretariat of the United Network of Young Peacebuilders. Mr. Ettema studied Psychology of Culture and Religion at the University of Nijmegen, the Netherlands.

  • Remote Moderator

The Remote Moderator is in charge of facilitating participation via digital channels such as WebEx and social medial (Twitter, facebook). Remote Moderators monitor and moderate the social media channels and the participants via WebEX and forward questions to the session moderator.

1. Alessandra Coppola – President of APICE

  • Organising Team (Org Team)

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member.

1. Rachel Pollack – UNESCO, France

2. Osman Coskunoglu, PhD – Alternative Informatics Association, Turkey

3. Ansgar Koene, PhD – Horizon Digital Economy Research Institute, University of Nottingham, United Kingdom

4. Narine Khachatryan – Safer Internet, Armenia

5. Xianhong Xu, PhD – UNESCO, France

6. Daniel Waugh – Union of Students in Ireland

7. Giacomo Mazzone – European Broadcasting Union

  • Reporter

The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience
  • are to be submitted to the secretariat within 48 hours after the session took place

Please provide short CV of the reporter of your session at the Wiki or link to another source.

1. Yrjö Länsipuro – ISOC Finland https://www.linkedin.com/in/yrj%C3%B6-l%C3%A4nsipuro-91517718/

2. Irina Drexler, PhD – No Hate Speech Movement in Romania https://www.linkedin.com/in/irinadrexler/

Video record

https://www.youtube.com/watch?v=ZvBsoSSCjx4

Messages

  • “Fake news” has become a fashionable term, and it is important to define what it is, and what it is not. Fake news is not just bad journalism, or reporting that one doesn’t agree with. Fake news, in fact, is not news, like plastic rice is not rice. Fake news does not happen; it is intentionally and knowingly put out to spread confusion in the society and to discredit democracy, or to produce economic gain.
  • Fake news feeds the polarization of societies, and polarization creates more markets for fake news. It disrupts the status of truth, undermines the value of objectivity and the principles of professional journalism. People who have felt constrained by objectivity, who don’t like and trust it, are emancipating themselves, now that they have the tools. Information landscape is not vertical any more. When old media tries to check and correct user-generated stories, they get accused for spreading fake news.
  • New norms are not needed, rather existing ones should be implemented. Government regulation is not the way to solve the fake news problem, and could lead to the suppression of legitimate voices. Self-regulation or co-regulation are already in use, and should be given a larger role, and new approaches should be discussed, like giving public trustee media status to platforms. Platforms, in their best interests, should become more accountable and transparent, eg. about the standards they apply in different countries.
  • Enhancing media literacy education is the most effective way of combating fake news and should be taken to a new height. Media literacy education should be seen as a political survival project for the society, and a right for the individual, especially children. Skills and resources of journalists should be enhanced.

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone 1-877-825-5234, +001-719-481-9835, www.captionfirst.com


This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.


>> GERT AUVAART: And with this we will now move to the next segment in our agenda, which is the second Grand Ballroom panel of the day, and perhaps as I mentioned in the morning, the most political of the panelists at EuroDIG. The issues to be addressed is Internet in the post-truth era. It will be moderated by Menno Ettema, from the Council of Europe, "Mr. No hate speech." And he will introduce all of the speakers. I just have to say my boss is among them. So there you go.

(Applause)

>> MENNO ETTEMA: Good afternoon. Welcome to Plenary 2, which is indeed as mentioned on the Internet in the post-truth era. And this debate, I'm introduced as working for the Council of Europe as the European campaign coordinator for the "No hate speech" movement. And I among many was on an organising team that took up this topic, because we realised it's a sensitive and developing concern that we need to address.

To explore this debate, I invite everyone in the room to participate. And we will have panelists introducing various faces in our thinking. Therefore I'd like to welcome forward Svan Mikser, the Minister of Foreign Affairs of Estonia, please to take a seat.

We are also joined by Ambassador Corina Calugaru, Thematic Coordinator on Information Policy of the Committee of Ministers of the Council of Europe. I'd like to welcome you forward.

And we are joined by Divina Frau-Meigs, Professor and President of the University of Sorbonne in Nouvelle. She is a media socialist and researcher on media and information education.

Thank you.

And Jessikka Aro. She is an awarded investigative reporter with the Finnish broadcasting company on the project called Yle Kioski.

This panel is also made possible, I just wanted to acknowledge all the work that has gone in here by two rapporteurs who will reporting back at the end of the plenary. And we have an online moderator who is following the online debate, and she will also take the floor now and again to give feedback From the online discussions that are taking place, because everything is broadcast live on the Internet.

Before I give the floor to the first speaker, maybe to highlight why this plenary. As I mentioned, when EuroDIG launched a call for proposals, there was a group of people that came up with this issue, this topic of fake news and hate speech and how to address it. And we formed an organising team. And in our reflections, we were struck and concerned about the spread of fake news. And especially its potential and impact on the Internet and European societies in a broader sense.

We raised various questions, such as why did fake news become such a phenomenon here? I mean, propaganda and misleading information and factually wrong news has always existed. Why now this term "fake news" and what does it signal. Maybe there is a correlation between the rise of popularism to follow sensationalism in society, or not. Is there a correlation? Or just the role of Internet bubbles and algorithms, do they enhance the spread of fake news or are there other things at play and, if so, what?

Whatever the course, the question is what is the response? Do we need regulation? Do we need Education? Do we need to strengthen the media, the independent media and other things? And who should take that responsibility?

It's a lot of points that are raised in our preparation discussions. And we also see this plenary as an explorative debate for the process to go.

We therefore decided to structure the debate in three phases. First, we will look at descriptive phase. How does fake news express itself in Europe today? After which we will look more into the normative aspects. Can fakeness be defined and who is responsible to address it and on what grounds? But most of the debate will be in the proscriptive area. What approaches can we take to combat fake news and hate speech also in that sense, and what are good practices that we could consider and should strengthen.

Each phase will be introduced by a panelist, and I want to thank you already for taking that role. But we want to involve the audience, that is all of you here sitting opposite me. So we will have a small interactive tool, a polling, checking the temperature in the room in the beginning of each phase, and then we will start the discussion based on the feedback that we have.

So I invite you to get ready your Internet devices. We will show you shortly use a link to a polling, and we can get your feedback even if you don't take the word in this panel. That's what we're going to do.

And I think it's important to first describe the situation of fake news, and for that, to see how does it express itself in Europe? What are the consequences of fake news? And what is -- why is that concern for a pluralistic Democracy and human rights in Europe today?

And for that I'd like to give the floor to the Minister Svan Mikser, the Minister of Foreign Affairs in Estonia, to guide us into this discussion.

>> SVEN MIKSER: A very good afternoon to everyone. And, well, we have very many important questions raised by the Moderator. Now I give a quick constitution of those and then you can go home.

I wish it was like that, obviously. I don't have very good conclusive answers to all of those questions. But I think that just to start off this, this panel discussion, I'd say that it's, first of all, very important to try to understand or to come to a consensus when it comes to the concept of fake news. I remember 15 years or so ago, after the terror attacks in New York City and Washington, D.C., the concept of asymmetric threats became fashionable. And then for a time all threats in the world were willingly described as being asymmetrical, irrespective of whether they had anything to do with symmetry at all or not.

Now more recently the concept of hybrid threats became fashionable. Suddenly everything came to be hybrid or at least came to be described as hybrid. And I'm afraid that, well, since the Oxford dictionary's dubbed the word "post-truth" the word of the year last year, the concept of post-truth environment and fake news became enormously fashionable. And now the recent nature that we would describe everything that we dislike, everything that we see as problematic in today's information environment as fake news.

I really don't think this is the case. I think that there is a considerable overlap between hate speech and fake news, but they are not one in the same thing, as much as I dislike hate speech, obviously.

I also think that to start off it's important to understand the limitations of the Internet as a platform, as a medium. Obviously, it's the thing of today and it's obviously going to be the thing of tomorrow. But it's -- it's a bit like, well, when the big International military operations in the Middle East were still ongoing in Iraq and in Syria, then there was an analyst that saw that, in the United States of America, since they have this terrific hammer of the US military, there is a strong tendency of thinking of every problem as a nail.

I think it's a bit similar with the Internet. Obviously the opportunities created by the Internet are enormous. I don't think -- I think much of that is potentially still untapped. It's an enormous and helpful tool to boost economic growth, to boost competitiveness, to give access to information to people and groups and societies that had never in the past had such access.

But there are obviously also limitations. I think some of those limitations have become pretty apparent recently when the Arab Spring happened. We were all very fascinated with and obviously impressed by how the military dictatorship in Egypt was brought on by social media, Twitter and Facebook. But quite soon it became apparent that the tool that was so effective for bringing down a Government was not equally suited -- well suited for putting together a new Government.

So obviously there are things, there are platforms, the technologies that are useful for a certain purpose may not be equally well fitted for other and different purposes.

And I think that I read very carefully, tried to read very carefully, the material that was put out on the website of the Conference, the food for thought, on this panel. One of the issues that was given considerable consideration was the sort of customized information, customized media, media that is designed not to provide sort of objective information, but rather information that is sort of designed to please or to be sort of acceptable to a particular recipient.

And, indeed, I think this is not -- this sort of customization of information is not unique to the Internet. We know that there are newspapers, there have been newspapers for well more than a century that had certain ideological bias that are left leaning or conservative leaning. There are TV channels, I mean, when you watch Fox News or when you read Huffington Post, you get a different idea of how the world is turning. But neither of those can be described as fake news, per se. I think -- although there is a certain ideological bias in each.

So I think that it's important that we -- that we define "Fake news" as news that is or the information that is being put out, sort of knowingly, false information that is put out there knowingly, in order to manipulate a certain audience, their understanding of the world, or produce a certain reaction from a particular audience. And we have seen that.

I think why it's become such a fashionable thing recently is the Internet has been discovered by this sort of malevolent forces in the world as a medium, as a platform that can be used for that particular purpose. And predominant among those purposes is the discrediting the sort of liberal Democracy which is, to a large extent, based on the objective accessible information. And the best way to actually sort of create confusion in the Democratic societies as to the viability of the Democratic structures is to undermine the trust of people in the information that is provided to them.

And I think that what is unique of the Internet here is the fact that the number of media channels and the amount of information that is available to people has exploded over a very short period of time. There has never been so much media, so much information available as there is today. And the absorptive capacity of the human mind has not expanded correspondingly, obviously. And the amount of information that we can take in during the day, I mean by the number of pages we can read or the number of TV channels that we can follow simultaneously, that has not expanded. That is pretty much the same as 500 years ago, 100 years ago, 50 years ago.

And obviously it's becoming increasingly more difficult for people to critically analyze and critically assess the quality of the information that is out there, analyze or sort of proofread or approve the sources and the objectivity of the information. And that is what has created a sort of post-truth environment, this post-truth era, if you wish.

And, obviously, if there are regimes, if there are movements, social movements, who see as the greatest enemy and the greatest adversary, liberal democracies and Democratic structures and procedures, free elections, free media, then those forces are obviously of critical interest in trying to utilize or use those possibilities offered by this new environment of the plurality of different information sources.

I have to stop at this point and move on to the next.

>> MENNO ETTEMA: Thank you for that, Minister. So you're raising various points I think in your description of the situation. "Fake news" is a term that may be fashionable and we should be worried about what do we actually talk about when we are talking about fake news. But you also mentioned, yes, there are issues of criticism about the information that we have and the filter bubbles that we often call that, so algorithms for filtering of information that we prefer to see and not the diversity of opinion. Plus you mentioned that there is lots of information out there that leads to a lot of questions on how to manage all of this information and which choices do we do.

I'd like to make this as interactive as possible. So I want to ask, first of all, because we had a feeling that fake news is leading to polarization or not. And we want to know, and this is a challenge that you were putting forward, Sven, so I'd like to ask the first question to the audience just to check how the tool works and open the discussion for the panelists and the audience to add.

You'll see here the link to -- in a second, yes. The link, HTTP://Etc.ch/zbA4. If we go to the first question: Fake news feeds polarization in European societies. That is one of the fears that we had when we started this. And somebody said well, it's a populist question. So now maybe you can share your reflections: Yes, no, don't know, it's a populist question. And leave it open for a few minutes.

Nine votes. So it's working. Good. Slowly, it's only ten votes, 13, 15. It's slowly moving towards the yes direction. I'll leave it open for a minute. But meanwhile, let's open the floor.

I see -- yes, you want to take, say something? Okay.

>> SVEN MIKSER: I think it's a bit of a chicken and egg question whether the fake news actually leads to polarization or not. I think -- I think when you go back in the history of parliamentary Democracy, then let's say a hundred years or so ago, political parties, head of elections, there were manifestos and they expected people to follow them, and sometimes they did or didn't. But then with the development of sociology, the situation basically, most of it was reversed.

And today, politicians, political parties, generally try to get a good sense of the public mood and then promise things that -- or campaign on things that they expect a particular segment of the society to care about or like or dislike. And that, I think, that is -- this is the main factor that leads to polarization of politics and has created very antagonistic political landscapes in many societies, in many States. And obviously those who use fake news to put out fake news in order to promote their own agenda, they make use of this polarized political landscape. I don't think that that actually sort of fake news or the emergence of fake news caused the polarization in the first place. But definitely there is a link between the two.

>> MENNO ETTEMA: Thank you.

Yes. Divina.

>> DIVINA FRAU-MEIGS: I would say that polarization existed before and that fake news is coming to break this. Because in fact if you look at the history of media and what the Internet is doing to it, we are not in front of something new with fake news. What we are in front of and that is new is how we evaluate truth and information. And what is changing is the circuits for us to check information, its validity and its authenticity.

And the fake news issue first has to be taken down in terms of category. So in my opinion there are three types of fake news, and they are not the same and they don't come from the same kind of publics and the same kind of entities with the same kind of agendas. We put them all in the same basket; we should not, because they don't have the same purpose.

One kind of fake news is propaganda. And that is state, state directed at the mass public that it wants to catch, captive.

And then there is plot theory. Plot theory is the answer of those who are trying to fight propaganda and bring up from different parts of the spectrum different ways of debunking propaganda.

And then in the middle there is, I would say, the people in the social media, you and I and et cetera, who are trying to find the truth in these two extremes of propaganda and plot. And we do it with gossip. We do it with rumor. We preach the false to get to the truth. So there are three types of fake news and they come from very different kinds of people. And we should keep this in mind.

But fake news are very interesting for us, and I speak from the perspective of the sociology of media, of course. Because they tell us three things at least that are changing and what the Internet is doing.

One of the things is that the status of truth is being disrupted. And it's the end of what journalism had imposed on us, which is the norm of objectivity. And the norm of objectivity normally sounds like a good thing, except that in the last 20 or 30 years, the norm of objectivity has polarized public opinion. Because objectivity was turned into, okay, I'm taking one extreme position and I'm taking the other extreme position and I'm being objective. So one bit to the right, one bit to the left. The canoe technique. That's not truth. And that certainly is not how lots of people who are not believing in a polarized world perceive it.

So what we see with the fake news issue is the debunking of the norm of objectivity as captured, as turned obsolete by reference media. And they are part of the problem. And hopefully part of the solution. But anyways, that's the first point.

The second point is that to capture the public with propaganda or plot theory, you need to make it into a public opinion. Well, guess what? The public is emancipating itself. People are running away. They are providing their own information. They are creating their own influences, people they trust. And because they don't trust objectivity anymore, they go for authenticity. So if you are authentic, you are more credible than you are if you are trying to say something around information today, which is what social media reveals to us.

The last point, the public is no longer a public that you can consider vertically. People at the top, politicians at the top, they are at the bottom. This is much more multi-layer. It doesn't believe in one form of causality. It knows there is complexity out there and it's trying to get it, and that's something that the Internet is bringing.

So it compels us also to tell ourselves how do we look at information? Information is no longer news only. It's no longer the documents you find online. Information is driven by data. And what does data do to media? Well, it profiles your opinion. It creates profiles. And so we should not look for the sources of information when we try to find the fake news. We should look for the profiles of those who create them. We should have a network analysis of the communities online, and from there we can start a discussion, rather than say it's false, it's fake, et cetera. It should be much more deliberative.

So there is no polarization. On the contrary, there is a lot of different ways of understanding and building and co-constructing information and news today. And that's the big news about fake news.

>> MENNO ETTEMA: Okay.

I would like to ask, also, from the audience if there is any experiences with fake news being confronted with, or how does that look and how does it impact you and your realities in Europe today, if there's any contributions. Please.

>> AUDIENCE: Hello. Hello. My name is Tody and I'm with the "No hate speech" movement.

I have sort of a question for you. And it was talked about at the YOUthDIG. In this era where we have the Government giving fake news to boost political agenda, we have several print medias who were reliable before giving fake news, also, siding with the Government, and especially during political elections.

Who do you think should be responsible for debunking our fake news or regulating news in general?

Thank you.

>> MENNO ETTEMA: The question is when it comes to normative, where do we set the rules and standards? But your experience is that it's actually political parties or politicians who are using or spreading fake news for a political agenda. You said a place, which region is this that your experience comes from?

>> AUDIENCE: I was talking in regards to the election, the American election.

>> MENNO ETTEMA: Ah. American elections. Okay. Thank you.

Any other experiences with fake news and the consequences it has to your work or your organisation that you would like to share?

Please.

>> AUDIENCE: So my name is Gercon. I come from Turkey. And I'm running a news organisation. It's a citizen journalism news centre. And what we do is we fact check and verify news before publishing them. We rely on our hundreds of reporters across Turkey.

But sensational news is something that is quite catchy. And the conventional media in more than one cases has been doing this kind of sensational news, which happens to be something that is not true, that is not based on facts or that is not verifiable.

Instead, we rely on verifying, rather than being fast. We try to be as accurate and reliable as possible. And by doing that, we also get accused of propagating fake news, distorting the facts.

So it seems to be a discussion that is going both ways. The people who actually generate news that is quite sensational but not true also accuse the ones who verify that it was not actually true, to be nonfactual. It is quite complicated when I speak like this, of course. But it is a tough situation when it comes to fact checking and verification.

>> MENNO ETTEMA: Yes. That is actually being challenged itself for being fake.

>> AUDIENCE: Yes. And I always remember Bernard Goldber's bias whenever the issue of fake news is being uttered.

That's it.

>> MENNO ETTEMA: Thank you very much.

>> AUDIENCE: Hi. My name is Rachel Pollack from UNESCO. It's not a concrete example, sorry for that, but I just wanted to share some of the outcomes of a discussion we had in March at a colloquialism called "Journalism under fire," where one of the central questions was about fake news. And that we thought that the term "fake news" is quite problematic, because actually it's not news. It's false information. And as Divina Frau-Meigs said, there are different types of "fake news." And the term itself is being used to de-legitimize the media, to criticize the media or who are critical of Government.

So I think we need to be real careful in the way that we talk about these issues, and also moving forward and finding solutions to working with journalists and journalism literacy.

>> MENNO ETTEMA: Did they come up with a new term to replace it that would be a bit more functional?

>> RACHEL POLLACK: Because that is what is still in common usage, we put it in quotation marks. At least as a start. But I think -- for example, there was a researcher from the University of Oxford who has done a lot of study about the spread of fake news, and they call it "Junk" as a kind of overall category.

So I think it's also just as important that we distinguish between different types of material, whether they are being produced and spread for political reasons or for commercial reasons, and that also affects then how we address the problem.

>> MENNO ETTEMA: Thank you very much.

So we are getting a bit of a feel -- yes, please.

>> AUDIENCE: My name is (Inaudible) I work for the European Commission. I'd like to bring to the floor an example of fake news that all of us experimented during the Brexit campaign in the UK. And here the fake news, they came from the traditional media, from the traditional newspapers, so nothing to do with the Internet. And the reaching of the fake news was politicians who were deliberately lying about running of kettles (?) Running of societies, and things like this. So this is a question that is not linked to the Internet.

>> MENNO ETTEMA: Thank you very much.

I'd like to move to the next phase, if that's okay.

Very short? Thank you.

>> AUDIENCE: My name is Osman Coskunoglu from Turkey, one of the organizers of this session. And this question I insisted to be voted for.

I thank you to the previous speaker here, because it has nothing to do with the Internet. Often it is discussed in the Internet realm. There is yet another recent research showing that polarization exists in older people. But social networks and Internet is mostly used by younger people. So let's separate those two, Internet and fake news and polarization from each other. But, fake news, all of a sudden why it became a hot topic? Because, I know in Turkey, I know in the United States, I know in some European countries, polarization in different countries is increasing in the recent years, hence it becomes a good market, so to speak, for fake news.

Thank you.

>> MENNO ETTEMA: Thank you very much.

Okay. We are getting various experiences of fake news. It has different faces and looks differently. And it's certainly not something only related to the Internet. And as some people mentioned, yes, but how do we define it? What are we actually talking about and what basis do we have to take action?

So can we actually define it? There were a few proposals made already of differentiating the fake news in two different categories. But then the question is, are this existing rules and regulations or standards that we can use to actually start looking for a way of addressing this fake news? So if it's propaganda, what can we fall back on? If it's gossip, what can we do? What standards do we have? What regulations do we have to maybe take action?

I'd like to ask Ambassador Corina Calugaru to take the floor. She is the thematical coordinator of the information policy with the Committee of Ministers of the Council of Europe and also the Ambassador from Maldova and the Committee of Ministers.

>> AMBASSADOR CORINA CALUGARU: Thank you. Perhaps the previous interlocutors mentioned that the digital year in fact brings to all of us not just the rights, but the responsibilities. And propaganda, fake news, it's fundamental that in fact could be, may be preventable. But at the same time, we have it today. And I think the Secretary General annual report on the State of Democracy, rule of law, and human rights, emphasize the scale of proper news and fake news. And, in fact, the possible further actions should be implemented by the Government, but at the same time by all the national and regional international stakeholders.

I'm talking here not just about the Governments, national agencies, private enterprises, Civil Society. And usually we, the Government, in the Council of Europe or the Member States, are used to elaborate and to negotiate different recommendations. As you mentioned, regulations.

But I think with the Council of Europe Internet Governance strategy, we are bringing a new platform for the business and the Council of Europe where we should listen to each other, and not just to listen but to act. Because I think until now we were discussing too much and we are emphasizing the issues that we have towards nowadays.

Concrete actions, but I think concrete actions should be very simple and focused. These actions are coming from the education. I think through "no hate speech" campaign of the Council of Europe as well. We need to raise awareness to the population, and at the same time freedom of expression that we have already, it's impossible to limit some access to the information. So the question is: How to delimitate the wrong and the true information?

So this parody, in fact, this is about doing your job in a professional manner with true information. But, unfortunately, it's quite impossible because there are different interests. So it's about the final end for the people's choice. What they are choosing to do -- to believe in some fake news or true news.

So it's nowadays a question of being true citizens of our countries and of our world. But in these delimitations, in fact, it's about taking from every individual and with all the national stakeholders.

>> MENNO ETTEMA: Thank you very much.

It's an interesting development or discussion here. So okay, we need to ask Governments, as the Council of Europe setting standards how to regulate these kinds of issues. And we have to see with businesses through platforms how to make this workable. But ultimately it's the huge responsibility with the citizens as consumers to find their way in this. Okay. Especially because of the freedom of expression, and this is sometimes difficult to balance, where do you find the line or draw the line.

>> AMBASSADOR CORINA CALUGARU: Back to this idea, in fact, as I mentioned on the possible platform between the Council of Europe and business, at this stage, everybody agrees that we need to listen to each other. My question is, we are listening to each other, but at the same time we need to take responsibility.

In general, in the recent era, we have used to talk about rights but not responsibilities. So I think it's about being responsible, taking -- and to assume responsibility to promote true information.

It's quite difficult, because we need to select the platforms. But, once again, it's about being professional.

>> MENNO ETTEMA: Okay. This raises two questions for me, which maybe also we could have a look at what the audience thinks and the other panelists.

The next question is also related to that in the poll. Do we need more regulation? Next one.

Regulation to stop the production and dissemination of fake news is what is actually needed. But listening to the Ambassador, it's also the question: Do we need more regulation or do we need more people taking responsibility to address it? So maybe some thoughts on that.

Of course there's only 14 votes now, but more will come in. It's moving into the no direction. No, we don't need more regulation. So maybe it's true, that what we actually need is people taking responsibility for implementation.

Does anybody have feedback on this? Points of view to add to this point?

Yes, please.

>> SVAN MIKSER: I think that -- well, we should not, definitely not equate bad journalism and fake news. And even I don't think that propaganda and fake news are exactly the same thing.

And I would like to limit the term "Fake news." Again, there was this question of whether we should call it fake news at all, because it's not news. There were news stories, I don't know whether they were fake or not, about fake rice being sold in China. Plastic rice. Well, we might talk about it, but let's not call it rice. It's not rice, it's plastic. That's why it's called fake rise. I think the same way, fake news, by definition, is not news.

But also, I would say that we should limit the term only to those instances where the news or sort of untruthful information is being put out there knowingly, in order to achieve a sort of desired effect. Otherwise, we basically say that well, all sorts of information that we do not like or do not agree with is fake news. And that's extremely confusing. And that way we would actually encourage the polarization. I would say you may read from one newspaper that the taxation system of Estonia is brilliant, and from another newspaper, that well it's crap. Obviously, both cannot be true at the same time. And that's sort of tempting to call the one that you do not like "fake news," but it's not. There may be a certain ideological bias, but that is not fake news.

And I think that regulation probably is not the way to over -- regulation is not the way to resolve it. Unfortunately, I think the bad news is that we cannot deal with the situation by just by putting out more truthful news. Because we cannot sort of preempt fake news.

It takes stories, like well, there was an immigrant from country Z that raped a pregnant woman. Well, it's sort of -- it's a mixture of hate speech and fake news, maybe. And -- but you cannot basically counter that by reporting all instances where people were not raped by immigrants.

I mean, that's -- you have to find -- you can debunk the story once it's out there. But then that puts you on the -- or the providers of the truthful information on the defensive already.

So I think that the answer is more with the critical, improving the critical capacity of audiences and the individual recipients of information to make the distinction between what is truthful and what is not. I think this is the key.

But even that is not easy. If I tell you that it's raining in Paris, you are sort of inclined to believe me. Why should I lie about it? If I do, I mean, it's very difficult for you to -- I mean, well, you can go to a website, weather.Com or something like that, and then check it. And then you'll probably believe what that website says. But you can create a website, climate.eu, a fake news website, and try to get it referenced in more serious media. And that's how this industry operates.

So it's actually an enormously difficult issue to deal with. But obviously we can't resort to a sort of Luddite answer, to try to destroy the machine in order to get rid of sort of the potential use of it.

>> MENNO ETTEMA: Thank you.

Divina?

>> DIVINA FRAU-MEIGS: Thank you. In fact, there is lots of regulation going on, except that it's very different kinds of regulations. And, again, we have to be clear about who enacts them and who is responsible for it. And should be accountable. And the gaffons, the media platforms themselves, are regulating right now. Facebook is taking down websites. Who controls Facebook? Facebook itself is now saying there might be a necessity for some kind of media status of the gaffons.

Right now we don't know what it is. It looks very much, because it's American, it looks very much like a model of a common carrier, telephone, Telecom or the Information Provider, which it is called. But there are other models for media that we might want to consider for them. Especially if the issue is trust. There is a public trustee media model in which there are a few obligations of public service, and isn't the Internet a public service somewhere?

Shouldn't we be able to give politicians, or whoever says something erroneous, the right to answer in the same outlet, which is what characterizes, for instance, the public trustee model? We really have to go back to thinking through these models that have been existing and that have hybridated over time. And probably come up with a new one for the news media. But they are media and they are starting to recognize it.

So they're media, then there's other ways of thinking regulation. It doesn't have to be state regulation. There is one by the journalist profession itself, which is happening. Are you all familiar with, for instance the Decradex of Ramon, (sp) which is trying to take care of all of the websites that are suspicious, and calls on to the media users and Internet people to help, to signal, to identify, to be aware of these things and report? So this is becoming interesting in terms of co-regulation. You have the journalist profession and the users getting together. Why not? This is certainly one of the options that we feel the Internet favors, having a bottom up approach into debunking fake news.

The last one would be -- well, there's two more. One, you know me, I did say that the best regulation is education. Ha-ha. It takes a long time.

But another one could be to figure out algorithms, that instead of pushing the fake news, because they are sensational, because of their proximity to other key words, and traditionally hot, how about devising an algorithm that thinks like a journalist, an ethical one. That's a challenge for the technical community that is here. Where is this journalistic minded designed algorithm? This is where we should be going, among other things.

And the true answer is that we have to have all of these options at hand if we are a real Democracy. All of these options.

>> MENNO ETTEMA: I do want to go back, though. It's interesting to see all the different various approaches that we can choose to see which answers the question: How do you start implementing the regulations or norms that we set ourselves?

The question here is before that, a little bit, how do we actually roll out the norms and where do we set that and what are the starting points? Me working for the Council of Europe, okay, I would say the Human Rights Charter of the Council of Europe and similar to that. But I think, as Sven was mentioning, the minister was mentioning, the intentionality of fake news, that's an interesting element here. And is that a starting point to say okay, the intentionality means that a certain norm has been crossed and that's why we need to take action?

So where do we distinguish between what is gossip, what is other means of trying to influence the public, and where does it become actually questionable, challenging, and we need to take serious action? Because even if a platform starts regulating and taking down, what is the basis for doing so? And that's a question that I think we still could look for an answer.

I see you wanted to add to that. If also people from the audience want to add, you're welcome.

>> AMBASSADOR CORINA CALUGARU: In fact, at the Council of Europe level and the UN as well, we have enough instructions, Human Rights Convention, Cybercrime Convention, (inaudible) we have the mechanism of the Human Rights Convention Procurement Rights, but I think on the Internet particularly, we have so many scenarios and so many cases that it's quite impossible to regulate.

So here, it's true. How we, in fact, are putting in place to work what we have already. So to invent new norms, it's about creating fatigue towards the drafting and negotiating the norms. It's easy to draft and to adopt. The most difficult it will be to be implemented. So I think when the Internet particular aspect, it's about how to put to work all of the instruments that we have already.

>> MENNO ETTEMA: So the norms are clear. It's how to implement. How to make them workable in the implementation.

I see two questions.

>> AUDIENCE: Hello. My name is Kurran (inaudible) from the Russian chapter. So in Russia we actually have a couple of regulations on the online platforms that basically puts the same liability on them as the media, in terms of reporting, things like slander, et cetera.

And what they actually achieve with the circulations is suppressing the voice of that position. It's working not as planned. When I hear from the supposedly Democratic countries that we need more regulation on online platforms, I fear that with the change of politicians it can be used to suppress voices.

As for the regulating algorithms, yes, obviously Facebook feeds their improvisation of the fake news. But, then again, people have confirmation biases. We click on the things that we want to be truth, not what actually is truth. And you can arrange it however you want on the search engines. People will still click on what they like to see and not what they actually face.

So I think that we should invest in good journalism, and that is probably the only way that we can accomplish this. And as far as algorithms, I think it's in the best interests of platforms themselves to create the better algorithms that are not easily impacted by people's biases. Because right now, at least from the social networks, it's what it is. We train algorithms to do what they do, and not algorithms make us read what we want to read.

>> MENNO ETTEMA: Thank you.

Yes.

>> AUDIENCE: Yes, I will up on that. Because I think that we were talking about education. And I wanted to first say that we need education. But my point is that we should also act. Again, we can influence the algorithms, and I'll bring an example. I worked in the UK and after the terror attack there has been in London, one of the first things -- up in -- after Manchester the same. One of the things that first appeared on social media, there were Twitter by mostly teenagers, that were saying about people missing or posting pictures of victims that are not confirmed yet. Because actually the main reason to do that, it was not hate speech, it was just to have visibility. Playing on some feelings in the aftermath of such awful events.

But the good thing is that after these things, there were people, journalists for different media outlets that pointed out the fake news. And they posted, they cross posted the fake posts and the actual ones, tracing the pictures where they were coming from, why they were doing it, and they did it really in the afterwards.

And I'm also talking about the country, which it was mentioned before, fake news and propaganda are part of the mainstream media. So we got newspapers that are free, available to everybody, that spread a lot of hate speech or fake news, especially during the referendum. But it's like since there were people that have been working on kind of deconstructing and debunking what they were doing, now they are much more careful. Monitoring what was going on after these attacks. And now that there is a general election campaign going on, there are people that, after everything, they go there and after every comment they are saying look, this is not true. And stuff like that.

So it works. It's not, of course, it's not like it works always or it's really popular because we are talking about like the person before me was saying people click on what they want to see. But it has an effect. And if we're talking about education, we should educate ourselves on how to recognize these things. There are truths available, how to trace images, there are reversing agencies, there are journalism forums which we can know how to do these things. And I think it's our duty to do more about it. Because if we just leave these things out there, waiting for education to come, we let too much of the bad stuff be out there. So I believe, in a fact, that the debunking and floating the good news and good information can do something.

So I just wanted to give an example.

>> MENNO ETTEMA: And you are --

>> AUDIENCE: I'm Debra from the Hate Speech Movement. Sorry about that.

>> MENNO ETTEMA: From the UK. Yes. Good. Thank you very much.

Yes. So we looked into the normative questions, and it's becoming very clear. We don't need more norms, but we need more -- there is one more comment. Yes, please. Sorry. Quickly.

>> AUDIENCE: I would like to underline that the point of Minister Sven Mikser, it's important to differentiate on propaganda and fake news. These are two different problems and two different views.

Propaganda is produced by a country, and it's not easy, but you know the sources are coming with a very political intention. So, for instance, in the European Union, we have in the EU external service, we have a task force dealing with propaganda. They are trained to understand how this works, how this disseminated, and how it threatens our process.

The other issue is fake news, which is really many different things. So I would add one category to what was said already. Those fake news produced by -- for economic interests. So people that just want to get advertising, and they get fake news, bigger and bigger and bigger, so that they get more traffic and more advertising. The big intermediaries are dealing with these with the "follow the money" approach or category.

And then we have the category of global fake news, which are not illegal. And I think this is important to say. Fake news are not, per se, illegal. You can have some fake news that are illegal. For instance, when they go into defamation or they are illegal statements to hate or to action, then you have a specific laws to deal with them. But outside what is illegal, fake news are not illegal. And here I would just quote our Estonian commissioner who said we don't want to have a Ministry of Truth. That's what he said. And this means we cannot have regulation on something which is not illegal. Because the freedom of speech is an overreaching value of our democracies.

Thank you.

>> MENNO ETTEMA: Did you mention who you were?

>> AUDIENCE: Who I am? So my name is Marisol (inaudible) Divanna. I work for the European Commission.

>> MENNO ETTEMA: Thank you. You said that, of course, before.

I would like to move on. It's interesting what is being mentioned here. On the normative level we have rules and regulations and they are functional and we have to be clear on what is legal or illegal, which I think is a good, interesting point.

Makes me reflect to previous discussions that we had at EuroDIG when it comes to hate speech and how do we deal with hate speech and legal hate speech and non-legal hate speech and the various approaches.

So if the norms are clear and the rules are clear, and it's about implementation, we move to the last phase. Prescriptive. What can we do? And many points were already raised.

So I'd like to ask Jessikka to comment.

>> JESSIKKA ARO: I'd like to present true stories, not fake news. I'd like to present what happened to me after I started to --

>> MENNO ETTEMA: That doesn't work.

>> So -- and also what I found out in my investigations.

So this will answer, I promise, this will answer so many of your questions and so many of your comments.

Let me just guide you through. This will give you pretty good insight and also show you what Finland has been conquering, this hate speech propaganda and online threats. I'm quite proud of that.

But let's see what happened to me. So in September 2014, I was, and I still am an investigative reporter who is investigating extremism, Russia and information warfare. In 2014, I found out that there was then new phenomenon of Russian information warfare, that would be the social media and online trolls, spreading propaganda programming, aggressive messages to people all over the Internet. So I helped Finnish people to help me investigate. And I started a online crowd sourcing project. I asked people to share their experiences, where they have seen these trolls and how do they act and what kind of attacks that they do.

So in two or three days, I started to receive hateful messages in the Russian language to my phone from all around. You know, Kazakhstan, Ukrainian number. I received a phone call where someone was shooting a gun or playing a sound of shooting gun. People were accusing me in these text messages of persecution of opinion. Persecution of people by their opinion. Accusing me of crimes, hoping that I would go to jail. And these were basically coming from all over the Russian language war.

So then I found out that some fake news had been spread about me all over Russia language Internet troll news sites. And what it says here in the news is that I'm not actually a journalist at all. I am actually a well-known famous assistant of American and Baltic special forces security services. So this is a propaganda trick of framing a person as -- for an agent in Russia.

Also, I'm accused of gathering some kind of illegal registry of Putin supporters in Finland, so doing crimes, and also giving this list of people to American services.

Also, my contact information there is published.

Why I want to tell you this is to show how fake news can work and how it can be used to manipulate and propagate and agitate people, not just their ideas, but also their actions. So basically people were reacting by calling me and sending hateful messages to me, to these fake news about me, because many people believed them.

Also at the same time there were orchestrated complaints campaign, which also was then made fake news about. Claiming that me and the Finnish Broadcasting Company were breaking many laws with my article, in which I certainly wasn't looking for any information about any actual people, only aggressive anonymous online trolls.

Also, during the process of my story making, there was a protest put outside Wiley. This protest was agitated on social media. And people were lied to about journalistic purposes of Wiley.

Also, some troll activists, so now we are talking of a time span of two and a half years. This was about one year ago, already after my investigations had been published and put in English and Finnish and the Russian language online and on TV and everywhere. So it seems that the troll activities would hire an actress to play me in a defamatory YouTube video and framing me again as some kind of NATO agent and proAmerican agent. And finally it looks like this has been recorded in the Wiley newsroom.

Also what these fake news predators and social media propagandists do, which I would like to have your comments about, is that they buy visibility for this information from Facebook, to show these contents to even bigger audiences. So this happened to me, too.

And then a little over a year ago, this was the most horrible campaign I ever experienced. This was a month after the video and the music videos being spread about me. So the troll activists went through old court files in Finland. And they found that in 2004 I received 30 Euro fine for drug use because I had a very bad drug problem back then. So they found these and they made these horrible fake news in Finnish and English, even calling for my resignation. So they were not anymore like propaganda and fake news predators, but they were also political pressurers. They even approached my bosses with Wiley and demanded for me to be sacked and demanded explanations for why they have me for their reporter.

Here is a short list of fake news being produced about me over the two and a half years. Starting the day I started to investigate Russian troll propaganda. So this was not originally in the English language, but I translated it for you. So it's actually a stalking campaign.

And the very disturbing thing is that always, this is really interesting technique and mechanism in this process of being stalked by possibly state-sponsored and state-controlled actors, there is always a wave of hate speech messages and contacts for me after there is hateful fake news about me online.

So there are people and trolls and even my former friends, like people who know me personally from the history, who want me dead. Or who want me silenced. Or just who are threatening me in other ways.

And here it's just a couple of more examples of some fake news, even RT Deutsche, the Russian most aggressive Russian propaganda branch. The German one also made fake news about me, using Finnish propaganda site materials, and also tweeting it to me.

And here finally I want to show you, because there is not enough research about how propaganda and fake news are actually influencing the human mind and behavior. So I tried to look at this in my original investigation. And I found out that it's so bad new, because so many people have difficulties in telling what is true and what is not true after they have become subjects and targets of disinformation by the trolls.

Also, many people turned into propagandists themselves. So in Finland, we have seen academic, well-educated people, like University teachers, spreading disinformation and fake news as truth online on social media.

And so this is my take on this. And luckily there is also very good positive reactions in the Finnish society. For example, now the police has set up a new special unit to target hate speech and hate crimes. Because it's a well-known fact that hate speech can agitate hate actions, so violence, basically.

Also, we are seeing various big international police operations and police investigations, and we are seeing our Government officials and our civil servants being trained to recognize and counter this information and propaganda. And yes, Civil Society also has also been truly active in this.

And so this is the Finnish case. Thank you.

(Applause)

>> MENNO ETTEMA: Thank you very much, Jessikka.

It seems a striking personal example of how, while exploring propaganda and Internet information wars, you end up being a victim of fake news yourself and all the consequences that it has on the journalist, which you are, and their practice.

Which comes to the question what can we do, which we will address in a few minutes.

Can I give the floor also to Divina, who is also working in this area and also a professor at the University and a social mediaologist. But you are also an expert -- or access expert for UNESCO, Council of Europe, European Union and now for us on media literacy.

>> DIVINA FRAU-MEIGS: Thank you. It's true for us that it's interesting, becuase I've been with EuroDIG since the beginning, and we are celebrating today a special longevity. But it's been very hard to bring the issue of literacy and critical Internet literacy to EuroDIG and to the Internet governance people at large. So I'm very pleased that this is coming at last on the board.

And, as unusual, media literacy arrives on the board when there is a mess. The mess here is fake news and the scandal of fake news. But, in fact, I think beginning to explain what media information and media literacy is about, it's about developing a critical mind. It's about a very specific set of competencies that are necessary today, and that should be first and foremost in education, including a search for information.

Because what it allows, when you start having a pedagogy of that, and we start teaching young people but also adults -- and the big news today is that we have to be media information literate throughout our life. It's a programme. What it allows us to see more clearly is what I've tried to do before. There are several types of fake news. They come with a different set of agenda. And we can figure out the agenda. It can be political, it can be economical, it can be malevolent or benevolent or whatever. There is some good fake news. It's done in Paris, to answer your question.

But when you become critical, then you start having tools and you start putting words on what is happening. And once you start putting words, you can ask for action. You can petition your politicians. You can try to be an activist of a different kind.

So, for instance, what I would say from the perspective of media and information literacy to what is happening is that fake news is a scandal. A scandal in the Greek sense. It makes us stumble on what we thought was the acceptable.

For instance, I would say one of the things we have accepted is the mantra that we are going towards more Democratic countries, and that this is progress and that it goes on a constant rise. Well, wrong. What we are seeing, and this is what is so difficult to explain in particular to young people, but let's face it, to ourselves, this kind of propaganda is extremely disruptive. And we have been taught in the digital world to think disruptive is good. Except that the disruption at the moment is against Democracy. And some people think it's good. And everybody has their own truths. So we have to think about it. But just thinking about it is starting to think of an answer and providing the responses to it. But think about it. What are the scandal? The scandal is that propaganda that used to be internal to a country is now affecting the politics and the decisions and the opinion of other countries. And that is scandalous. Because the integrity of one culture, of one nation, et cetera, is being attacked. So yes, we have to think about it.

And then there is what Jessikka was mentioning, it's the blurring of borders, and we move from fake to hate. So liberating the language of fakeness leads us to a very different thing, in fact: Hate speech. And hate speech is regulated, at least in Democracies, at least in European democracies. Not in the U.S., where a lot a lot of things can happen that way.

So getting handles on these issues critically is part of the role of a pedagogy of information and media literacy. But once you said that, you haven't advanced. And I'm really bored with presenting just this aspect of media literacy. That is something to wrap, to put a nice wraparound a very bitter pill. So we have to take media and information literacy to a different height if we are responsible, which the Council of Europe I'm glad to say is doing, UNESCO is doing, the EU is doing, Marisol with the DG Connect and taking into account and information literacy.

Europe has been wise in deciding that media information and literacy is the banner under which digital literacy should be. Which is to say it's not so much about the technology, it's about the values that you want to give to the technology. And that's something that is a good lesson from Europe.

So Europe has a directive in which we speak about media and information literacy. And there are digi coms, digital competencies, in which information is major. So if it's not just a pedagogy that you do in the classroom and it's going to take about 25 years to be effective, if you want it to be effective, how should we go about it? Well, I'll say two things, and I'll be quick, to start you thinking. We should construct media and information literacy as a right. As a right that gives us access to education, to participation, to citizenship. It's not yet conceived like that.

And we should recognize it again, media and information literacy, for what it has been from the start. And the result is making it have a very hard time piercing into national scales, national curriculums. Media and information literacy is a political project. A political project. Not about propaganda or about debunking propaganda, it's about creating a kind of governance, a kind of giving us again an enchantment of what the media were about when we started with media. They were about democracy. They were about ethics. They were about helping us create our opinion. We have lost that. We haven't transmitted that to young people. We haven't transmitted that even sometimes to journalists.

So it has to be a political project. And I advocate not only the governance of the Internet, but the governance of media and information literacy. When you look at how politics of media and information literacy are in Europe, and I just came out with a book on that, you see that there is in fact a lot of dysfunctional types of governance. There is sometimes a Minister of Education that takes charge of it. Sometimes the Minister of Culture. Sometimes it's Civil Society. Sometimes it's Microsoft.

We need to think this through and we need to tell ourselves, media and information and literacy is an essential 21st century skill, so let's move about it at a huge scale altogether.

>> MENNO ETTEMA: Thank you, Divina.

I see the first person wanting to take the floor, which is great. I think various points were raised here on the role of journalism and the taking up. And in the previous discussions we also had discussions about fact checking and the role to take up there.

I just wanted to check the audience. Where do you think we should start? And then I'll give the floor to you. What is your approach to how do we address fake news?

The question is, if we move forward, no, one previous. Next.

To counter the risk of fake news we must work on... so what is -- after hearing various discussion points, what are the feelings in the room? One person already voted. To counter the risks of fake news, we must work on media literacy, strengthening fact checkers, rewriting algorithms, strengthening the work of journalists, something else, totally nothing, this is a multiple choice. You can add more points.

All of the above. That's not an option, but it's an multiple choice so you can click all of them.

Good. We will keep this open while it's going. I'll give the floor to you. Please introduce yourself.

>> AUDIENCE: Thanks. My name is Val Plummer (sp). I work for the Electronic Frontier in Finland.

And I have some kind of a solutin, and I'm really glad to see that most people in the room have voted on not regulating the issue, which I think is good. It's a way to measure and we can destroy a lot of freedom of speech along the way if we go that route.

But I think what we do need is better tools for identifying fake news. And I think the key to that would be to have Facebook and Twitter, which are the biggest social platforms, social media platforms, to give us a tool that we can not just flag disinformation for Facebook, but for all the other people. Because there is nothing at the moment that would actually sort of flag you, fake news, that is going around in Facebook.

And I think there could be a solution where Facebook could give you an extra button on a news, and that would show you how many people or what percentage of people have determined that as fake news. And then it would take you, if you were sort of interested to see the conclusions or sort of the debate about that specific news, Facebook or Twitter would then take you to a separate page where the sort of legitimacy of that news were discussed. And people could really verify for themselves and even take part in that discussion whether it's real or not. And I think that is something that can be done, crowdsourced.

And then there is stuff -- in fact, I just have a friend who is starting a service on an Internet page, where he is going to list all the ridiculous fake news that is about from different sources. And that could give us some statistical analysis of what websites produce this kind of fake news all the time.

So we need tools. We don't need regulation.

>> MENNO ETTEMA: Yes. And tools to actually open up the fact checking to a wide audience. Just a great suggestion.

Jessikka, you can take the floor.

>> JESSIKKA ARO: I'd like to comment on that proposal and at the same time bring into mind or memorize that Facebook, for example, already has so many tools to counter hate speech. Very simple tools, but it's just not using them at the moment.

For example, in Finland we are seeing all the time that people are reporting, making user reports to Facebook and telling that here is hate speech, here is graphic violence, here is this and that, harassment. And Facebook is all the time, it's just so unbelievable, but it's all the time telling that these do not break the community standards of Facebook. And it's really amazing. Because that stuff really breaks the law, not just in Finland, but also many other countries. So how is it possible that Facebook has different community standards than most of, for example, the hate speech laws in most of the countries?

So we should really make these big Internet giants accountable for what they're doing. They are basically doing business with us. They are basically taking your and my information and using it everywhere. But they are not carrying their responsibility as enablers of even state-sponsored propaganda and hate speech spreaders, which Facebook, Twitter, even Google and YouTube at the moment are.

And the big problem is that you never see these big giants fulfill their promises. They had promised already one year ago they promised to the European Commission that they would remove hate speech in 24 hours after it has been reported. Well, did that happen? No.

>> MENNO ETTEMA: Maybe to -- it's worth mentioning that the new report was just published a few days ago, the one year after evaluation, which seems to imply that there's more and more -- there are quicker within 24-hour actions taken.

But what is interesting, adding to your point, is that when it's not taken down, it's not very clearly communicated why not or why. And there is a call for transparency in that level, which is one of the conclusions. That's interesting. Thank you.

>> DIVINA FRAU-MEIGS: I think maybe to add, we need research. Really, we are talking without really being sure about the consequences or the path of fake news or how it works. What little research we have gives us one or two pointers. One of them is that the biggest influencers, the ones who are most followed online, are old media. And, in fact, they are the ones who usually activate most fake news.

So we have to thank the ethics of journalism, and we have to retrain journalists who were born in the digital age, and who click first, then after, and then reclick something else. They think that they can do that for news the way that they do for other information. That has to stop.

So there is a real need for the profession to have a grip on itself. And I know that some journalists are doing that, but it's not all. And we really have to be careful about that.

The other thing that research shows us is that these influencers, we know now the speed of rumor. Rumor, when it's fake, takes about 12 to 14 hours to be debunked. When it's real, it takes two hours. So bad news, fake news, real fake news, if you want, circulate much, much longer and have a much more damaging capacity than real news. And, unfortunately, the buzz that the correction makes is much less than the buzz made by the fake news itself. So the correction doesn't get the same speed of diffusion.

All this explains what happened to you. So if we have a better understanding of speeds of networks, of rhythms and flows, et cetera, we may be able to take action. Again, this is where research and education come to the rescue in order to make decisions that are not necessarily regulatory or political, but that can have an influence on our behavior online. And we are all concerned here.

>> MENNO ETTEMA: Yes. Yes.

>> SVEN MIKSER: I think it's probably a bit easier to both sort of identify and also regulate hate speech than it is to identify and regulate fake news.

And I think that when it comes to sort of the distinction between propaganda and fake news, sort of what is and what is not fake news, then I think that, for example, politicians making unrealistic promises during the election campaigns is not necessarily fake news. If you basically say during a campaign that I have a perfectly simple solution to provide affordable healthcare to everyone, and it's going to be so easy, that's not fake news. If you sort of imply that your opponent may have had some role in a plot to assassinate a former President, then that is clearly fake news. This is a subtle distinction but there is a distinction.

And I think that basically fake news, basically claiming that something that is not true is true, or something that didn't really happen, happened, that cannot be sort of voted on. There is -- I think the objective of those spreading the fake news is to create the impression that there was a relativity of truth. And this should not be allowed to happen.

So I think the suggestion that we should sort of vote whether a news item, is it a factual claim, is it true or not? That's a ridiculous (?) vote, I'm afraid.

>> MENNO ETTEMA: That's a challenge.

I want to go back to the thing, the poll where it said okay we need to work on algorithms, strengthening fact checkers and strengthening the work of journalisms.

This comes together in a concrete example that I was assigned to show. It's -- maybe you can introduce it. It's the broadcasters clip that you can play. But maybe you want to click the show and tell us what we are watching. Because you said that this is a concrete example of the work of fact checkers and the work of journalists, which are at threat also to spread elements of hate fake news, which is a concern.

>> AUDIENCE: Yes, Giacomo Mazzone from the European Broadcasting Union and a journalist for 40 years, at least.

I think in what has been said, there is a problem, a main problem, concerning the fact that the traditional media are still today the source for reliable information. But the problem is that there are entities and there are interests that tried to short circuit this traditional spread of the information. They don't want to go directly to influence the audience, they tried to kill the mediator, as we have seen in the case of Jessikka, or tried to bypass the mediator. Because the mediator was before the guarantee, because there are some legal responsibility according to the media regulation, to be responsible for that.

So today we need to continue to do our job, especially as public service broadcasters, in searching for the truth and provide the evidence. But also we need to counter concretely and being rapid and fast. Because as Divina said, the problem is that if you react to fake news the day after, then the battle is lost, because the damage is already done.

So there is more and more work that is done in all the newsrooms, trying to tackle these issues.

There is a video that I want to show. I want to add one thing before --

>> MENNO ETTEMA: Can a technician switch the screens to the video, please?

>> GIACOMO MAZZONE: The thing I want to say is that there are two things that would be very important to be made by the others, not only by the media company, et cetera, et cetera. The first thing is that I would like very much that the politicians will not use this as an instrument for making their own campaign. Because if they do, then all the efforts that you can do as media is lost.

The second is the algorithms. The algorithms, there is a possibility -- and we have done some testing during the election campaign in France and now, in preparation of the German campaign -- where when the fact checking work is done, then the algorithms could be controlled. And instead the spreading of fake news could be countered, and eventually earmarked as the Electronic Frontier colleague suggested.

Have a look.

>> MENNO ETTEMA: Let's have a look at the explanatory video.

(Video)

Fake news. They are present everywhere these days. (inaudible) supports Trump. They are fake. The green party member invites all refugees to Germany. Another fake. But these stories are shared many times on Facebook or Twitter. That's why AID News decided a month ago to start an anti fact news unit. We call it fact finders. Every day four people of the team look at viral stories shared on social media websites. Christoph here on the left is part of that team. He and his colleagues produce a website where they share the results of their research. Is the story really true or is it false, fake?

>> Well we monitor the web every day and try to find out or to determine which fake news stories might be relevant in Germany on this day or not. And then, of course, we have a discussion on whether it is a story that is so important that we have to cover it. Because there also is the danger that if you cover a fake news story which hasn't had so much impact at this point yet, you make the story actually great by reporting on it.

>> The fact finders do not only look at German stories. When French presidential candidate Le Pen copied the speech of former conservative candidate Fillon, the unit produced a little video showing directly which parts of the speech which had ben taken by Le Pen.

Sometimes even a satire can go wrong. For instance when German Chancellor Merkel went to Saudi Arabia three weeks ago. In public women normally wear a scarf in their country. Of course German Chancellor Merkel did not. But did Saudi Arabian media really blur her photo, not to offend viewers? Of course not. It was a joke or a satire, as these pictures show.

Also on Saudi Arabian TV, Merkel was shown as original, not blurred. But many people took it for real.

>> We cannot prevent fake news from appearing, but we want to provide information on fake news and we want to provide guidance to our users and help them to navigate within the vast amount of information which is spread in social media and which is spread on social websites. We want to help people to sort out which information is trustworthy and which is not.

>> So the team teaches people how to tell fake from real. To teach people how to tell the fake news from real news. First. always check the origin. Is the source really trustworthy? Is the social media profile really new? Does the source have only a few friends? Then you should be very careful.

When you see videos are photos, always use image reverse to see if the video is really from today or taken from a different time.

Last but not least, always check the Web address. Is it trustworthy or who is behind it? Take a look at the details of the uploader. That can give you some valuable hints if the source is authentic. And if you are not sure, you can always come to the fact finders at tagesschau news.

>> MENNO ETTEMA: So there is a concrete example. Different elements come together of journalism based on education and training.

Divina, you have a comment on this?

>> DIVINA FRAU-MEIGS: I think it is an excellent example of where we are now. We are now in the resistance stage to fake news. We are over it, in my opinion. The media have found some tools, they are answering the way the social Web works, which is to say relational proximity. Does he have friends? Is he alone? Et cetera. That's how young people check. They don't know what the source is or they don't know what real information is compared to a fake. They will know if it has a relationship proximity with their own group, et cetera, et cetera. And we have to make the effort to go towards that community and think in the same terms, to then bring them to values that are not changing about the need for real information and for truth. Because I think like many of us, we believe there's truth out there.

And so I think the good news is that there are now more and more means. The resistance is coming. I have examples of teachers in France, good practices about what they call hoax busting, which is things that children like. Because of course it's game oriented. If you take it as a game, they will go into it.

There is a social Web group that create things like "accounts be hunting." So all of these active and proactive attitudes I think are coming now and we will see probably an age where we are going more toward deliberative and critical approaches. And desperately I hope that media and information and literacy will be part of that picture as a political project.

>> MENNO ETTEMA: Thank you very much.

I have one more comment from the audience, and then I'll look to wrap it up.

>> AUDIENCE: Hello. My name is Asca Kuna from the University of Nottingham.

I wanted to briefly pick up on the point made earlier about the need for more research as to the actual impact and the spread of fake news things. To point out that one of the problems with doing more research on this is that the social media companies are the only ones who have access to a lot of the data as to what is actually happening. If you are trying to do research on this, and you don't have a direct collaboration with people from Facebook, it's difficult to get data on it. We have tried to do research projects where we built a tool using one of their APIs, but Facebook has a knack for changing their APIs so you can't actually get big enough data sets that could help.

>> MENNO ETTEMA: Thank you. I appreciate that. It calls for cooperation. Thank you very much.

I want to thank everyone for your contributions. I think, as I mentioned at beginning, it's an explorative debate. While discussing, it's clear that there are lots of practices out there that we can build on, not necessarily to more regulation but more searching for how to implement.

So I'm curious what the audience found of the meeting. So we will have a small little closing question. There is a small evaluation question for you, while the Rapporteurs prepare short feedback from the group.

After this plenary on Internet in the post-truth era, I feel:

I learned a lot.

I'm more confused than ever before.

I'm committed to addressing the issue of fake news.

I feel fake news is a nonissue.

I wish to thank the organizers and the speakers.

And I look forward to the dinner party tonight.

Again multiple choice, so you can add several comments. Feel free to add.

I'll give the floor to the Rapporteur to quickly summarize our discussion.

>> I think the last point is fake news. There is no dinner party tonight.

>> MENNO ETTEMA: We will need to clarify that in a minute with the Moderator.

>> MODERATOR: Thank you very much. It's not easy to summarize such a long discussion, but I'll try.

It helps, actually, the Moderator structure and debate as well.

Anyway, starting from the description, and it's important to define what we mean by fake news and what we don't mean.

There are fashionable terms, like asymmetric warfare or hybrid, and we should define some characteristics that make fake news really fake. And one is that it is intentionally spread out in order to create confusion and to discredit Democratic societies, or to produce economic gain. But it was pointed out that just that journalism itself as such is not fake news. Fake news is actually not news. Because plastic rice is not rice. It's plastic.

So the second point was polarization, the relationship between polarization and fake news in European society. The majority agreed that there is a relationship, that the fake news feeds the polarization. But at the same time, the relationship is not a straight one or it is actually a very complicated one. There is no direct course of one either way. But anyway, polarization creates a good market for fake news.

There was also a category of fake news was produced. There were like three types, the state propaganda, plots that were designed to counter it, and finally, then you have the people who try to make sense of it all and do it by rumors and gossip.

One of the perhaps potentially really bad results of all of this: The status of truth is disrupted. It also undermines the concept of objectivity and journalism as a whole. People are emancipating themselves because now they have the tools. And if they don't trust or like objectivity, they create their own narratives.

The paradox is, as was pointed out, that when people generate news which is false, and media tried to check it, then the media get accused for fake news.

It's certainly true that the information landscape is not vertical anymore. It's multi-layer and horizontal.

Now, to the normative aspects, fake news is not illegal as such. And the question on regulation brought a clear answer: 28 percent said yes, but 60 percent said no. And I think that, in the audience, there was a clear tendency against Government regulation. It's not the way to solve it.

At the same time, it was pointed out that regulation can also happen as self regulation or co-regulation. And also, that the best regulation is education.

The big players, they were in focus here in that there were events to make them more accountable. It was pointed out that platforms are already regulating, but perhaps more is needed. Like, for instance, a media status for Facebook and other platforms, instead of being just platforms without any responsibility, should they actually be more like media or have some sort of public trustee media model.

Coming to the proscriptive part, what should be done? The classic question, (speaking Russian) in Russian. I think that the answer, also in the discussion but also backed up by the figures here, was education for media literature.

But instead of being just something that you always throw out in a difficult situation when there is a mess, call for more media education, this time the whole thing should be brought on a new level and perhaps it should be really recognized as a right of citizens. As really a political project that would, in a way, bring the media back to where it began, with fascination and political purpose.

Now, of course, we heard and saw an example of fake news as a weapon to silence and intimidate journalists and others who dare to oppose fake news.

Finally, the results of the vote here. Again, the media literacy education was clearly the most favored method. But the second was strengthening the work of journalism or journalists, and furthermore to train digital journalists, those who are pretty much digital natives, to train them in the background of what has happened.

So I think this is -- I tried to formulate these into messages of this session, if I have some more time.

Thanks.

>> MENNO ETTEMA: Thank you very much.

Because of time -- so I would like to thank you for the reports. I want to highlight that -- a fuller report will go online at the wiki. You can check there the report. Plus the recordings of the plenary were also covered.

So I want to conclude by thanking everyone in the audience for your contributions. As I said, I think it's an explorative debate and you added a lot to the discussions. New insights that we hadn't even covered with the organising team.

And I would also like to thank the panelists: Minister Sven Mikser, Ambassador Corina Calugaru, Divina Frau-Meigs, and Jessikka Aro. Thank you very much for your contributions and putting your thoughts to the audience.

Thank you Rapporteurs and colleagues for making this happen. And I wish you good evening, and a good party tonight.

(Applause)


This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.