Internet in the “post-truth” era? – Pl 02 2017: Difference between revisions

From EuroDIG Wiki
Jump to navigation Jump to search
Line 244: Line 244:
SMEs are responsible for the clustering of submissions into a thematic category they have an expertise in. They define subtopics and identify submissions which fall under this subtopic. The aim is to verify submissions which can be merged in one session. In the course of the session organising process SMEs will serve as a mentor for the respective category by supporting all Focal Points.
SMEs are responsible for the clustering of submissions into a thematic category they have an expertise in. They define subtopics and identify submissions which fall under this subtopic. The aim is to verify submissions which can be merged in one session. In the course of the session organising process SMEs will serve as a mentor for the respective category by supporting all Focal Points.


1. Yrjö Länsipuro (ISOC Finland)
1. Yrjö Länsipuro ISOC Finland


*'''Key Participants (for workshop) or Panellists (for plenary)'''
*'''Key Participants (for workshop) or Panellists (for plenary)'''

Revision as of 14:29, 31 May 2017

6 June 2017 | 17:00 - 18:30 | Grand Ballroom, Swissotel, Tallinn, Estonia | remote participation | live streaming
Programme overview wiki | Programme overview EuroDIG web site

Session teaser

The internet has given voice to the many, great!, but how and who can now distinguish between ‘facts’, ‘opinions’ and ‘feelings’?

Keywords

Post-truth, fake news, manipulation, emotional appeal, post-truth politics, accountability, legitimacy, freedom of expression, access to information, filter bubble, algorithms, profit, business model, content creation, narratives, ethic, media literacy education, response to hate speech, take down, education, alternative narratives, trusted flaggers, multi-stakeholder approach, regulation, code of conduct, democracy, gatekeepers

Session description

Deployment of Internet infrastructure is insufficient without media literacy education, aimed at equipping (young) people with the competences to reflect critically, to deal with ambiguity and diversity of opinions, with respect to freedom of expression and all other human rights.

Propaganda, misleading information and factually wrong news has always existed, irrespective of the spread of the Internet. Therefore, one question arises: why has fake news become such a phenomenon now? Is it because the polarization of society (social aspect) pushes people to their comfort zones, leading to the creation of ‘filter bubbles’, or do ‘filter bubbles’ lead to polarization – which comes first? Is the business model of the dominant IT companies making fake news more appealing than real news? Why do some narratives (e.g. counter and alternative narratives to hate speech) and news (i.e. real news) receive less visibility than others? At the same time, the proliferation of Internet access gave rise to the phenomenon of mass dissemination of fake news online, adding a new dimension: fake news is rapidly reproduced on social media, mostly without editorial control and with limited ability for readers to verify content. With this in mind, the need for journalists to get involved in fact checking before publishing news, so as to advance quality media, is all the more crucial.

Moreover, personalized algorithms which optimize instant gratification by prioritizing information that matches the world view of the user have reinforced the creation of ‘filter bubbles’, facilitating and causing fake news to spread at ease. Are such algorithms a symptom of segregation in society, in lack of communication with each other? Is ‘bubble-piercing’ through similar algorithms to confront users with ‘the view point of the other side’ an approach that is feasible, desirable and/or likely to be effective? Is flagging or removing fake news enough? As ‘filter bubbles’ become the norm the exposure of the reader to news that can challenge fake news is limited. Creating fake social media accounts compromises freedom of expression, while publishing fake news as advertisements undermines journalistic ethics. It would thus not be an exaggeration to say that fake news, while not yet being clearly defined and not always easy to spot, is a potential threat to democracy.

What is more, increased Internet access mainly via mobile devices does not provide complete access to the breadth of Internet/information. As fake news also often contains elements of hate speech and justifies or reinforces hate narratives and conspiracy theories, media literacy education, critical thinking and human rights education are urgently needed, ensuring a coherent joint effort of all stakeholders to play their role in safeguarding and fostering democracy.

Format

The plenary session explored the issue ‘Internet in the Post Truth Era’ in three phases:

a) Descriptive: How does ‘fake news’ express itself across Europe today, what are the consequences and why is this a concern for pluralistic democracy and human rights?

b) Normative: Can ‘fake news’ be defined, who is responsible to address it and on what grounds? Are there parallels to be drawn with the search for an effective approach to addressing hate speech in Europe?

c) Prescriptive: What approaches to combatting fake news and hate speech should be considered good practices that need to be strengthened. In particular what role is there for education, journalists and internet businesses to play, for example by strengthening human rights and Internet media literacy education, diversity and credibility of media distribution online and adapting algorithms to broaden the filter bubbles?

Each phase will be introduced with a 3 minute statement by a panellist. The introduction is followed by a discussion with the audience and other panellist. Interaction between members of the audience will be encouraged.

To strengthen the audience engagement during phase b) and c) a quick check of the opinions on the audience on the issue through a statement and an (online) 'voting' tool will be conducted.

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, so only links to external resources are possible.

→ From Yrjö Länsipuro:

1. Reuters Institute (Nic Newman) A report on key developments in journalism, media and technology in 2016 focuses on fears about how changing technology is affecting the quality of information and the state of our democracy. http://reutersinstitute.politics.ox.ac.uk/sites/default/files/Journalism%2C%20Media%20and%20Technology%20Trends%20and%20Predictions%202017.pdf

2. Pew Center: Experts worry algorithms can also put too much control in the hands of corporations and governments, perpetuate bias, create filter bubbles. http://www.pewinternet.org/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age/

3. Meedia: Bislang gibt es keine Belege dafür, dass die gefälschten Nachrichten hierzulande überhaupt ein ernstes Problem darstellen. Springer-Chef Mathias Döpfner meint sogar, dass es schon immer Fake-News gab. http://meedia.de/2017/01/17/fake-news-debatte-jetzt-widersprechen-die-nachrichten-praktiker-der-politik/

4. Deutsche Welle: Countering fake news while safeguarding free speech. As fighting fake news becomes a global priority, questions remain about what exactly fake news is, what can be done to stop it and how to do so without curbing free speech. http://www.dw.com/en/countering-fake-news-while-safeguarding-free-speech/a-37924278

5. University of Washington. Examining the Alternative Media Ecosystems through the Production of Alternative Narratives of Mass Shooting Events on Twitter http://faculty.washington.edu/kstarbi/Alt_Narratives_ICWSM17-CameraReady.pdf

6. Seattle Times. A story about the above http://www.seattletimes.com/seattle-news/politics/uw-professor-the-information-war-is-real-and-were-losing-it/

7. Vanity Fair. Fake News Is About to Get Even Scarier than You Ever Dreamed. What we saw in the 2016 election is nothing compared to what we need to prepare for in 2020. http://www.vanityfair.com/news/2017/01/fake-news-technology

8. New York Times. Europe Combats a New Foe of Political Stability: Fake News https://www.nytimes.com/2017/02/20/world/europe/europe-combats-a-new-foe-of-political-stability-fake-news.html?_r=0

9. Tow Center. The influence infl uence of social media platforms and technology companies is having a greater effect on American journalism than even the shift from print to digital. There is a rapid takeover of traditional publishers' roles by companies including Facebook, Snapchat, Google, and Twitter that shows no sign of slowing http://towcenter.org/wp-content/uploads/2017/04/The_Platform_Press_Tow_Report_2017.pdf

10. Tow Center. Journalists today have an imperative—and an opportunity—to sift through the mass of content being created and shared in order to separate true from alse, and to help the truth to spread. Unfortunately, as this paper details, that isn’t the current reality of how news organizations cover unverified claims, online rumors, and viral con- tent. Lies spread much farther than the truth, and news organizations play a powerful role in making this happen. http://towcenter.org/wp-content/uploads/2015/02/LiesDamnLies_Silverman_TowCenter.pdf

11. New York Times Magazine (Farhad Manjoo) Mark Zuckerberg now acknowledges the dangerous side of the social revolution he helped start. But is the most powerful tool for connection in human history capable of adapting to the world it created? https://www.nytimes.com/2017/04/25/magazine/can-facebook-fix-its-own-worst-bug.html?_r=0

12. New York Times (Jonathan Taplin) Google has an 88 percent market share in search advertising, Facebook (and its subsidiaries Instagram, WhatsApp and Messenger) owns 77 percent of mobile social traffic and Amazon has a 74 percent share in the e-book market. In classic economic terms, all three are monopolies... Could it be that these companies — and Google in particular — have become natural monopolies by supplying an entire market’s demand for a service, at a price lower than what would be offered by two competing firms? And if so, is it time to regulate them like public utilities? https://www.nytimes.com/2017/04/22/opinion/sunday/is-it-time-to-break-up-google.html?_r=0

13. The Guardian (Evgeny Morozov) Moral panic over fake news hides the real enemy – the digital giants https://www.theguardian.com/commentisfree/2017/jan/08/blaming-fake-news-not-the-answer-democracy-crisis

14. Digitoday: How Sweden is fighting fake news http://digiday.com/uk/fake-news-in-sweden/

15. Washington Post: How do you stop fake news? In Germany, with a law. https://www.washingtonpost.com/world/europe/how-do-you-stop-fake-news-in-germany-with-a-law/2017/04/05/e6834ad6-1a08-11e7-bcc2-7d1a0973e7b2_story.html?tid=ss_fb&utm_term=.0a3b50c700c3

16. Nieman Lab. Wikipedia founder Jimmy Wales launches Wikitribune, a large-scale attempt to combat fake news. The crowd-funded news platform aims to combat fake news by combining professional journalism with volunteer fact checking: “news by the people and for the people.” http://www.niemanlab.org/2017/04/wikipedia-founder-jimmy-wales-launches-wikitribune-news-by-the-people-and-for-the-people/ and: https://www.wikitribune.com/

17. NewCo (Krishna Barat) How to Detect Fake News in Real-Time https://shift.newco.co/how-to-detect-fake-news-in-real-time-9fdae0197bfd

18. New York TimesA European front on fake news. https://www.nytimes.com/2017/05/01/business/europe-election-fake-news.html?_r=0


→ From Rachel Pollack:

19. UN, OSCE, OAS and ACHPR Special Rapporteurs on freedom of expression: Joint Declaration on Freedom of Expression and 'Fake News', Disinformation and Propaganda http://www.osce.org/fom/302796?download=true

20. UNESCO Colloquim: Journalism Under Fire http://en.unesco.org/journalism-under-fire

21. UNESCO: Campaign for World Press Freedom Day (video) https://www.youtube.com/watch?v=2fPm7d_RxeU


→ From Professor Osman Coşkunoğlu, PhD:

A scientifically appropriate and functionally useful approach to the fake news issue has to deal with the three aspects, namely, descriptive (describing the problem and its source), normative (setting norms and definitions), and prescriptive (approaches and techniques to solve the problem). The references below are categorized along these three aspects.


DESCRIPTIVE: Three sources of the problem are human fallibilities, polarized society, and poor or biased algorithms

Human

Herbert Simon received Nobel in Economic Sciences (1979) by his bounded rationality theory but seminal studies showing human biases and reasoning fallibilities were done collaboratively by psychology professors Daniel Kahneman and Amos Tversky. Their initial work was summarized in their two books: Judgement Under Uncertainty: Heuristics and Biases (1982) and Choices, Values, and Frames (2000). After receiving Nobel in Economic Sciences (2002), Kahneman (Tversky had died in 1996) wrote an international bestseller in 2013: Thinking Fast and Slow. The book explains, for general reader, his and Tversky’s findings on human cognitive biases and limitations.

As fake news has recently become center of attention, articles about human fallibility have been appearing in reputable media:

22. The New Yorker: Why Facts Don't Change Our Minds http://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds (27 February 2017)

23. The New York Times: Why People Continue to Believe Objectively False Things: https://www.nytimes.com/2017/03/22/upshot/why-objectively-false-things-continue-to-be-believed.html?_r=1 (22 March 2017)

24. Slate: Our Punny Human Brains Are Terrible at Thinking About the Future http://www.slate.com/articles/technology/future_tense/2017/04/why_people_are_so_bad_at_thinking_about_the_future.html (13 April 2017)

There seems to be a consensus on that, lies that play to individual’s prejudices and beliefs are more easily believed and are passed on thoughtlessly or for a feel-good effect.

25. Lotto, Beau: "Deviate" (2017) Professor of Neuroscience at University College London, Beau Lotto, in his very recent book (25 April 2017), entitled "Deviate", argues that all experience occurs in the brain – and that the brain is deluded. A review of the book: https://www.theguardian.com/books/2017/apr/22/deviate-the-science-of-seeing-differently-by-beau-lotto-review


Society

Eli Parisier’s bestselling 2012 book "The Filter Bubble: How the New Personalized Web is Changing What We Read and How We Think" is hailed anew to explain the polarization in the society. For example, 2016 Wired magazine article:

26. Wired: Your Filter Bubble is Destroying Democracy (2016) https://www.wired.com/2016/11/filter-bubble-destroying-democracy/

However, economists at Brown University and Stanford University recently (March 2017) reported that it is the “…the growth in polarization in recent years is largest for the demographic groups least likely to use the internet and social media…[Our findings] argue against the hypothesis that the internet is a primary driver of rising political polarization.”

27. Brown University and Stanford University: Is the Internet Causing Political Polarization? Evidence From Demographics (2017) https://www.brown.edu/Research/Shapiro/pdfs/age-polars.pdf


Algorithms

28. Domingos, Pedro: "The Master Algorithm" (2016) https://lareviewofbooks.org/article/algorithms-future-already-happened/ The Master Algorithm: The 2016 book by Pedro Domingos – a researcher on machine learning – on how algorithms can warp the reality. What is to be done? Informed active citizen. An excellent review of the book and criticism of the solution suggested by Domingos.

29. Pew Research Center: “Code-Dependent: Pros and Cons of the Algorithm Age” (February 2017): http://www.pewinternet.org/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age A very comprehensive – but not random – survey research by Pew.


NORMATIVE

30. The Guardian: Defining Fake News Will Help Us Expose It (12 May 2017) https://www.theguardian.com/media/commentisfree/2017/may/12/defining-fake-news-will-help-us-expose-it The Guardian’s Readers’ editor Paul Chadwick differentiates fake news from flawed journalism and proposes a draft definition of it

31. Slate: Stop Calling Everything 'Fake News' (6 December 2016) http://www.slate.com/articles/technology/technology/2016/12/stop_calling_everything_fake_news.html Towards a better characterization of fake news.


PRESCRIPTIVE

Criticized for their newsfeed during the presidential elections in the US, by the end of 2016 Facebook had already taken several steps to fight fake news

32. Slate: Facebook's Cautious, Sensible Plan to Fight Fake News http://www.slate.com/blogs/future_tense/2016/12/15/facebook_s_plan_to_fight_fake_news_is_sensibly_cautious.html In addition to some changes in newsfeed algorithm boostin authentic news, Facebook relies on user participation for flagging a fake news – to be reviewed by independent fact-checkers.

33. Slate: Only You Can Stop the Spread of Fake News http://www.slate.com/articles/technology/technology/2016/12/introducing_this_is_fake_slate_s_tool_for_stopping_fake_news_on_facebook.html Digital magazine Slate also relies on user participation (by a Chrome extension) in a similar fashion.

However Facebook’s approach (user flagging an article, then if two or more fact-checkers’ approve this “disputed” tag is placed) may not work:

34. The Guardian: Facebook Promised to Takle Fake News. But the Evidence Shows It's Not Working (16 May 2017) https://www.theguardian.com/technology/2017/may/16/facebook-fake-news-tools-not-working

35. The New York Times: Fact-Checking Can Change Views? We Rate That as Mostly True (5 November 2016) https://www.nytimes.com/2016/11/06/upshot/fact-checking-can-change-views-we-rate-that-as-mostly-true.html Darthmout professor Brendan Nyhan reports that their research shows that correcting people’s false beliefs can be ineffective, or worse, make them cling to their views even more strongly. However, other research we have done suggests that fact-checking can be effective.

36. The Guardian: How To Solve Facebook's Fake News Problem: Experts Pitch Their Ideas (29 November 2016) https://www.theguardian.com/technology/2016/nov/29/facebook-fake-news-problem-experts-pitch-ideas-algorithms A cadre of technologists, academics and media experts are thinking up solutions, from hiring human editors, to crowdsourcing or creating algorithms.

37. Bloomberg View: Closing the Safe Harbor for Libelous Fake News (16 December 2016) https://www.bloomberg.com/view/articles/2016-12-16/free-speech-libel-and-the-truth-after-pizzagate According to Harvard professor Noah Feldman, a legal remedy for victims of fake news that would be consistent with the First Amendment is complicated but possible.

38. Wadhwa, Vivek: What Google and Facebook Must Do About One of Their Biggest Problems (12 April 2017) http://wadhwa.com/2017/04/12/google-facebook-must-one-biggest-problems/ and http://digitalethics.net/what-google-and-facebook-must-do-about-one-of-their-biggest-problems-vivek-wadhwa Recognizing that there is no quick fix, distinguished scholar Vivek Wadhwa has some structural recommendations not only to Google and Facebook but also to individuals

39. Data & Society: Points: Did Media Literacy Backfire? (5 January 2017) https://points.datasociety.net/did-media-literacy-backfire-7418c084d88d Danah boyd argues that neither educational approaches nor self-regulation of information intermediaries will work. She concludes that “…if we want to address issues like propaganda, hate speech, fake news, and biased content, we need to focus on the underlying issues at play. No simple band-aid will work.”

40. Slate: It's Time to Give Up on Facts (8 February 2017) http://www.slate.com/articles/health_and_science/science/2017/02/counter_lies_with_emotions_not_facts.html In the meantime, a very different prescription: “It’s Time go Give Up on Facts - Or at least to temporarily lay them down in favor of a more useful weapon: emotions.” (1,600 Comments made and 22,000 person shared on social media)


GENERAL AND RELATED

41. Carr, Nicholas: “The Manipulators: Facebook’s Social Engineering Project” (2014) https://lareviewofbooks.org/article/manipulators-facebooks-social-engineering-project Nicholas Carr explains why in 2014 we came to realize that “…a new stage in the net’s history — one in which the public will be called upon to guide the technology, rather than the other way around. We may look back on 2014 as the year the internet began to grow up.”

42. The Guardian: How Technology Disrupted the Truth https://www.theguardian.com/media/2016/jul/12/how-technology-disrupted-the-truth A fairly comprehensive analysis of the issue – with 1,584 comments – by Katharine Viner, editor-in-chief of the Guardian, concluding with a call for a civic participation: “engaging with people as civic actors, citizens, equals. It is about holding power to account, fighting for a public space, and taking responsibility for creating the kind of world we want to live in.”

People

Please provide name and institution for all people you list here.

  • Focal Point

Focal Points take over the responsibility and lead of the session organisation. Focal Points are kindly requested to observe EuroDIG's session principles. Focal Points work in close cooperation with the respective Subject Matter Expert (SME) and the EuroDIG Secretariat.

1. Menno Ettema and Irina Drexler, Council of Europe – No Hate Speech Movement

  • Subject Matter Expert (SME):

SMEs are responsible for the clustering of submissions into a thematic category they have an expertise in. They define subtopics and identify submissions which fall under this subtopic. The aim is to verify submissions which can be merged in one session. In the course of the session organising process SMEs will serve as a mentor for the respective category by supporting all Focal Points.

1. Yrjö Länsipuro – ISOC Finland

  • Key Participants (for workshop) or Panellists (for plenary)

Key Participants (workshop) are experts willing to provide their knowledge during a session – not necessarily on stage. Key Participants should contribute to the session planning process and keep statements short and punchy during the session. Panellist (plenary) will be selected and assigned by the org team, ensuring a stakeholder balanced dialogue also considering gender and geographical balance. Panellists should contribute to the session planning process and keep statements short and punchy during the session.

1. Sven Mikser – Minister of Foreign Affairs, Republic of Estonia [1]

2. Divina Frau-Meigs – President and professor at Université Sorbonne Nouvelle (Paris III), media sociologist, researcher on media and information education, expert at UNESCO, the European Union and the Council of Europe

3. Jessikka Aro – awarded investigative reporter with Finnish Broadcasting Company's social media project Yle Kioski. Aro specializes in Russia, extremism and information warfare. In 2014 Aro became the target of serious and still ongoing international propaganda and hate speech campaign after starting a series of articles about the influence pro-Kremlin social media trolls. Currently Aro writes an investigative book about Kremlin's information warfare and trains reporters and general public to recognize and counter online disinformation. Aro is also lobbying for better legislation to counter hybrid threats and protect citizens from state-sponsored online security threats. https://link.springer.com/article/10.1007/s12290-016-0395-5

  • Moderator

The moderator is the facilitator of the session at the event. Moderators are responsible for including the audience and encouraging a lively interaction among all session attendants. Please make sure the moderator takes a neutral role and can balance between all speakers. Please provide short CV of the moderator of your session at the Wiki or link to another source.

1. Menno Ettema – Council of Europe (No Hate Speech Movement coordinator)

Mr. Menno ETTEMA is the European coordinator of the No Hate Speech Movement, a youth campaign of the Council of Europe for mobilizing young people for human rights online, freedom of speech and against hate speech. He works closely with the national campaign committees, European partners and online activists on the implementation of the campaign, through the campaign website nohatespeechmovement.org, Twitter and Facebook, European trainings, seminars and the various awareness-raising and training activities at national level. Mr. Ettema has previously worked as an educational advisor for the Youth Department of the Council of Europe, supporting European Youth Organizations to develop their human rights education programmes. He has been coordinating intercultural dialogue and peace building projects, and projects promoting access to human rights for young refugees and youth with disabilities. Before starting at the Council of Europe he worked with youth peace organisations in the Middle East and coordinated the international secretariat of the United Network of Young Peacebuilders. Mr. Ettema studied Psychology of Culture and Religion at the University of Nijmegen, the Netherlands.

  • Remote Moderator

The Remote Moderator is in charge of facilitating participation via digital channels such as WebEx and social medial (Twitter, facebook). Remote Moderators monitor and moderate the social media channels and the participants via WebEX and forward questions to the session moderator.

1. Alessandra Coppola – President of APICE

  • Organising Team (Org Team)

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member.

1. Rachel Pollack – UNESCO, France

2. Osman Coskunoglu, PhD – Alternative Informatics Association, Turkey

3. Ansgar Koene, PhD – Horizon Digital Economy Research Institute, University of Nottingham, United Kingdom

4. Narine Khachatryan – Safer Internet, Armenia

5. Xianhong Xu, PhD – UNESCO, France

6. Daniel Waugh – Union of Students in Ireland

7. Giacomo Mazzone – European Broadcasting Union

  • Reporter

The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience
  • are to be submitted to the secretariat within 48 hours after the session took place

Please provide short CV of the reporter of your session at the Wiki or link to another source.

1. Yrjö Länsipuro – ISOC Finland https://www.linkedin.com/in/yrj%C3%B6-l%C3%A4nsipuro-91517718/

2. Irina Drexler, PhD – No Hate Speech Movement in Romania https://www.linkedin.com/in/irinadrexler/

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page.

Contact

Get in contact with the Org Team by sending an email.

Video record

Will be provided here after the event.

Transcript

Will be provided here after the event.

Messages

Please provide a short summary from the outcome of your session. Bullet points are fine.