Internet in the “post-truth” era? – Pl 02 2017: Difference between revisions

From EuroDIG Wiki
Jump to navigation Jump to search
Line 13: Line 13:
Deployment of Internet infrastructure is insufficient without media literacy education, aimed at equipping (young) people with the competences to reflect critically, to deal with ambiguity and diversity of opinions, with respect to freedom of expression and all other human rights.  
Deployment of Internet infrastructure is insufficient without media literacy education, aimed at equipping (young) people with the competences to reflect critically, to deal with ambiguity and diversity of opinions, with respect to freedom of expression and all other human rights.  


Propaganda, misleading information and factually wrong news has always existed, irrespective of the spread of the Internet. Therefore, one question arises: why has fake news become such a phenomenon now? Is it because the polarization of the society (social aspect) pushes people to their comfort zones, thus leading to ‘filter bubbles’ being created, or do ‘filter bubbles’ lead to polarization – which comes first? Also, what is in the business model of the dominant IT companies that makes fake news more appealing than real news? Why do some narratives (e.g. counter and alternative narratives to hate speech) and news (i.e. real news) receive less visibility than others? However, At the same time, the proliferation of Internet access gave rise to the phenomenon of mass dissemination of fake news online, adding a new dimension: fake news is rapidly reproduced on social media, mostly without editorial control and with limited ability for readers to verify content. With this in mind, the need for journalists to get involved in fact checking before publishing news, so as to advance quality media, is all the more crucial.  
Propaganda, misleading information and factually wrong news has always existed, irrespective of the spread of the Internet. Therefore, one question arises: why has fake news become such a phenomenon now? Is it because the polarization of society (social aspect) pushes people to their comfort zones, leading to the creation of ‘filter bubbles’, or do ‘filter bubbles’ lead to polarization – which comes first? Is the business model of the dominant IT companies makeing  fake news more appealing than real news? Why do some narratives (e.g. counter and alternative narratives to hate speech) and news (i.e. real news) receive less visibility than others? At the same time, the proliferation of Internet access gave rise to the phenomenon of mass dissemination of fake news online, adding a new dimension: fake news is rapidly reproduced on social media, mostly without editorial control and with limited ability for readers to verify content. With this in mind, the need for journalists to get involved in fact checking before publishing news, so as to advance quality media, is all the more crucial.  


Moreover, personalized algorithms which optimize instant gratification by prioritizing information that matches the world view of the user have reinforced the creation of ‘filter bubbles’ or ‘echo chambers’, facilitating and causing fake news to spread at ease. Are such algorithms a symptom of segregation in society, in lack of communication with each other? Is ‘bubble-piercing’ through similar algorithms to confront users with ‘the view point of the other side’ a feasible, desirable and/or likely to be effective approach? Is flagging or removing fake news enough? As ‘filter bubbles’ become the norm, helping the spread of fake news, the exposure of the reader to news that can challenge fake news is limited. Creating fake social media accounts compromises freedom of expression, while publishing fake news as advertisements undermines journalistic ethics. It would thus not be an exaggeration to say that fake news, while not yet being clearly defined and not always easy to spot, is a potential threat to democracy.
Moreover, personalized algorithms which optimize instant gratification by prioritizing information that matches the world view of the user have reinforced the creation of ‘filter bubbles’ , facilitating and causing fake news to spread at ease. Are such algorithms a symptom of segregation in society, in lack of communication with each other? Is ‘bubble-piercing’ through similar algorithms to confront users with ‘the view point of the other side’ an approach that is feasible, desirable and/or likely to be effective? Is flagging or removing fake news enough? As ‘filter bubbles’ become the norm the exposure of the reader to news that can challenge fake news is limited. Creating fake social media accounts compromises freedom of expression, while publishing fake news as advertisements undermines journalistic ethics. It would thus not be an exaggeration to say that fake news, while not yet being clearly defined and not always easy to spot, is a potential threat to democracy.


What is more, increased Internet access mainly via mobile devices does not provide complete access to the breadth of Internet/information. As fake news also often contains elements of hate speech and justifies or reinforces hate narratives and conspiracy theories, media literacy education, critical thinking and human rights education are urgently needed, ensuring a coherent joint effort of all stakeholders to play their role in safeguarding and fostering democracy.
What is more, increased Internet access mainly via mobile devices does not provide complete access to the breadth of Internet/information. As fake news also often contains elements of hate speech and justifies or reinforces hate narratives and conspiracy theories, media literacy education, critical thinking and human rights education are urgently needed, ensuring a coherent joint effort of all stakeholders to play their role in safeguarding and fostering democracy.

Revision as of 06:55, 16 May 2017

Programme overview 2017

PL 2: Internet in the “post-truth” era?

Consolidated programme 2018 overview

To follow the current discussion on this topic, see the discussion tab on the upper left side of this page


Final title of the session: Please send the final title until latest to wiki@eurodig.org. Do not edit the title of the page at the wiki on your own. The link to your session may otherwise disappear.

Session teaser

The internet has given voice to the many, great!, but how and who can now distinguish between ‘facts’, ‘opinions’ and ‘feelings’?

Keywords

Post-truth, fake news, manipulation, emotional appeal, post-truth politics, accountability, legitimacy, freedom of expression, access to information, algorithms, profit, content creation, narratives, ethic, media literacy, response to hate speech, take down, education, alternative narratives, trusted flaggers, multi-stakeholder approach, code of conduct, democracy, gatekeepers

Session description

Deployment of Internet infrastructure is insufficient without media literacy education, aimed at equipping (young) people with the competences to reflect critically, to deal with ambiguity and diversity of opinions, with respect to freedom of expression and all other human rights.

Propaganda, misleading information and factually wrong news has always existed, irrespective of the spread of the Internet. Therefore, one question arises: why has fake news become such a phenomenon now? Is it because the polarization of society (social aspect) pushes people to their comfort zones, leading to the creation of ‘filter bubbles’, or do ‘filter bubbles’ lead to polarization – which comes first? Is the business model of the dominant IT companies makeing fake news more appealing than real news? Why do some narratives (e.g. counter and alternative narratives to hate speech) and news (i.e. real news) receive less visibility than others? At the same time, the proliferation of Internet access gave rise to the phenomenon of mass dissemination of fake news online, adding a new dimension: fake news is rapidly reproduced on social media, mostly without editorial control and with limited ability for readers to verify content. With this in mind, the need for journalists to get involved in fact checking before publishing news, so as to advance quality media, is all the more crucial.

Moreover, personalized algorithms which optimize instant gratification by prioritizing information that matches the world view of the user have reinforced the creation of ‘filter bubbles’ , facilitating and causing fake news to spread at ease. Are such algorithms a symptom of segregation in society, in lack of communication with each other? Is ‘bubble-piercing’ through similar algorithms to confront users with ‘the view point of the other side’ an approach that is feasible, desirable and/or likely to be effective? Is flagging or removing fake news enough? As ‘filter bubbles’ become the norm the exposure of the reader to news that can challenge fake news is limited. Creating fake social media accounts compromises freedom of expression, while publishing fake news as advertisements undermines journalistic ethics. It would thus not be an exaggeration to say that fake news, while not yet being clearly defined and not always easy to spot, is a potential threat to democracy.

What is more, increased Internet access mainly via mobile devices does not provide complete access to the breadth of Internet/information. As fake news also often contains elements of hate speech and justifies or reinforces hate narratives and conspiracy theories, media literacy education, critical thinking and human rights education are urgently needed, ensuring a coherent joint effort of all stakeholders to play their role in safeguarding and fostering democracy.

Format

Please try out new interactive formats. EuroDIG is about dialogue not about statements, presentations and speeches. Workshops should not be organised as a small plenary.

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, so only links to external resources are possible.

From Yrjö Länsipuro:

1. Reuters Institute (Nic Newman) A report on key developments in journalism, media and technology in 2016 focuses on fears about how changing technology is affecting the quality of information and the state of our democracy. http://reutersinstitute.politics.ox.ac.uk/sites/default/files/Journalism%2C%20Media%20and%20Technology%20Trends%20and%20Predictions%202017.pdf

2. Pew Center: Experts worry algorithms can also put too much control in the hands of corporations and governments, perpetuate bias, create filter bubbles. http://www.pewinternet.org/2017/02/08/code-dependent-pros-and-cons-of-the-algorithm-age/

3. Meedia: Bislang gibt es keine Belege dafür, dass die gefälschten Nachrichten hierzulande überhaupt ein ernstes Problem darstellen. Springer-Chef Mathias Döpfner meint sogar, dass es schon immer Fake-News gab. http://meedia.de/2017/01/17/fake-news-debatte-jetzt-widersprechen-die-nachrichten-praktiker-der-politik/

4. Deutsche Welle: Countering fake news while safeguarding free speech. As fighting fake news becomes a global priority, questions remain about what exactly fake news is, what can be done to stop it and how to do so without curbing free speech. http://www.dw.com/en/countering-fake-news-while-safeguarding-free-speech/a-37924278

5. University of Washington. Examining the Alternative Media Ecosystems through the Production of Alternative Narratives of Mass Shooting Events on Twitter http://faculty.washington.edu/kstarbi/Alt_Narratives_ICWSM17-CameraReady.pdf

6. Seattle Times. A story about the above http://www.seattletimes.com/seattle-news/politics/uw-professor-the-information-war-is-real-and-were-losing-it/

7. Vanity Fair. Fake News Is About to Get Even Scarier than You Ever Dreamed. What we saw in the 2016 election is nothing compared to what we need to prepare for in 2020. http://www.vanityfair.com/news/2017/01/fake-news-technology

8. New York Times. Europe Combats a New Foe of Political Stability: Fake News https://www.nytimes.com/2017/02/20/world/europe/europe-combats-a-new-foe-of-political-stability-fake-news.html?_r=0

9. Tow Center. The influence infl uence of social media platforms and technology companies is having a greater effect on American journalism than even the shift from print to digital. There is a rapid takeover of traditional publishers' roles by companies including Facebook, Snapchat, Google, and Twitter that shows no sign of slowing http://towcenter.org/wp-content/uploads/2017/04/The_Platform_Press_Tow_Report_2017.pdf

10. Tow Center. Journalists today have an imperative—and an opportunity—to sift through the mass of content being created and shared in order to separate true from alse, and to help the truth to spread. Unfortunately, as this paper details, that isn’t the current reality of how news organizations cover unverified claims, online rumors, and viral con- tent. Lies spread much farther than the truth, and news organizations play a powerful role in making this happen. http://towcenter.org/wp-content/uploads/2015/02/LiesDamnLies_Silverman_TowCenter.pdf

11. New York Times Magazine (Farhad Manjoo) Mark Zuckerberg now acknowledges the dangerous side of the social revolution he helped start. But is the most powerful tool for connection in human history capable of adapting to the world it created? https://www.nytimes.com/2017/04/25/magazine/can-facebook-fix-its-own-worst-bug.html?_r=0

12. New York Times (Jonathan Taplin) Google has an 88 percent market share in search advertising, Facebook (and its subsidiaries Instagram, WhatsApp and Messenger) owns 77 percent of mobile social traffic and Amazon has a 74 percent share in the e-book market. In classic economic terms, all three are monopolies... Could it be that these companies — and Google in particular — have become natural monopolies by supplying an entire market’s demand for a service, at a price lower than what would be offered by two competing firms? And if so, is it time to regulate them like public utilities? https://www.nytimes.com/2017/04/22/opinion/sunday/is-it-time-to-break-up-google.html?_r=0

13. The Guardian (Evgeny Morozov) Moral panic over fake news hides the real enemy – the digital giants https://www.theguardian.com/commentisfree/2017/jan/08/blaming-fake-news-not-the-answer-democracy-crisis

14. Digitoday: How Sweden is fighting fake news http://digiday.com/uk/fake-news-in-sweden/

15. Washington Post: How do you stop fake news? In Germany, with a law. https://www.washingtonpost.com/world/europe/how-do-you-stop-fake-news-in-germany-with-a-law/2017/04/05/e6834ad6-1a08-11e7-bcc2-7d1a0973e7b2_story.html?tid=ss_fb&utm_term=.0a3b50c700c3

16. Nieman Lab. Wikipedia founder Jimmy Wales launches Wikitribune, a large-scale attempt to combat fake news. The crowd-funded news platform aims to combat fake news by combining professional journalism with volunteer fact checking: “news by the people and for the people.” http://www.niemanlab.org/2017/04/wikipedia-founder-jimmy-wales-launches-wikitribune-news-by-the-people-and-for-the-people/ and: https://www.wikitribune.com/

17. NewCo (Krishna Barat) How to Detect Fake News in Real-Time https://shift.newco.co/how-to-detect-fake-news-in-real-time-9fdae0197bfd

18. New York TimesA European front on fake news. https://www.nytimes.com/2017/05/01/business/europe-election-fake-news.html?_r=0


From Rachel Pollack:

19. UN, OSCE, OAS and ACHPR Special Rapporteurs on freedom of expression: Joint Declaration on Freedom of Expression and 'Fake News', Disinformation and Propaganda http://www.osce.org/fom/302796?download=true

20. UNESCO Colloquim: Journalism Under Fire http://en.unesco.org/journalism-under-fire

21. UNESCO: Campaign for World Press Freedom Day (video) https://www.youtube.com/watch?v=2fPm7d_RxeU

People

Please provide name and institution for all people you list here.

  • Focal Point Menno Ettema and Irina Drexler, Council of Europe – No Hate Speech Movement

Focal Points take over the responsibility and lead of the session organisation. Focal Points are kindly requested to observe EuroDIG's session principles. Focal Points work in close cooperation with the respective Subject Matter Expert (SME) and the EuroDIG Secretariat.

  • Subject Matter Expert (SME): Yrjö Länsipuro (ISOC Finland)

SMEs are responsible for the clustering of submissions into a thematic category they have an expertise in. They define subtopics and identify submissions which fall under this subtopic. The aim is to verify submissions which can be merged in one session. In the course of the session organising process SMEs will serve as a mentor for the respective category by supporting all Focal Points.

  • Key Participants (for workshop) or Panellists (for plenary)

Until 15 May 2017. Key Participants (workshop) are experts willing to provide their knowledge during a session – not necessarily on stage. Key Participants should contribute to the session planning process and keep statements short and punchy during the session. Panellist (plenary) will be selected and assigned by the org team, ensuring a stakeholder balanced dialogue also considering gender and geographical balance. Panellists should contribute to the session planning process and keep statements short and punchy during the session.

1. Sven Mikser – Minister of Foreign Affairs, Republic of Estonia [1]

  • Moderator

Until 15 May 2017. The moderator is the facilitator of the session at the event. Moderators are responsible for including the audience and encouraging a lively interaction among all session attendants. Please make sure the moderator takes a neutral role and can balance between all speakers. Please provide short CV of the moderator of your session at the Wiki or link to another source.

  • Remote Moderator

Until 15 May 2017. The Remote Moderator is in charge of facilitating participation via digital channels such as WebEx and social medial (Twitter, facebook). Remote Moderators monitor and moderate the social media channels and the participants via WebEX and forward questions to the session moderator. Please contact the EuroDIG secretariat if you need help to find a remote moderator.

  • Organising Team (Org Team)

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member.

1. Rachel Pollack – UNESCO, France

2. Osman Coskunoglu, PhD – Alternative Informatics Association, Turkey

3. Ansgar Koene, PhD – Horizon Digital Economy Research Institute, University of Nottingham, United Kingdom

4. Narine Khachatryan – Safer Internet, Armenia

5. Daniel Waugh - Vice President for Campaigns with the Union of Students in Ireland, Ireland

  • Reporter

Until 15 May 2017. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience
  • are to be submitted to the secretariat within 48 hours after the session took place

Please provide short CV of the reporter of your session at the Wiki or link to another source and contact the EuroDIG secretariat if you need help to find a reporter.

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.

Contact

Get in contact with the Org Team by sending an email.

Video record

Will be provided here after the event.

Transcript

Will be provided here after the event.

Messages

Please provide a short summary from the outcome of your session. Bullet points are fine.