Talk:Best practices of self- and co-regulation of platforms towards a legal framework – WS 12 2021

From EuroDIG Wiki
Revision as of 16:29, 28 April 2021 by Eurodigwiki-edit (talk | contribs) (Created page with "draft concept note workshop 12 - version 11-4-2021 WORKSHOP 12 affluent to Focus Session 4: Draft Concept note The advent of new digital media has amplified the concerns ab...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

draft concept note workshop 12 - version 11-4-2021

WORKSHOP 12 affluent to Focus Session 4:

Draft Concept note The advent of new digital media has amplified the concerns about the spread of disinformation and hate online. Against these challenges, the European Union has relied on a certain number of tools to address the challenges. Primarily, the Commission has introduced self-regulatory instruments like codes of conducts and guidelines to nudge online platforms to introduce transparency and accountability mechanisms. The code of conduct on hate speech and the code of practice of disinformation are the paradigmatic example of this European strategy. Besides, the Recommendation on measures to effectively tackle illegal content online, adopted in 2018, can be considered general approach of the Commission to challenges of content moderation.

Two years have passed from the launch of that strategy and in June 2021 all the pieces are supposed to be operational: EDMO (the European observatory on disinformation), 6 national observatories and many others. Together with these initiatives, online platforms have put in place voluntary mechanisms as suggested by the code of conducts. However, even if the monitoring of the code of conducts has shown how the online platforms’ measures have contributed to mitigating the dissemination of disinformation and hate speech online, the decision on how to address these challenges is in the hands of private entities determining how to put in place voluntary mechanism to limit harmful content online. The covid-19 pandemic has shown how voluntary mechanisms have not been effective to reduce the spread of disinformation like conspiracy theories or false information about health treatments.

This phase of self-regulation is likely close to an end. In the last years, the Union has increasingly intervened in the framework of content moderation and platform governance. The Copyright Directive, the amendments to the Audiovisual Media Service Directive or the proposal for a Regulation on terrorist content are just some examples which have enriched the approach of the Union to online platforms and online speech. The Digital Services Act and the DMA (but also EDAP) will increasingly make the entire framework less reliant on self-regulation moving the perspective towards hard regulation characterized by procedural safeguards based on transparency and accountability.

EuroDIG 2021 would be the perfect timing to picture these changes in the European digital policy. Within an increasing proliferation of instruments to address online speech and platform governance, this workshop aims to make an assessment about self-regulating disinformation and hate-online and focus on the evolution of the European path in the next years. In particular, this workshop aims to address three primary points:

  • the lesson learnt from self-regulation;
  • the status quo of online platform voluntary mechanisms;
  • the new phase of hard regulation and its relationship with self-regulations existing mechanisms.