Social media – opportunities, rights and responsibilities – WS 06 2020: Difference between revisions

From EuroDIG Wiki
Jump to navigation Jump to search
(15 intermediate revisions by 2 users not shown)
Line 1: Line 1:
11 June 2020 | 14:30-16:00 | Studio Trieste | [[image:Icons_live_20px.png | Video recording | link=https://youtu.be/7d76NQKVq0c?t=10456]] | [[image:Icon_transcript_20px.png | Transcript | link=Social media – opportunities, rights and responsibilities – WS 06 2020#Transcript]] | [[image:Icons_forum_20px.png | Forum | link=https://www.eurodig.org/?id=821]]<br />
[[Consolidated_programme_2020#day-1|'''Consolidated programme 2020 overview / Day 1''']]<br /><br />
[[Consolidated_programme_2020#day-1|'''Consolidated programme 2020 overview / Day 1''']]<br /><br />
Proposals: [[EuroDIG proposals 2020#prop_20|#20]], [[EuroDIG proposals 2020#prop_27|#27]], [[EuroDIG proposals 2020#prop_56|#56]], [[EuroDIG proposals 2020#prop_77|#77]], [[EuroDIG proposals 2020#prop_90|#90]], [[EuroDIG proposals 2020#prop_99|#99]], [[EuroDIG proposals 2020#prop_108|#108]], [[EuroDIG proposals 2020#prop_112|#112]], [[EuroDIG proposals 2020#prop_125|#125]], [[EuroDIG proposals 2020#prop_139|#139]], [[EuroDIG proposals 2020#prop_167|#167]] ([[EuroDIG proposals 2020#prop_5|#5]], [[EuroDIG proposals 2020#prop_12|#12]], [[EuroDIG proposals 2020#prop_16|#16]], [[EuroDIG proposals 2020#prop_18|#18]], [[EuroDIG proposals 2020#prop_28|#28]], [[EuroDIG proposals 2020#prop_34|#34]], [[EuroDIG proposals 2020#prop_186|#186]])<br /><br />
Proposals: [[EuroDIG proposals 2020#prop_20|#20]], [[EuroDIG proposals 2020#prop_27|#27]], [[EuroDIG proposals 2020#prop_56|#56]], [[EuroDIG proposals 2020#prop_77|#77]], [[EuroDIG proposals 2020#prop_90|#90]], [[EuroDIG proposals 2020#prop_99|#99]], [[EuroDIG proposals 2020#prop_108|#108]], [[EuroDIG proposals 2020#prop_112|#112]], [[EuroDIG proposals 2020#prop_125|#125]], [[EuroDIG proposals 2020#prop_139|#139]], [[EuroDIG proposals 2020#prop_167|#167]] ([[EuroDIG proposals 2020#prop_5|#5]], [[EuroDIG proposals 2020#prop_12|#12]], [[EuroDIG proposals 2020#prop_16|#16]], [[EuroDIG proposals 2020#prop_18|#18]], [[EuroDIG proposals 2020#prop_28|#28]], [[EuroDIG proposals 2020#prop_34|#34]], [[EuroDIG proposals 2020#prop_186|#186]])<br /><br />
Line 40: Line 41:


*https://www.coe.int/en/web/media-freedom/detail-alert?p_p_id=sojdashboard_WAR_coesojportlet&p_p_lifecycle=0&p_p_col_id=column-4&p_p_col_pos=2&p_p_col_count=3&_sojdashboard_WAR_coesojportlet_alertId=50799770
*https://www.coe.int/en/web/media-freedom/detail-alert?p_p_id=sojdashboard_WAR_coesojportlet&p_p_lifecycle=0&p_p_col_id=column-4&p_p_col_pos=2&p_p_col_count=3&_sojdashboard_WAR_coesojportlet_alertId=50799770
https://script-ed.org/article/twenty-years-of-intermediary-immunity-the-us-experience/


https://www.cato.org/blog/trumps-social-media-order-rewrites-internet-law-decree
*https://script-ed.org/article/twenty-years-of-intermediary-immunity-the-us-experience/


== People ==
*https://www.cato.org/blog/trumps-social-media-order-rewrites-internet-law-decree
Until <span class="dateline">27 April 2020</span>.
 
*https://ec.europa.eu/digital-single-market/en/news/consultation-digital-services-act-package


'''Please provide name and institution for all people you list here.'''
*https://www.pewresearch.org/internet/2017/10/19/the-future-of-truth-and-misinformation-online/


== People ==
'''Focal Point'''  
'''Focal Point'''  
*Sabrina Vorbau
*Sabrina Vorbau
Line 82: Line 84:


*'''Liz Corbin, Deputy Media Director and Head of News for the European Broadcasting Union (EBU)'''
*'''Liz Corbin, Deputy Media Director and Head of News for the European Broadcasting Union (EBU)'''
Liz Corbin, a senior editorial leader with more than 18 years' experience at the BBC across television and digital journalism, has been appointed Deputy Media Director and Head of News for the European Broadcasting Union (EBU), effective from January 2020. Liz began her career as a journalist on the BBC's domestic rolling news channel and progressed rapidly through the corporation. She has worked in a wide variety of roles, including on the BBC News at Six and Ten - the UK's most-watched evening news bulletins - and as a senior political producer where she covered British politics at the highest level. Before 2017 Liz spent four years as the BBC's Singapore Bureau Editor where she created an integrated television and digital newsroom. On returning to London she was the Editor of BBC Reality Check, a cross-platform brand aiming to reduce the impact of disinformation and "fake news".
Liz Corbin in the Deputy Director of Media and Head of News at the European Broadcasting Union. She oversees the Eurovision News and Sports Exchange, the Social Newswire, the News Events broadcast services team and Radio News. Prior to joining the EBU in January 2020, Liz worked at the BBC for 18 years, most recently as the Head of News at the international channel BBC World News. Previous roles include the Editor of BBC Reality Check where she oversaw a large expansion in the team. She was also the Singapore Bureau Editor for 4 years.


*'''Charlotte Altenhöner-Dion, Council of Europe, Head of Internet Governance Unit and Secretary to the Expert Committee on Freedom of Expression and Digital Technologies (MSI-DIG)'''
*'''Charlotte Altenhöner-Dion, Council of Europe, Head of Internet Governance Unit and Secretary to the Expert Committee on Freedom of Expression and Digital Technologies (MSI-DIG)'''
Line 89: Line 91:
*'''Paolo Cesarini, Head of Unit Media Convergence & Social Media, DG CONNECT, European Commission (EC)'''
*'''Paolo Cesarini, Head of Unit Media Convergence & Social Media, DG CONNECT, European Commission (EC)'''
Paolo Cesarini is head of the unit responsible for media convergence and social media policy at the European Commission, DG Communication Networks, Content and Technology since 2017. He previously held other management positions in the Commission, including in DG Competition. He also worked as a member of the legal service at the International Labour Organization in Geneva, Switzerland, and as a researcher at the Institute for Public and International Law of Siena University, Italy. He has been teaching EU competition law as visiting professor at Siena University, and as lecturer at Montpellier University, France.  He obtained a master degree in international law at Siena University and an LLM in EU law at the College of Europe, Belgium.
Paolo Cesarini is head of the unit responsible for media convergence and social media policy at the European Commission, DG Communication Networks, Content and Technology since 2017. He previously held other management positions in the Commission, including in DG Competition. He also worked as a member of the legal service at the International Labour Organization in Geneva, Switzerland, and as a researcher at the Institute for Public and International Law of Siena University, Italy. He has been teaching EU competition law as visiting professor at Siena University, and as lecturer at Montpellier University, France.  He obtained a master degree in international law at Siena University and an LLM in EU law at the College of Europe, Belgium.
*'''Guido Bülow, Head of News Partnerships for Facebook'''
Guido Bülow has been working as Head of News Partnerships for Facebook in Central Europe since September 2015. Since March 2019 he has been responsible for Strategic Initiatives in EMEA which includes the Third-Party Fact-Checking program. Before joining Facebook, he was head of social media at SWR (ARD). Guido started his career with bigFM, where he volunteered after completing his studies and then worked as a marketing manager.


'''Moderator'''
'''Moderator'''
Line 102: Line 107:


'''Reporter'''
'''Reporter'''
 
*Katarina Andjelkovic, [https://www.giplatform.org/ Geneva Internet Platform]
Reporters will be assigned by the EuroDIG secretariat in cooperation with the [https://www.giplatform.org/ Geneva Internet Platform]. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:
*are summarised on a slide and  presented to the audience at the end of each session
*relate to the particular session and to European Internet governance policy
*are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
*are in (rough) consensus with the audience


== Current discussion, conference calls, schedules and minutes ==
== Current discussion, conference calls, schedules and minutes ==
Line 116: Line 116:


== Messages ==   
== Messages ==   
A short summary of the session will be provided by the Reporter.
*Multistakeholder involvement (i.e. the involvement of those directly concerned and impacted by misinformation) is of utmost importance in fighting misinformation. There is also a need for the infrastructure to organise fact checking and research activities that would be available in all EU languages and would, therefore, benefit all EU countries.
*High-quality trusted news is the best antidote to fake news. To achieve that, there is a need for more reliable funding for public service journalism on one hand, and the protection of the freedom of the press by national authorities on the other. Crises such as the COVID-19 pandemic should not be an excuse for governments to restrict freedom of expression.
*Media literacy is crucial in fighting misinformation. It is very important to educate and empower people to spot misinformation and make informed decisions on whom to trust.
*In order to regulate all platforms in a uniform manner, there is a need for a more comprehensive reflection on how to construct a ‘regulatory backstop’ that creates more uniformity, more instruments with appropriate oversight mechanisms, and in cases of need, sanctions.
 
 
Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/social-media-opportunities-rights-and-responsibilities.


== Video record ==
== Video record ==
Will be provided here after the event.
https://youtu.be/7d76NQKVq0c?t=10456


== Transcript ==
== Transcript ==
Will be provided here after the event.
Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-482-9835, www.captionfirst.com
 
 
 
''This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.''
 
 
 
>> ROBERTO GAETENO: Hello, everybody. We’re starting the afternoon session and my name is Roberto Gaeteno. The session that we are going to have is a first session this afternoon at studio Triesta, Social Media ‑ Opportunities, Rights and Responsibilities, before giving the floor to the moderators, we’ll read the code of conduct. We’ll remind everybody what is the code of conduct. Basically you need to state your full name when entering the room. Raise hands using the Zoom function to ask question, when speaking, switch on the video, state your name, affiliation. Do not share the links to the Zoom meetings, not even with your colleagues..
 
Let me give the floor now to Sabrina Vorbau that is the moderator of this session.
 
>> SABRINA VORBAU: Thank you very much, Roberto.
 
Hello, everyone. Good afternoon from Brussels. Welcome to this EuroDIG session on Social Media ‑ Opportunities, Rights and Responsibilities. I’m Sabrina Vorbau, I’m a Project Manager at European Schoolnet and I will be moderating the session with Abhilash Nair who is Senior Lecturer in Internet Law at Aston Law School. Before we dive in the content, some practicalities, we encourage all participants on this call, please do use the chat throughout the sessions as much as possible for your comments and questions. We’ll make sure we leave enough room at the very end, at the end of the session for your questions and comments and to foster this dialogue. We do have to keep 5‑minutes at the end of our workshop in the tradition of EuroDIG, we have a colleague from the Geneva internet platform with us who will read out the key messages, Katarina Andjelkovic, the main takeaways of our discussions today that will later on be published on the EuroDIG website.
 
Without further ado, anticipate I’m happy to kick off the session that couldn’t come at the better time given the pandemic and many citizens that have been confined at home for weeks and months, the internet, especially social media platforms provided a lifeline for many people to stay connected with their communities, while this is obviously a great opportunity it also brings even greater responsibilities along, and not only during COVID‑19 but also when we look back at past events, social events, economic‑related events, political events, very often and in these moments, social media, it is misused, and we do see there are on certain occasions these results and an increase of this information that are circulating on the platforms. This is why I’m very happy that today I’m joined by a very great panel, a panel of experts that works around this topic of this information and works hard to combat this information on the internet.
 
Let me briefly introduce you to our panelists. We have with us Nertil Berdufi, and we have Tanja Pavleska, today, we haves Liz Corbin with us, LD/WG. We have Charlotte Altenhoner‑Dion with us from the Council of Europe, Head of Internet Governance Unit and Secretary to the Expert Committee on Freedom of Expression and Digital Technologies, we have Paolo Cesarini, Head of Unit Media Convergence & Social Media, DG CONNECT, European Commission, Guido Bulow, Head of News Partnerships for Facebook.
 
Thank you very much to all our panelists for taking the time to be with us this afternoon to discuss this very important and timely issue. I don’t want to lose any more time and dive into the conversation. I’m happy to hand the floor over to Tanja Pavleska to kick us off with the discussion. As said, she’s coming from the academic field and she has been researching around the topic of this and misinformation for years and I’m delighted for her to kick us off with some insights of the latest research of her’s. Tanja, please, the floor is yours.
 
>> TANJA PAVLESKA: I hope you can hear me well. Just confirm so I can continue.
 
>> SABRINA VORBAU: Yes. Thank you.
 
>> TANJA PAVLESKA: This is actually not only my research, it is part of a complex project which is horizon 2020 project. As part of the project we developed a methodology to which we analyze the information governance initiatives in the field of information disorder. Just to give a quick definition, by information disorder, I mean this information, this information, other information that’s heavy content, private, protected information that’s leaked from the private into the public space so that this is what I mean by information disorder.
 
As part of analyzing the initiatives across Europe that deal with combating information disorder we developed a methodology that approaches the programme from two perspectives: One, regulatory implications which means just transparency and accountability. The other is from the addressing the fundamental Human Rights. We only concentrate on a skewed set of fundamental rights which can be extended as well but the methodology is the same. We analyzed freedom of expression, equality, non‑discrimination, protection of personal information, Rights of children, freedom of assembly and association and intellectual property rights. These are the two aspects from which we approach to analyze the actual criteria within the methodology.
 
The criteria, there are five. What is the context of the implementation of this initial initiatives? How the conception, the implementation took place, meaning how much they’re open toward multistakeholder cooperation and so on what, are the mechanisms used to self-monitor themselves and track the impact of their activities, how they perform the enforcement and complaint mechanisms and how much support they get, how much the state is involved within their activities. This methodology was pilot across 24 European countries. The number of initiatives, it is 146. It is not an exhaustive set of initiatives that exist over Europe but it is statistically sufficient to give some valuable insights. I will directly go to the insights.
 
One of the most important findings for me personally was unexpected, it is that most of the initiatives are not open to multistakeholder involvement. This is very unfortunate. One of the most important things for developing the undeveloped regulatory landscape for social media issues is multistakeholder involvement. Not only is there a lack of such involvement, most of the initiatives – and I don’t give the number, around 70%, they’re only acting at national level and are not coordinated among themselves. This implies there is a lot of isolated efforts to address the problem of this information. These efforts are not coordinated among themselves which is not efficient and not as effective as it could be.
 
So another finding, it is that there is a large underrepresentation of certain types of stakeholders. For me personally the most striking finding regarding this is that there – there were only 4 out of the 146 initiatives that include digital rights and civil activists. 4 initiatives only include that type of stakeholder. As part of the main challenges that the initiatives report that they’re facing with is insufficient public awareness on the problem of information disorder. On the other hand, to give a more positive evaluation, most of the efforts that they put for the information disorder, it is on the increasing public awareness. The objectives are well‑placed, clearly defined, however – clearly – however there is insufficient public awareness among the general population since this is very much an ongoing issue. It is not easy to keep track of all of the latest events and problems that are being appearing.
 
Then we also analyzed employment of technology and technical solutions used as part of fighting information disorder and it is very, very low. Only 28% I think or around 25 use technology as part of their activities. This is mainly the fact checkers. When I say initiatives, maybe I should have mentioned that. When I say initiatives, I mean every type of initiative that is concerned with the problem of information disorder. This is a very broad range of initiatives. We can mention the public service media, the fact checkers, the media outlets in general, so a lot of different types of stakeholders involved here.
 
Another interesting result, it is that although the initiatives are – the objectives include transparency of the players in the field, they themselves are not implementing any safeguards for transparency of their methodology on the financial schemes and of the points of state inclusion within their activities. It is unfortunate. It basically means that they’re undermining their own activity in a way by not implementing, by not preaching – by not doing what they’re preaching. Here we can give some concrete recommendations as well.
 
A final finding that I will share, then concentrate on only two to point out, how important they are, the final finding, it is that there is a huge mismatch of the timelines between the practices of academia and – so between theory and practice I would say, and I will extend that a bit more.
 
One thing I would like to point out, it is a problem that needs immediate addressing, the multistakeholder involvement and the coordination between the initiatives. There was a talk today during the Plenary, I believe it was by Patrick talking about the recommendations and the baseline. Here I presented the actual state, which is quite off this baseline. Mainly regarding this multistakeholder inclusion, that and the underrepresentation of civil right activists do not mean that the issues related to the activities are not addressed, but it practically means that those who are directly concerned by the issues are not discussing them, that those that are involved in shaping policies regarding those issues. Here I would also mention the inclusion of children. For me in particular, it is an interesting phenomenon to observe that some information is considered to be misinformation or disinformation, but by the younger audience, it is just the generation that they use that they don’t see it as missed information because they have adopted either to the tone of the – the present tone across the social networks which may be striking for the older audience. We have to be aligned also with the children. If I want to be provocative here, I will put forward the following statement: The effect of policy implementation on Human Rights is much less interesting than they’re fitting within the company’s business models. Manically, when shaping the qualities, I think a lot of attention is given to how they fit within the business models of the stakeholders, buddies regarding the part of Human Rights. A bit of the devil’s advocate here, you can confront me on that later. The main point, I cannot over emphasize how important it is to act in the direction of multistakeholderism.
 
The last point, it is that actually during the panel it was stated that science is at the forefront of fighting the issue on disinformation especially regarding COVID‑19 and I would not agree completely with that, there is a huge mismatch as between I said between practice and research. I will give one example, usually I imagine it like this, so we have a problem with information disorder and it is causing huge, huge impacts on the world. The world ends because of that, and out of the ashes, a researcher rises up, says this is it, I imagined to find the definition of this information, this is the mismatch that I’m talking about while the period you process in academia allows a lot of time to pass before accepting research as valid. The policies require sometimes especially regarding this information, misinformation, immediate response.
 
Thank you very much. This is what I had to say.
 
>> ABHILASH NAIR: Thank you, that was useful and informative. You set out a few issues there. What pointed out to me is how underrepresented digital rights and Civil Rights are and the lack of multistakeholder involvement and what better platform can you think of than EuroDIG to discuss multistakeholder if you look at the stakeholder. Yes. I’m sure we’ll come back to some of those issues in more detail when we take questions and get to the discussion stage of the meeting.
 
We’ll go straight to the next speaker, Nertil Berdufi, from Beder. Nertil will speak very broadly about how an earthquake in Albania set off a series of items and the government response to that and I think he’ll pause fundamental questions, whether it is better to have no news at all rather than fake news and the role of government in media control generally. I hope I summarized that briefly and nicely for you.
 
Inches my idea is to explain how this case in Albania can be spread over Europe and the world and how the governments can use such things as emergency situation for their purposes or on the other side, how the fake news, misinformation can influence the people and how it can be such a housing public panic and how everything can be.
 
In November, 2019 Albania, we had a huge earthquake, a 6.4. After that, what happened, it was that all in the social media started a lot of putting out information that something big could happen and it was a girl, in fact, from a city near Tirana that was giving this information. After that, all the people in Tirana, other cities, they went out, it was panic, everything was blocked, all the roads, everything happened so that caused the state conditions and they went back to this girl and arrested her for causing public panic. On the other side, the girl was saying I was just tweeting, just sharing the post that another portal has made and after that, what’s happened, it was kind of spread over the media, this fake news and the Prime Minister, that was the key point, when the Prime Minister went online and he said to the public that all this media, all this social medias that are posting, that are posting this with fake and misinformation to people, they are should be blocked and that’s what happened. This portal, it was blocked by the government. What wasn’t blocked is the Facebook portal, the Facebook account that they have because it is not in the environment to close this type of portal in the Facebook area.
 
After that, what brings it together, it was that the Prime Minister directly started a new law on audiovisual and media services and in fact it was called the law on fake news. It says, this law should be ready as fast as possible so that we can block all of this fake news and other things. On the other side, all of the debates, political debates front and center, this is unconstitutional and also it somehow has blocked expression of media freedom is what really happened because of the development and the President, there is a conflict in such issue and the President sent it to a commission and before two days, the commission says this is, of course, unconstitutionally within our constitution but also it breaks a lot of freedom of expression and media freedom and the government should somehow retreat this part.
 
What I have put down, it is kind of a question, if it is better to have no news at all rather than fake news, which is kind of putting all of this pressure on this, and as we’re seeing also in the pandemic that we are in now, so we have a lot of fake news that’s going around the media, social media, all of the developments also in Europe, we have had some problems, also in Hungary, they have laws, other things that somehow caused this part of freedom of expression, and the other part, which is for media freedom. This is a big question, in fact. I don’t know how the situation will go in Albania and other countries. We can let the control of the fake news, the legislation, or could we find other solutions and range a legal bit part and somehow regulate this in front of the development and in front of the freedom of expression and freedom of media. That’s my point in this part.
 
>> ABHILASH NAIR: Thank you. That was fascinating. Thank you for sticking to the time limits. Part of the idea of a EuroDIG session is to get as much audience participation as possible in the end. Very helpful indeed.
 
I think you made two fundamental points there, which welcome back on later on: One, the regulation, the state regulation, what can over regulation do? Also I think that raises a further conceptual question as to how do you differentiate opinion on one hand with facts on the other. I’m sure we’ll come back to that in due course.
 
Well, the next speaker is head of news partnerships, Facebook, Guido Bulow, you’re the only industry here on the panel. No pressure. We wanted to ask how you handled this and what social company, company – social media, companies can do to make that multistakeholder approach more workable as Tanja pointed out in the first presentation.
 
Over to you.
 
>> GUIDO BULOW: Thank you for the invitation, the chance to, yeah, have that discussion with all of you.
 
When I talk about our approach and integrity of our platforms – yeah, our platforms, it is not simply just Facebook anymore, it is Instagram, other apps, we have a three‑fold approach. On one hand we remove content from our platform that violates our community standards. As a second stab, we reduce the distribution for those that does not directly violate our standards that offends the authenticity of the platform. We reduce that distribution and release. And an important topic that usually is underrated, less discussed, it is we want to empower people to decide for themselves on the content of the product. If you want to say that’s news literacy. Media literacy.
 
I don’t have to focus on the first topic, community standards, hopefully you’re aware of that. An example, we removed from our platform as an example fake accounts, just last quarter we removed 1.7 billion fake accounts which were by the way also transparent, that’s in our transparency reports which are publicly available and have just been rereleased. We don’t remove any misinformation. Misinformation actually falls in the second bucket where we’re working together with 60 partners globally, more than 555 languages, in Europe, 31 partners with 27 languages and we’re expanding that network of partners we’re working together with constantly.
 
When I talk about the fact checking programme for a second, we are working with those partners that are certified by the international fact checking, so the global standard and everyone is abiding by the same principles by the same code of conduct and the transparency, transparency of funding sources, things like that. The goal of the programme obviously is to combat misinformation to identify and address misinformation with false claims, content that can be harmful or that is financially motivated, things like that.
 
There is one exception which I put out front right away because it has been discussed a lot in the last few weeks. We have an exception for political speech so that content that’s been directly created by politicians, it is not eligible for the fact check. We don’t do that because we want to help politicians, we feel that politicians – that people should hear directly from their elected officials and the private company should not interfere in that political discourse. That being said, in the last especially since – we’re still in the coronavirus crisis, in the last few months our fact checking partners have submitted 7500 Articles which resulted in 50 million misinformation overlays that we put on top of identified misinformation which by the way, we reduced the amount of distribution on the platform. 95% of the people actually didn’t engage any more with the postings. They didn’t engage them by commending or sharing or liking them. It is exactly what we want to achieve.
 
Other than that, we also – what we also do as part of the process, it is to also inform people retroactively. Just assume that you have shared something in the past and it is now getting fact checked. We’ll also inform people by sending notifications. We don’t just stop there with Facebook, we extend the programme also to Instagram at the end of last year so that a single fact check, let’s say a video, an image where we’re able to detect duplicates, they have a much greater impact than just on a single platform. It is quite harder, of course, to do that with Articles. We’re making progress to look at the duplicates, especially with the use of technology and our fact checking partners, they have a potential similar Article where they can apply ratings, similar retaining that is allied to one Article to other Articles as well just to speed up the process of fact checking..
 
We also acknowledge, there are a certain stakeholders that share misinformation more often and we enforce that, if people are repeatedly sharing misinformation we demote the page or demote technically the bad actors, they don’t get distribution any more on the platform. We also revoke the rights to use our advertising products or other products. No one is able to Ernie money or boost that content on our platform anymore. Apart from that, we take the signals we get from the fact checking programme also into other areas. We have started the news public lecher index, it is a registration for publishers that they can self-identify them as news publishers on the platform and that signal that we get from the fact checking partner plays a role into that as well. There are other products we can talk about a bit longer, I know we have time constraints here. Quickly I wanted to jump on the last part, media literacy, I personally feel that’s very, very important for people to decide on their own and spot misinformation on their own, that’s why we’re partnering with publishers to come up with initiatives, school projects, school children in Germany thousand learn more about what it feels to be a journalist, what are the challenges as a journalist, things like that, we’re talking about doing news literacy campaigns on the platforms. Maybe you have seen our coronavirus information on top of news feed or when you search for coronavirus on Instagram, you do get more information from WHO and other reliable sources. The same with WhatsApp, we partnered with local health authorities. Yeah. I think that’s it in a nutshell. Like I said, given that we have only 5‑minutes time, I want to hand it over back to you.
 
>> ABHILASH NAIR: Thank you. I’m sure there will be questions at the end. I already see a question in the chat window and we’ll pick that up later but essentially on the political speech exception you talked about earlier. I also have a view on that, on what counts as public interest, is it politician speech or other stakeholder’s speech that also counts in that public exception. We’ll come back to that later in the session.
 
Our next speaker, Liz Corbin, Deputy Media Director and Head of News for the European Broadcasting Union, she comes from the public service media, on the website, it is universality, independence, excellence, diversity, accountability, innovation are the coprinciples that guide them and obviously, you couldn’t think of a more polar opposite world in social media where no such controls exist. We’ll speak a little bit about those aspects as well.
 
Over to you.
 
>> LIZ CORBIN: Thank you very much. I’m really, really pleased just as our previous speaker said to have this conversation today. It is absolutely crucial because the stakes, they’re so very, very high. It is very – at the basic level, this is helping everyone, everywhere understand what is true, what isn’t, and the current moment, the current situation we’re in, this means understanding a crisis which is having a profound effect on all of our lives. It has become abundantly clear that the public crave information they can trust, that they have had enough of fake news. Why is trusted content so important? Surely more information, more facts, more opinions is what’s needed, no filter. People can see everything that’s out there, make up their own minds about what is true. Of course that sounds incredibly attractive, and it is understandable that people would like that theory, but, of course, the reality is completely different. People aren’t stupid, they realize there is huge amounts of incorrect information out there. They’re wondering instead whether they should trust nothing they see any more, believe nothing tall, even the established public service broadcasters, maybe they shouldn’t trust them either. The problem is when people believe nothing tall, as we know, they’ll actually believe anything. That is the future if we don’t act now. If this current crisis is not motivating us to act, frankly, I don’t know what will.
 
COVID‑19 is a global problem as Tanja said earlier, we must collaborate more. The virus doesn’t know borders, nor does fake news. There is no one vaccine that will purge this pandemic and we have to act together in lots of different arena, lots of different ways.
 
At the Broadcasting Union, of course, we have been doing just that, all of our members are public service broadcasters and they’re sharing content advice and experiences of getting through the crisis. We brought fact checkers together to share crucial information with each other, so as not to duplicate, to take advantage of sharing that information. We have shared world class investigation journalism to get a large audience as possible. What’s important, in a world where there is so much incorrect information, fake news, we need to put as much – a huge amount of effort into the real news content that’s out there.
 
It costs nothing to spread fake news, it costs a small fortune to counter it. What public service media does, proper journalism, they’re regulated, there is accountability to the important positions that they hold in society. That’s absolutely, right. Wouldn’t you now say that the social platforms hold an incredibly influential position in society, where is there accountability, to the public, where is the regulation, and they’re no longer start‑ups, they’re not new anymore, they’re among the wealthiest companies in the world and just now I have heard from Peter, from Facebook about the work they have been doing recently, the other platforms have been doing similar efforts, particularly during this crisis, and they told us about the millions of posts and accounts that have been removed, the efforts that have been banned, but the fact is, we still see dangerous fake news online. I’m sure you see it in your feeds, and so do you wonder how effective these actions are? Unfortunately, there is no independent verification of how successful the initiatives are. The companies keep the data very private. You have to ask how much longer is this going to be okay.
 
In a communication that was published yesterday by the European Commission, talking about tackling fake news and disinformation in COVID‑19 it said there needs to be more transparency and accountability from the platforms. Basically the system we have at the moment that allows them to mock their own homework, it is just not the same as being accountable.
 
We know that fact checking is working. And it is playing an important role in this crisis. Facebook is doing good work in that area. They’re bringing down biggest myths and biggest lies. But it is only effective if the fact checkers can really reach the people that saw the fake news in the first place. What’s worrying, in that same European Commission communication yesterday, it said that platforms have not sufficiently empowered fact checkers during the crisis. For example, by making more data available or giving prominence to the fact checks. The problem for us, public service broadcaster, usually you’re blamed for not reaching the people with information, is it the platforms or the public media association?
 
I’m a public service journalist, I’m also duty bound to talk to you today about where I think the public service media could be doing more and there is one particular area I would like to highlight.
 
It is what actually made all the platforms so successful. When you open Facebook or Twitter, YouTube, Instagram, you see people like you. You see people who look like you, who think like you, who have similar life experiences to you. You feel comfortable, these are your people. These are people you can trust. These are people who share fake news, but because the guard is down, you share it onwards. Despite laudable progress in this area, a diverse world is not fully reflected in the content produced by our public service media. This is well‑recognized. We must move faster. Our mandate will shrink with every day we fail to represent all of our audiences. To survive, we must be trusted, to be trusted, we have to be authentic and we can only be authentic if we really live and represent the experiences of the public we serve.
 
In summary, I think everybody here today will be in agreement that high‑quality trusted news is the best antidote to fake news, that means we need more reliable funding for public service journalism and for political leaders to protect it as a core pillar of democracy. It means we need freedom to do our journalism without hinderance, COVID‑19 should not be an excuse for governments bent on restricting freedom of expression, protests in the U.S. should not mean that journalists are fair game to be targeted by law enforcement. It means that there is a requirement for major technology platforms to prioritize properly real news content and for them to be more accountable with the influence that they have.
 
Thank you.
 
>> SABRINA VORBAU: Thank you very much for sharing your thoughts and comments with us. I’m pretty sure we have also the other panelists who would like to make some comments. I also see that people are starting posting some questions in the chat. We’ll definitely come back to them later on.
 
Now I would like to give the floor to Paolo Cesarini from the European Commission. We have definitely seen that the European Commission has been doing a lot of work around this topic and especially during the COVID‑19 crisis. There was also a lot of conversations that took place between the European Commission and social media companies. These conversations were also accessible for the public and just last week the commission also launched a digital service act which is actually now open for public consultation. This is to strengthen the protection of fundamental rights such as freedom of expression protecting citizens against harmful content. I’m delightful to have Paolo on the panel with us to talk more about the work that the commission is doing in this regard.
 
Over to you. P thank you. Good afternoon to everybody.
 
Can you hear me? It is okay? Very good.
 
>> ABHILASH NAIR: Yes. Thank you.
 
>> PAOLO CESARINI: Let me start by recalling that the Commissioner has been working hard on this complex topic for now more than two years. Our approach has been set out in different documents, including the action plan against this information from December, 2018. It is important to stress that the underlying approach, it is always the same, it has remained very much with the basic principles of the fundamental rights, freedom of expression, media pluralism and freedom and while at the same time recognizing the need to act upon behaviors, conducts which take place online, particularly on social media platforms that in fact undermine this fundamental rights because this information is a means to undermine the right of everybody to receive and impart information, to receive reliable information. So that’s the starting point which just translated into a number of actions that reflect the whole of government, the whole of society engagement to have this complex phenomena.
 
The COVID‑19 crisis has just underlined the need for this action to move forward. It is an excellent test case to see to what extent the work done so far has produced results. In my view, and the commissioner noted yesterday in a communication, you know, that underline, it the work done so far has had some good results. These results are certainly not sufficient.
 
The COVID‑19 has also highlighted how complex the phenomena is. What we have been assisting during these months, it is phenomenal that mix, that intentional spread of dangerous and conspiracy theory was an overload of information that in itself creates confusion on audiences. We have seen, you know, unintentional spread of information which is inaccurate, but nevertheless perceived by people as accurate and the result is into behaviors that undermine the containment policies that various governments put in place in order to control the spread. We have seen conspiracy theories shifting from online to the offline world, an example of the stories around the 5G deployment has been one of the causes of the spread of the virus, the stories that had been inciting people to take concrete actions in the real world with attacks on network infrastructures, with taking actions against employees of Telecom employees of various Member States and we have seen other types of warning things, such as resurgence of hate speech, particularly addressing certain ethnical group, that they’re responsible for the spread of the virus. We had seen consumer, false products, scams, that are sold by using information that was clearly false and we have seen cybercrimes, hacking, phishing using COVID‑19 as a way to spread malware.
 
Most importantly, we have seen foreign actors in certain countries like Russia, China, exploiting these circumstances to spread deliberately a number of stories aimed to undermine the credibility of the actions taken by European government to control the crisis and on the other side, to improve that image worldwide.
 
So this is to say that when we talk about this information, information disorder someone has just recalled this terminology, we need probably to keep in mind that we need to distinguish the different – the forms of misleading content that’s been translated into an epidemic during this period and taking into account that some may be illegal, others, not illegal, we should take into account that in certain cases that’s not an intention to cause harm, but in other case, the intention, it is clearly demonstrated by the tools and the means that are put in place when this information is spread on digital media and platforms.
 
We need to have a calibrated response that take as into account the harm, the intent, the form of dissemination, the actors involved, and their origin. For instance, if we talked about misinformation, which you could define as a form of intentional dissemination of false, misleading information, (audio issue)..
 
The intention of the information, the campaign – (audio issue).
 
>> SABRINA VORBAU: I believe we are – we’re losing you a little bit, Paolo.
 
>> PAOLO CESARINI: I have some connection problems apparently. Is it better now?
 
Is it better now?
 
>> ABHILASH NAIR: Yes. Now better. Before you were breaking up.
 
>> PAOLO CESARINI: Unfortunately, it is quite unstable. I see warnings appearing on my screen and I can do much about it, I cannot control the wi‑fi. Hopefully –
 
>> ABHILASH NAIR: We can hear you now.
 
>> PAOLO CESARINI: So this is to say the complex phenomena, it has been reflected upon in the communication that was adopted yesterday where there’s a number of actions that should, that will be carried out in the next week, months in parallel, including the strengthening of the strategic information and outside of the E.U., they include mechanisms for Member States to better cooperate in terms of exchange of information, exchange of situational analysis and threat analysis between themselves, and they include also a better cooperation with international partners like NATO, G7 And, of course, a very important part of this communication, it is about the responsibilities and platforms, it is about the importance to ensure freedom of expression and to ensure a pluralistic, democratic debate, it is about raising citizens’ awareness, and it is about reporting, fact checking, research activities. In particular, as the topic today, it is very much focused on platforms, I would say that during the crisis platforms have reacted, listening to the concerns that the Commissioner has taken to them, they have taken action particularly in terms of raising the visibility of authentic sources, including the WHO, public health authorities and the media and their own service, they have been demoting content that was fact checked as false or misleading and in extreme cases where the content would be against the terms of service of the platforms that have been moving content which was clearly harming Chief Evangelist health in certain cases, public security is an example of the attack of the 5G infrastructures. This is very good. Certainly, it is important, lessons that one can learn, when there are conditions, whereby the platform are based with their own social responsibility, they take action.
 
They feel action when they feel the heat from the public scrutiny. (Audio issue). – monitoring programme very much focused on the ways which whereby the sources and the services, very much focused on cooperation that platforms established with fact checkers and that they fact check contents, very much focused on the type of manipulative behavior that’s detected on their own platforms that includes, of course, the fight against fake news accounts and other things, leak the news, the fake engagements or the intricacy against a behavior that’s one of the methods that they use to influence our domestic debate in Europe. We focus on better understanding, the revenue flows that come from the advertising and goes sometimes to the wrong places instead of contributing to the fines you a thundershower active, professional media and we’ll carry out that programme in the next months in the same model that was applied in the European elections in essence.
 
We also like to extend this dialogue to other platforms, we have seen other platforms like Tik Tok for instance, they have surged in importance of user we have in the last months. They should be a part of this conversation we believe.
 
A finality, a final couple of points, all these efforts will be doomed if at the same time we will not put in place an adequate system in order to set expectations to support professional media during this period through crisis, the crisis coming from the dramatic fall in advertising revenues, it is one of the causes and we need to provide structure for the long‑term sustainability of the media sector.
 
Secondly, we need to really watch out for governmental actions that through the COVID crisis they may take measures to exclude freedom of expression. We have had an example you wanted our own eyes within probably the internal borders of the E.U. Thirdly, we have to have a more serious structure in place to organize fact checking and research activities around the information in a way that all the languages of the E.U., all of the countries within the E.U. can benefit the same type of approach that fact checkers had been implementing, experimenting, developing during the last months. We need an appropriate structure. This structure has come to light. In the first of June, the European media observatory was officially launched and in the next months we’ll start to create a true network of fact checkers and researchers across Europe to carry out dedicated and thematic research that will enable a better detection, better analysis and a better exposure of the information, threats and strengths. That work will not just be an end in itself, that work should feed into the work of the professional media that could fund their source of information in order to increase the accuracy of their own reporting. It will also be a source of important educational materials that the media literacy community, so the media literary practitioners could use in order to – (audio issue).
 
>> SABRINA VORBAU: We lost you again. Sorry to interrupt you, staying with our time, I seal there are lots of questions.
 
>> PAOLO CESARINI: I have just finished.
 
>> SABRINA VORBAU: Perfect. Thank you so much. Thank you so much for your contribution.
 
I see so many questions coming into the chat. We just want to assure to make enough time for everyone on the call. Thank you so much. I think you made very, very interesting points and it is very great to see how intensive the commission is working on this. For example, you mentioned that this is not only this information but other issues such as hate speech which nicely also links to our final key participant on the panel with us, Charlotte Altenhoner‑Dion from the Council of Europe, and so I’m delighted to hand over to you now to hear a bit more about the work of the Council of Europe is doing and this morning, one of the panel session, it was described as the watchdog of Human Rights. We’re very delighted to have you with us, to hear where you see the responsibilities of the Council of Europe in this regard.
 
Please, over to you.
 
>> (Audio quality too poor to translate).
 
>> SABRINA VORBAU: Excuse me. I’m not sure if it is only me, but we hear you a little bit not clear. Maybe you can try again. I’m afraid the connection is not great. I don’t know if it is just on my end or other colleagues –
 
>> ABHILASH NAIR: No. I cannot hear either. It may be on her end. You may have to turn off your video and start talking to improve the quality.
 
>> SABRINA VORBAU: I’m really sorry.
 
>> CHARLOTTE ALTENHONER-DION: (Audiovisual yes quality too poor to transcribe).
 
>> SABRINA VORBAU: This is a negative part of all being online. Maybe because we don’t obviously want to lose much time, maybe you can try to figure out your audio and maybe we can come back to you then in a moment. As I said, there’s been a lot of conversation and questions.
 
>> CHARLOTTE ALTENHONER-DION: I’m so sorry.
 
>> SABRINA VORBAU: Don’t worry. That’s fine.
 
Maybe we can just have a look in the meantime in the chat, there has been indeed a lot of questions and interactions and I’m pretty sure some colleagues on the Pam would also like to make some comments based on the statements that we’re giving – that were given by other colleagues and maybe in the meantime we can get back to Charlotte Altenhoner‑Dion. To everyone on the call, feel free to use the chat function or raise your hand if you have any questions.
 
There was immediate feedback from a few colleagues on the call towards Guido in regard to the fact checking of the content that politicians are sharing on social media platforms. Maybe you can elaborate on this. People in the chat were wondering if the content that politicians are posting on social media that’s rather harmful or certainly slightly – or should be fact checked. Maybe you can comment on this question. Afterwards, we’ll try with Charlotte Altenhoner‑Dion again.
 
>> GUIDO BULOW: Sure. Thank you for the opportunity to answer that question.
 
What I tried to say, in a democracy, we do believe that people should decide what is credible and not technology companies. We didn’t want to interfere in that political discourse between politicians and people who elected the politicians. That being said, be and plus, we also know that political speech is probably highly scrutinized. That being said, politicians aren’t able to say whatever they want to say. If anything violates our Committee standards, it will be removed from our platform. If politicians share misinformation that could cause harm to people, and we have seen cases of that in Brazil as an example during the coronavirus crisis, we removed content from the platform, regardless if that’s politician, if they’re – I don’t know, from myself, we don’t make any distinction here. There are rules that are for everyone on the platform. This is our community standards. This is on top content that could entice real world harm. The only thing where we – that is not legible to be fact checked, this is political speech. That’s the only exception on our platform.
 
>> SABRINA VORBAU: Thank you so much. Guido for elaborating on the question.
 
Maybe we can check one more time with Charlotte. Looks like she changed the room slightly in the meantime. We can try again with you. It would be obviously very great to hear your views as well.
 
Maybe she’s trying to reconnect.
 
I also see someone from the audience who raised their hand. I would kindly ask you to unmute Mike Harris to pose your question.
 
Mike, the floor is yours. Please tell us where you’re from and from where you are calling today.
 
>> Mike Harris: Founder of XNMA Berlin‑based firm.
 
When talking about this, we rarely get off the subject of what companies should and shouldn’t be doing. While all of these points are valid, they’re distracting from the real problem, it is that social media platforms, operator governance, they’re governing details of our lives with the broadest of strokes, most roles on social media platforms, they enact civil liberty, they should be entirely owned by society, there are many reasons why the web has shaped itself in the way that it has. The only objective one, it the network effect, it is the primary driver for the successes of many of the big tech firms. I’m not suggesting the end of Facebook, Twitter, to break up big tech, I’m saying that there is no point in doing that. The network effect, it is a law, it can’t be avoided. Let’s shape governs of the platforms into something that fits with that and is equitable to us as societies.
 
To solve the problems that have been discussed here today requires us to acknowledge that networks of people become coercive monopolies, if we can do that, we stand a chance of reshaping the web into an advanced, diverse, competitive network. My view is that Facebook just shouldn’t have the right to say what they are or are not going to do about political speech.
 
It is just not their decision.
 
Thank you.
 
>> You’re absolutely, right. Thank you for bringing that up. It is a good segue to something we have set up over the last, I think one and a half, two years, and it is called the oversight board. We know we should make so many important decisions on free expression and while we always have taken advice from experts we feel we can do much better, which is the reason why we have setup an over site board, in the beginning, yes, we initially picked a few people, then the board itself picked people from all over the world and is still growing. At the moment we have 20 people from all over the world and there are former Presidents from Denmark, for example, there are people from Civil Society organizations and many more. They are operating absolutely independent. They are checking certain content decisions, and they – I wouldn’t say they set the rules. They will in the end decide. It is something that we will actually follow. We will apply what they are actually deciding. The board is growing to up to 40 people from all over the world which will hopefully represent the more than 3 billion people that are on our different platforms and decisions that at the moment we’re making on our own. Like I said, in the beginning, where we feel very, very uncomfortable as a private company to make the decisions on our own. One of the things, that we repeatedly also have Nick Lack, in charge of communications and policy globally, he’s repeatedly arguing that we need smart regulation in certain areas. We shouldn’t set the rules for when to label, for example, political advertising on our platform. We came up with that. We wanted to provide transparency. In the end, it should be something that governments are actually telling us how to do that.
 
I mean, we have certain standards, but ideally that comes from other people. I totally echo your point. we shouldn’t have so much power, and this is one of the reasons why we have created this oversight board.
 
>> SABRINA VORBAU: Thank you so much, Guido, for mentioning the oversight board. I’m pretty sure a lot of people in today’s call have took note of this.
 
I just received a message from Charlotte, she back with us. I would like to one more time try with her to hear her properly and to give her input on bow half of the Council of Europe.
 
>> (Audio quality too poor for transcription).
 
>> SABRINA VORBAU: One more time. It is still a bit shaky I’m afraid.
 
>> CHARLOTTE ALTENHONER-DION: I’m really sorry. This is –
 
>> SABRINA VORBAU: This is better actually.
 
>> CHARLOTTE ALTENHONER-DION: [indiscernible].
 
I just will go slightly [indiscernible]. (Audio quality too par for transcription).
 
I wanted to make a couple of points – (audio coming in and out) and the point of multistakeholder rhythm and approach. I think we need to do more of that and need to develop it into a real coordinated network. No one of us is able to really approach or address the problem that we have with the disinformation and broader issues of content regulation. (Audio quality too poor for transcription).
 
Public interest activity that can help them in taking the right action and the population, while very good, it is problematic (audio quality too poor).
 
>> SABRINA VORBAU: I’m afraid we’re losing you again.
 
>> (Audio quality too poor for transcription). Maybe use the chat function. I’m very sorry for this. We’ll hand it over to Abhilash Nair now who has been monitoring the chat and see if we have any further comments or questions and you are also very warmly invited to raise your hand if you have any questions and we’ll unmute you.
 
There are quite a few questions here.
 
A question for Liz, how can the government fund and protect authentic journalism without funding and protecting fake journalists camouflaging as authentic.
 
>> LIZ CORBIN: I saw that question. Thank you very much. It is good to have the opportunity to answer it.
 
Look, public service media is structured, well established, it is regulated and had a direct – it is very transparent in the way that it is organized and funded and supported.
 
So supporting public service media organizations is the best way to protect and support public service journalism.
 
Of course, we want to become as public service journalists, we want to become more of a diverse group, we want to reach more people, more audiences than we currently reach. We do need to expand and with that, it will be done, you know, that should be done under the umbrella of the organizations that already exist.
 
Yes, governments should protect their public service media organizations, and that’s what I meant by that point.
 
>> ABHILASH NAIR: Thank you. It certainly does.
 
A couple more questions: There’s a question I suppose this is best placed for the regulators to respond, maybe perhaps Paolo if you can, what about dark social platform, they’re becoming more and more important for the shaping of public opinion and especially the current situation, there’s a lot of misinformation being spread through the platforms. Are there any ideas what we can do about it as far as I know from Germany, this is the person asking the question saying, these services are not regulated by the law from the German perspective.
 
>> PAOLO CESARINI: Thank you for this question actually.
 
We have very well present to our mind the problem that’s represented by the current limited partnership of the COVID practice, we’ll start to expand it, we’ll start to invite new actors already now, telegram is one of those platforms that is increasing in terms of user bases and I would say during the COVID the two most used platforms which have shown some problematic features have been WhatsApp and Tik Tok.
 
I think we need to progress step by step and to start to tackle those platforms which are currently outside of the framework and to engage them into this structure of cooperation one by one.
 
By the way, also to underline, that beginning of July, the commission published also the evaluation of the code. Beyond the communication from yesterday, there will be not communication but a document drawing some conclusions about the work and the function, the effectiveness of the code and the next steps would be not only the DSA, the digital services actor that was mentioned by Sabrina, I would say also the European democracy action plan, which is put forward this year. That’s why think we need to have a serious reflection considering how to construct a regulatory backstop that creates more uniformity, more mechanisms called up with appropriate over site mechanisms and in case of need, sanctions.
 
That is one point I wanted to make before, I could make it before the question, to make it.
 
The other point, not to the question, but I feel the urge to react to, it is about the fact checking of political speech. I know and I take the point made by Facebook representative that is a delicate ground. I still fail to understand why if political speech is promoted, paid for, political advertising, why a platform should take money to give visibility and prominence to content that’s fact checked irrespective of the other again, the author of the content. What is paid for, it is – the paid for content, visibility, it is a form of overtaking the visibility of other sources of information to the use of money. The use of money should be subjected to scrutiny, even if it comes from politicians. That’s a reflection I think we need to carry on.
 
Thank you.
 
>> ABHILASH NAIR: Thank you. That’s a good question, I don’t know if Guido wants to quickly respond to that in 30 seconds, if you can.
 
Before you do, I would like to raise a related question which is in the chat screen, so perhaps you could take another 15 seconds, 45 seconds to answer! Why Facebook, social media companies prefer to work with fact checkers instead of working directly with professional media that do this work daily as part of the mandate, mission and dictate.
 
Over to you.
 
>> GUIDO BULOW: Just have to close my door, the mailman is at my door. One second.
 
>> (Chuckle).
 
>> While Guido is coming back, I would same fact checker, they’re in‑house departments of mainstream media, let’s not think of fact checkers as isolated, detached from mainstream media.
 
>> ABHILASH NAIR: Thank you, Paolo for chiming in already.
 
>> GUIDO BULOW: We have several examples of France, others in Germany even, they are doing fact checking in our fact checking programme.
 
Nevertheless, it is a good question, why don’t we work together with journalists, other media outlets and the reason for that, it is that the IFCN provides a global standard, and we’re a global company and we’re trying to operate with this programme on a global level, and to maintain the same quality across every single content – country of the world, in order to defend – yeah, also the programme.
 
Of course, there will be parties, people that say, well, this organization is biased, why are they fact checking my content? I don’t want to dwell too much on this. I mean, we see the situation in the U.S. where – I mean, on one hand, you see people favoring New York Times, the Wallstreet Journal, CNN, on the other side, you have Fox New, others, if you don’t have the one single standard, the perception of bias, it is definitely not helpful to increase trust in the programme and with an independent organization like the international fact checking network, you always have a body where people can reach out to in case they don’t trust any of the fact checkers and where everything is super transparent. Take an organization, like DPA, Germany, the news agency, you see that the application for the ICM with every single step publicly available, and you see this, if anyone has ever dealt with the organization, you can reach out and you will see a publicly – a public process of the inquiry that you actually asked for.
 
We’re trying to be as transparent as possible on a global level.
 
While I totally acknowledge that in Europe, most countries of Europe, we have certain standards like in Germany, others, there may be some countries that we don’t have that, and in order to have a functioning system with fact checkers, we need to have this global standard. That’s the reason for the standard, why we’re working just with specific sets of partner, which by the way is growing steadily over time.
 
I almost forgot about the first question, give me a hint quick..
 
What was the first question again?
 
>> ABHILASH NAIR: I’m actually thinking what was the first question.
 
It was building on from – it was essentially Paolo’s question.
 
>> PAOLO CESARINI: If you want, I can repeat? My question is (poor audio quality). Political speech, why – how does – how can this be justified when we talk about paid for content. My point was I don’t think the scrutiny should stop with the ads from the politician and the statement has been fact checked and is false, misleading, you can still put it on the platforms, but why to give extra visibility and to make it an object of money, to get a prize to statements that are not reliable.
 
>> ABHILASH NAIR: 1 minute please.
 
>> GUIDO BULOW: I see your point. It is not an easy answer. You know, we’re heavily discussing that internally and externally.
 
There are certain limitations.
 
Even when politicians or political parties do advertisement, advertisements, which is not political speech per se, we actually take these advertisements down, which we have done in the U.S. with Trump as an example where he was doing advertisements on the census.
 
That happens if it is political speech, not eligible to be fact checked, it is able for political parties and politicians to use advertising for that. Again, the reasons for being that, it is that we don’t want to interfere in our political discourse. We don’t want to be the ones that stand between the politicians, but also the people that elected these officials.
 
>> SABRINA VORBAU: Thank you very much, Guido.
 
I’m afraid we’re coming closer to the end of our session today. I know there are many more questions and comments in the chat. Just for everybody to be aware, you can continue this conversation on the EuroDIG forum, even once the session is over within the next days. I encourage all participants to do so.
 
As mentioned before, I will hand over to Katarina Andjelkovic now from the Geneva internet platform to summarize the key messages, the main takeaways from our discussion today.
 
The floor is yours.
 
>> KATARINA ANDJELKOVIC:
 
I hope you can hear me. I’m Katarina Andjelkovic and I am a Rapporteur from the Geneva internet platform.
 
I have collected a couple of messages. Now I’ll turn to the first one. I hope I can see it on the slide.
 
I can’t see it on the slide. I can read it anyway.
 
So my first message will be that a multistakeholder involvement, meaning the involvement of all those that are directly concerned by the issue of misinformation of is utmost information in fighting this misinformation and there is a need for more serious structure in place to organize fact checking and research activities that will be available in all U.N. languages and would therefore benefit all countries. My second message, it is a high‑quality trusted news is the best antidote to fake news, to achieve that there is a need for reliable it funding for public journalism and the protection of the freedom of the press by national authorities on the other. Crisis such as COVID‑19 should not be an excuse for governments bent on restricting freedom of expression.
 
The third message will be that media literacy is crucial in fighting the information, it is important to educate and empower people to spot this misinformation and decide on their own whom to trust.
 
The final message that’s just drafted, and therefore it is not uniform, it is in order to regulate all platforms in a uniform manner there is a need for a more corps then serve reflection on how to construct the regulatory backstop that creates more uniformity, moreen instruments with appropriate oversight mechanisms and in case of need, sanctions.
 
I would just like to remind you that. All these messages are not final and will be subject to your comments and questions and EuroDIG will provide more detail on that.
 
Thank you very much.
 
>> SABRINA VORBAU: Thank you very much.
 
We have 2 more minutes before we have to close.
 
I would like to give the opportunity for our key participants to just in 10 seconds give a final statement, a main take away. We’ll start from the beginning. We heard from the beginning from Tanja, maybe your main take away from today’s discussion, please? In 10 seconds if possible!
 
>> TANJA PAVLESKA: I will just repeat that I see multistakeholder involvement as key, but not only speaking of multistakeholder issues, but involvement of the stakeholders themselves, and showing understanding for the other stakeholder’s problems. Not expressing their problems but showing understanding for the stakeholders within the same issue.
 
>> SABRINA VORBAU: Thank you very much.
 
>> NERTIL BERDUFFI: I hope during this discussion we learned a lot and also to talk about the freedom of expression and the freedom of media as a key point, I think. I need to end with this.
 
>> SABRINA VORBAU: Thank you very much. Thank you.
 
>> GUIDO BULOW: Plus to what’s been said so far, I think the problem is misinformation, it has been around for as long as people are living on this planet and probably will live on even longer. We need to collectively look at the misinformation and do our best efforts to help people get first of all, access to incredible resources and give the tools that they themselves can decide, what is credible information, whatever is the technology, whatever we invent, we’ll never get rid of misinformation.
 
>> SABRINA VORBAU: Thank you very much.
 
>> LIZ CORBIN: In order to tackle fake new, we’ll need to collaborate a lot, lot more between journalists and platforms, with regulators, governments, and other parts of society.
 
Journalists need to know in realtime from the platforms what is happening and the scale of what’s happening. We can’t tackle what we can’t measure.
 
>> SABRINA VORBAU: Thank you very much.
 
Charlotte, if you’re still with us, write the thoughts in the chat.
 
>> PAOLO CESARINI: Yes. For my side, I would say complete what Liz has just said. Regulation will not make fake news disappear. The dialogue between all from Civil Society, media, it will remain essential in the future, irrespective of the regulatory choices that will be made in the next months.
 
>> SABRINA VORBAU: Thank you.
 
Mull day stakeholder approach is key to echo what you have said, this is what we tried with today’s discussion, I’m delighted that we managed to get such a diverse panel with different experts representing different stakeholder groups, the academic field, the industry sector, also policymakers. It was a real pleasure to talk with all of you.
 
I hope you enjoyed this.
 
As I said, you can still use the EuroDIG forum within the next days to continue this discussion, which obviously won’t stop here. Thank you very much for your time and for sharing your expertise with us. I do hope that we’ll continue this discussion very soon. Apologies for all technicalities, this is the life we live in! I hope to see many of you soon in person to continue this dialogue.
 
I’ll hand over to Roberto in the studio. Thank you very much once again for everybody joining us today.
 
>> ROBERTO GAETENO: Thank you.
 
The only thing I would like to say is thank you to the participants, all the speakers, the moderators, everybody who has participated to make a great session and I’ll stop it here. We’re already a couple of minutes late.
 
Back to the main studio.
 
Bye, all.  




[[Category:2020]][[Category:Sessions 2020]][[Category:Sessions]][[Category:Media and content 2020]]
[[Category:2020]][[Category:Sessions 2020]][[Category:Sessions]][[Category:Media and content 2020]]

Revision as of 15:39, 7 July 2020

11 June 2020 | 14:30-16:00 | Studio Trieste | Video recording | Transcript | Forum
Consolidated programme 2020 overview / Day 1

Proposals: #20, #27, #56, #77, #90, #99, #108, #112, #125, #139, #167 (#5, #12, #16, #18, #28, #34, #186)

You are invited to become a member of the session Org Team! By joining a Org Team you agree to that your name and affiliation will be published at the respective wiki page of the session for transparency reasons. Please subscribe to the mailing list to join the Org Team and answer the email that will be send to you requesting your confirmation of subscription.

Session teaser

Thanks to the social media, everybody now can bypass traditional gatekeepers and – trading one’s personal data for access – enjoy one’s seemingly unfettered freedom of expression in the digital public space. But when does personal freedom end and personal responsibility begins? The session will facilitate a discussion on the limitations and pitfalls of freedom of speech on the Internet.

Session description

Especially in the time of COVID-19, people are using the internet and online services more now than ever before. As schools close and people are confined to their homes, being online is providing a lifeline for everyone in society from the young to the old, learners and workers, and the vulnerable, curious and those seeking an escape from boredom.

Social networking services - perhaps now more than ever - play an important part in people’s daily lives. While they serve as a way to connect, discuss and share people’s ideas, the fact that they can be accessed by anyone gives more room of action to malicious actors. Among others, this might result in issues such as hate speech, cyberbullying and disinformation. To contrast these actions, governments are utilizing sophisticated technology (e.g. AI) to monitor citizens’ behavior on social media, but this could result in mass surveillance on social media, which represents a threat for both authoritarian and democratic governments. As human rights protected offline (speech, expression, aggregation) should also be protected online (Tallinn Manual), the future of our online freedom depends on our ability to ensure that citizens’ online rights are respected.

This session will be organized in a facilitated multi-stakeholder dialogue with representatives from governments, technical community, civil society and academia to discuss opportunities, challenges as well as rights and responsibilities of social media platforms and its users.

Format

The workshop will be divided in three parts, each lasting approximately 30 minutes. At the beginning of each part a specific case study or latest research findings will be presented by representatives from the academic and civil society sector. Following each case presentation, representatives from the public sector and tech industry will act as respondent, giving a short statement. This approach will initiate a facilitated dialogue led by the moderator who will also open the floor to the audience for intervention after each part. In addition, a second moderator will facilitate comments and questions that will be posted in the chat throughout the whole session.

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, so only links to external resources are possible. Example for an external link: Website of EuroDIG

People

Focal Point

  • Sabrina Vorbau
  • Joachim Kind

Organising Team (Org Team) List them here as they sign up.

  • Elena Perotti
  • Narine Khachatryan
  • Amali De Silva-Mitchell
  • Ruth Cookman
  • Nertil Berdufi
  • Oliana Sula
  • Anna Romandash
  • Abhilash Nair
  • Bissera Zankova
  • Federica Casarosa
  • Sofia Badari
  • Carola Croll
  • Debora Cerro Fernandez
  • Zoey Barthelemy
  • Aleksandra Ivanković
  • Jörn Erbguth
  • Giacomo Mazzone
  • João Pedro Martins

Key Participants

  • Nertil Bërdufi, Assistant professor at University College Beder and Founder and Director of Beder Legal Clinic

Mr. Nertil Bërdufi is an experienced lecturer and lawyer with a demonstrated history of working in the legal services based in Tirana, Albania. He is the founder and Director of the Legal Clinic at University College Bedër. He is also external evaluation expert at the Quality Assurance Agency of the Higher Education and trainer in different projects in the field of countering violent extremism (CVE), Cybercrime, International Law and Dispute Resolution. Nertil has a PhD in Cybercrime and National security and an LLM in Rule of Law for Development from Loyola University Chicago, School of Law. He is a regular participant and Org Team member at EuroDIG since 2016.

  • Tanja Pavleska, Researcher Laboratory for Open Systems and Networks, Jozef Stefan Institute

Dr. Tanja Pavleska is a researcher at Jozef Stefan Institute and Chair of the Slovenian Internet Society Chapter. Her background is in electrical engineering. Her research is focused on trust and cybersecurity management, digital policies, social media regulatory frameworks and information governance. She is currently involved in the enterprise architecture design in the Digital Europe for All (DE4All) project and the work on user-centric cybersecurity in the CONCORDIA project. For the past 3 years, she has been leading the work on social media policies and regulatory frameworks in the COMPACT project

  • Liz Corbin, Deputy Media Director and Head of News for the European Broadcasting Union (EBU)

Liz Corbin in the Deputy Director of Media and Head of News at the European Broadcasting Union. She oversees the Eurovision News and Sports Exchange, the Social Newswire, the News Events broadcast services team and Radio News. Prior to joining the EBU in January 2020, Liz worked at the BBC for 18 years, most recently as the Head of News at the international channel BBC World News. Previous roles include the Editor of BBC Reality Check where she oversaw a large expansion in the team. She was also the Singapore Bureau Editor for 4 years.

  • Charlotte Altenhöner-Dion, Council of Europe, Head of Internet Governance Unit and Secretary to the Expert Committee on Freedom of Expression and Digital Technologies (MSI-DIG)

Charlotte has a law degree from Hamburg University and an LL.M. in International Law from Georgetown University. She has been with the Council of Europe for the past ten years, having previously worked for the OSCE and the UN, including in Central Asia, the Balkans, and East Africa. Her current professional focus lies on the human rights impacts of digital transformation, intermediary liability, and the institutional challenges of governing fast-moving technologies in a human rights-compliant manner, while adhering to the principles of democracy and the rule of law.

  • Paolo Cesarini, Head of Unit Media Convergence & Social Media, DG CONNECT, European Commission (EC)

Paolo Cesarini is head of the unit responsible for media convergence and social media policy at the European Commission, DG Communication Networks, Content and Technology since 2017. He previously held other management positions in the Commission, including in DG Competition. He also worked as a member of the legal service at the International Labour Organization in Geneva, Switzerland, and as a researcher at the Institute for Public and International Law of Siena University, Italy. He has been teaching EU competition law as visiting professor at Siena University, and as lecturer at Montpellier University, France. He obtained a master degree in international law at Siena University and an LLM in EU law at the College of Europe, Belgium.

  • Guido Bülow, Head of News Partnerships for Facebook

Guido Bülow has been working as Head of News Partnerships for Facebook in Central Europe since September 2015. Since March 2019 he has been responsible for Strategic Initiatives in EMEA which includes the Third-Party Fact-Checking program. Before joining Facebook, he was head of social media at SWR (ARD). Guido started his career with bigFM, where he volunteered after completing his studies and then worked as a marketing manager.

Moderator

  • Sabrina Vorbau, Project Manager at European Schoolnet

Sabrina Vorbau is Project Manager at European Schoolnet and has been a member of its Digital Citizenship Team since 2014. She is involved in a variety of online safety projects such as Better Internet for Kids (BIK), the eSafety Label and SELMA (Social and Emotional Learning for Mutual Awareness). As part of BIK she is coordinating the BIK Youth activities and the annual Pan-European Youth Panel. In addition, she recently started managing the H2020 project CO:RE – Children Online: Research and Evidence - developing a knowledge base on children and youth in the digital world. Originally, from Germany, Sabrina holds a Master’s degree in Applied Economic Sciences: Business Administration (MBA) from the University of Antwerp. Prior to joining EUN, she worked at the European Agency for Fundamental Rights in Vienna.

  • Abhilash Nair, Senior Lecturer in Internet Law at Aston Law School

Abhilash Nair is based at Aston University Law School, Birmingham, where he is Director of the Internet Law and Emerging Technologies (ILET) research unit. He is an internationally recognised expert on child online safety laws and has advised various international and national bodies on regulating illegal and harmful content, online child sexual abuse material, and content-related cybercrime. He has published widely on aspects of internet law and regulation, and has been a regular speaker at international conferences and panels focusing on internet safety. His recent book titled 'The Regulation of Internet Pornography: Issues and Challenges' (Routledge) examines how the internet has necessitated a fundamental change in the regulation of pornography. Abhilash is a member of the Evidence Working Group of the UK Council for Internet Safety (UKCIS) and has previously served as a member of the Technology Law and Practice Committee of the Law Society of Scotland. He has significant experience of contributing to both broadcast and print outlets including the BBC News channel, Victoria Derbyshire (BBC2/BBC News), BBC Radio 5 Live, BBC Asian Network, various BBC Local Radio stations, the Economic Times and the Conversation.

Remote Moderator

Trained remote moderators will be assigned on the spot by the EuroDIG secretariat to each session.

Reporter

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.

Messages

  • Multistakeholder involvement (i.e. the involvement of those directly concerned and impacted by misinformation) is of utmost importance in fighting misinformation. There is also a need for the infrastructure to organise fact checking and research activities that would be available in all EU languages and would, therefore, benefit all EU countries.
  • High-quality trusted news is the best antidote to fake news. To achieve that, there is a need for more reliable funding for public service journalism on one hand, and the protection of the freedom of the press by national authorities on the other. Crises such as the COVID-19 pandemic should not be an excuse for governments to restrict freedom of expression.
  • Media literacy is crucial in fighting misinformation. It is very important to educate and empower people to spot misinformation and make informed decisions on whom to trust.
  • In order to regulate all platforms in a uniform manner, there is a need for a more comprehensive reflection on how to construct a ‘regulatory backstop’ that creates more uniformity, more instruments with appropriate oversight mechanisms, and in cases of need, sanctions.


Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/social-media-opportunities-rights-and-responsibilities.

Video record

https://youtu.be/7d76NQKVq0c?t=10456

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-482-9835, www.captionfirst.com


This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.


>> ROBERTO GAETENO: Hello, everybody. We’re starting the afternoon session and my name is Roberto Gaeteno. The session that we are going to have is a first session this afternoon at studio Triesta, Social Media ‑ Opportunities, Rights and Responsibilities, before giving the floor to the moderators, we’ll read the code of conduct. We’ll remind everybody what is the code of conduct. Basically you need to state your full name when entering the room. Raise hands using the Zoom function to ask question, when speaking, switch on the video, state your name, affiliation. Do not share the links to the Zoom meetings, not even with your colleagues..

Let me give the floor now to Sabrina Vorbau that is the moderator of this session.

>> SABRINA VORBAU: Thank you very much, Roberto.

Hello, everyone. Good afternoon from Brussels. Welcome to this EuroDIG session on Social Media ‑ Opportunities, Rights and Responsibilities. I’m Sabrina Vorbau, I’m a Project Manager at European Schoolnet and I will be moderating the session with Abhilash Nair who is Senior Lecturer in Internet Law at Aston Law School. Before we dive in the content, some practicalities, we encourage all participants on this call, please do use the chat throughout the sessions as much as possible for your comments and questions. We’ll make sure we leave enough room at the very end, at the end of the session for your questions and comments and to foster this dialogue. We do have to keep 5‑minutes at the end of our workshop in the tradition of EuroDIG, we have a colleague from the Geneva internet platform with us who will read out the key messages, Katarina Andjelkovic, the main takeaways of our discussions today that will later on be published on the EuroDIG website.

Without further ado, anticipate I’m happy to kick off the session that couldn’t come at the better time given the pandemic and many citizens that have been confined at home for weeks and months, the internet, especially social media platforms provided a lifeline for many people to stay connected with their communities, while this is obviously a great opportunity it also brings even greater responsibilities along, and not only during COVID‑19 but also when we look back at past events, social events, economic‑related events, political events, very often and in these moments, social media, it is misused, and we do see there are on certain occasions these results and an increase of this information that are circulating on the platforms. This is why I’m very happy that today I’m joined by a very great panel, a panel of experts that works around this topic of this information and works hard to combat this information on the internet.

Let me briefly introduce you to our panelists. We have with us Nertil Berdufi, and we have Tanja Pavleska, today, we haves Liz Corbin with us, LD/WG. We have Charlotte Altenhoner‑Dion with us from the Council of Europe, Head of Internet Governance Unit and Secretary to the Expert Committee on Freedom of Expression and Digital Technologies, we have Paolo Cesarini, Head of Unit Media Convergence & Social Media, DG CONNECT, European Commission, Guido Bulow, Head of News Partnerships for Facebook.

Thank you very much to all our panelists for taking the time to be with us this afternoon to discuss this very important and timely issue. I don’t want to lose any more time and dive into the conversation. I’m happy to hand the floor over to Tanja Pavleska to kick us off with the discussion. As said, she’s coming from the academic field and she has been researching around the topic of this and misinformation for years and I’m delighted for her to kick us off with some insights of the latest research of her’s. Tanja, please, the floor is yours.

>> TANJA PAVLESKA: I hope you can hear me well. Just confirm so I can continue.

>> SABRINA VORBAU: Yes. Thank you.

>> TANJA PAVLESKA: This is actually not only my research, it is part of a complex project which is horizon 2020 project. As part of the project we developed a methodology to which we analyze the information governance initiatives in the field of information disorder. Just to give a quick definition, by information disorder, I mean this information, this information, other information that’s heavy content, private, protected information that’s leaked from the private into the public space so that this is what I mean by information disorder.

As part of analyzing the initiatives across Europe that deal with combating information disorder we developed a methodology that approaches the programme from two perspectives: One, regulatory implications which means just transparency and accountability. The other is from the addressing the fundamental Human Rights. We only concentrate on a skewed set of fundamental rights which can be extended as well but the methodology is the same. We analyzed freedom of expression, equality, non‑discrimination, protection of personal information, Rights of children, freedom of assembly and association and intellectual property rights. These are the two aspects from which we approach to analyze the actual criteria within the methodology.

The criteria, there are five. What is the context of the implementation of this initial initiatives? How the conception, the implementation took place, meaning how much they’re open toward multistakeholder cooperation and so on what, are the mechanisms used to self-monitor themselves and track the impact of their activities, how they perform the enforcement and complaint mechanisms and how much support they get, how much the state is involved within their activities. This methodology was pilot across 24 European countries. The number of initiatives, it is 146. It is not an exhaustive set of initiatives that exist over Europe but it is statistically sufficient to give some valuable insights. I will directly go to the insights.

One of the most important findings for me personally was unexpected, it is that most of the initiatives are not open to multistakeholder involvement. This is very unfortunate. One of the most important things for developing the undeveloped regulatory landscape for social media issues is multistakeholder involvement. Not only is there a lack of such involvement, most of the initiatives – and I don’t give the number, around 70%, they’re only acting at national level and are not coordinated among themselves. This implies there is a lot of isolated efforts to address the problem of this information. These efforts are not coordinated among themselves which is not efficient and not as effective as it could be.

So another finding, it is that there is a large underrepresentation of certain types of stakeholders. For me personally the most striking finding regarding this is that there – there were only 4 out of the 146 initiatives that include digital rights and civil activists. 4 initiatives only include that type of stakeholder. As part of the main challenges that the initiatives report that they’re facing with is insufficient public awareness on the problem of information disorder. On the other hand, to give a more positive evaluation, most of the efforts that they put for the information disorder, it is on the increasing public awareness. The objectives are well‑placed, clearly defined, however – clearly – however there is insufficient public awareness among the general population since this is very much an ongoing issue. It is not easy to keep track of all of the latest events and problems that are being appearing.

Then we also analyzed employment of technology and technical solutions used as part of fighting information disorder and it is very, very low. Only 28% I think or around 25 use technology as part of their activities. This is mainly the fact checkers. When I say initiatives, maybe I should have mentioned that. When I say initiatives, I mean every type of initiative that is concerned with the problem of information disorder. This is a very broad range of initiatives. We can mention the public service media, the fact checkers, the media outlets in general, so a lot of different types of stakeholders involved here.

Another interesting result, it is that although the initiatives are – the objectives include transparency of the players in the field, they themselves are not implementing any safeguards for transparency of their methodology on the financial schemes and of the points of state inclusion within their activities. It is unfortunate. It basically means that they’re undermining their own activity in a way by not implementing, by not preaching – by not doing what they’re preaching. Here we can give some concrete recommendations as well.

A final finding that I will share, then concentrate on only two to point out, how important they are, the final finding, it is that there is a huge mismatch of the timelines between the practices of academia and – so between theory and practice I would say, and I will extend that a bit more.

One thing I would like to point out, it is a problem that needs immediate addressing, the multistakeholder involvement and the coordination between the initiatives. There was a talk today during the Plenary, I believe it was by Patrick talking about the recommendations and the baseline. Here I presented the actual state, which is quite off this baseline. Mainly regarding this multistakeholder inclusion, that and the underrepresentation of civil right activists do not mean that the issues related to the activities are not addressed, but it practically means that those who are directly concerned by the issues are not discussing them, that those that are involved in shaping policies regarding those issues. Here I would also mention the inclusion of children. For me in particular, it is an interesting phenomenon to observe that some information is considered to be misinformation or disinformation, but by the younger audience, it is just the generation that they use that they don’t see it as missed information because they have adopted either to the tone of the – the present tone across the social networks which may be striking for the older audience. We have to be aligned also with the children. If I want to be provocative here, I will put forward the following statement: The effect of policy implementation on Human Rights is much less interesting than they’re fitting within the company’s business models. Manically, when shaping the qualities, I think a lot of attention is given to how they fit within the business models of the stakeholders, buddies regarding the part of Human Rights. A bit of the devil’s advocate here, you can confront me on that later. The main point, I cannot over emphasize how important it is to act in the direction of multistakeholderism.

The last point, it is that actually during the panel it was stated that science is at the forefront of fighting the issue on disinformation especially regarding COVID‑19 and I would not agree completely with that, there is a huge mismatch as between I said between practice and research. I will give one example, usually I imagine it like this, so we have a problem with information disorder and it is causing huge, huge impacts on the world. The world ends because of that, and out of the ashes, a researcher rises up, says this is it, I imagined to find the definition of this information, this is the mismatch that I’m talking about while the period you process in academia allows a lot of time to pass before accepting research as valid. The policies require sometimes especially regarding this information, misinformation, immediate response.

Thank you very much. This is what I had to say.

>> ABHILASH NAIR: Thank you, that was useful and informative. You set out a few issues there. What pointed out to me is how underrepresented digital rights and Civil Rights are and the lack of multistakeholder involvement and what better platform can you think of than EuroDIG to discuss multistakeholder if you look at the stakeholder. Yes. I’m sure we’ll come back to some of those issues in more detail when we take questions and get to the discussion stage of the meeting.

We’ll go straight to the next speaker, Nertil Berdufi, from Beder. Nertil will speak very broadly about how an earthquake in Albania set off a series of items and the government response to that and I think he’ll pause fundamental questions, whether it is better to have no news at all rather than fake news and the role of government in media control generally. I hope I summarized that briefly and nicely for you.

Inches my idea is to explain how this case in Albania can be spread over Europe and the world and how the governments can use such things as emergency situation for their purposes or on the other side, how the fake news, misinformation can influence the people and how it can be such a housing public panic and how everything can be.

In November, 2019 Albania, we had a huge earthquake, a 6.4. After that, what happened, it was that all in the social media started a lot of putting out information that something big could happen and it was a girl, in fact, from a city near Tirana that was giving this information. After that, all the people in Tirana, other cities, they went out, it was panic, everything was blocked, all the roads, everything happened so that caused the state conditions and they went back to this girl and arrested her for causing public panic. On the other side, the girl was saying I was just tweeting, just sharing the post that another portal has made and after that, what’s happened, it was kind of spread over the media, this fake news and the Prime Minister, that was the key point, when the Prime Minister went online and he said to the public that all this media, all this social medias that are posting, that are posting this with fake and misinformation to people, they are should be blocked and that’s what happened. This portal, it was blocked by the government. What wasn’t blocked is the Facebook portal, the Facebook account that they have because it is not in the environment to close this type of portal in the Facebook area.

After that, what brings it together, it was that the Prime Minister directly started a new law on audiovisual and media services and in fact it was called the law on fake news. It says, this law should be ready as fast as possible so that we can block all of this fake news and other things. On the other side, all of the debates, political debates front and center, this is unconstitutional and also it somehow has blocked expression of media freedom is what really happened because of the development and the President, there is a conflict in such issue and the President sent it to a commission and before two days, the commission says this is, of course, unconstitutionally within our constitution but also it breaks a lot of freedom of expression and media freedom and the government should somehow retreat this part.

What I have put down, it is kind of a question, if it is better to have no news at all rather than fake news, which is kind of putting all of this pressure on this, and as we’re seeing also in the pandemic that we are in now, so we have a lot of fake news that’s going around the media, social media, all of the developments also in Europe, we have had some problems, also in Hungary, they have laws, other things that somehow caused this part of freedom of expression, and the other part, which is for media freedom. This is a big question, in fact. I don’t know how the situation will go in Albania and other countries. We can let the control of the fake news, the legislation, or could we find other solutions and range a legal bit part and somehow regulate this in front of the development and in front of the freedom of expression and freedom of media. That’s my point in this part.

>> ABHILASH NAIR: Thank you. That was fascinating. Thank you for sticking to the time limits. Part of the idea of a EuroDIG session is to get as much audience participation as possible in the end. Very helpful indeed.

I think you made two fundamental points there, which welcome back on later on: One, the regulation, the state regulation, what can over regulation do? Also I think that raises a further conceptual question as to how do you differentiate opinion on one hand with facts on the other. I’m sure we’ll come back to that in due course.

Well, the next speaker is head of news partnerships, Facebook, Guido Bulow, you’re the only industry here on the panel. No pressure. We wanted to ask how you handled this and what social company, company – social media, companies can do to make that multistakeholder approach more workable as Tanja pointed out in the first presentation.

Over to you.

>> GUIDO BULOW: Thank you for the invitation, the chance to, yeah, have that discussion with all of you.

When I talk about our approach and integrity of our platforms – yeah, our platforms, it is not simply just Facebook anymore, it is Instagram, other apps, we have a three‑fold approach. On one hand we remove content from our platform that violates our community standards. As a second stab, we reduce the distribution for those that does not directly violate our standards that offends the authenticity of the platform. We reduce that distribution and release. And an important topic that usually is underrated, less discussed, it is we want to empower people to decide for themselves on the content of the product. If you want to say that’s news literacy. Media literacy.

I don’t have to focus on the first topic, community standards, hopefully you’re aware of that. An example, we removed from our platform as an example fake accounts, just last quarter we removed 1.7 billion fake accounts which were by the way also transparent, that’s in our transparency reports which are publicly available and have just been rereleased. We don’t remove any misinformation. Misinformation actually falls in the second bucket where we’re working together with 60 partners globally, more than 555 languages, in Europe, 31 partners with 27 languages and we’re expanding that network of partners we’re working together with constantly.

When I talk about the fact checking programme for a second, we are working with those partners that are certified by the international fact checking, so the global standard and everyone is abiding by the same principles by the same code of conduct and the transparency, transparency of funding sources, things like that. The goal of the programme obviously is to combat misinformation to identify and address misinformation with false claims, content that can be harmful or that is financially motivated, things like that.

There is one exception which I put out front right away because it has been discussed a lot in the last few weeks. We have an exception for political speech so that content that’s been directly created by politicians, it is not eligible for the fact check. We don’t do that because we want to help politicians, we feel that politicians – that people should hear directly from their elected officials and the private company should not interfere in that political discourse. That being said, in the last especially since – we’re still in the coronavirus crisis, in the last few months our fact checking partners have submitted 7500 Articles which resulted in 50 million misinformation overlays that we put on top of identified misinformation which by the way, we reduced the amount of distribution on the platform. 95% of the people actually didn’t engage any more with the postings. They didn’t engage them by commending or sharing or liking them. It is exactly what we want to achieve.

Other than that, we also – what we also do as part of the process, it is to also inform people retroactively. Just assume that you have shared something in the past and it is now getting fact checked. We’ll also inform people by sending notifications. We don’t just stop there with Facebook, we extend the programme also to Instagram at the end of last year so that a single fact check, let’s say a video, an image where we’re able to detect duplicates, they have a much greater impact than just on a single platform. It is quite harder, of course, to do that with Articles. We’re making progress to look at the duplicates, especially with the use of technology and our fact checking partners, they have a potential similar Article where they can apply ratings, similar retaining that is allied to one Article to other Articles as well just to speed up the process of fact checking..

We also acknowledge, there are a certain stakeholders that share misinformation more often and we enforce that, if people are repeatedly sharing misinformation we demote the page or demote technically the bad actors, they don’t get distribution any more on the platform. We also revoke the rights to use our advertising products or other products. No one is able to Ernie money or boost that content on our platform anymore. Apart from that, we take the signals we get from the fact checking programme also into other areas. We have started the news public lecher index, it is a registration for publishers that they can self-identify them as news publishers on the platform and that signal that we get from the fact checking partner plays a role into that as well. There are other products we can talk about a bit longer, I know we have time constraints here. Quickly I wanted to jump on the last part, media literacy, I personally feel that’s very, very important for people to decide on their own and spot misinformation on their own, that’s why we’re partnering with publishers to come up with initiatives, school projects, school children in Germany thousand learn more about what it feels to be a journalist, what are the challenges as a journalist, things like that, we’re talking about doing news literacy campaigns on the platforms. Maybe you have seen our coronavirus information on top of news feed or when you search for coronavirus on Instagram, you do get more information from WHO and other reliable sources. The same with WhatsApp, we partnered with local health authorities. Yeah. I think that’s it in a nutshell. Like I said, given that we have only 5‑minutes time, I want to hand it over back to you.

>> ABHILASH NAIR: Thank you. I’m sure there will be questions at the end. I already see a question in the chat window and we’ll pick that up later but essentially on the political speech exception you talked about earlier. I also have a view on that, on what counts as public interest, is it politician speech or other stakeholder’s speech that also counts in that public exception. We’ll come back to that later in the session.

Our next speaker, Liz Corbin, Deputy Media Director and Head of News for the European Broadcasting Union, she comes from the public service media, on the website, it is universality, independence, excellence, diversity, accountability, innovation are the coprinciples that guide them and obviously, you couldn’t think of a more polar opposite world in social media where no such controls exist. We’ll speak a little bit about those aspects as well.

Over to you.

>> LIZ CORBIN: Thank you very much. I’m really, really pleased just as our previous speaker said to have this conversation today. It is absolutely crucial because the stakes, they’re so very, very high. It is very – at the basic level, this is helping everyone, everywhere understand what is true, what isn’t, and the current moment, the current situation we’re in, this means understanding a crisis which is having a profound effect on all of our lives. It has become abundantly clear that the public crave information they can trust, that they have had enough of fake news. Why is trusted content so important? Surely more information, more facts, more opinions is what’s needed, no filter. People can see everything that’s out there, make up their own minds about what is true. Of course that sounds incredibly attractive, and it is understandable that people would like that theory, but, of course, the reality is completely different. People aren’t stupid, they realize there is huge amounts of incorrect information out there. They’re wondering instead whether they should trust nothing they see any more, believe nothing tall, even the established public service broadcasters, maybe they shouldn’t trust them either. The problem is when people believe nothing tall, as we know, they’ll actually believe anything. That is the future if we don’t act now. If this current crisis is not motivating us to act, frankly, I don’t know what will.

COVID‑19 is a global problem as Tanja said earlier, we must collaborate more. The virus doesn’t know borders, nor does fake news. There is no one vaccine that will purge this pandemic and we have to act together in lots of different arena, lots of different ways.

At the Broadcasting Union, of course, we have been doing just that, all of our members are public service broadcasters and they’re sharing content advice and experiences of getting through the crisis. We brought fact checkers together to share crucial information with each other, so as not to duplicate, to take advantage of sharing that information. We have shared world class investigation journalism to get a large audience as possible. What’s important, in a world where there is so much incorrect information, fake news, we need to put as much – a huge amount of effort into the real news content that’s out there.

It costs nothing to spread fake news, it costs a small fortune to counter it. What public service media does, proper journalism, they’re regulated, there is accountability to the important positions that they hold in society. That’s absolutely, right. Wouldn’t you now say that the social platforms hold an incredibly influential position in society, where is there accountability, to the public, where is the regulation, and they’re no longer start‑ups, they’re not new anymore, they’re among the wealthiest companies in the world and just now I have heard from Peter, from Facebook about the work they have been doing recently, the other platforms have been doing similar efforts, particularly during this crisis, and they told us about the millions of posts and accounts that have been removed, the efforts that have been banned, but the fact is, we still see dangerous fake news online. I’m sure you see it in your feeds, and so do you wonder how effective these actions are? Unfortunately, there is no independent verification of how successful the initiatives are. The companies keep the data very private. You have to ask how much longer is this going to be okay.

In a communication that was published yesterday by the European Commission, talking about tackling fake news and disinformation in COVID‑19 it said there needs to be more transparency and accountability from the platforms. Basically the system we have at the moment that allows them to mock their own homework, it is just not the same as being accountable.

We know that fact checking is working. And it is playing an important role in this crisis. Facebook is doing good work in that area. They’re bringing down biggest myths and biggest lies. But it is only effective if the fact checkers can really reach the people that saw the fake news in the first place. What’s worrying, in that same European Commission communication yesterday, it said that platforms have not sufficiently empowered fact checkers during the crisis. For example, by making more data available or giving prominence to the fact checks. The problem for us, public service broadcaster, usually you’re blamed for not reaching the people with information, is it the platforms or the public media association?

I’m a public service journalist, I’m also duty bound to talk to you today about where I think the public service media could be doing more and there is one particular area I would like to highlight.

It is what actually made all the platforms so successful. When you open Facebook or Twitter, YouTube, Instagram, you see people like you. You see people who look like you, who think like you, who have similar life experiences to you. You feel comfortable, these are your people. These are people you can trust. These are people who share fake news, but because the guard is down, you share it onwards. Despite laudable progress in this area, a diverse world is not fully reflected in the content produced by our public service media. This is well‑recognized. We must move faster. Our mandate will shrink with every day we fail to represent all of our audiences. To survive, we must be trusted, to be trusted, we have to be authentic and we can only be authentic if we really live and represent the experiences of the public we serve.

In summary, I think everybody here today will be in agreement that high‑quality trusted news is the best antidote to fake news, that means we need more reliable funding for public service journalism and for political leaders to protect it as a core pillar of democracy. It means we need freedom to do our journalism without hinderance, COVID‑19 should not be an excuse for governments bent on restricting freedom of expression, protests in the U.S. should not mean that journalists are fair game to be targeted by law enforcement. It means that there is a requirement for major technology platforms to prioritize properly real news content and for them to be more accountable with the influence that they have.

Thank you.

>> SABRINA VORBAU: Thank you very much for sharing your thoughts and comments with us. I’m pretty sure we have also the other panelists who would like to make some comments. I also see that people are starting posting some questions in the chat. We’ll definitely come back to them later on.

Now I would like to give the floor to Paolo Cesarini from the European Commission. We have definitely seen that the European Commission has been doing a lot of work around this topic and especially during the COVID‑19 crisis. There was also a lot of conversations that took place between the European Commission and social media companies. These conversations were also accessible for the public and just last week the commission also launched a digital service act which is actually now open for public consultation. This is to strengthen the protection of fundamental rights such as freedom of expression protecting citizens against harmful content. I’m delightful to have Paolo on the panel with us to talk more about the work that the commission is doing in this regard.

Over to you. P thank you. Good afternoon to everybody.

Can you hear me? It is okay? Very good.

>> ABHILASH NAIR: Yes. Thank you.

>> PAOLO CESARINI: Let me start by recalling that the Commissioner has been working hard on this complex topic for now more than two years. Our approach has been set out in different documents, including the action plan against this information from December, 2018. It is important to stress that the underlying approach, it is always the same, it has remained very much with the basic principles of the fundamental rights, freedom of expression, media pluralism and freedom and while at the same time recognizing the need to act upon behaviors, conducts which take place online, particularly on social media platforms that in fact undermine this fundamental rights because this information is a means to undermine the right of everybody to receive and impart information, to receive reliable information. So that’s the starting point which just translated into a number of actions that reflect the whole of government, the whole of society engagement to have this complex phenomena.

The COVID‑19 crisis has just underlined the need for this action to move forward. It is an excellent test case to see to what extent the work done so far has produced results. In my view, and the commissioner noted yesterday in a communication, you know, that underline, it the work done so far has had some good results. These results are certainly not sufficient.

The COVID‑19 has also highlighted how complex the phenomena is. What we have been assisting during these months, it is phenomenal that mix, that intentional spread of dangerous and conspiracy theory was an overload of information that in itself creates confusion on audiences. We have seen, you know, unintentional spread of information which is inaccurate, but nevertheless perceived by people as accurate and the result is into behaviors that undermine the containment policies that various governments put in place in order to control the spread. We have seen conspiracy theories shifting from online to the offline world, an example of the stories around the 5G deployment has been one of the causes of the spread of the virus, the stories that had been inciting people to take concrete actions in the real world with attacks on network infrastructures, with taking actions against employees of Telecom employees of various Member States and we have seen other types of warning things, such as resurgence of hate speech, particularly addressing certain ethnical group, that they’re responsible for the spread of the virus. We had seen consumer, false products, scams, that are sold by using information that was clearly false and we have seen cybercrimes, hacking, phishing using COVID‑19 as a way to spread malware.

Most importantly, we have seen foreign actors in certain countries like Russia, China, exploiting these circumstances to spread deliberately a number of stories aimed to undermine the credibility of the actions taken by European government to control the crisis and on the other side, to improve that image worldwide.

So this is to say that when we talk about this information, information disorder someone has just recalled this terminology, we need probably to keep in mind that we need to distinguish the different – the forms of misleading content that’s been translated into an epidemic during this period and taking into account that some may be illegal, others, not illegal, we should take into account that in certain cases that’s not an intention to cause harm, but in other case, the intention, it is clearly demonstrated by the tools and the means that are put in place when this information is spread on digital media and platforms.

We need to have a calibrated response that take as into account the harm, the intent, the form of dissemination, the actors involved, and their origin. For instance, if we talked about misinformation, which you could define as a form of intentional dissemination of false, misleading information, (audio issue)..

The intention of the information, the campaign – (audio issue).

>> SABRINA VORBAU: I believe we are – we’re losing you a little bit, Paolo.

>> PAOLO CESARINI: I have some connection problems apparently. Is it better now?

Is it better now?

>> ABHILASH NAIR: Yes. Now better. Before you were breaking up.

>> PAOLO CESARINI: Unfortunately, it is quite unstable. I see warnings appearing on my screen and I can do much about it, I cannot control the wi‑fi. Hopefully –

>> ABHILASH NAIR: We can hear you now.

>> PAOLO CESARINI: So this is to say the complex phenomena, it has been reflected upon in the communication that was adopted yesterday where there’s a number of actions that should, that will be carried out in the next week, months in parallel, including the strengthening of the strategic information and outside of the E.U., they include mechanisms for Member States to better cooperate in terms of exchange of information, exchange of situational analysis and threat analysis between themselves, and they include also a better cooperation with international partners like NATO, G7 And, of course, a very important part of this communication, it is about the responsibilities and platforms, it is about the importance to ensure freedom of expression and to ensure a pluralistic, democratic debate, it is about raising citizens’ awareness, and it is about reporting, fact checking, research activities. In particular, as the topic today, it is very much focused on platforms, I would say that during the crisis platforms have reacted, listening to the concerns that the Commissioner has taken to them, they have taken action particularly in terms of raising the visibility of authentic sources, including the WHO, public health authorities and the media and their own service, they have been demoting content that was fact checked as false or misleading and in extreme cases where the content would be against the terms of service of the platforms that have been moving content which was clearly harming Chief Evangelist health in certain cases, public security is an example of the attack of the 5G infrastructures. This is very good. Certainly, it is important, lessons that one can learn, when there are conditions, whereby the platform are based with their own social responsibility, they take action.

They feel action when they feel the heat from the public scrutiny. (Audio issue). – monitoring programme very much focused on the ways which whereby the sources and the services, very much focused on cooperation that platforms established with fact checkers and that they fact check contents, very much focused on the type of manipulative behavior that’s detected on their own platforms that includes, of course, the fight against fake news accounts and other things, leak the news, the fake engagements or the intricacy against a behavior that’s one of the methods that they use to influence our domestic debate in Europe. We focus on better understanding, the revenue flows that come from the advertising and goes sometimes to the wrong places instead of contributing to the fines you a thundershower active, professional media and we’ll carry out that programme in the next months in the same model that was applied in the European elections in essence.

We also like to extend this dialogue to other platforms, we have seen other platforms like Tik Tok for instance, they have surged in importance of user we have in the last months. They should be a part of this conversation we believe.

A finality, a final couple of points, all these efforts will be doomed if at the same time we will not put in place an adequate system in order to set expectations to support professional media during this period through crisis, the crisis coming from the dramatic fall in advertising revenues, it is one of the causes and we need to provide structure for the long‑term sustainability of the media sector.

Secondly, we need to really watch out for governmental actions that through the COVID crisis they may take measures to exclude freedom of expression. We have had an example you wanted our own eyes within probably the internal borders of the E.U. Thirdly, we have to have a more serious structure in place to organize fact checking and research activities around the information in a way that all the languages of the E.U., all of the countries within the E.U. can benefit the same type of approach that fact checkers had been implementing, experimenting, developing during the last months. We need an appropriate structure. This structure has come to light. In the first of June, the European media observatory was officially launched and in the next months we’ll start to create a true network of fact checkers and researchers across Europe to carry out dedicated and thematic research that will enable a better detection, better analysis and a better exposure of the information, threats and strengths. That work will not just be an end in itself, that work should feed into the work of the professional media that could fund their source of information in order to increase the accuracy of their own reporting. It will also be a source of important educational materials that the media literacy community, so the media literary practitioners could use in order to – (audio issue).

>> SABRINA VORBAU: We lost you again. Sorry to interrupt you, staying with our time, I seal there are lots of questions.

>> PAOLO CESARINI: I have just finished.

>> SABRINA VORBAU: Perfect. Thank you so much. Thank you so much for your contribution.

I see so many questions coming into the chat. We just want to assure to make enough time for everyone on the call. Thank you so much. I think you made very, very interesting points and it is very great to see how intensive the commission is working on this. For example, you mentioned that this is not only this information but other issues such as hate speech which nicely also links to our final key participant on the panel with us, Charlotte Altenhoner‑Dion from the Council of Europe, and so I’m delighted to hand over to you now to hear a bit more about the work of the Council of Europe is doing and this morning, one of the panel session, it was described as the watchdog of Human Rights. We’re very delighted to have you with us, to hear where you see the responsibilities of the Council of Europe in this regard.

Please, over to you.

>> (Audio quality too poor to translate).

>> SABRINA VORBAU: Excuse me. I’m not sure if it is only me, but we hear you a little bit not clear. Maybe you can try again. I’m afraid the connection is not great. I don’t know if it is just on my end or other colleagues –

>> ABHILASH NAIR: No. I cannot hear either. It may be on her end. You may have to turn off your video and start talking to improve the quality.

>> SABRINA VORBAU: I’m really sorry.

>> CHARLOTTE ALTENHONER-DION: (Audiovisual yes quality too poor to transcribe).

>> SABRINA VORBAU: This is a negative part of all being online. Maybe because we don’t obviously want to lose much time, maybe you can try to figure out your audio and maybe we can come back to you then in a moment. As I said, there’s been a lot of conversation and questions.

>> CHARLOTTE ALTENHONER-DION: I’m so sorry.

>> SABRINA VORBAU: Don’t worry. That’s fine.

Maybe we can just have a look in the meantime in the chat, there has been indeed a lot of questions and interactions and I’m pretty sure some colleagues on the Pam would also like to make some comments based on the statements that we’re giving – that were given by other colleagues and maybe in the meantime we can get back to Charlotte Altenhoner‑Dion. To everyone on the call, feel free to use the chat function or raise your hand if you have any questions.

There was immediate feedback from a few colleagues on the call towards Guido in regard to the fact checking of the content that politicians are sharing on social media platforms. Maybe you can elaborate on this. People in the chat were wondering if the content that politicians are posting on social media that’s rather harmful or certainly slightly – or should be fact checked. Maybe you can comment on this question. Afterwards, we’ll try with Charlotte Altenhoner‑Dion again.

>> GUIDO BULOW: Sure. Thank you for the opportunity to answer that question.

What I tried to say, in a democracy, we do believe that people should decide what is credible and not technology companies. We didn’t want to interfere in that political discourse between politicians and people who elected the politicians. That being said, be and plus, we also know that political speech is probably highly scrutinized. That being said, politicians aren’t able to say whatever they want to say. If anything violates our Committee standards, it will be removed from our platform. If politicians share misinformation that could cause harm to people, and we have seen cases of that in Brazil as an example during the coronavirus crisis, we removed content from the platform, regardless if that’s politician, if they’re – I don’t know, from myself, we don’t make any distinction here. There are rules that are for everyone on the platform. This is our community standards. This is on top content that could entice real world harm. The only thing where we – that is not legible to be fact checked, this is political speech. That’s the only exception on our platform.

>> SABRINA VORBAU: Thank you so much. Guido for elaborating on the question.

Maybe we can check one more time with Charlotte. Looks like she changed the room slightly in the meantime. We can try again with you. It would be obviously very great to hear your views as well.

Maybe she’s trying to reconnect.

I also see someone from the audience who raised their hand. I would kindly ask you to unmute Mike Harris to pose your question.

Mike, the floor is yours. Please tell us where you’re from and from where you are calling today.

>> Mike Harris: Founder of XNMA Berlin‑based firm.

When talking about this, we rarely get off the subject of what companies should and shouldn’t be doing. While all of these points are valid, they’re distracting from the real problem, it is that social media platforms, operator governance, they’re governing details of our lives with the broadest of strokes, most roles on social media platforms, they enact civil liberty, they should be entirely owned by society, there are many reasons why the web has shaped itself in the way that it has. The only objective one, it the network effect, it is the primary driver for the successes of many of the big tech firms. I’m not suggesting the end of Facebook, Twitter, to break up big tech, I’m saying that there is no point in doing that. The network effect, it is a law, it can’t be avoided. Let’s shape governs of the platforms into something that fits with that and is equitable to us as societies.

To solve the problems that have been discussed here today requires us to acknowledge that networks of people become coercive monopolies, if we can do that, we stand a chance of reshaping the web into an advanced, diverse, competitive network. My view is that Facebook just shouldn’t have the right to say what they are or are not going to do about political speech.

It is just not their decision.

Thank you.

>> You’re absolutely, right. Thank you for bringing that up. It is a good segue to something we have set up over the last, I think one and a half, two years, and it is called the oversight board. We know we should make so many important decisions on free expression and while we always have taken advice from experts we feel we can do much better, which is the reason why we have setup an over site board, in the beginning, yes, we initially picked a few people, then the board itself picked people from all over the world and is still growing. At the moment we have 20 people from all over the world and there are former Presidents from Denmark, for example, there are people from Civil Society organizations and many more. They are operating absolutely independent. They are checking certain content decisions, and they – I wouldn’t say they set the rules. They will in the end decide. It is something that we will actually follow. We will apply what they are actually deciding. The board is growing to up to 40 people from all over the world which will hopefully represent the more than 3 billion people that are on our different platforms and decisions that at the moment we’re making on our own. Like I said, in the beginning, where we feel very, very uncomfortable as a private company to make the decisions on our own. One of the things, that we repeatedly also have Nick Lack, in charge of communications and policy globally, he’s repeatedly arguing that we need smart regulation in certain areas. We shouldn’t set the rules for when to label, for example, political advertising on our platform. We came up with that. We wanted to provide transparency. In the end, it should be something that governments are actually telling us how to do that.

I mean, we have certain standards, but ideally that comes from other people. I totally echo your point. we shouldn’t have so much power, and this is one of the reasons why we have created this oversight board.

>> SABRINA VORBAU: Thank you so much, Guido, for mentioning the oversight board. I’m pretty sure a lot of people in today’s call have took note of this.

I just received a message from Charlotte, she back with us. I would like to one more time try with her to hear her properly and to give her input on bow half of the Council of Europe.

>> (Audio quality too poor for transcription).

>> SABRINA VORBAU: One more time. It is still a bit shaky I’m afraid.

>> CHARLOTTE ALTENHONER-DION: I’m really sorry. This is –

>> SABRINA VORBAU: This is better actually.

>> CHARLOTTE ALTENHONER-DION: [indiscernible].

I just will go slightly [indiscernible]. (Audio quality too par for transcription).

I wanted to make a couple of points – (audio coming in and out) and the point of multistakeholder rhythm and approach. I think we need to do more of that and need to develop it into a real coordinated network. No one of us is able to really approach or address the problem that we have with the disinformation and broader issues of content regulation. (Audio quality too poor for transcription).

Public interest activity that can help them in taking the right action and the population, while very good, it is problematic (audio quality too poor).

>> SABRINA VORBAU: I’m afraid we’re losing you again.

>> (Audio quality too poor for transcription). Maybe use the chat function. I’m very sorry for this. We’ll hand it over to Abhilash Nair now who has been monitoring the chat and see if we have any further comments or questions and you are also very warmly invited to raise your hand if you have any questions and we’ll unmute you.

There are quite a few questions here.

A question for Liz, how can the government fund and protect authentic journalism without funding and protecting fake journalists camouflaging as authentic.

>> LIZ CORBIN: I saw that question. Thank you very much. It is good to have the opportunity to answer it.

Look, public service media is structured, well established, it is regulated and had a direct – it is very transparent in the way that it is organized and funded and supported.

So supporting public service media organizations is the best way to protect and support public service journalism.

Of course, we want to become as public service journalists, we want to become more of a diverse group, we want to reach more people, more audiences than we currently reach. We do need to expand and with that, it will be done, you know, that should be done under the umbrella of the organizations that already exist.

Yes, governments should protect their public service media organizations, and that’s what I meant by that point.

>> ABHILASH NAIR: Thank you. It certainly does.

A couple more questions: There’s a question I suppose this is best placed for the regulators to respond, maybe perhaps Paolo if you can, what about dark social platform, they’re becoming more and more important for the shaping of public opinion and especially the current situation, there’s a lot of misinformation being spread through the platforms. Are there any ideas what we can do about it as far as I know from Germany, this is the person asking the question saying, these services are not regulated by the law from the German perspective.

>> PAOLO CESARINI: Thank you for this question actually.

We have very well present to our mind the problem that’s represented by the current limited partnership of the COVID practice, we’ll start to expand it, we’ll start to invite new actors already now, telegram is one of those platforms that is increasing in terms of user bases and I would say during the COVID the two most used platforms which have shown some problematic features have been WhatsApp and Tik Tok.

I think we need to progress step by step and to start to tackle those platforms which are currently outside of the framework and to engage them into this structure of cooperation one by one.

By the way, also to underline, that beginning of July, the commission published also the evaluation of the code. Beyond the communication from yesterday, there will be not communication but a document drawing some conclusions about the work and the function, the effectiveness of the code and the next steps would be not only the DSA, the digital services actor that was mentioned by Sabrina, I would say also the European democracy action plan, which is put forward this year. That’s why think we need to have a serious reflection considering how to construct a regulatory backstop that creates more uniformity, more mechanisms called up with appropriate over site mechanisms and in case of need, sanctions.

That is one point I wanted to make before, I could make it before the question, to make it.

The other point, not to the question, but I feel the urge to react to, it is about the fact checking of political speech. I know and I take the point made by Facebook representative that is a delicate ground. I still fail to understand why if political speech is promoted, paid for, political advertising, why a platform should take money to give visibility and prominence to content that’s fact checked irrespective of the other again, the author of the content. What is paid for, it is – the paid for content, visibility, it is a form of overtaking the visibility of other sources of information to the use of money. The use of money should be subjected to scrutiny, even if it comes from politicians. That’s a reflection I think we need to carry on.

Thank you.

>> ABHILASH NAIR: Thank you. That’s a good question, I don’t know if Guido wants to quickly respond to that in 30 seconds, if you can.

Before you do, I would like to raise a related question which is in the chat screen, so perhaps you could take another 15 seconds, 45 seconds to answer! Why Facebook, social media companies prefer to work with fact checkers instead of working directly with professional media that do this work daily as part of the mandate, mission and dictate.

Over to you.

>> GUIDO BULOW: Just have to close my door, the mailman is at my door. One second.

>> (Chuckle).

>> While Guido is coming back, I would same fact checker, they’re in‑house departments of mainstream media, let’s not think of fact checkers as isolated, detached from mainstream media.

>> ABHILASH NAIR: Thank you, Paolo for chiming in already.

>> GUIDO BULOW: We have several examples of France, others in Germany even, they are doing fact checking in our fact checking programme.

Nevertheless, it is a good question, why don’t we work together with journalists, other media outlets and the reason for that, it is that the IFCN provides a global standard, and we’re a global company and we’re trying to operate with this programme on a global level, and to maintain the same quality across every single content – country of the world, in order to defend – yeah, also the programme.

Of course, there will be parties, people that say, well, this organization is biased, why are they fact checking my content? I don’t want to dwell too much on this. I mean, we see the situation in the U.S. where – I mean, on one hand, you see people favoring New York Times, the Wallstreet Journal, CNN, on the other side, you have Fox New, others, if you don’t have the one single standard, the perception of bias, it is definitely not helpful to increase trust in the programme and with an independent organization like the international fact checking network, you always have a body where people can reach out to in case they don’t trust any of the fact checkers and where everything is super transparent. Take an organization, like DPA, Germany, the news agency, you see that the application for the ICM with every single step publicly available, and you see this, if anyone has ever dealt with the organization, you can reach out and you will see a publicly – a public process of the inquiry that you actually asked for.

We’re trying to be as transparent as possible on a global level.

While I totally acknowledge that in Europe, most countries of Europe, we have certain standards like in Germany, others, there may be some countries that we don’t have that, and in order to have a functioning system with fact checkers, we need to have this global standard. That’s the reason for the standard, why we’re working just with specific sets of partner, which by the way is growing steadily over time.

I almost forgot about the first question, give me a hint quick..

What was the first question again?

>> ABHILASH NAIR: I’m actually thinking what was the first question.

It was building on from – it was essentially Paolo’s question.

>> PAOLO CESARINI: If you want, I can repeat? My question is (poor audio quality). Political speech, why – how does – how can this be justified when we talk about paid for content. My point was I don’t think the scrutiny should stop with the ads from the politician and the statement has been fact checked and is false, misleading, you can still put it on the platforms, but why to give extra visibility and to make it an object of money, to get a prize to statements that are not reliable.

>> ABHILASH NAIR: 1 minute please.

>> GUIDO BULOW: I see your point. It is not an easy answer. You know, we’re heavily discussing that internally and externally.

There are certain limitations.

Even when politicians or political parties do advertisement, advertisements, which is not political speech per se, we actually take these advertisements down, which we have done in the U.S. with Trump as an example where he was doing advertisements on the census.

That happens if it is political speech, not eligible to be fact checked, it is able for political parties and politicians to use advertising for that. Again, the reasons for being that, it is that we don’t want to interfere in our political discourse. We don’t want to be the ones that stand between the politicians, but also the people that elected these officials.

>> SABRINA VORBAU: Thank you very much, Guido.

I’m afraid we’re coming closer to the end of our session today. I know there are many more questions and comments in the chat. Just for everybody to be aware, you can continue this conversation on the EuroDIG forum, even once the session is over within the next days. I encourage all participants to do so.

As mentioned before, I will hand over to Katarina Andjelkovic now from the Geneva internet platform to summarize the key messages, the main takeaways from our discussion today.

The floor is yours.

>> KATARINA ANDJELKOVIC:

I hope you can hear me. I’m Katarina Andjelkovic and I am a Rapporteur from the Geneva internet platform.

I have collected a couple of messages. Now I’ll turn to the first one. I hope I can see it on the slide.

I can’t see it on the slide. I can read it anyway.

So my first message will be that a multistakeholder involvement, meaning the involvement of all those that are directly concerned by the issue of misinformation of is utmost information in fighting this misinformation and there is a need for more serious structure in place to organize fact checking and research activities that will be available in all U.N. languages and would therefore benefit all countries. My second message, it is a high‑quality trusted news is the best antidote to fake news, to achieve that there is a need for reliable it funding for public journalism and the protection of the freedom of the press by national authorities on the other. Crisis such as COVID‑19 should not be an excuse for governments bent on restricting freedom of expression.

The third message will be that media literacy is crucial in fighting the information, it is important to educate and empower people to spot this misinformation and decide on their own whom to trust.

The final message that’s just drafted, and therefore it is not uniform, it is in order to regulate all platforms in a uniform manner there is a need for a more corps then serve reflection on how to construct the regulatory backstop that creates more uniformity, moreen instruments with appropriate oversight mechanisms and in case of need, sanctions.

I would just like to remind you that. All these messages are not final and will be subject to your comments and questions and EuroDIG will provide more detail on that.

Thank you very much.

>> SABRINA VORBAU: Thank you very much.

We have 2 more minutes before we have to close.

I would like to give the opportunity for our key participants to just in 10 seconds give a final statement, a main take away. We’ll start from the beginning. We heard from the beginning from Tanja, maybe your main take away from today’s discussion, please? In 10 seconds if possible!

>> TANJA PAVLESKA: I will just repeat that I see multistakeholder involvement as key, but not only speaking of multistakeholder issues, but involvement of the stakeholders themselves, and showing understanding for the other stakeholder’s problems. Not expressing their problems but showing understanding for the stakeholders within the same issue.

>> SABRINA VORBAU: Thank you very much.

>> NERTIL BERDUFFI: I hope during this discussion we learned a lot and also to talk about the freedom of expression and the freedom of media as a key point, I think. I need to end with this.

>> SABRINA VORBAU: Thank you very much. Thank you.

>> GUIDO BULOW: Plus to what’s been said so far, I think the problem is misinformation, it has been around for as long as people are living on this planet and probably will live on even longer. We need to collectively look at the misinformation and do our best efforts to help people get first of all, access to incredible resources and give the tools that they themselves can decide, what is credible information, whatever is the technology, whatever we invent, we’ll never get rid of misinformation.

>> SABRINA VORBAU: Thank you very much.

>> LIZ CORBIN: In order to tackle fake new, we’ll need to collaborate a lot, lot more between journalists and platforms, with regulators, governments, and other parts of society.

Journalists need to know in realtime from the platforms what is happening and the scale of what’s happening. We can’t tackle what we can’t measure.

>> SABRINA VORBAU: Thank you very much.

Charlotte, if you’re still with us, write the thoughts in the chat.

>> PAOLO CESARINI: Yes. For my side, I would say complete what Liz has just said. Regulation will not make fake news disappear. The dialogue between all from Civil Society, media, it will remain essential in the future, irrespective of the regulatory choices that will be made in the next months.

>> SABRINA VORBAU: Thank you.

Mull day stakeholder approach is key to echo what you have said, this is what we tried with today’s discussion, I’m delighted that we managed to get such a diverse panel with different experts representing different stakeholder groups, the academic field, the industry sector, also policymakers. It was a real pleasure to talk with all of you.

I hope you enjoyed this.

As I said, you can still use the EuroDIG forum within the next days to continue this discussion, which obviously won’t stop here. Thank you very much for your time and for sharing your expertise with us. I do hope that we’ll continue this discussion very soon. Apologies for all technicalities, this is the life we live in! I hope to see many of you soon in person to continue this dialogue.

I’ll hand over to Roberto in the studio. Thank you very much once again for everybody joining us today.

>> ROBERTO GAETENO: Thank you.

The only thing I would like to say is thank you to the participants, all the speakers, the moderators, everybody who has participated to make a great session and I’ll stop it here. We’re already a couple of minutes late.

Back to the main studio.

Bye, all.