Online political advertising and disinformation – gathering evidence, shaping regulation – BigStage 2020

From EuroDIG Wiki
Revision as of 14:24, 13 November 2020 by Eurodigwiki-edit (talk | contribs) (→‎Transcript)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

12 June 2020 | 13:00-14:30 | Studio The Hague | Video recording | Transcript | Forum
BigStage 2020 overview

Session teaser

In this session, Dr Meike Isenberg from the German Media Authorities will talk about important research she led on how online platforms have been handling disinformation, political advertising and elections communications. She’ll tell us about how regulators have gone about this work: the methodology used for the analysis, skills required, collaboration with platforms and with other regulators, and potential challenges along the road; and how platforms have responded to the self-regulatory challenge of the EU’s Code on Disinformation. And perhaps most importantly, we’ll hear from Dr Isenberg what conclusions can be drawn about the current self-regulatory approach, and discuss how these can be translated into policy actions that avoid threats for freedom of speech.

Further reading

ERGA reports on Disinformation Code:

Independent consultants (VVA) report on Disinformation Code for the European Commission:

Statements from the European Commission on 10 June 2020 on Disinformation:

People

Presenter/Key participants:

  • Maria Donde (interviewer)
    Maria works for the UK Office of Communications (Ofcom) as the Head of its International Content Policy, covering media, content and broadcasting issues. She leads on Ofcom’s engagement with other European media regulators, most particularly through the European Platform for Regulatory Authorities (EPRA, where is currently a Vice-Chair) as well as international bodies, and represents Ofcom on the full range of media policy questions, in particular media plurality and media literacy.
  • Dr. Meike Isenberg (main speaker)
    Meike works for the Media Authority of North Rhine-Westphalia, which is responsible for the protection of human dignity, minors, media users and private media plurality. In the Media Policy and Media Economics Group, she heads the research activities and deals mainly with questions of disinformation, online political advertising, hate speech and international law enforcement. Recently she had the overall coordination of the German contribution to the monitoring the European Regulators Group for Audiovisual Media Services (ERGA) has carried out to assist the European Commission in the evaluation of the effectiveness of the Code of Practice on Disinformation, signed by the major online platforms Facebook, Google and Twitter.

Video record

https://youtu.be/XvCciO9lYX0?t=13493

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com


This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.


>> SANDRA HOFERICHTER: We will start immediately with the first BigStage from the United Kingdom submitted by Claire Local, we’ll have two presenters, the first is Maria Donde and the second one, it is Dr. Mika Sinburg.

>> Hello to everyone attending the EuroDIG conference and this theme on Online political advertising and disinformation: Gathering evidence, shaping regulation. On topic is disinformation, it is on top of the public agenda around COVID‑19 what, we want to give you is insight on the research on online political advertising and disinformation and efforts made with key long‑terms for public policy.

I’m joined by Dr. Mica who works for the media regulatory authority in Germany. In the media policy and economic group she heads research activities and deals mainly with questions of disinformation on, online political advertising, hate speech and law enforcement international, bringing us to a key development we’ll discuss today, recently she was responsible for coordinating the German contribution to the monitoring carried out by the European audiovisual group to assist the European Commission in the evaluation of the effectiveness of the voluntary code on disinformation this code, it is a voluntary commitment for those not familiar with it made by on line platforms including Google, Facebook, Twitter in context of the ensuring the integrity of election sets around 2018 ahead of the 2019 election. The code covers five areas, scrutiny of ad placements, about redoing revenue streams for disinformation, political advertising, issue‑based advertising on transparency, sponsor identity, amount spent, integrity of services, bank accounts, bots, empowering consumer, supporting critical media literacy, empowering the research community as well.

We published two reports assessing the application of the code, one in July of last year and May of this year. Our work motivated and coincided with national level inquiry from regulatory authorities. The work with the German authority, it is a leading example of this type of national inquiry, it should be seen independently of the work on the E.U. disinformation code and one of the most visible efforts in area. We have an example of how research is conducted to provide critical evidence for developing recommendations for solutions. That’s why we have invited her today to give a flavor of the German Media Authority work on the topics in the E.U. and in the Independent German context.

Doctor, welcome.

Turning first to the publication of the latest report in Germany which I believe happened this week, the question for me, how did you come to work on this topic, conducting the research for this report?

>> Yes. First of all, thank you very much for your kind introduction. It is good to see you here. Coming back to the question, actually we were already thinking about these questions ourselves, setting aside for one moment the work on the E.U. level code, in the run‑up to the European elections we conducted in cooperation with three other national media authorities a study on political micro targeting and there we found German political parties are only just beginning to experiment with targeted advertising and that there is currently no need for further regulation in the field of political communication in the form of broad advertising that’s labeled and clearly presented as such.

We also found that there is discrepancy between the advertising reach on the one hand and the organic reach on the other hand. It is possible for parties and other actors to communicate the messages without paying for advertising and that organic political content has the widest reach, but eludes the transparency of the issues established by the platform.

>> Can you give me an example?

>> Of course. It is evidence that although the EFA, a German right wing party is a party with the least advertising reach, it is the party with the furthest organic reach by all according to the calculation of our researchers, only a maximum of 39,000 people interact with their advertisements, while there were 2.6 million interactions on the EFD’s pace book pages, 10 times more than for any other party. The same applies for YouTube by the way.

Since other studies also pointed to similar results and indicated that such a reach may be based in part on non‑authentic user behavior, for example, by using fake accounts, social bots which distorts political online discourse we decided to get a picture of this phenomena in order to clarify whether or not that was fear of action from media regulation.

>> I think this is where the work with the other media regulators?

>> Right. The urban monitor, it became concrete, you already mentioned the action plan including an assessment of the code of practice and disinformation. It was proposed this monitoring was done by the European Regulatory Group for Audiovisual Media Services, and therefore in practice to the regulatory authorities at national level.

The following Directors Conference of the German State Media Authorities, they strongly supported this approach of the European Commission. Based on our previous experience it was decided to contribute to the monitoring activities in the second phase in Autumn, 2019 and approved to join operational implementation under the overall coordination of the State Media Authority.

In summary, we have had two goals, with our monitoring. On the one hand, we wanted to make a significant contribution to the decision making basis of the European Commission for assessment of further regulatory needs and the fight against disinformation. On the other hand, the cross platform and cross party is of high relevance, for further action in this complex field of disinformation we have an overview of what constitutes a problem to have adequate approaches to action.

>> What was the research process like? Can you tell us about what data you collected, who you worked with, so on?

>> Yeah. We decided to do it as a joint project between a few and we found the work programme so that it could be standalone among the media authorities and merged to an overall report. The concrete work programme derived, as you can imagine, from the Code of Practice. We have decided to carry out a two‑stage monitoring process, 11R and B of the five pillars mentioned in the intro to check whether the platforms comply with the commitments.

In the level A monitoring, four media authorities evaluated the assessment report of the online platforms. Interviews with fact checker, researchers on national level were also included.

In the level B monitor, professors from the George Washington University in Washington DC worked on our behalf to concretize the results by analyzing data and information they had collected themselves. This is an important point, the inclusion of data beyond the archives, it presents valuable extension of the database. It is of particular importance for the German media authorities as independent institutions to obtain insights that go beyond the cure rated data previously only made available by the platforms.

>> Let me just get this, right. What you’re saying, without the additional research you carried out with the external experts, you don’t think you could have conducted the right analysis.

>> Thank you. This brings us to the key result, which is not very pleasant but clear. First, the data material we received to answer the questions was totally insufficient. For example, there is no access to the population of all advertising to check whether political advertising is correctly labeled as such. It needs data that can be broken down to the Member State level.

The second, we were not able to verify the data because we have no power to obtain data by ourselves. This is increasingly difficult to justify given that the platforms offer their commercial partners much more detailed data than they make available to researchers or regulators. To put it in a nutshell, I think this is where it needs adjustment.

>> Understood. I think all of us regulators would like to find similar findings in that regard and we’ll see how platforms and regulators respond to that in the coming months.

Moving on, I think to the central part of your research, what were the key findings, what did you find?

>> Yes. We gave many detailed insights from the analysis. The key findings can be summarized in three statements, so first, as already mentioned, the data provided by the platforms is insufficient and the validity cannot be verified. This means that a platform must significantly improve data access.

The second, the concept of political advertising needs to be defined. The problem does not exist in the case of labored advertisement, but it does exist where it is not clear and transparent that a party is advertising. We need a definition of political advertising which is appropriate to the phenomenon. This is a task of the legislature, not of industry.

Third, the analysis showed that the forming of the democratic media sphere cannot be left to industry, but as a task of society that’s of the legislature.

>> On the basis of these findings, what condition collusions did you reach?

>> To sum it up, the separate relationship established with the code of practice and disinformation, it represents an effective tool against politically motivated disinformation, nor does it provide a valid basis of information or data to measure the effectiveness and compliance. This is, if you ask me, unacceptable because politically motivated disinformation is a potential danger for a democratic society. Particularly, this applies to the so‑called coordinated inauthentic behavior aimed at disseminating politically motivated disinformation by using fake accounts and artificially generated reach. At least the existence of this danger, it is confirmed by the current analysis and the numerous discussions with stakeholders.

So from our – so from our point of view, it is essential to take actions and given the need for effective, independent media supervision we think the responsibility should be with the national regulators.

>> Fascinating conclusions. In that context I’m sure it is interesting for this audience, how do the recommendations deal with the question of freedom of expression?

>> Yeah. So it raises a very important point Thank you for that. The right of freedom of expression is a high democratic value and much of what we discuss under the term of disinformation falls under the freedom of expression. For media regulation, the issue of disinformation is highly complex. It is about safeguarding the right of freedom of expression and at the same time preventing the manipulation and misleading of our citizens.

As the monitoring has shown, the voluntary commitments are falling short. Since the platforms are not fulfilling their commitments to a necessary extent, the legislature must develop rules and monitor the compliance.

The first important step here, it is to distinguish the phenomena caused you wanted the term of disinformation and to assess them according to the potential danger to a democratic society to see if we need regulation at all. Unlabeled political advertisement, for example, it is certainly to be assessed differently in terms of regulation law than content for which an artificial reach is created by using fake accounts.

The aim must be to take appropriate measures that are in proportion to the potential danger in a democratic society.

>> I’m sure that idea of proportionately is a key focus for never new developments in this area, including looking at what’s coming from the E.U. side I think.

>> What’s next?

>> One thing we have learned, a cooperative approach with other regulators is good to capture this phenomena from different perspectives and our joint discussion today, it shows that this is more scope for cooperation. What’s the timeframe for the next stages of our work? In the beginning of May we published the report and statement, also in May, the commission published an independent report conducted by a public policy company on behalf of the commission to assess the effectiveness of the code of practice and disinformation. In the next step, the commission will launch an official statement taking into account both reports. We are very much looking forward to this and the next steps. You know, we’re ready.

>> Finally, what do you see as the future high‑level development? I mentioned already that the E.U. is starting to think about this, we have seen an announcement from them this week. Indeed, in Germany, and in that context, what key areas do you think we as regulators, policymakers should think about?

>> Currently the debate in Europe is dominated by the question of which data media regulators need for effective monitoring. If you ask me, we can spare ourselves this discussion because we are running into a complexity trap which is based on fundamental error and thinking. The basic idea of regulatory policy is actually in reverse. We, as a society see a danger or a risk that we legislate that it must not exist. This legal obligation results in the duty for every company that operates in this field to ensure that such incidents do not happen and in the final step, it is necessary to check whether companies are following these rules. Right now we’re going the complete opposite way. We’re arguing about methodology. Policymakers must reclaim the prerogative of the legislature so that we have to say this is how we want a democratic internet and this company must follow the rules and the media authority is there to check that they do and if they don’t we implement the range of sanctions available to us. I’m quite sure then that as in all other areas, companies will comply with the rules within a very short time. With that work, I suggest that we then start and then we’ll see how far we get.

>> Thank you so much. Thank you for your insights on the research and the work you have been doing and indeed on the work at the prone level as well as in Germany. I think it has been incredibly useful for me and hopefully for everyone watching this interview to reflect on the importance of the growing evidence base that we have and to listen very carefully to the recommendations coming from regulators where they are coming from their day‑to‑day experience of doing this type of work.

Really, obviously delighted to further make the case for strong evidence‑based regulation. Very, very interesting, as I think we all are, to see how the themes fit into the coming proposal from the European Commission on the matter and, of course, also as ever, I would like to stress it’s the importance and to agree with you on the need for collaboration, cooperation and avenues of sharing the data and the evidence base amongst ourselves because after all, we’re dealing with global companies and much of the work that we all do can be shared and useful to all of us.

Thank you very much for sharing that insight with us. Your report I believe was published this woke and I’m sure we can provide links to that, it is available in German in the first instance but many will be interested in reading it in German and subsequently. Thank you for the time. Thank you for watching and listening to our discussion.

>> SANDRA HOFERICHTER: Thank you. Thank you to the two speakers of the BigStage.