Platforms as critical infrastructure for democratic discourse – TOPIC 03 Sub 03 2023

From EuroDIG Wiki
Revision as of 12:28, 14 September 2023 by Eurodigwiki-edit (talk | contribs) (→‎Messages)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

21 June 2023 | 12:30 - 13:15 EEST | Main auditorium | Video recording | Transcript
Consolidated programme 2023 overview / Main Topic 3

Proposals: #17 #31 #44 #54 #55 #58 #63 #64

You are invited to become a member of the Session Org Team by simply subscribing to the mailing list. By doing so, you agree that your name and affiliation will be published at the relevant session wiki page. Please reply to the email send to you to confirm your subscription.

Kindly note that it may take a while until the Org Team is formed and starts working.

To follow the current discussion on this topic, see the discussion tab on the upper left side of this page.

Session teaser

In recent years, challenges of platform dominance have attracted new approaches and solutions, ranging from decentralized social networks to independent oversight mechanisms. In this session we will discuss approaches and concrete initiatives which go beyond regulation and can contribute to shape a digital public sphere which respects human rights and which works according to democratic rules.

Session description

With growing conviction that increased democratic control of Big Tech is necessary for democratic societies, regulation is being introduced at European level and global guidelines are being discussed (see for example: There is also growing awareness that visions and initiatives beyond regulation are needed for an Internet for Trust which safeguards freedom of expression, access to information and allows for a democratic discourse. In our session we will discuss approaches and concrete initiatives which can contribute to shape a digital public sphere which respects human rights and which works according to democratic rules.

In recent years, we have seen how platforms not only enable democratic debate but also impact it negatively. The spread of false information online, whether intentional or not, threatens factual common ground. The use of offensive language prevents some groups in society from joining democratic debates. (see Nordic Think Tank for Tech and Democracy: A small number of private companies decide who can speak, which ideas get heard, and, most importantly, which ideas get traction. As a result they control the digital public sphere.

Moderation: Nicola Frank, EBU


Speakers will briefly outline their approaches, followed by a discussion between panelists and the audience. We will address:

  • The situation of the growing power and misuse of Social Media; the social media ecology that emerged in the 2010s is now crumbling.
  • How to address the situation beyond ‘cleaning up the mess’: creating a vision for a Digital Public Sphere

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, so only links to external resources are possible. Example for an external link: Main page of EuroDIG


Please provide name and institution for all people you list here.


  • Hille Ruotsalainen
  • Meri Baghdasaryan

The Subject Matter Experts (SME) support the programme planning process throughout the year and work closely with the Secretariat. They give advice on the topics that correspond to their expertise, cluster the proposals and assist session organisers in their work. They also ensure that session principles are followed and monitor the complete programme to avoid repetition.

Focal Point

  • Nicola Frank

Focal Points take over the responsibility and lead of the session organisation. They work in close cooperation with the respective Subject Matter Expert (SME) and the EuroDIG Secretariat and are kindly requested to follow EuroDIG’s session principles

Organising Team (Org Team) List Org Team members here as they sign up.

  • Amali De Silva-Mitchell
  • Vittorio Bertola
  • Romy Mans
  • Emilia Zalewska
  • Dorijn Boogaard
  • Jesse Spector
  • Rajinder Jhol

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member by subscribing to the mailing list.

Key Participants

Key Participants are experts willing to provide their knowledge during a session – not necessarily on stage. Key Participants should contribute to the session planning process and keep statements short and punchy during the session. They will be selected and assigned by the Org Team, ensuring a stakeholder balanced dialogue also considering gender and geographical balance. Please provide short CV’s of the Key Participants involved in your session at the Wiki or link to another source.


The moderator is the facilitator of the session at the event. Moderators are responsible for including the audience and encouraging a lively interaction among all session attendants. Please make sure the moderator takes a neutral role and can balance between all speakers. Please provide short CV of the moderator of your session at the Wiki or link to another source.

Remote Moderator

Trained remote moderators will be assigned on the spot by the EuroDIG secretariat to each session.


Reporters will be assigned by the EuroDIG secretariat in cooperation with the Geneva Internet Platform. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.


Rapporteur: Katarina Bojović, Geneva Internet Platform

  1. Decentralised platforms have the potential to provide an alternative and overcome some of the concerning features of dominant social platforms, such as surveillance capitalism, the attention economy, and digital colonialism. Yet, many questions and challenges still need to be addressed, such as sustainable financing and the lack of scalable business models.
  2. The surge in large language models such as ChatGPT and other possibilities to create synthetic text creates greater pressure on content filtering and a much bigger need for transparency. Big tech companies must adopt clear and transparent content moderation policies that prioritise accuracy and accountability, with clear procedures for removing harmful content. Companies must also ensure their content moderation systems and rules are fair, transparent, and easily accessible in user languages.

Video record


Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835,

This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.

>> NADIA TJAHJA: Welcome back from a short break. Moving swiftly and directly into subtopic 3, platforms as critical infrastructure for democratic discourse. I would introduce you to the moderator. Please give her a warm welcome.

>> Thank you. For this last session, I hope you’re still awake, interested and interactive, that would be really very much appreciated.

Now, this is a discussion on platforms as critical infrastructure for the democratic discourse. We have heard a lot about regulation in the previous session in particular, because there is the growing conviction that increased democratic control of big tech is necessary for our democratic societies and this is why regulation is being introduced.

Not only at European level but we have also heard the discussions at global level so there is at UNESCO the discussion to prepare regulation of platforms and to have guidelines on the subject, and at the UN level, a code of conduct for information integrity on digital platforms is also being prepared.

These are only a few initiatives and there are many more.

There is also the growing awareness that we need visions and initiatives beyond regulation. It is not only about cleaning the mess I would say, it is also about proposing positive approaches if we want to have an Internet for trust. I would say what we also would like to have, of course, an Internet that safeguards Freedom of Expression, access to information and which allows for the democratic discourse.

This is why in this session we will discuss approaches on concrete initiatives that can contribute to shape what we would call a digital public sphere, and this digital public sphere should respect Human Rights and should work according to democratic rules.

We have four experts here who are our key participants who will introduce the discussion. They are all online as you can see. A feel a bit lonely here on stage. I’m sure we will manage with the help of colleagues.

And as I said, I invite everybody to interact with us from the room here and also from online.

We are all required to be brief, this is just a reminder for the speakers, so keep each intervention to 3 minutes and you have the clock ticking here.

So let’s start. Let’s first hear about the Nordic approach to the democratic debate in the age of big tech, we have heard in the morning already from Tobias Bornakke, but here we look at a little bit of a different aspect of the Nordic approach. Please, you have the floor.

>> OUTI PUUKKO: Thank you. I hope you can hear me well.

Good afternoon, everyone. My name is Outi, I’m a doctor of researcher at the University of Helsinki working in the democratic capacities in the age of algorithms project and my own research is focused on digital rights discourses in light of different initiatives by researchers and activists. I will focus on the research in the recent Nordic initiative titled Nordic Approach to Democratic Debate in the Age of Big Tech. However, I have not participated in writing this report myself, I can only provide more General Comments in relation to other initiatives.

So just earlier today we have heard from the Chair Tobias Bornakke that the implementation of the recommendations is now being considered in each country.

The report focuses on how the Nordic countries relate to digital platforms, it is kind of Nordic edition to E.U.’s Digital Services Act, and this report presents five visions for the Nordic region and some rural communities for concrete actions.

I will highlight some examples that I think are interesting for our discussion and that also characterizes this as a Nordic approach.

The first mission considers Nordics as a united tech democratic region, as an example of one of the concrete actions is the establishment of a Nordic centre for tech and democracy. This centre would support the implementation of the shared voice in the E.U. and towards digital platforms. The second vision highlights digital illiterate citizens, for example establishing a knowledge hub on digital literacy to bring together the different efforts, for example, to combat misinformation online.

The third vision highlights access to diverse, credible digital platforms and communities, emphasizing alternatives to large online platforms, and the fourth mission, it is about open and informed public debates and this includes recommendation to public service media a strong digital mandate, including online presence, content creation and development of platforms.

The fifth vision emphasizes public oversight of big tech platforms, including the role of independent researchers and continue monitoring of the State of Nordic digital democrat cyber resilience act. It is an interesting initiative, at least in three ways. First highlights, the need to think about the solutions also beyond E.U. regulation, and second emphasis, it is on specific historical, cultural context in this case, building on the Nordic tradition, for example, so-called media welfare state model has been coming to the Nordics and I would highlight the actor, importantly, citizens, Civil Society that should be able to inform and participate in this ongoing debates on different levels.

So I think it would be interesting to discuss this Nordic approach together with other efforts with my colleagues.

Thanks so much.

>> This is really a vision that really looks at the democratic discourse very much. The welfare of society in this context, very important elements in this context and this discussion.

Now let me turn to the Meta oversight board, welcome to this discussion here today, you have just heard about the Nordic report and recommendations to improve the democratic debate in the age of big tech. What should big tech do to improve the conditions for this democratic discourse, and does the oversight board have an impact on reducing, for example, false information on maker platforms, the use of offensive language, hate speech, I know that we have been looking at this particularly, you have recently issued a report, tell us about it. Thank you.

>> EMI PALMOR: Thank you for inviting me. I’ll try to be quick in 3 minutes.

I’m a founding member of the oversight board, we’re people who don’t know about the board, it is an opportunity to say that we’re an independent body comprised of experts from around the world that makes binding decisions on what content should be allowed on Facebook and Instagram, and our decisions are based on whether the Meta enforcement is consistent with the policies and values. We make policy recommendations rooted in Human Rights protections and Human Rights principles which reshapes the Meta content guidelines when fully implemented.

I want to say, I want to talk a little bit about what we’re doing and it is important to say in the beginning that we are and we should be part of an ecosystem of stakeholders and actors at a want to get us into a fair, more just world, and the word is the one that can publicly identify the gaps and work towards changes in Meta but without the interests of other actors pushing Meta for implementation our work will not be as meaningful as it can be.

So first of all, I think that we should say that Meta and other tech companies need to adopt clear and transparent content moderation policies that prioritizes accuracy and accountability and has clear procedures for removing content when there is a potential for real world harm. The board has reviewed a number of cases and hate speech from South Africa to Serbia, we have focused on hate speech against marginalized groups as one of the strategic priorities, precisely because it creates an environment of discrimination and it is often context specific, coded, can manifest a real world harm.

We found that Meta’s hate speech community standards is often among the most frequently cited community standard in user ate peels to remove content from platforms and we have pushed Meta to publish clear explanation and on how it creates enforces, audits the market specific slur lists, and the company has since followed through on these recommendations.

Companies also must ensure their content moderation systems and rules are fair, easily accessible in user languages, it is important to tell people as clearly as possible how you make and enforce the decision, also it goes to what had been mentioned on the importance of literacy of those platforms, because when you don’t understand, you don’t know how to use it, you make a lot of mistakes, and it can damage your freedom of speech.

Within three years, the work has resulted in greater transparency in how the Meta systems work, I don’t have more time, I will just say that one of the important things that we have done is in terms of crisis protocols, because we understand that the democracies, they’re in danger in times of crisis, even more than in times of peace and it is important to verify that Meta, other tech platforms are very transparent, fair, clear on the way they moderate content, on times where voices are the most important.

A last word, when we talk about a group of experts, I think that when we talk in terms of the democracy, our diversity, it is not just about our professions, but also about the countries we come from, the regimes we come from, not all of us come from democracies and it has a lot of impact on the way we understand the importance of voice and freedom of speech all year long and specifically during times of crisis.

>> Thank you very much.

This is very interesting. We will all look at it, see how really this oversight board will make a difference, we hope it will, because there are a lot of problems we know.

Now let’s turn from the metaverse to – we have heard about the metaverse this morning, quota lot, but there is also now this new idea, newer idea of let’s say the third verse and it refers to a collection of interoperable decentralized social networks, I looked it up, so I’m not inventing anything here. And other online services that are independent but able to communicate with each other. So the most well-known application is certainly mass done and since the musk takeover of Twitter, it is a popular alternative to traditional social networks as you know. The question is, if the decentral lieder lied approach is more democratic, whether it can support a digital public sphere, which respects Human Rights, which works according to the democratic rules, so this question goes to you, you have looked at this with a number of experts.

>> Charlotte Freihse: It is a pleasure here to be, I’m a former journalist, political scientist, currently working on platform governance and disinformation.

So I’ll be very quick. Our report is very long, but maybe let’s start with the very right and often heard criticism of the digital age, there is just too much power concentrated in the hands of very few individuals, namely platform owners. And an infrastructure that’s democratic and designed to serve the common good currently conflicts with the business models of large commercial platforms. Yeah, here we go with decentralizing networks like Macedon, and by moving away from profit oriented logics, they do uphold as we believe the potential to serve a more democratic public sphere. This is still very broad. Maybe to narrow a little bit down what it actually means to be serving for a more democratic public sphere, for that we looked at concerning features of dominant social platforms, namely surveillance capitalism, the attention economy, digital colonialism, to see if Macedon is likely to be repeating them or even amplifying or not. While Macedon is designed decentralized like it was said, OpenSource, non-commercial, the whole idea is to break with the wallet gardens of the centralized commercial offerings. In short, at the moment, we don’t see surveillance capitalism. Second, this is chronological, not algorithmic and there is no tracking because at the current moment there is are no ads. This mean there is are no design features that really try to keep you on the platform to show you more ads. At the current moment we can observe that it does not feed the attention economy.

Third, and this is crucial in our main concern with Macedon, it was mentioned here are main concern is with inclusiveness. In theory, anyone in where can set up an instance and practice, however the current lack of scalable business models means that people really require time to volunteer and financial resources to run their instance, and in both things, they can more likely, no surprise, be found in people with privilege of white and male presenting so we really think without dedicated resources therefore Macedon can really risk repeating patterns of digital colonialism. In short, before my time runs up, we think it has the potential to provide an alternative, but we still have to address a few things and questions and the most crucial ones are how can it be financed in a sustainable way, and the second one, can inclusive participation and safe spaces be scaled effectively in the future? Thank you, everyone.

>> Thank you so this could become a real alternative model. One of the keywords for me, is probably also transparency. We know what happens because we don’t know this with big tech, we don’t know how they organize content moderation for example, we just see the results and sort of if they can deal with this, that would be certainly very much appreciated.

Now, last but not least, we want to show you an example of a decentralized content offer, offerings – it is actually offering content, news online, and this is a project developed by the EBU and a great number of its members. We first want to show you a little video so that you have an impression of this project.

>> (Video with captions). is.

>> I hope that gave you a good impression of this project which has been developed now for a while already. We can show something and it is really online, it is there.

I have my colleague Justina here online, and she will briefly explain a bit about that and I turn to her also on how far does this project contribute to the European public sphere.

>> Thank you for having me. I’m the head of news strategy and also Director of this project, the European perspective.

I’m going to take you to a quote towards the first Secretary-General of the European Broadcasting Union, in 1920s, I think it really defines the essence of the efforts.

So he said then if nations could see what news for others was, and how others lived it would endanger peace and understanding. I think that 100 years later we’re doing the same thing, which started when the European Broadcasting Union was created, except that we are doing it online through sharing content from public service organizations online for audiences.

I wanted to also bring a quote, I’m sure that many of you listened, attended, heard the quote before, from Maria recently made at the Internet for trust conference at UNESCO.

I think that this quote, it really explains why there is no choice between – it is not – it is not either/or. It is not either personalization, no mass media journalism, mass journalism.

The quote from Maria, it is that without facts we cannot have truth. Without truth, we cannot have trust. Without those three, we have no shared reality, and therefore we cannot find solutions, and therefore we cannot have democracy. As you said, Nicola, an underlying aim is for this project as well – this is not the only element, but to contribute to secure the European public sphere, just reflecting the normative principles such as Human Rights, such as diversity, such as inclusion and respect for law. I will end on telling you that I’m speaking to you from Dublin, where we just held a big industry event, news exchange for the first time the journalists came together almost 500 journalists since COVID and, yes, we talked about journalism, we looked at different aspects, we called it different frontlines of journalism. For me, it is very important, and I go back to why sharing is important, because one of the subjects broadly discussed was the war of course in Ukraine and why people are reporting or taking risk, risking their lives, why is it so important? It is very clear, if we stop reporting, it is what’s happening in Ukraine, then the society will kind of start slowly forgetting about it.

So what I wanted to say is that I believe that as I said, it is not either/or, and there is still a place and a strong place to build a shared reality and democracy with trusted journalism.

Thank you.

>> Thank you.

We are very proud about this project, as you can see.

We have been developing for a while now.

It’s really a practical application of what we call the European public sphere where people can, you know, have different perspectives from different countries, they can make their own informed choice and judgment about what is happening in the world.

Now, at this point, I want to open the debate for questions from the floor, from online, so please come here. Please, go ahead.

>> Hello. I’m from Poland.

I have seen just recently your website of this project.

I’m so amazed that TVP, our Polish national broadcasting television is there, if you can say that’s an objective media, it is a joke for me. It is not an objective television. It is propaganda television, it is a public national television, but from my perspective, it is the only source of media that is publishing on this platform about Poland. I can strongly disagree with the different views and attitudes about politics because this television hasn’t shown the most – one of the biggest protests that were about two weeks ago in Warsaw and shown the most important thing was the gathering of housewives. There were no information about it. So I can’t imagine why in such project is only one public broadcaster because as we can see from the point of Poland, Hungary, public broadcasting isn’t free of being in a political war and can be propaganda and it can happen in any country, not only in Poland, it can happen in any country. That’s just my stand – statement. I don’t agree that such initiatives are worth developing but if you want to provide various news from various perspectives, then to various perspectives provide a platform for more than just on TV, provide a platform for independence journalists because this was a small, reliable, even though we have public broadcasting media, it is not enough to be objective even if there’s no such word as objectivity in journalism.

Thank you.

>> Thank you for this important question.

Yes, I think you can give a good answer to this?

>> Cristiana: Yes. Thank you very much. Thank you for sharing. If you see my name, I’m Polish, I live there had for a very long time ago for Polish television and I’m very aware of the different sensitivities and problems.

An element I want to say, of course, European Broadcasting Union has a number of members which are under and desire – this is not a secret – under a lot of political pressure and this political pressure is growing and is expanding to companies that have been having very stable media and have been – didn’t have the political pressures. This is one thing we have to recognize.

Public Polish public television and Hungarian television are not part of the European perspective. So European perspective, as a project, it has also – it is built on the editorial principles and the members participating in the project, they are deciding and there is a Committee which would analyze candidacy of any other member from the EBU that would like to join this public facing part.

TVP and Hungarian television are not – they’re not part of the project, and I think, yeah, I’ll end this just for the clarification.

Thank you for this suggestion. We’re discussing a lot.

You know, about the opening of the project, it started as a public service project and we are cooperating with other organizations, I would just mention two constructive television in Denmark and another, but this is definitely one of the paths that we’ll be looking at in the future.

Thank you.

>> Thank you.

Just to repeat it, Polish television and Hungary television are not part of the project, we have this editorial principles against which each member of the project is checked and decision of the group is taken, whether they can join or not.

Very clear on this.

The next question, please.

>> VITTORIO BERTOLA: Sorry to be always at the mic, a comment, something that the panelist said, some reflections, but first of all, I come from Italy, everybody knows that the news brought in by the public broadcasting service is politically oriented and when the government changes, usually the news, the Directors of the news services change. People already have that in the back of their minds. I was struck by your – by the comment that we have to continue reporting about the war in Ukraine, otherwise people will stop caring about it, I agree, it is important, not only just that war but all the wars that exist in the world. Yesterday I quoted a poll that said 53% of Italians think that the mainstream news reporting is rescued and manipulated in favor of Ukraine. So why do you think – is this two sides of the same phenomena? What do you think the idea that people seem to have that you should not trust the media reporting on the war and everything it comes from, is it coming from the platforms or is it coming from something maybe older, deeper, a long traditional political manipulation of media that’s now being the price we have to pay and the lack of trust people have with the official information.

>> Before I give the floor to Justina again, I would like to know, are the – are the questions on this, something else? We don’t have so much time. We want to also address the other speakers with questions if possible so that’s fine. Thank you. Please, go ahead.

>> I think of course there is no simple answer, each country has a different context and it also connected with the moment in time. So, yes, when in the country where is there are political pressures, there is a clear indication of the very low trust, I think that probably the best reference for this very kind of objective, independent, and it is long-term study, it is the Rutgers institute report that is digital news report that’s released every year, the last one was released only last week, the 14th of June and it says still that there is a very high-level of trust in public service media but obviously in more Nordic countries.

I think as I said, it really depends on the country, the situation, because we would have seen in the moments of crisis there is much higher trust, whether it is public service media, big outlet media, people turn to news.

Obviously, when the times are maybe calmer, there is this huge phenomena that we all know also, which is probably the biggest challenge now, to journalism, which is news avoidance. And the news avoidance, of course, it is connected – not that very simple, the news avoidance is not only connected with the fact that I don’t trust particular outlet because there are political pressures are there are many different outlets, but news avoidance is connected with very kind of subtle understanding that they’re our kind of mental and emotional contracts, constructs, that are coping mechanisms for us, that the news managers have to be aware, coping with, you know, big pressures and with things which in reality bring a lot of worry, such as climate crisis, such as war.

So I think it is – this is a very, very complex subject, it is how to get the trust.

As I said, I think that if we look at journal trends, and again I will refer to the digital news report, I think that especially times of crisis there is still the need for trusted news and trusted media outlets, not only social media.

>> Thank you.

Just one word, so our project here, it offers actually content in dinner languages and cross-borders, if people want to be informed they can, they can also get the views on certain issues from dinner countries. That’s really the offer we make. Please, the next question.

>> I actually want to get back to the issue that’s really fascinating, not a question, but a bit of a comment, actually this exists for at least ten years, it pretty much holds the decentralized systems and protocols that supports it, and we just now having such attention because of the Twitter issue, other issue, so we have this waves of popularity, but a year ago we had a big discussion with the holders in the Russian community, so we were trying to, you know, get the idea what this decentralized metaverse, what is it, because it is not just Macedon like Twitter, it is like Instagram, YouTube, so all of the alternatives and the idea is that you have just one account and you can intercept between different services, that’s it.

And returning to the problem of content moderation, for some this may seem like a silver bullet, we have this democratic procedures, all transparency, but no, in reality you have plenty of instances that are supported by volunteers and most are geeks, so all of this small bubbles are really about some sub cultures, very local issues, so they don’t have a lot of reach out to the general public, and rules for content moderation is pretty authoritarian, it is just one person or a small group of them that supports the instance.

Users have just two way, you either except them or you go away, find another instance or set your own one, which is kind of very tricky and technical and of course financial sides. The last point, you know, no money, no honey. Who can you expect from this kind of platforms any big reach? So it is really working only for some small geeky sub culture people, that if we will try to, you know, transfer Twitter level of discussions and issues there it doesn’t work.

So that’s my comment. Thank you.

>> Many thanks for this interesting statement. I would think there is also a question, I mean, maybe to Charlotte, is this approach only something for the geeks, can it become something for the general public?

>> Charlotte Freihse: Thank you very much. You raised a lot of very important points.

I mean, I could only – I have only assessed Macedon so far. You’re right, there are many, many more. At the current moment, there is also blue sky and Meta will launch its own the de centralized alternative. I think this month actually.

Yeah. I mean, content – so what we see on Macedon as an example, it is that actually what you said, that the community on there is very tech focused, not very diverse. The question is really how do we get a more diverse community to actually access Macedon? This comes with I think really the business model which is also connected to content moderation.

Like you said, in the current moment, every instance has to collectively develop their own community guidelines, and maintenance for their instances.

This works out for the moment, instances, like you said, they’re very small, how do they scale and how can we make sure that if it gets larger there will be more resources to actually maintain these smaller and getting bigger instances, right now people do this on a voluntary basis.

So I think this being said, but I think that the most crucial point, like I see your critical points, I think this is actually what we also came up in our study, that diversity is an issue and has to be addressed, but I think when you look at the larger platforms, see how the content moderation is just organized, and without any transparency by just one business company situated in the U.S., then I do see that there is more potential here in decentralized networks where any instance can set up their own guiding rules and has to be transparent about that.

I think this is actually a potential when it comes to content moderation. We just need to take into account is who will be responsible or should be responsible to finance them in the future? There are different ideas on a business model for de centralized networks. Advertising is probably the most prominent one. This would actually feed into the logic of the commercial platforms again, but you could also think about a subscription model that you have to pay a euro, I don’t know, per month, to subscribe so that they can use that to support instances, but you could also – and I think this goes a little bit to my panel here, what do you think of public funds, should maybe public media be more responsible and say, okay, this is – can be part of the public infrastructure, communicative infrastructure, so that they may be found in instance and be more responsible there, what do you think of governments? I think, yeah, this raises for me really further questions to be discussed when it comes to the fediver se.

>> Yeah. Very good question indeed. Thank you. Before maybe – I don’t know, with maybe you want to take that, before that, Amy has raised remember hand, please, you want to comment on this also?

>> EMI PALMOR: A quick question coming from Israel, the question, I was concerned from the woman from Poland and the comments about freedom of showing what’s happening during protests. I wanted to mention a case that was took from Iran when we noticed that Meta is removing protest slogans that translates to death to harmonize and we asked for guidance to the reviewers that such statements do not violate the policy in the given context of protest. As a result, Meta began allowing viewers to identify previously misinformed actions and reverse them, which has increased political speech and critical times for the country where press freedom and other forms of expression have routinely been repressed.

This is one of our possibilities to act relatively quickly as an independent global body and to complement other obligatory, other efforts so we want to, you know, try to raise your attention to those possibilities and help us to identify those important moments where we can intervene.

>> Thank you for that.

Maybe before we move on with the next question, I have a question for – except if that question is for – no, that’s not the case? Okay. I have a question for is this Nordic approach, which has been developed, would that be also interesting for other regions, maybe specifically regions and with smaller countries like the Nordic region?

>> Outi Puukko: Thank you for the question. Definitely I think that this initiative can be interesting and also beyond the Nordic region, there can be for example similarities to other smaller – (Zoom freeze). – related to content moderation. Of course, like other panelists have also addressed, there is also a need, of course, to address these issues always more specifically in each context.

But for sure, I think that this Nordic approach has not emerged in isolation either. It is clear that these current models with who are dominant actors has become increasingly challenged and I think that this again highlights the need for plurality of voices that could be included and provide participation on different levels.

Yeah. It will be interesting to continue on that.

>> Thank you.

I have Justina that wants to react.

>> Thank you. It is not so much more a reaction and neither a solution, whether a public service media should invest, as you know, obviously this really happens in each country so we can decide.

I think that I just wanted to add one more dimension that we are very concerned about and, of course, other journalists and I’m sure the platforms as well, and that is, of course, the surfacing of the large language models and the fact that now the large language models with the release of ChatGPT in 2022 and other synthetic possibilities to create synthetic text media, predictive, genre active AI technologies are just going to increase already the pressures that we have had and there would be even more need to be able to filter things to our audiences, to citizens, and to be able – and also a much bigger need for transparency. It doesn’t mean that synthetic content or generative AI created content, there is something wrong with it. How we’re going to communicate it. I think that all the issues that we’re discussing, they will just have another even more complex dimension and as you probably heard, the prediction is from scientists that by 2026 90% of the content can be actually – can be created – synthetically created. This is a big dimension we should remember about.

>> Thank you. I take one last question if it is quick.

No, we cannot take it. I’m sorry about that. Then let me thank you all to this really interesting discussion, a special thank you of course to the key participants, the speakers at this session, and I think there were a lot of aspects covered and it is very difficult to let’s say summarize, and I won’t do that at this stage because we don’t have time, but I can see that there is a lot of interest in alternative visions and ideas beyond regulation and beyond the big tech. That I think is an important message to make at this stage.

Thank you very much, everybody, and have a good continuation and I hope we have a good discussion this afternoon on the messages. Thank you.

Thank you for helping me. Thank you.

>> NADIA TJAHJA: A big thank you to you for moderating this wonderful session.

Unfortunately, there wasn’t enough time to summarize what the session was about, but luckily we do have the Geneva Internet platform who is it grading the EuroDIG messages, do stay with us after lunch to see the results of the message drafting of the three main topics that is being done.

So now that we are nearing the end, I would like to invite you to come to this last session, we’re now going to have lunch and at 2:30 we hope to see you back. It will bring together all of the different debated discussions that we have had. Please, we look forward to seeing how these discussions have been developing over the last two days with this interculture community from diverse stakeholder groups, backgrounds and sectors. Please do come back. I look forward to seeing you there. See you after lunch.