IGF 2024 Youth Track – AI and Threats: new strategies to detect AI-generated political deepfakes – Pre 10 2024: Difference between revisions
No edit summary |
|||
| Line 62: | Line 62: | ||
== Transcript == | == Transcript == | ||
Disclaimer: This is not an official record of the session but provided only for convenience. The following transcript is automatically generated by the zoom recording function. Timestamps were removed, else the text is presented in it’s original format, as provided by the zoom system (e.g. including any spelling mistakes). The accuracy of the transcript is not guaranteed. | |||
Host, EuroDIG 2024, WS 1: The room is gonna be packed<br /> | |||
Host, EuroDIG 2024, WS 1: just for the remote participants. I will go. Okay. So please, enter our zoom name. And if you want to ask a question, please, I'm using the zoom function.<br /> | |||
Host, EuroDIG 2024, WS 1: It will be unmuted. And one speaker switch more of the videos, state your name and information and do not surely so, having said that I'm happy to give more to our moderators of the session.<br /> | |||
Host, EuroDIG 2024, WS 1: thank you so much. Well, 1st we're going to have a welcome from the UN Igf Secretariat. Oh, you can't hear me. Can you hear me? Now.<br /> | |||
Host, EuroDIG 2024, WS 1: stay here.<br /> | |||
Host, EuroDIG 2024, WS 1: Just speak louder, like, Okay, just, I'm going to leave out my whole voice here today. Okay, so first, st I think we have a welcome from the UN Igf Secretariat<br /> | |||
Host, EuroDIG 2024, WS 1: whom I invite to come and share with us.<br /> | |||
Host, EuroDIG 2024, WS 1: Okay, if you want to start if you stand up.<br /> | |||
Host, EuroDIG 2024, WS 1: Check. No.<br /> | |||
Host, EuroDIG 2024, WS 1: you can do my keyboard from here. Yeah.<br /> | |||
Host, EuroDIG 2024, WS 1: You know, we had this. I want to use that as a cheer. And it went really? Well. So, oh, okay, okay, I can do that, too. That's fine.<br /> | |||
Host, EuroDIG 2024, WS 1: like lots of that. Okay, thank you. And<br /> | |||
Host, EuroDIG 2024, WS 1: good meetings. Everybody my name is Chingata time Asango, and I am head of the it Secretariat based in Geneva. Some of you may know Anya Gingo. She works with me.<br /> | |||
Host, EuroDIG 2024, WS 1: So I've got a speech for you and I'll read it out. Okay.<br /> | |||
Host, EuroDIG 2024, WS 1: it is a pleasure to officially launch the idea of 2024 youth track here in beautiful illness. So it was raining yesterday. But that's fine. With stakeholders joining us from around the world on site and online.<br /> | |||
Host, EuroDIG 2024, WS 1: I would like to thank your Dick, and it's you stick for hosting us.<br /> | |||
Host, EuroDIG 2024, WS 1: it's been nearly 2 decades have passed since Kofi Annan, the UN Secretary General convened the idea back in 2,006, and I don't think anybody here was there right? But I was in these dynamic times when changes are on the horizon.<br /> | |||
Host, EuroDIG 2024, WS 1: It is good to remind ourselves about the historic journey the Ikef community went through from co consistent work which resulted in introducing the multi stakeholder model under the auspices of a traditionally<br /> | |||
Host, EuroDIG 2024, WS 1: multilateral intergovernmental mechanism to implementation of key idea principles and values at the local regional<br /> | |||
Host, EuroDIG 2024, WS 1: and global levels.<br /> | |||
Host, EuroDIG 2024, WS 1: The changes I've mentioned relate to the upcoming Global Digital Compact and the 20 year review of the World Summit on the Information Society.<br /> | |||
Host, EuroDIG 2024, WS 1: And these are both great opportunities for this community, including the youth. Specifically the youth as well<br /> | |||
Host, EuroDIG 2024, WS 1: to upgrade. It's already excellent work and help ensure. The world has accessible, affordable, safe and secure human rights respecting resilient Internet. And it's good governance.<br /> | |||
Host, EuroDIG 2024, WS 1: The journey of hard work and changes continues. This calls for continuous investment in the next generation of experts and leaders by the current generation<br /> | |||
Host, EuroDIG 2024, WS 1: at the level of the idea. This has been recognized a long time ago through a dedicated Youth Track Corporation is developed across silos through the corporation of youth, idf initiatives existing around the world, and the various international youth focused initiatives.<br /> | |||
Host, EuroDIG 2024, WS 1: The Ipf secretariat is proud to be able to work with these leaders and ensure that the track has a firm structure<br /> | |||
Host, EuroDIG 2024, WS 1: and rich, globally relevant, input coming from communities from all corners of the world.<br /> | |||
Host, EuroDIG 2024, WS 1: This year the idea of youth track is focusing on governance of artificial intelligence under the motor.<br /> | |||
Host, EuroDIG 2024, WS 1: empowering youth leaders for trusted AI.<br /> | |||
Host, EuroDIG 2024, WS 1: 4. Capacity development workshops are to be hosted in conjunction with the regional Idf, all leading to the Idf. 2024 global youth summit hosted in Riyadh on the 15th of December.<br /> | |||
Host, EuroDIG 2024, WS 1: from exploring AI generated deep face. Here at Eurotic we are continuing our journey towards discussing the use of large language modules in schools at the Asia Pacific youth idea, and to looking at human rights implications during the Latin, American and Caribbean idea<br /> | |||
Host, EuroDIG 2024, WS 1: to ways of building a strong ecosystem ecosystem for good AI governance at the African idea, working with and for youth, is working for a better tomorrow.<br /> | |||
Host, EuroDIG 2024, WS 1: I am teased to confirm that the youth track fosters a dialogue between young people and those at more senior experts.<br /> | |||
Host, EuroDIG 2024, WS 1: The workshop today is no exception, and I look forward to learning from all of you about the issues and opportunities that AI brings to us. Thank you very much, and I wish a good deliberation.<br /> | |||
Host, EuroDIG 2024, WS 1: Thank you so much, Shenegai, for your keynote. And now,<br /> | |||
Host, EuroDIG 2024, WS 1: I'll be welcoming all of you. To this roundtable discussion on AI threats and new strategies to detect, to detect AI generated political deep fakes organized by the Igf youth track. I'm Pilot. I'm going to be the moderator for today, and I'm also Youth Day. Organize<br /> | |||
Host, EuroDIG 2024, WS 1: a<br /> | |||
Host, EuroDIG 2024, WS 1: here. We will discuss one of the most pressing issues at the Internet section of technology and security as AI continues to advance at an unprecedented pace. We must be prepared to combat this misuse.<br /> | |||
Host, EuroDIG 2024, WS 1: particularly in the realm of political misinformation, deep fakes, hyperrealistic videos, and audio recordings generated by a I have emerged as a powerful tool for deception.<br /> | |||
Host, EuroDIG 2024, WS 1: These sophisticated forgeries can manipulate public opinion, disrupt elections, and undermine trust in our institutions.<br /> | |||
Host, EuroDIG 2024, WS 1: The ability to create convincing deep fakes is no longer confined to well funded organizations. It's becoming increasingly accessible. Raising the stakes, protection and prevention<br /> | |||
Host, EuroDIG 2024, WS 1: in our panel. Today we will explore the latest strategies and technologies being developed to identify and mitigate these AI related risks. To this end. We have a group of experts on this matter who are at the forefront for combating this digital mass.<br /> | |||
Host, EuroDIG 2024, WS 1: And without further ado, let's begin our conversation by welcoming our distinguished panelists.<br /> | |||
Host, EuroDIG 2024, WS 1: So online, I hope she's yes, I can see her online. We have Mary, senior case and policy officer at Meta Oversight board<br /> | |||
Host, EuroDIG 2024, WS 1: to my<br /> | |||
Host, EuroDIG 2024, WS 1: left.<br /> | |||
Host, EuroDIG 2024, WS 1: We have Thomas. He's a youth day 2,024 Fellow and UN. Student Association of Norway.<br /> | |||
Host, EuroDIG 2024, WS 1: Next, Emily, from the Council of Europe, Advisory Group on Youth, member and unique, 2024, member as well. And finally.<br /> | |||
Host, EuroDIG 2024, WS 1: we have Everton an expert advisor to the board of cgi.com dot VR from Brazil. So Mary, we will start with you.<br /> | |||
Host, EuroDIG 2024, WS 1: yes, I see you. Okay. So with 2024 being a pivotal election year, what strategies have you implemented? Or would you recommend to be introduced to safeguard the integrity of elections against the threats posed by deep peaks.<br /> | |||
Meri Baghdasaryan: Thank you, Pilar. Can you hear me? Just to confirm.<br /> | |||
Host, EuroDIG 2024, WS 1: Yes, we can hear you. Well.<br /> | |||
Meri Baghdasaryan: Perfect. Thank you so much. For having me so, as Killer mentioned, I work at Meta's Oversight board, and for those that do not know about the board. I will just give a brief overview. So the board is an independent body, so it is not part of Meta, and it is a body of 22 experts from around the world.<br /> | |||
Meri Baghdasaryan: and it was created to answer some of the most difficult questions around free expression, human rights on and content moderation globally. The Board guides international human rights standards between finding decisions that are in response to the appeals from Facebook thread and Instagram users from around the globe.<br /> | |||
Meri Baghdasaryan: The Board also issues non binding recommendations<br /> | |||
Meri Baghdasaryan: that I aim to address more systemic content moderation issues on Meta platform<br /> | |||
Meri Baghdasaryan: till I started with a very big question and I'll give a brief overview, but happy to go into more detail as the discussion goes<br /> | |||
Meri Baghdasaryan: for the protection of elections and civic space is one of the 7 strategic priorities for the board, and earlier in May the Board published it's<br /> | |||
Meri Baghdasaryan: peace on the key lessons learned around elections in this unprecedented year of elections around the world.<br /> | |||
Meri Baghdasaryan: Ed.<br /> | |||
Meri Baghdasaryan: The.<br /> | |||
Meri Baghdasaryan: That's what<br /> | |||
Meri Baghdasaryan: friend<br /> | |||
Meri Baghdasaryan: you need to defend the specific.<br /> | |||
Host, EuroDIG 2024, WS 1: Mary, you're breaking up a little bit.<br /> | |||
Host, EuroDIG 2024, WS 1: Is your connection stable?<br /> | |||
Meri Baghdasaryan: Can you hear me better now?<br /> | |||
Host, EuroDIG 2024, WS 1: Yes, now we can hear you better. Thank you.<br /> | |||
Meri Baghdasaryan: So I was saying that early<br /> | |||
Meri Baghdasaryan: the Board published<br /> | |||
Meri Baghdasaryan: key lessons learned from its case law around elections, and one of the main messages is that the platforms need to<br /> | |||
Meri Baghdasaryan: do everything to defend open civic space which means that it should provide space for members of political opposition, human rights defenders and marginalized voices<br /> | |||
Meri Baghdasaryan: to express their opinion and participate in democratic process.<br /> | |||
Meri Baghdasaryan: Other main messages from this paper include the fact that companies must set global platform centers for elections everywhere.<br /> | |||
Meri Baghdasaryan: They should not let political speech that incites violence get go unchecked in terms of deep fakes the Board acknowledges that live has been part of political campaign. Always, however, with the development of technologies, this has become more problematic, not only in terms of deep fakes and other AI generated or manipulated content, but also in terms of<br /> | |||
Meri Baghdasaryan: deep fakes. And the most relevant case to our discussion is the decision around the altered video. President Biden in which the Board reviewed a video that had been altered to make it appear as though the Us. President was inappropriately touching his front granddaughter's chest.<br /> | |||
Meri Baghdasaryan: The Board emphasized that the mere falsehood cannot be the sole basis of a restricting freedom of expression under human rights law, and outlined a number of gaps and inconsistencies with meta manipulated media policy. So eventually the Board decided that this content should stay on the platform because it didn't violate the policy as it was in force at the time, however, it recommended a number of measures for Meta to review and<br /> | |||
Meri Baghdasaryan: improve its manipulated media policy. One of those was that the policy at the time was<br /> | |||
Meri Baghdasaryan: only concerning<br /> | |||
Meri Baghdasaryan: AI modified content. But it was not, for instance, talking about any other way of manipulated<br /> | |||
Meri Baghdasaryan: content. It also was treating audio and auto visual media differently. Also, there was difference between what the people were doing that was altered and what they were saying. But you know<br /> | |||
Meri Baghdasaryan: around the world<br /> | |||
Meri Baghdasaryan: a lot of times audio based deep fakes are also, quite influential and have adverse<br /> | |||
Meri Baghdasaryan: consequences. The main recommendation from this case was around urging Meta to label its content that is, AI generated or manipulated instead of relying on fact checkers, because this has several. There are several differences.<br /> | |||
Meri Baghdasaryan: 1st of all, the board noted that labeling is not as is, is more scalable than relying on 3rd party fact checkers, because fact checkers<br /> | |||
Meri Baghdasaryan: this back checking process works differently, depending on language and market, or where the specific content comes from. At the same time. When a piece of content is back checked, and then a specific back checking label is applied to it. It gets demoted, which means the content ranking in the recommender system gets down ranked. And this actually has a problem in terms of<br /> | |||
Meri Baghdasaryan: You know, the spread of this kind of content, and the board in general is very concerned about negative impacts of devotion on freedom of expression.<br /> | |||
Meri Baghdasaryan: Meta, in fact, committed to implementing this recommendation to start labeling manipulated content instead of defaulting to removal. And since May 24 met, already started labeling organic AI generated content<br /> | |||
Meri Baghdasaryan: and in in addition to labeling, the Board believes that Meta should give a researchers access to data to verify this process and also assess the impact of platforms, automated decisions and content duration. And, thanks to the Board's efforts, Meta's The Content library there. 3rd party researcher tool is open to 3rd party researchers around the world at the moment.<br /> | |||
Meri Baghdasaryan: in other cases the board is currently also looking at a case bundle that involves AI generated mute images of mail female public figures from India and us. But I will stop here and can discuss more as the discussion goes. Thank you so much.<br /> | |||
Host, EuroDIG 2024, WS 1: Thank you so much. Mary. And now, Emily, we witness deep fakes, influence and interference with political discourse and voter perception in recent years. Given the rapid advancements in deep fake technology. And it's increasing accessibility. What are the most pressing challenges we face in regulating and mitigating the misuse of this technology in the political arena?<br /> | |||
Host, EuroDIG 2024, WS 1: Thank you for the question. Pilar is, can everyone hear my voice? I'm trying to speak out loud. Perfect. I'm Emily Hachetan, coming from the Council of Europe of the Advisory Council on Youth, and I would like to explain to you 1st what the structure is how, how we function in order to understand how we can, how young people can help with the regulations and with the recommendations to<br /> | |||
Host, EuroDIG 2024, WS 1: deal with deep Fakes and other AI concerning issues that we face so Advisory Council on Youth works next to the Committee of Ministers, and we sit down on a joint Council and discuss political issues that are violating human rights, democracy and rule rule law. And I'm representing the portfolio on digitalization and young<br /> | |||
Host, EuroDIG 2024, WS 1: people and artificial intelligence. And this issue, the issue of defect is a concerning issue, because it is important to implement regulations targeting deep bakes and synthetic media. And I really like the idea that Meta has started implementing labelling because that is something that I actually wanted to propose.<br /> | |||
Host, EuroDIG 2024, WS 1: It is an idea that China has been implementing since 2019, according to their cyberspace administration of China, and they took a step to issue this regulation and making it make it mandatory for all the social media platforms. So I believe this could be crucial for European Union as well. We could implement and adapt this regulation to our needs<br /> | |||
Host, EuroDIG 2024, WS 1: and regulate all social media. Not only Meta in order to label all the all the content, because labelling will help people to be more sure whether it's correct or not. And I'm also a data science student. So coming from a technical background, this is not hard to implement. It is an AI model that can be easily created through training and testing. And as more deep fake data is being<br /> | |||
Host, EuroDIG 2024, WS 1: created right now, these AI models can be trained and used very easily, so it can be done in less than a month. And how young people can actually get involved in this would be having more focus groups and having more young people actually interacting with deepfakes as where the generation who are using the technologies as much. It's easier for us. I noticed that it's easier for us<br /> | |||
Host, EuroDIG 2024, WS 1: us to understand which one is a deep fake which one is not. And maybe if there was a website, a tool where young people could choose which one is a real authentic content and which one is AI generated content. This could give Meta and other technical technical companies to develop these AI tools even easier.<br /> | |||
Host, EuroDIG 2024, WS 1: And the companies due to the regulation. The companies that do not have these detector tools should be fined in a way, because this is exactly like copyright issues. This is exactly like violent content, and it should be treated the same way. And a solution should be implemented for this, because this is also due to algorithms. And if algorithms can be modified and actually taken to<br /> | |||
Host, EuroDIG 2024, WS 1: situation, this deep fake content, so many people will be less manipulated and could enter social media with more safety net, because we always enter social media, and we never know what to trust or what not to trust, which is a big big issue in our society now.<br /> | |||
Host, EuroDIG 2024, WS 1: And yes. And why, young people? Why, why, young people would be a good help to engage with these fast moving technologies. It's because it is our future. We are currently building our future together. And with Eurodig. It's great that this co-management system is being developed, and it should be done not only on international levels, but also<br /> | |||
Host, EuroDIG 2024, WS 1: on national levels in different countries, because young people can help policymakers to think of these regulations even better, and we will present our youth big messages in these couple of days. I hope we are also excited for it, because there are many solutions we came up with. And yes, also, as I am mentioning that young people can help with this regulation.<br /> | |||
Host, EuroDIG 2024, WS 1: I also find that it's very important to implement curriculum and educational curriculum in their system, in their middle school educational system, because we have the informatics class. But what does it detail? It has Xl classes, Microsoft Microsoft office classes. But now, as AI is becoming a tool, especially, it is important to teach young people how to use these<br /> | |||
Host, EuroDIG 2024, WS 1: properly in order to improve our work and become more productive instead of just losing our critical thinking due to these AI technologies. So I would urge all, considering to implement this AI literacy in educational courses and curriculums in many countries, because afterwards these young people are going to spread the message between an older generation. Young people<br /> | |||
Host, EuroDIG 2024, WS 1: can raise awareness and also teach elder people of how to use this, how to understand it better, because AI is viewed as a bad thing in our world as something that is going. For example, robots in the future are going to take us over, but in good hands it can be used for social impact, and with a strong background knowledge<br /> | |||
Host, EuroDIG 2024, WS 1: of it, it can be done very well and eliminate manipulation on social media and other platforms. Thank you.<br /> | |||
Host, EuroDIG 2024, WS 1: Thank you so much. And<br /> | |||
Host, EuroDIG 2024, WS 1: thank you, you diggers for supporting your. And yeah, I I'm talking like you introduce the Youtube messages. And I think that's great. And I encourage everyone here to come and listen to them because they're designed. They're designed by the young people to give you, like senior stakeholders the chance to listen to our their messages.<br /> | |||
Host, EuroDIG 2024, WS 1: and I'm so glad that Mary, and met us implementing the the link. So Everton in light of the growing prevalence of political, deep fakes. What measures can be taken to rebuild and maintain trust in democratic processes? How many hours do we have to answer that question? That's i i would love to to<br /> | |||
Host, EuroDIG 2024, WS 1: make a very comprehensive answer to that, but because it it's a very complex scenario. But well, I would like, anyway, yes, to thank you for the invitation, and, thanks to all of you here, and remotely following this debate, I'm from Nick Br here. I'll be speaking on behalf of the Brazilian Igf<br /> | |||
Host, EuroDIG 2024, WS 1: the impact of Deepfake largely depends on the informational environment. It finds so that environment is composed by many pieces. We all know that in a very healthy environment, with diverse sources, robust robust journalism and careful fact, checking deep fakes can be quickly be proven. Fake. So that's the easy part.<br /> | |||
Host, EuroDIG 2024, WS 1: whether they come in video, audio or both, because we have both in development, so rebuilding. Trust, in my opinion, starts with fostering of robust information. Environment. As my background is in communications, I believe, I firmly believe that the best way to combat misinformation begins with trustworth information, and journalism alone<br /> | |||
Host, EuroDIG 2024, WS 1: won't solve every problem. But collaborating with media professionals is essential to provide a wider audience with reliable information. And this information exists on the Internet, despite so many challenges ranging from technology to finances or to financing the journalism everywhere. So<br /> | |||
Host, EuroDIG 2024, WS 1: this depends on on an information environment which is solid and healthy, that that has good journalists working, being well paid with republican criteria and seeking impartiality.<br /> | |||
Host, EuroDIG 2024, WS 1: Although the truth is most mostly an intention, then an end in itself.<br /> | |||
Host, EuroDIG 2024, WS 1: Deep fakes will prevail when reliable information is living in the shadows of the Internet. So it's not about being by default against the media outlets that are more biased or that have their political positions of speech, or and so on. But it's about considering more strongly those who seek this sort of consensual truth, so to say, that are closer to this relationship with the reality itself.<br /> | |||
Host, EuroDIG 2024, WS 1: All of them have their places, both those more bias and those who are more so to say, neutral. They have their places in us in our societies, and they deserve to live independently from major digital platforms, which is a reality<br /> | |||
Host, EuroDIG 2024, WS 1: more over and over more distant in the present times, and this regarding this discussion and letting the spreading of misinformation go on, is a dead end for digital platforms for their users which may be personally affected by deep fix for the trust on the Internet and for societies as a whole. So this is not a small topic that we are talking here<br /> | |||
Host, EuroDIG 2024, WS 1: to compose the discussion in this issue. Last year CGI. Br. The Brazilian Mood stakeholder Steering Committee. We have here a board member from CGI with us.<br /> | |||
Host, EuroDIG 2024, WS 1: cgi.br. Published a study suggesting the idea. One of the ideas is to suggest a public fund financed by major Internet platforms as one of the possible ways. We have a lot of ways ahead, but one of them would be the idea of this public fund to improve the Brazilian informational environment. And while this is not the only suggestion on the table today, cgi.pr is sure<br /> | |||
Host, EuroDIG 2024, WS 1: that this debate should be accepted as legitimate in the sense that regulations deserves to be at least discussed by policymakers from all branches, the lawmakers, the executive branch, and the judiciary as well.<br /> | |||
Host, EuroDIG 2024, WS 1: So let's not forget the role that each stakeholder has into feeding and organizing those discussions, including the very well articulated youth groups. And because, after all, do deep fakes affect young people in a political environment? Yes, they do.<br /> | |||
Host, EuroDIG 2024, WS 1: Are deep fakes, becoming more C simpler as time go by, goes by? Or are they becoming more complex, more complex, right? So these challenges gives us an opportunity to organize and promote engagement, to tackle those issues.<br /> | |||
Host, EuroDIG 2024, WS 1: One another<br /> | |||
Host, EuroDIG 2024, WS 1: important topic that I would like to highlight is to track those who finance. The creation and spreading of deep fakes. This requires agile coordination and implementation of efforts with law enforcement, agencies, and other authorities. And additionally, it's very important to recognize that the origin and development of deep fakes depend on the development of artificial intelligence mechanism<br /> | |||
Host, EuroDIG 2024, WS 1: whose bodies require permanent improvements, and otherwise we will be only discussing and debating the effects of deep fakes, instead of looking to the origins of them of deep fakes themselves<br /> | |||
Host, EuroDIG 2024, WS 1: coming to the end of my intervention. Our social connections, they allow us to achieve great results from consensus based stakeholder debates which consider the scenarios and challenges in different regions.<br /> | |||
Host, EuroDIG 2024, WS 1: So in that sense, it's necessary that each region convene those discussions to address their challenges, their challenges regarding deep fakes. For example, using the Nri structure for that. But not only the Nri structure.<br /> | |||
Host, EuroDIG 2024, WS 1: When governments, tech companies and, for example, citizens work together, we can create an environment<br /> | |||
Host, EuroDIG 2024, WS 1: that tackles and healthy content which compromises our informational environment. And these collective constructions include providing social and technological skills that combat deep fakes inside and out of<br /> | |||
Host, EuroDIG 2024, WS 1: social media channels. Those tools should be carefully, should consider carefully the different languages, for example, that are spoken in a major territory that are spoken in a territory. Sometimes we can only easily recognize using our tools.<br /> | |||
Host, EuroDIG 2024, WS 1: deep fakes, which are created in one language, but not in another one. When it comes to audio deep fakes.<br /> | |||
Host, EuroDIG 2024, WS 1: If a targeted group is constant and is constantly being harassed, for example, and many other important issues, so the greater the size of a player in this environment, the greater the responsibility it has. So in our roles as citizens and conveners of debates is not small as well.<br /> | |||
Host, EuroDIG 2024, WS 1: So to sum up, protecting trust in our democracy when there are lots of deep fakes means using many different ways. So there's space for everyone. I could stay here talking for hours. But that's it for the moment. Thank you very much, Gila.<br /> | |||
Host, EuroDIG 2024, WS 1: Thank you so much because it's true that<br /> | |||
Host, EuroDIG 2024, WS 1: Verizon deep fakes like Mary Mary mentioned audio the the face. Video deepfakes picture Deepfakes is just for us. It's very. It's becoming very hard to detect them. So thumb Thomas. How can we empower youth to come back the rise of this information fuel by deep fake technology.<br /> | |||
Host, EuroDIG 2024, WS 1: Thank you for that question. So yeah, my name is Thomas. What I'm I'm from Norway, and I have my background in organization called the UN Student Association.<br /> | |||
Host, EuroDIG 2024, WS 1: I think that question is, it's a very good question. I think?<br /> | |||
Host, EuroDIG 2024, WS 1: it's also followed up with what? What's kind of<br /> | |||
Host, EuroDIG 2024, WS 1: what initiatives have shown promise in in this area. And I think you know, empowering youth is something that has shown promise when it comes to these areas? These these questions are a little bit inter related. But I also think the 1st question we have to then ask is, who is going to empower youth, you know, is it? Are we talking about teachers, decision makers, parents, other youth. And of course the the answer is, at least in my opinion, that it's not just one group<br /> | |||
Host, EuroDIG 2024, WS 1: that we're talking about to to empower you if it has to come from a united force, if you will. And why is this important? Because I think more and more we talk about.<br /> | |||
Host, EuroDIG 2024, WS 1: or, for example, we've talked a little bit already today about, you know, we're being called digital natives, the youth<br /> | |||
Host, EuroDIG 2024, WS 1: and and I'm not sure entirely whether or not that is a very useful, or at least helpful term to to use, because, if anything, it just creates a little bit of a further division between this generation and the previous generation, while in reality we're probably better off trying to stay on the on the same page, and and rather our our great way to empower youth is to be<br /> | |||
Host, EuroDIG 2024, WS 1: is for these other groups to be a safe and trusted place for them to rely on or to rely on to come with their sources. So what do I mean with this?<br /> | |||
Host, EuroDIG 2024, WS 1: For example, picture, you have a scenario of a a teacher sorry, a student, or a child<br /> | |||
Host, EuroDIG 2024, WS 1: that comes to you and shows to you something on the phone that they saw on. Yeah, let's say, Tiktok or Instagram and the teacher or parent, they just kind of dismiss it and say, you know, you shouldn't be on Instagram and Tiktok and get your information from these sources. You should, you know. Read the real news.<br /> | |||
Host, EuroDIG 2024, WS 1: Now, I think my point is that that is not a very useful way of answering and why is that? Well, the reality is that these platforms they're here to stay, and youth are on them, and they will form their attitudes, and they will get their news from these sources, whether or not you like it<br /> | |||
Host, EuroDIG 2024, WS 1: so rather than than dismissing it, I think it's important that the the recipient group has the necessary tools to feel confident, answering in an empowering way, and that this it can be a way of, you know, bridging this generational gap.<br /> | |||
Host, EuroDIG 2024, WS 1: So I know, for example. If we're talking about some initiatives that have shown promise in my home country of Norway, they have been developing something called the conversation cards which essentially can be between a parent and a child that you have around the kitchen table, or it can be in a classroom with a teacher. Talking to their students about where they get their information and<br /> | |||
Host, EuroDIG 2024, WS 1: what they can do if they're uncertain of what what the intention say behind this information could be.<br /> | |||
Host, EuroDIG 2024, WS 1: And essentially this provides these educators, like parents or teachers, with the right questions to to in a constructive way, lead the conversation with youth. Youth. And I mean, that's 1 aspect of it. But then I also want to stress that youth also has a responsibility to other youth when it comes to this.<br /> | |||
Host, EuroDIG 2024, WS 1: So that means that's, you know, that's us checking our own attitudes, and how we treat, you know, if your friend says something or sends something in a group chat that you're not entirely certain about, you know. Bother enough to take that conversation and ask them, okay, where did you get this from? Is this just something you saw on, you know. Could this be, you know, presented in a in a way that has a certain<br /> | |||
Host, EuroDIG 2024, WS 1: narrative behind it? And actually asking these questions is, what's going to be, I think, at least a really important tool of combating misinformation. And and this information, especially with with like deep fakes and something that can seem very convincing.<br /> | |||
Host, EuroDIG 2024, WS 1: So essentially to sum up, all generations have to be a part of the solution. And it's so important that we know how to have the conversation like we know how to do the proper like, ask the right questions and then follow up on them and have the confidence to do so.<br /> | |||
Host, EuroDIG 2024, WS 1: Because even though we are, you know, we're talking about cutting edge technology, that is in rapid development. I think. I think that doesn't necessarily mean we need to have a cutting edge answer. Sometimes we can also use non tech answers to, you know. Talk talk about. You know things we don't even know entirely how it came about. Like defakes. Most people probably won't know the technical stuff behind it, but you can still talk about it in a non technical manner.<br /> | |||
Host, EuroDIG 2024, WS 1: So the answer doesn't necessarily have to be something drastically new. And I think my my key point is that it's important to keep a little bit of the the human touch in all of this. And you know, just know how to talk to each other about these kind of topics<br /> | |||
Host, EuroDIG 2024, WS 1: like with a more discussion process. And I'm going to refer to Mary. Because, Thomas, you referred to social media platforms in your in your intervention. And Mary like Thomas mentioned, how are you implementing the tools to make the people feel confident in your platform.<br /> | |||
Meri Baghdasaryan: Thank you so much, and thanks to my fellow speakers for very interesting interventions. So to to start, I'll start by clarifying that the board is not<br /> | |||
Meri Baghdasaryan: Meta. So I'm not speaking on behalf of Meta here overall, as I was saying, based on the board case law and overall position. I think. There are several things that should<br /> | |||
Meri Baghdasaryan: work together to achieve user trust, which is the ultimate goal in in this<br /> | |||
Meri Baghdasaryan: time of our development. And I think we need to look at this<br /> | |||
Meri Baghdasaryan: more depending on which situations we are taking into account. So I think this session mostly tried to focus on<br /> | |||
Meri Baghdasaryan: more generally on the fake. But and AI generated manipulated content, but also more specifically in the context of elections, which, in my opinion, this type of manipulated content poses more specific risks.<br /> | |||
Meri Baghdasaryan: And I think there should be combination of several factors. So one is as mentioned, some technological solutions as Emily was saying, AI is not<br /> | |||
Meri Baghdasaryan: an inherently bad thing. There are a lot of benefits to using AI technology. But maybe using things as a labeling can be very helpful for users to understand. What is a human, created, so called content. And what is AI generated and you could already see that implemented on meta platform.<br /> | |||
Meri Baghdasaryan: The second thing, in my opinion is, as has been mentioned several times, the need to continue with more<br /> | |||
Meri Baghdasaryan: user empowerment and overall digital literacy development. This is also something that the Board has been pushing for decision, starting from providing more notification to users when A post that they made was in violation of the community standards. In the 1st year of the Board existence. It already had cases where the Board called on Meta to provide more reasoning to the<br /> | |||
Meri Baghdasaryan: content, take down decisions and provide the users with the opportunity to explain.<br /> | |||
Meri Baghdasaryan: trying to say in their post which, in case of, for instance, that hire or humor is quite difficult to understand for moderators, or for AI system for content moderation. So I think overall beyond platforms. We need to invest more in digital literacy. I, as far as I remember myself in this space.<br /> | |||
Meri Baghdasaryan: this has been a theme, and I think technologies come and go, new ones emerge. But the need for digital literacy stays the same, and this the Poc. In my opinion, that are useful for everyone regardless of age and profession, are actually quite transferable to new type of technologies. And the 3rd factor, in my opinion, is the cooperation across different stakeholders.<br /> | |||
Meri Baghdasaryan: A lot of groups are thinking about similar issues, not only from technological perspective, but generally understanding the impacts and or different technologies around the world. The impacts are also different<br /> | |||
Meri Baghdasaryan: on different in different regions, and I believe a corporation and multi stakeholder corporation is the 3rd factor that is very much needed, especially now with the much faster development of technologies. So I don't have one solution to answer your question. But I think several things need to work<br /> | |||
Meri Baghdasaryan: together to to make this happen. And definitely, platforms like this, and especially having youth voices, it's very crucial in this process. Thank you.<br /> | |||
Host, EuroDIG 2024, WS 1: So, Emily, just by reference, I'm going to. Yes, I would like actually to ask a question to Mary, because I'm very curious whether it is a possibility to implement into the algorithm prioritizing educational content on AI literacy. Because you mentioned the point that it's digital literacy. AI literacy is very important, and I was curious whether Meta has the possibility in the opportunity<br /> | |||
Host, EuroDIG 2024, WS 1: to give this small introduction to the algorithms in order to prioritize this kind of content, because if it is being shown to more people, then we can actually raise awareness. And young people from the other side, as we are going to educate them, it we can give them an opportunity to create this kind of content. And if this content is created and Mata is pushing it and promoting it. It would be a great cooperation with between young people and also private sector.<br /> | |||
Host, EuroDIG 2024, WS 1: So, Mary, good to take the question.<br /> | |||
Meri Baghdasaryan: Sure that's a great question, Emily. Thank you. For that. Again. Not speaking on behalf of Meta, but<br /> | |||
Meri Baghdasaryan: based on what we know.<br /> | |||
Meri Baghdasaryan: I think, given the sheer amount of content on the platform, it may be it may not be that easy to understand which one is the educational content on the platform to recommend more or less.<br /> | |||
Meri Baghdasaryan: I think what may be helpful in addition to labeling is just to potentially provide more prompts to users. I think, for instance, that Twitter was doing a lot around. You sent this information when people were sharing posts without reading them. So the prompt when you are trying to forward the message, I think Facebook was doing the same that I was doing the same on Whatsapp<br /> | |||
Meri Baghdasaryan: when people were doing Covid, we're just forwarding messages to bigger groups and just spreading information without verifying.<br /> | |||
Meri Baghdasaryan: And they basically restricted the this functionality to a certain number of groups, and then the prompts were in development. So I think this kind of prompts that will also include<br /> | |||
Meri Baghdasaryan: more of this.<br /> | |||
Meri Baghdasaryan: Take another look. Think about this a bit more like stop and think, and then act kind of messaging will be helpful, and the board is definitely pushing for this, not only for AI generated content, but more more generally because this is an issue, regardless of<br /> | |||
Meri Baghdasaryan: whether the content is manipulated or it's just a more simple disinformation piece. So if you have other ideas the board usually in all cases has the call for public comments, and we would love to hear more ideas and your thoughts or expertise on specific things that you think may make sense. And then, the board is happy<br /> | |||
Meri Baghdasaryan: to take it into consideration and try to put, because this is also a direction we're trying to make an impact on.<br /> | |||
Host, EuroDIG 2024, WS 1: Amazing. Thank you so much. Thank you, Mary and Everton. I would like to refer to you cause we label things we like try to improve trust. But how can you collectively, in enhance the public's ability to distinguish between fact and fake?<br /> | |||
Host, EuroDIG 2024, WS 1: Come on<br /> | |||
Host, EuroDIG 2024, WS 1: the final answer. I think it's still to be discussed by. But I think that a good start is by promoting more debates, collective debates, and to share ideas on how to build those the future that we want ahead. So in the case of the Brazilian Igf, we.<br /> | |||
Host, EuroDIG 2024, WS 1: the the Brazilian Internet Steering Committee, organizes the events. And then we have this call for proposals. Many of those proposals, debate deep fix and deep, and needs information and other aspects. And this is one good way forward for sure. So I'm quite sure that by collectively<br /> | |||
Host, EuroDIG 2024, WS 1: discussing and promoting those debates, not only<br /> | |||
Host, EuroDIG 2024, WS 1: from our perspectives, but also allowing other voices, this is a good way forward.<br /> | |||
Host, EuroDIG 2024, WS 1: So Pouse, would you like to add?<br /> | |||
Host, EuroDIG 2024, WS 1: Well, I mean, I think, I agree with with everything else. Just said, I think<br /> | |||
Host, EuroDIG 2024, WS 1: something that I also wanted to to bring attention to. That hasn't been discussed yet. Which I'm can. I can ask if my fellow panel panelists agree with me or not. But there is also, you know, a certain responsibility about or on how we talk about these type of things like this, information is different, especially with deep fix that can seem like it's very, very scary, and that the world's kind of<br /> | |||
Host, EuroDIG 2024, WS 1: like coming to an end because we can't trust anything online anymore. And then<br /> | |||
Host, EuroDIG 2024, WS 1: because we've also talked quite a bit about this back home as well. If if the people that use, you know. So let's say, social media are<br /> | |||
Host, EuroDIG 2024, WS 1: are very, you know, scared and and become quite like frustrated, or essentially, if you, if you're scared of something you become less critical of or your ability to think critically also diminishes right? So there's also a responsibility for. Well, this may be more like journalists, but also social media about how they talk about these types of issues and the<br /> | |||
Host, EuroDIG 2024, WS 1: and how the how they might perceive the audience to to perceive it.<br /> | |||
Host, EuroDIG 2024, WS 1: and also because we tend to, of course, that<br /> | |||
Host, EuroDIG 2024, WS 1: all the young people here may not know. But there was a time in which there was no Internet<br /> | |||
Host, EuroDIG 2024, WS 1: interact like this. And but joke is aside.<br /> | |||
Host, EuroDIG 2024, WS 1: real conversations in real spaces need to take place.<br /> | |||
Host, EuroDIG 2024, WS 1: Sometimes we we tend to think<br /> | |||
Host, EuroDIG 2024, WS 1: or to only convene meetings and and discussions online, and to think of the effects of deep fakes and fake news or misinformation.<br /> | |||
Host, EuroDIG 2024, WS 1: a very restrict space of social medias, but they also affect people by gossips and other<br /> | |||
Host, EuroDIG 2024, WS 1: analog situations. So of course, that deep fakes are not necessarily analog. They. They are digital artifact. But still so these conversations need to take place. As you were mentioning one very interesting topic<br /> | |||
Host, EuroDIG 2024, WS 1: which is to talk to your peers and talk to your friends, because sometimes they are sources of the transmission of the fake content. But not only that, sometimes they<br /> | |||
Host, EuroDIG 2024, WS 1: maybe by accident, generating that this information, or even generating this information, and and<br /> | |||
Host, EuroDIG 2024, WS 1: is to talk to each other, to know the opinions of your friends, to stay in contact with them in the real world is one of the ways that we have on hand as well to identify. If that content is true is<br /> | |||
Host, EuroDIG 2024, WS 1: truth or a like, yeah, Emily. So yes, I'd like to add, I agree with both of my fellow panelists.<br /> | |||
Host, EuroDIG 2024, WS 1: And the thing is that the digital world is becoming bigger and bigger, but it can never change and take the place of human interaction, because by the end of the day all of us have been living the life since childhood. AI was not there. I don't know. Social media was not there, but it will never fully replace it, because, for example, Chi, Gpa can never replace my teacher, because this human interaction<br /> | |||
Host, EuroDIG 2024, WS 1: motions and all this attachment are more important, and all the experience that we lived through are more important than any tool can be. I can use Gbt. As I don't know a calculator in order to do my coding or to help me understand my course or content, but it will never replace my teacher or my friends. It can never do that.<br /> | |||
Host, EuroDIG 2024, WS 1: But it's important to embrace these tools and remember how to and learn how to productively use them, because you can use it and lose your critical critical thinking. But you can use it and become 100 times more productive and finish your work instead of<br /> | |||
Host, EuroDIG 2024, WS 1: I don't know 3 h in 10 min, which is great. We are evolving, and we are getting into the innovation. So instead of just being scared, we should embrace it, and as well try to not have this gap between the generations. Because that's also very important.<br /> | |||
Host, EuroDIG 2024, WS 1: Yeah, I think. And we put that very well. And I think, like you're saying, we yes, we we're probably gonna have to learn a few new things, and probably how to use it in the most optimal manner. But then, maybe on the on the other side of that. We also have to make sure that we<br /> | |||
Host, EuroDIG 2024, WS 1: don't get so much ahead of us that we kind of forget what it was like before. We don't want to become too reliable on these types of tools, so that we forget what it's like to have, you know, a a real conversation, or what it's like to be in a real classroom. So so maybe we then have to change a little bit. Our focus to, you know, have a little bit more discussions about how to how to do things, the the good old fashioned way.<br /> | |||
Host, EuroDIG 2024, WS 1: and also appreciate that not everything, just because it's new is necessarily that much better.<br /> | |||
Host, EuroDIG 2024, WS 1: So I'm going to leave the floor to someone in the room. Now.<br /> | |||
Host, EuroDIG 2024, WS 1: the real reason is, the microphone is here. So thank you very much. I've been listening to everyone talk about different different subjects. There are, a few questions here. So when you see a video on the Internet. You immediately believe it because most people will actually believe the video immediately. And the zoom, it is true.<br /> | |||
Host, EuroDIG 2024, WS 1: Actually, there were some sessions talking about fake news, about detecting all of these kind of problems. The the issue is that people will believe it because they have no reason not to believe by default. Humans have a tendency to just believe things. What happens is that if you consider social networks like Facebook, Twitter, or whatever it's called these days<br /> | |||
Host, EuroDIG 2024, WS 1: and anything else. People will just believe things because they are just there. Facebook, a few years ago created something called the fact Checkers, or something similar to that name. And this is actually a very good idea. So you have some which is, you should trust that actually validates this, I can tell you today in my company, the 2 ones that were chosen<br /> | |||
Host, EuroDIG 2024, WS 1: are the media. That's the biggest source of fake news that we have so clearly that didn't work because most of the answers were wrong. So my suggestion here is that we might need to do fact checking on AI videos and make sure that they are true if they are created by AI or not. But the problem spans much before that. So the reality is what we need is critical thinking.<br /> | |||
Host, EuroDIG 2024, WS 1: We need to make sure that people are aware of things. We need to make sure that kids from a very early age, when they are 2, 3 years old. They are. Gonna ask the questions that my niece kept asking, which is, I talk to her and say, Oh, something's like this. Oh, that seems very strange. Let me ask someone. Let me ask my parents to see if this is true or not, and this is the critical thinking that I kind of because that was actually important. So think about this. If you have fake news, if you have.<br /> | |||
Host, EuroDIG 2024, WS 1: these kind of things.<br /> | |||
Host, EuroDIG 2024, WS 1: have people to double check them. Okay. By the way, best conspiracy theory is not that the artist is flats is that the sun is flat. Try to prove that one that one is much better.<br /> | |||
Host, EuroDIG 2024, WS 1: Now we're going to be going to the closing remarks. And I'm interested to be very brief. So I'm gonna ask each speaker to give one tweet or x now like headline that you want the people in the room to give with. And I'm we're gonna begin with Mary.<br /> | |||
Meri Baghdasaryan: Yes, thank you so much. This has been very interesting. I learned a few things from the audience and from fellow speakers. Thank you so much for having me. I think I my last words would be just reinforcing what we were saying, that there is need for more cooperation and need to exchange the the thoughts and ideas that we have<br /> | |||
Meri Baghdasaryan: and to invest more in not only technology but also in social skills, human connection. And most importantly, digital literacy, which is<br /> | |||
Meri Baghdasaryan: a team that is here to stay into plus one to Andres<br /> | |||
Meri Baghdasaryan: points definitely asking, I think the default should not be. I don't want to say the default should be not trusting, but definitely trust, but verify should be the model for everyone. And this goes for both online and offline information, because fake news is not that new. It Internet didn't invent fake news. It has been there for long time. It's just the spread and reach has changed. And this<br /> | |||
Meri Baghdasaryan: it will be<br /> | |||
Meri Baghdasaryan: already, is a big issue with AI generated, manipulated content. So I really look forward to reading the youth messages, and definitely, if you have any ideas, please. We feel free to reach out we'll be happy to discuss this further and also cooperate later. Thank you so much.<br /> | |||
Host, EuroDIG 2024, WS 1: Thank you, Mary Emily. Yes, I would like again to highlight and support the point that Everton brought up, that we need to have a body in that will include media policy makers, tech companies, young people this diverse body will have. We can have an open conversation all the time, because, for example, we came to Eurodic and I myself from Armenia, and I'm<br /> | |||
Host, EuroDIG 2024, WS 1: also some Armenians from the Igf in Armenia. And now we are thinking for some ways how we can collaborate. And these kind of conferences and events help people to meet each other, connect and actually come up with something good because we can build our future together, no matter which generation it is. And I don't know. It's just important to<br /> | |||
Host, EuroDIG 2024, WS 1: continue having real live talks and continue trying to decrease this gap between our generations and include people and multi-stakeholders from different places in order to come up with the best solutions and fight the issues that we're facing because they are becoming serious. And we need to be able to face it<br /> | |||
Host, EuroDIG 2024, WS 1: together. And yes, thank you. Thank you, Emily. So everything. Oh, great segue! Just to add to<br /> | |||
Host, EuroDIG 2024, WS 1: to. The last point is that<br /> | |||
Host, EuroDIG 2024, WS 1: don't let others build<br /> | |||
Host, EuroDIG 2024, WS 1: your truth the the truth of your group. So if you have an issue to tackle, organize yourself, it's not funding is hard for finding. Funding is hard for basically everyone. But don't let others tell the truth, for of your group. So if you're feeling<br /> | |||
Host, EuroDIG 2024, WS 1: targeted by by this kind of info, that information, or or with this information, organize yourself. Organize your group make yourself heard. And as I said before, so truth starts with fostering a robust information environment, and that<br /> | |||
Host, EuroDIG 2024, WS 1: requires you all of you there building the the reality that we want in the future. Thank you. Thank you. Maybe to sum it up, I think. When it comes to the realm of misinformation and disinformation. Real empowerment comes from being able to take those difficult discussions and not being afraid to being critical, even if it's to your friends or family, or someone, if you might be in like a higher position than you.<br /> | |||
Host, EuroDIG 2024, WS 1: Yeah, let me just handle that. Thank you. I want to finish thanking the speakers, thanking the youth track all the volunteers, the rappers, and everyone involved in creating this session. And of course you the public that has been wonderful, and with this I think we can wrap it up.<br /> | |||
Host, EuroDIG 2024, WS 1: So thank everyone.<br /> | |||
[[Category:2024]][[Category:Sessions 2024]][[Category:Sessions]][[Category:Sessions]][[Category:Side events 2024]] | [[Category:2024]][[Category:Sessions 2024]][[Category:Sessions]][[Category:Sessions]][[Category:Side events 2024]] | ||
Revision as of 17:30, 26 June 2024
17 June 2024 | 14:30 - 15:30 EEST | WS room 1 |
Consolidated programme 2024
In the crucial election year of 2024, through a dialogue between the current and next generation of experts and leaders, we will explore the challenges posed by political deepfakes, their direct and indirect effects on elections and public trust, and collaborative strategies to safeguard the integrity of democratic processes.
Session description
The UN IGF Secretariat, in collaboration with the current IGF Host Country and all Youth IGF coordinators, is designing and implementing the IGF Youth Track as a capacity development activity implemented throughout the year-round IGF process including at the annual IGF meeting. This year's IGF Youth Track is focused on unpacking AI governance. Under the motto Empowering youth leaders for trusted AI, the track will be officially launched at EuroDIG with the very first capacity development workshop.
Through a dialogue with senior stakeholders, a network of Youth IGF coordinators will explore the significant influence of AI on democracy by deepfakes impacting the political discourse. In recent years, the quality of deep fake technology has improved dramatically, and with its widespread accessibility, it is now available to a vast audience at minimal cost. It is estimated that the number of deepfakes tripled in 2023, compared to 2022. With 2024 being a pivotal election year, this session explores the impact of deepfakes on the political discourse, the challenges arising from this, and how we can collaboratively strive to safeguard the integrity of elections and democratic processes.
Together with senior experts, the Youth IGF network will discuss both the direct and indirect effects of political deepfakes. While political deepfakes can be used in campaigns to influence voters and elections, the indirect effect of this widespread technology cannot be understated. The pollution of the public information ecosystem and the inability to distinguish between fact and fake can lead to uncertainty and distrust. In this session, we will delve into strategies to enhance election integrity and build trust in democracy amidst the growing prevalence of political deepfakes.
Format
Interactive roundtable exchange between youth from around the world and senior experts. Several policy questions will guide the discussion:
- How have you seen deepfakes influence and interfere with political discourse and voter perception in recent years? Can you share specific examples where deepfakes have significantly impacted election outcomes or political campaigns?
- Given the rapid advancements in deepfake technology and its increasing accessibility, what are the most pressing challenges we face in regulating and mitigating the misuse of this technology in the political arena? In your experience, how can youth help reshape these regulatory frameworks?
- With 2024 being a pivotal election year, what strategies have you implemented or would you recommend to be introduced, to safeguard the integrity of elections against the threats posed by deepfakes? How can we ensure that voters can trust the information they receive? What role social media platforms have with respect to providing actual information to people?
- How can we empower youth with digital agency to combat the rise of disinformation fueled by deepfake technology? What educational initiatives or programs have shown promise in helping young people identify and resist deepfake misinformation?
- In light of the growing prevalence of political deepfakes, what measures can be taken to rebuild and maintain trust in democratic processes? How can we collectively enhance the public's ability to distinguish between fact and fake to prevent the erosion of trust in our information ecosystem?
Further reading
About the IGF Youth Track
Former IGF sessions on deepfakes
About deepfakes
- Tackling deepfakes in European Policy, written by the European Parliamentary Research Service
- A look at global deepfake regulation approaches, written by Responsible AI Institute
- Facing reality? Law enforcement and the challenge of deepfakes, written by the Europol Innovation Lab
People
Focal Points:
- Anja Gengo, UN IGF Secretariat
- Nadia Tjahja, YOUthDIG Coordinator
Moderator:
- Pilar Rodriguez, YOUthDIG Organizing Team and coordinator of Youth IGF Spain
Speakers and contributors:
- Chengetai Masango, UN IGF Secretariat, Head of Office
- Emili Khachatryan, YOUthDIG 2024/Council of Europe Advisory Council on Youth
- Thomas Aarheim, YOUthDIG 2024/UN Student Association of Norway
- Everton Rodrigue, CGI.br/Brazil IGF
- Meri Baghdasaryan, Senior Case and Policy Officer, Meta Oversight Board
Rapporteurs:
- Phyo T., Myanmar Youth IGF coordinator
- Saba Tiku, Ethiopia Youth IGF coordinator
Online moderators:
- Levy Syanseke, Zambia Youth IGF
Video record
Transcript
Disclaimer: This is not an official record of the session but provided only for convenience. The following transcript is automatically generated by the zoom recording function. Timestamps were removed, else the text is presented in it’s original format, as provided by the zoom system (e.g. including any spelling mistakes). The accuracy of the transcript is not guaranteed.
Host, EuroDIG 2024, WS 1: The room is gonna be packed
Host, EuroDIG 2024, WS 1: just for the remote participants. I will go. Okay. So please, enter our zoom name. And if you want to ask a question, please, I'm using the zoom function.
Host, EuroDIG 2024, WS 1: It will be unmuted. And one speaker switch more of the videos, state your name and information and do not surely so, having said that I'm happy to give more to our moderators of the session.
Host, EuroDIG 2024, WS 1: thank you so much. Well, 1st we're going to have a welcome from the UN Igf Secretariat. Oh, you can't hear me. Can you hear me? Now.
Host, EuroDIG 2024, WS 1: stay here.
Host, EuroDIG 2024, WS 1: Just speak louder, like, Okay, just, I'm going to leave out my whole voice here today. Okay, so first, st I think we have a welcome from the UN Igf Secretariat
Host, EuroDIG 2024, WS 1: whom I invite to come and share with us.
Host, EuroDIG 2024, WS 1: Okay, if you want to start if you stand up.
Host, EuroDIG 2024, WS 1: Check. No.
Host, EuroDIG 2024, WS 1: you can do my keyboard from here. Yeah.
Host, EuroDIG 2024, WS 1: You know, we had this. I want to use that as a cheer. And it went really? Well. So, oh, okay, okay, I can do that, too. That's fine.
Host, EuroDIG 2024, WS 1: like lots of that. Okay, thank you. And
Host, EuroDIG 2024, WS 1: good meetings. Everybody my name is Chingata time Asango, and I am head of the it Secretariat based in Geneva. Some of you may know Anya Gingo. She works with me.
Host, EuroDIG 2024, WS 1: So I've got a speech for you and I'll read it out. Okay.
Host, EuroDIG 2024, WS 1: it is a pleasure to officially launch the idea of 2024 youth track here in beautiful illness. So it was raining yesterday. But that's fine. With stakeholders joining us from around the world on site and online.
Host, EuroDIG 2024, WS 1: I would like to thank your Dick, and it's you stick for hosting us.
Host, EuroDIG 2024, WS 1: it's been nearly 2 decades have passed since Kofi Annan, the UN Secretary General convened the idea back in 2,006, and I don't think anybody here was there right? But I was in these dynamic times when changes are on the horizon.
Host, EuroDIG 2024, WS 1: It is good to remind ourselves about the historic journey the Ikef community went through from co consistent work which resulted in introducing the multi stakeholder model under the auspices of a traditionally
Host, EuroDIG 2024, WS 1: multilateral intergovernmental mechanism to implementation of key idea principles and values at the local regional
Host, EuroDIG 2024, WS 1: and global levels.
Host, EuroDIG 2024, WS 1: The changes I've mentioned relate to the upcoming Global Digital Compact and the 20 year review of the World Summit on the Information Society.
Host, EuroDIG 2024, WS 1: And these are both great opportunities for this community, including the youth. Specifically the youth as well
Host, EuroDIG 2024, WS 1: to upgrade. It's already excellent work and help ensure. The world has accessible, affordable, safe and secure human rights respecting resilient Internet. And it's good governance.
Host, EuroDIG 2024, WS 1: The journey of hard work and changes continues. This calls for continuous investment in the next generation of experts and leaders by the current generation
Host, EuroDIG 2024, WS 1: at the level of the idea. This has been recognized a long time ago through a dedicated Youth Track Corporation is developed across silos through the corporation of youth, idf initiatives existing around the world, and the various international youth focused initiatives.
Host, EuroDIG 2024, WS 1: The Ipf secretariat is proud to be able to work with these leaders and ensure that the track has a firm structure
Host, EuroDIG 2024, WS 1: and rich, globally relevant, input coming from communities from all corners of the world.
Host, EuroDIG 2024, WS 1: This year the idea of youth track is focusing on governance of artificial intelligence under the motor.
Host, EuroDIG 2024, WS 1: empowering youth leaders for trusted AI.
Host, EuroDIG 2024, WS 1: 4. Capacity development workshops are to be hosted in conjunction with the regional Idf, all leading to the Idf. 2024 global youth summit hosted in Riyadh on the 15th of December.
Host, EuroDIG 2024, WS 1: from exploring AI generated deep face. Here at Eurotic we are continuing our journey towards discussing the use of large language modules in schools at the Asia Pacific youth idea, and to looking at human rights implications during the Latin, American and Caribbean idea
Host, EuroDIG 2024, WS 1: to ways of building a strong ecosystem ecosystem for good AI governance at the African idea, working with and for youth, is working for a better tomorrow.
Host, EuroDIG 2024, WS 1: I am teased to confirm that the youth track fosters a dialogue between young people and those at more senior experts.
Host, EuroDIG 2024, WS 1: The workshop today is no exception, and I look forward to learning from all of you about the issues and opportunities that AI brings to us. Thank you very much, and I wish a good deliberation.
Host, EuroDIG 2024, WS 1: Thank you so much, Shenegai, for your keynote. And now,
Host, EuroDIG 2024, WS 1: I'll be welcoming all of you. To this roundtable discussion on AI threats and new strategies to detect, to detect AI generated political deep fakes organized by the Igf youth track. I'm Pilot. I'm going to be the moderator for today, and I'm also Youth Day. Organize
Host, EuroDIG 2024, WS 1: a
Host, EuroDIG 2024, WS 1: here. We will discuss one of the most pressing issues at the Internet section of technology and security as AI continues to advance at an unprecedented pace. We must be prepared to combat this misuse.
Host, EuroDIG 2024, WS 1: particularly in the realm of political misinformation, deep fakes, hyperrealistic videos, and audio recordings generated by a I have emerged as a powerful tool for deception.
Host, EuroDIG 2024, WS 1: These sophisticated forgeries can manipulate public opinion, disrupt elections, and undermine trust in our institutions.
Host, EuroDIG 2024, WS 1: The ability to create convincing deep fakes is no longer confined to well funded organizations. It's becoming increasingly accessible. Raising the stakes, protection and prevention
Host, EuroDIG 2024, WS 1: in our panel. Today we will explore the latest strategies and technologies being developed to identify and mitigate these AI related risks. To this end. We have a group of experts on this matter who are at the forefront for combating this digital mass.
Host, EuroDIG 2024, WS 1: And without further ado, let's begin our conversation by welcoming our distinguished panelists.
Host, EuroDIG 2024, WS 1: So online, I hope she's yes, I can see her online. We have Mary, senior case and policy officer at Meta Oversight board
Host, EuroDIG 2024, WS 1: to my
Host, EuroDIG 2024, WS 1: left.
Host, EuroDIG 2024, WS 1: We have Thomas. He's a youth day 2,024 Fellow and UN. Student Association of Norway.
Host, EuroDIG 2024, WS 1: Next, Emily, from the Council of Europe, Advisory Group on Youth, member and unique, 2024, member as well. And finally.
Host, EuroDIG 2024, WS 1: we have Everton an expert advisor to the board of cgi.com dot VR from Brazil. So Mary, we will start with you.
Host, EuroDIG 2024, WS 1: yes, I see you. Okay. So with 2024 being a pivotal election year, what strategies have you implemented? Or would you recommend to be introduced to safeguard the integrity of elections against the threats posed by deep peaks.
Meri Baghdasaryan: Thank you, Pilar. Can you hear me? Just to confirm.
Host, EuroDIG 2024, WS 1: Yes, we can hear you. Well.
Meri Baghdasaryan: Perfect. Thank you so much. For having me so, as Killer mentioned, I work at Meta's Oversight board, and for those that do not know about the board. I will just give a brief overview. So the board is an independent body, so it is not part of Meta, and it is a body of 22 experts from around the world.
Meri Baghdasaryan: and it was created to answer some of the most difficult questions around free expression, human rights on and content moderation globally. The Board guides international human rights standards between finding decisions that are in response to the appeals from Facebook thread and Instagram users from around the globe.
Meri Baghdasaryan: The Board also issues non binding recommendations
Meri Baghdasaryan: that I aim to address more systemic content moderation issues on Meta platform
Meri Baghdasaryan: till I started with a very big question and I'll give a brief overview, but happy to go into more detail as the discussion goes
Meri Baghdasaryan: for the protection of elections and civic space is one of the 7 strategic priorities for the board, and earlier in May the Board published it's
Meri Baghdasaryan: peace on the key lessons learned around elections in this unprecedented year of elections around the world.
Meri Baghdasaryan: Ed.
Meri Baghdasaryan: The.
Meri Baghdasaryan: That's what
Meri Baghdasaryan: friend
Meri Baghdasaryan: you need to defend the specific.
Host, EuroDIG 2024, WS 1: Mary, you're breaking up a little bit.
Host, EuroDIG 2024, WS 1: Is your connection stable?
Meri Baghdasaryan: Can you hear me better now?
Host, EuroDIG 2024, WS 1: Yes, now we can hear you better. Thank you.
Meri Baghdasaryan: So I was saying that early
Meri Baghdasaryan: the Board published
Meri Baghdasaryan: key lessons learned from its case law around elections, and one of the main messages is that the platforms need to
Meri Baghdasaryan: do everything to defend open civic space which means that it should provide space for members of political opposition, human rights defenders and marginalized voices
Meri Baghdasaryan: to express their opinion and participate in democratic process.
Meri Baghdasaryan: Other main messages from this paper include the fact that companies must set global platform centers for elections everywhere.
Meri Baghdasaryan: They should not let political speech that incites violence get go unchecked in terms of deep fakes the Board acknowledges that live has been part of political campaign. Always, however, with the development of technologies, this has become more problematic, not only in terms of deep fakes and other AI generated or manipulated content, but also in terms of
Meri Baghdasaryan: deep fakes. And the most relevant case to our discussion is the decision around the altered video. President Biden in which the Board reviewed a video that had been altered to make it appear as though the Us. President was inappropriately touching his front granddaughter's chest.
Meri Baghdasaryan: The Board emphasized that the mere falsehood cannot be the sole basis of a restricting freedom of expression under human rights law, and outlined a number of gaps and inconsistencies with meta manipulated media policy. So eventually the Board decided that this content should stay on the platform because it didn't violate the policy as it was in force at the time, however, it recommended a number of measures for Meta to review and
Meri Baghdasaryan: improve its manipulated media policy. One of those was that the policy at the time was
Meri Baghdasaryan: only concerning
Meri Baghdasaryan: AI modified content. But it was not, for instance, talking about any other way of manipulated
Meri Baghdasaryan: content. It also was treating audio and auto visual media differently. Also, there was difference between what the people were doing that was altered and what they were saying. But you know
Meri Baghdasaryan: around the world
Meri Baghdasaryan: a lot of times audio based deep fakes are also, quite influential and have adverse
Meri Baghdasaryan: consequences. The main recommendation from this case was around urging Meta to label its content that is, AI generated or manipulated instead of relying on fact checkers, because this has several. There are several differences.
Meri Baghdasaryan: 1st of all, the board noted that labeling is not as is, is more scalable than relying on 3rd party fact checkers, because fact checkers
Meri Baghdasaryan: this back checking process works differently, depending on language and market, or where the specific content comes from. At the same time. When a piece of content is back checked, and then a specific back checking label is applied to it. It gets demoted, which means the content ranking in the recommender system gets down ranked. And this actually has a problem in terms of
Meri Baghdasaryan: You know, the spread of this kind of content, and the board in general is very concerned about negative impacts of devotion on freedom of expression.
Meri Baghdasaryan: Meta, in fact, committed to implementing this recommendation to start labeling manipulated content instead of defaulting to removal. And since May 24 met, already started labeling organic AI generated content
Meri Baghdasaryan: and in in addition to labeling, the Board believes that Meta should give a researchers access to data to verify this process and also assess the impact of platforms, automated decisions and content duration. And, thanks to the Board's efforts, Meta's The Content library there. 3rd party researcher tool is open to 3rd party researchers around the world at the moment.
Meri Baghdasaryan: in other cases the board is currently also looking at a case bundle that involves AI generated mute images of mail female public figures from India and us. But I will stop here and can discuss more as the discussion goes. Thank you so much.
Host, EuroDIG 2024, WS 1: Thank you so much. Mary. And now, Emily, we witness deep fakes, influence and interference with political discourse and voter perception in recent years. Given the rapid advancements in deep fake technology. And it's increasing accessibility. What are the most pressing challenges we face in regulating and mitigating the misuse of this technology in the political arena?
Host, EuroDIG 2024, WS 1: Thank you for the question. Pilar is, can everyone hear my voice? I'm trying to speak out loud. Perfect. I'm Emily Hachetan, coming from the Council of Europe of the Advisory Council on Youth, and I would like to explain to you 1st what the structure is how, how we function in order to understand how we can, how young people can help with the regulations and with the recommendations to
Host, EuroDIG 2024, WS 1: deal with deep Fakes and other AI concerning issues that we face so Advisory Council on Youth works next to the Committee of Ministers, and we sit down on a joint Council and discuss political issues that are violating human rights, democracy and rule rule law. And I'm representing the portfolio on digitalization and young
Host, EuroDIG 2024, WS 1: people and artificial intelligence. And this issue, the issue of defect is a concerning issue, because it is important to implement regulations targeting deep bakes and synthetic media. And I really like the idea that Meta has started implementing labelling because that is something that I actually wanted to propose.
Host, EuroDIG 2024, WS 1: It is an idea that China has been implementing since 2019, according to their cyberspace administration of China, and they took a step to issue this regulation and making it make it mandatory for all the social media platforms. So I believe this could be crucial for European Union as well. We could implement and adapt this regulation to our needs
Host, EuroDIG 2024, WS 1: and regulate all social media. Not only Meta in order to label all the all the content, because labelling will help people to be more sure whether it's correct or not. And I'm also a data science student. So coming from a technical background, this is not hard to implement. It is an AI model that can be easily created through training and testing. And as more deep fake data is being
Host, EuroDIG 2024, WS 1: created right now, these AI models can be trained and used very easily, so it can be done in less than a month. And how young people can actually get involved in this would be having more focus groups and having more young people actually interacting with deepfakes as where the generation who are using the technologies as much. It's easier for us. I noticed that it's easier for us
Host, EuroDIG 2024, WS 1: us to understand which one is a deep fake which one is not. And maybe if there was a website, a tool where young people could choose which one is a real authentic content and which one is AI generated content. This could give Meta and other technical technical companies to develop these AI tools even easier.
Host, EuroDIG 2024, WS 1: And the companies due to the regulation. The companies that do not have these detector tools should be fined in a way, because this is exactly like copyright issues. This is exactly like violent content, and it should be treated the same way. And a solution should be implemented for this, because this is also due to algorithms. And if algorithms can be modified and actually taken to
Host, EuroDIG 2024, WS 1: situation, this deep fake content, so many people will be less manipulated and could enter social media with more safety net, because we always enter social media, and we never know what to trust or what not to trust, which is a big big issue in our society now.
Host, EuroDIG 2024, WS 1: And yes. And why, young people? Why, why, young people would be a good help to engage with these fast moving technologies. It's because it is our future. We are currently building our future together. And with Eurodig. It's great that this co-management system is being developed, and it should be done not only on international levels, but also
Host, EuroDIG 2024, WS 1: on national levels in different countries, because young people can help policymakers to think of these regulations even better, and we will present our youth big messages in these couple of days. I hope we are also excited for it, because there are many solutions we came up with. And yes, also, as I am mentioning that young people can help with this regulation.
Host, EuroDIG 2024, WS 1: I also find that it's very important to implement curriculum and educational curriculum in their system, in their middle school educational system, because we have the informatics class. But what does it detail? It has Xl classes, Microsoft Microsoft office classes. But now, as AI is becoming a tool, especially, it is important to teach young people how to use these
Host, EuroDIG 2024, WS 1: properly in order to improve our work and become more productive instead of just losing our critical thinking due to these AI technologies. So I would urge all, considering to implement this AI literacy in educational courses and curriculums in many countries, because afterwards these young people are going to spread the message between an older generation. Young people
Host, EuroDIG 2024, WS 1: can raise awareness and also teach elder people of how to use this, how to understand it better, because AI is viewed as a bad thing in our world as something that is going. For example, robots in the future are going to take us over, but in good hands it can be used for social impact, and with a strong background knowledge
Host, EuroDIG 2024, WS 1: of it, it can be done very well and eliminate manipulation on social media and other platforms. Thank you.
Host, EuroDIG 2024, WS 1: Thank you so much. And
Host, EuroDIG 2024, WS 1: thank you, you diggers for supporting your. And yeah, I I'm talking like you introduce the Youtube messages. And I think that's great. And I encourage everyone here to come and listen to them because they're designed. They're designed by the young people to give you, like senior stakeholders the chance to listen to our their messages.
Host, EuroDIG 2024, WS 1: and I'm so glad that Mary, and met us implementing the the link. So Everton in light of the growing prevalence of political, deep fakes. What measures can be taken to rebuild and maintain trust in democratic processes? How many hours do we have to answer that question? That's i i would love to to
Host, EuroDIG 2024, WS 1: make a very comprehensive answer to that, but because it it's a very complex scenario. But well, I would like, anyway, yes, to thank you for the invitation, and, thanks to all of you here, and remotely following this debate, I'm from Nick Br here. I'll be speaking on behalf of the Brazilian Igf
Host, EuroDIG 2024, WS 1: the impact of Deepfake largely depends on the informational environment. It finds so that environment is composed by many pieces. We all know that in a very healthy environment, with diverse sources, robust robust journalism and careful fact, checking deep fakes can be quickly be proven. Fake. So that's the easy part.
Host, EuroDIG 2024, WS 1: whether they come in video, audio or both, because we have both in development, so rebuilding. Trust, in my opinion, starts with fostering of robust information. Environment. As my background is in communications, I believe, I firmly believe that the best way to combat misinformation begins with trustworth information, and journalism alone
Host, EuroDIG 2024, WS 1: won't solve every problem. But collaborating with media professionals is essential to provide a wider audience with reliable information. And this information exists on the Internet, despite so many challenges ranging from technology to finances or to financing the journalism everywhere. So
Host, EuroDIG 2024, WS 1: this depends on on an information environment which is solid and healthy, that that has good journalists working, being well paid with republican criteria and seeking impartiality.
Host, EuroDIG 2024, WS 1: Although the truth is most mostly an intention, then an end in itself.
Host, EuroDIG 2024, WS 1: Deep fakes will prevail when reliable information is living in the shadows of the Internet. So it's not about being by default against the media outlets that are more biased or that have their political positions of speech, or and so on. But it's about considering more strongly those who seek this sort of consensual truth, so to say, that are closer to this relationship with the reality itself.
Host, EuroDIG 2024, WS 1: All of them have their places, both those more bias and those who are more so to say, neutral. They have their places in us in our societies, and they deserve to live independently from major digital platforms, which is a reality
Host, EuroDIG 2024, WS 1: more over and over more distant in the present times, and this regarding this discussion and letting the spreading of misinformation go on, is a dead end for digital platforms for their users which may be personally affected by deep fix for the trust on the Internet and for societies as a whole. So this is not a small topic that we are talking here
Host, EuroDIG 2024, WS 1: to compose the discussion in this issue. Last year CGI. Br. The Brazilian Mood stakeholder Steering Committee. We have here a board member from CGI with us.
Host, EuroDIG 2024, WS 1: cgi.br. Published a study suggesting the idea. One of the ideas is to suggest a public fund financed by major Internet platforms as one of the possible ways. We have a lot of ways ahead, but one of them would be the idea of this public fund to improve the Brazilian informational environment. And while this is not the only suggestion on the table today, cgi.pr is sure
Host, EuroDIG 2024, WS 1: that this debate should be accepted as legitimate in the sense that regulations deserves to be at least discussed by policymakers from all branches, the lawmakers, the executive branch, and the judiciary as well.
Host, EuroDIG 2024, WS 1: So let's not forget the role that each stakeholder has into feeding and organizing those discussions, including the very well articulated youth groups. And because, after all, do deep fakes affect young people in a political environment? Yes, they do.
Host, EuroDIG 2024, WS 1: Are deep fakes, becoming more C simpler as time go by, goes by? Or are they becoming more complex, more complex, right? So these challenges gives us an opportunity to organize and promote engagement, to tackle those issues.
Host, EuroDIG 2024, WS 1: One another
Host, EuroDIG 2024, WS 1: important topic that I would like to highlight is to track those who finance. The creation and spreading of deep fakes. This requires agile coordination and implementation of efforts with law enforcement, agencies, and other authorities. And additionally, it's very important to recognize that the origin and development of deep fakes depend on the development of artificial intelligence mechanism
Host, EuroDIG 2024, WS 1: whose bodies require permanent improvements, and otherwise we will be only discussing and debating the effects of deep fakes, instead of looking to the origins of them of deep fakes themselves
Host, EuroDIG 2024, WS 1: coming to the end of my intervention. Our social connections, they allow us to achieve great results from consensus based stakeholder debates which consider the scenarios and challenges in different regions.
Host, EuroDIG 2024, WS 1: So in that sense, it's necessary that each region convene those discussions to address their challenges, their challenges regarding deep fakes. For example, using the Nri structure for that. But not only the Nri structure.
Host, EuroDIG 2024, WS 1: When governments, tech companies and, for example, citizens work together, we can create an environment
Host, EuroDIG 2024, WS 1: that tackles and healthy content which compromises our informational environment. And these collective constructions include providing social and technological skills that combat deep fakes inside and out of
Host, EuroDIG 2024, WS 1: social media channels. Those tools should be carefully, should consider carefully the different languages, for example, that are spoken in a major territory that are spoken in a territory. Sometimes we can only easily recognize using our tools.
Host, EuroDIG 2024, WS 1: deep fakes, which are created in one language, but not in another one. When it comes to audio deep fakes.
Host, EuroDIG 2024, WS 1: If a targeted group is constant and is constantly being harassed, for example, and many other important issues, so the greater the size of a player in this environment, the greater the responsibility it has. So in our roles as citizens and conveners of debates is not small as well.
Host, EuroDIG 2024, WS 1: So to sum up, protecting trust in our democracy when there are lots of deep fakes means using many different ways. So there's space for everyone. I could stay here talking for hours. But that's it for the moment. Thank you very much, Gila.
Host, EuroDIG 2024, WS 1: Thank you so much because it's true that
Host, EuroDIG 2024, WS 1: Verizon deep fakes like Mary Mary mentioned audio the the face. Video deepfakes picture Deepfakes is just for us. It's very. It's becoming very hard to detect them. So thumb Thomas. How can we empower youth to come back the rise of this information fuel by deep fake technology.
Host, EuroDIG 2024, WS 1: Thank you for that question. So yeah, my name is Thomas. What I'm I'm from Norway, and I have my background in organization called the UN Student Association.
Host, EuroDIG 2024, WS 1: I think that question is, it's a very good question. I think?
Host, EuroDIG 2024, WS 1: it's also followed up with what? What's kind of
Host, EuroDIG 2024, WS 1: what initiatives have shown promise in in this area. And I think you know, empowering youth is something that has shown promise when it comes to these areas? These these questions are a little bit inter related. But I also think the 1st question we have to then ask is, who is going to empower youth, you know, is it? Are we talking about teachers, decision makers, parents, other youth. And of course the the answer is, at least in my opinion, that it's not just one group
Host, EuroDIG 2024, WS 1: that we're talking about to to empower you if it has to come from a united force, if you will. And why is this important? Because I think more and more we talk about.
Host, EuroDIG 2024, WS 1: or, for example, we've talked a little bit already today about, you know, we're being called digital natives, the youth
Host, EuroDIG 2024, WS 1: and and I'm not sure entirely whether or not that is a very useful, or at least helpful term to to use, because, if anything, it just creates a little bit of a further division between this generation and the previous generation, while in reality we're probably better off trying to stay on the on the same page, and and rather our our great way to empower youth is to be
Host, EuroDIG 2024, WS 1: is for these other groups to be a safe and trusted place for them to rely on or to rely on to come with their sources. So what do I mean with this?
Host, EuroDIG 2024, WS 1: For example, picture, you have a scenario of a a teacher sorry, a student, or a child
Host, EuroDIG 2024, WS 1: that comes to you and shows to you something on the phone that they saw on. Yeah, let's say, Tiktok or Instagram and the teacher or parent, they just kind of dismiss it and say, you know, you shouldn't be on Instagram and Tiktok and get your information from these sources. You should, you know. Read the real news.
Host, EuroDIG 2024, WS 1: Now, I think my point is that that is not a very useful way of answering and why is that? Well, the reality is that these platforms they're here to stay, and youth are on them, and they will form their attitudes, and they will get their news from these sources, whether or not you like it
Host, EuroDIG 2024, WS 1: so rather than than dismissing it, I think it's important that the the recipient group has the necessary tools to feel confident, answering in an empowering way, and that this it can be a way of, you know, bridging this generational gap.
Host, EuroDIG 2024, WS 1: So I know, for example. If we're talking about some initiatives that have shown promise in my home country of Norway, they have been developing something called the conversation cards which essentially can be between a parent and a child that you have around the kitchen table, or it can be in a classroom with a teacher. Talking to their students about where they get their information and
Host, EuroDIG 2024, WS 1: what they can do if they're uncertain of what what the intention say behind this information could be.
Host, EuroDIG 2024, WS 1: And essentially this provides these educators, like parents or teachers, with the right questions to to in a constructive way, lead the conversation with youth. Youth. And I mean, that's 1 aspect of it. But then I also want to stress that youth also has a responsibility to other youth when it comes to this.
Host, EuroDIG 2024, WS 1: So that means that's, you know, that's us checking our own attitudes, and how we treat, you know, if your friend says something or sends something in a group chat that you're not entirely certain about, you know. Bother enough to take that conversation and ask them, okay, where did you get this from? Is this just something you saw on, you know. Could this be, you know, presented in a in a way that has a certain
Host, EuroDIG 2024, WS 1: narrative behind it? And actually asking these questions is, what's going to be, I think, at least a really important tool of combating misinformation. And and this information, especially with with like deep fakes and something that can seem very convincing.
Host, EuroDIG 2024, WS 1: So essentially to sum up, all generations have to be a part of the solution. And it's so important that we know how to have the conversation like we know how to do the proper like, ask the right questions and then follow up on them and have the confidence to do so.
Host, EuroDIG 2024, WS 1: Because even though we are, you know, we're talking about cutting edge technology, that is in rapid development. I think. I think that doesn't necessarily mean we need to have a cutting edge answer. Sometimes we can also use non tech answers to, you know. Talk talk about. You know things we don't even know entirely how it came about. Like defakes. Most people probably won't know the technical stuff behind it, but you can still talk about it in a non technical manner.
Host, EuroDIG 2024, WS 1: So the answer doesn't necessarily have to be something drastically new. And I think my my key point is that it's important to keep a little bit of the the human touch in all of this. And you know, just know how to talk to each other about these kind of topics
Host, EuroDIG 2024, WS 1: like with a more discussion process. And I'm going to refer to Mary. Because, Thomas, you referred to social media platforms in your in your intervention. And Mary like Thomas mentioned, how are you implementing the tools to make the people feel confident in your platform.
Meri Baghdasaryan: Thank you so much, and thanks to my fellow speakers for very interesting interventions. So to to start, I'll start by clarifying that the board is not
Meri Baghdasaryan: Meta. So I'm not speaking on behalf of Meta here overall, as I was saying, based on the board case law and overall position. I think. There are several things that should
Meri Baghdasaryan: work together to achieve user trust, which is the ultimate goal in in this
Meri Baghdasaryan: time of our development. And I think we need to look at this
Meri Baghdasaryan: more depending on which situations we are taking into account. So I think this session mostly tried to focus on
Meri Baghdasaryan: more generally on the fake. But and AI generated manipulated content, but also more specifically in the context of elections, which, in my opinion, this type of manipulated content poses more specific risks.
Meri Baghdasaryan: And I think there should be combination of several factors. So one is as mentioned, some technological solutions as Emily was saying, AI is not
Meri Baghdasaryan: an inherently bad thing. There are a lot of benefits to using AI technology. But maybe using things as a labeling can be very helpful for users to understand. What is a human, created, so called content. And what is AI generated and you could already see that implemented on meta platform.
Meri Baghdasaryan: The second thing, in my opinion is, as has been mentioned several times, the need to continue with more
Meri Baghdasaryan: user empowerment and overall digital literacy development. This is also something that the Board has been pushing for decision, starting from providing more notification to users when A post that they made was in violation of the community standards. In the 1st year of the Board existence. It already had cases where the Board called on Meta to provide more reasoning to the
Meri Baghdasaryan: content, take down decisions and provide the users with the opportunity to explain.
Meri Baghdasaryan: trying to say in their post which, in case of, for instance, that hire or humor is quite difficult to understand for moderators, or for AI system for content moderation. So I think overall beyond platforms. We need to invest more in digital literacy. I, as far as I remember myself in this space.
Meri Baghdasaryan: this has been a theme, and I think technologies come and go, new ones emerge. But the need for digital literacy stays the same, and this the Poc. In my opinion, that are useful for everyone regardless of age and profession, are actually quite transferable to new type of technologies. And the 3rd factor, in my opinion, is the cooperation across different stakeholders.
Meri Baghdasaryan: A lot of groups are thinking about similar issues, not only from technological perspective, but generally understanding the impacts and or different technologies around the world. The impacts are also different
Meri Baghdasaryan: on different in different regions, and I believe a corporation and multi stakeholder corporation is the 3rd factor that is very much needed, especially now with the much faster development of technologies. So I don't have one solution to answer your question. But I think several things need to work
Meri Baghdasaryan: together to to make this happen. And definitely, platforms like this, and especially having youth voices, it's very crucial in this process. Thank you.
Host, EuroDIG 2024, WS 1: So, Emily, just by reference, I'm going to. Yes, I would like actually to ask a question to Mary, because I'm very curious whether it is a possibility to implement into the algorithm prioritizing educational content on AI literacy. Because you mentioned the point that it's digital literacy. AI literacy is very important, and I was curious whether Meta has the possibility in the opportunity
Host, EuroDIG 2024, WS 1: to give this small introduction to the algorithms in order to prioritize this kind of content, because if it is being shown to more people, then we can actually raise awareness. And young people from the other side, as we are going to educate them, it we can give them an opportunity to create this kind of content. And if this content is created and Mata is pushing it and promoting it. It would be a great cooperation with between young people and also private sector.
Host, EuroDIG 2024, WS 1: So, Mary, good to take the question.
Meri Baghdasaryan: Sure that's a great question, Emily. Thank you. For that. Again. Not speaking on behalf of Meta, but
Meri Baghdasaryan: based on what we know.
Meri Baghdasaryan: I think, given the sheer amount of content on the platform, it may be it may not be that easy to understand which one is the educational content on the platform to recommend more or less.
Meri Baghdasaryan: I think what may be helpful in addition to labeling is just to potentially provide more prompts to users. I think, for instance, that Twitter was doing a lot around. You sent this information when people were sharing posts without reading them. So the prompt when you are trying to forward the message, I think Facebook was doing the same that I was doing the same on Whatsapp
Meri Baghdasaryan: when people were doing Covid, we're just forwarding messages to bigger groups and just spreading information without verifying.
Meri Baghdasaryan: And they basically restricted the this functionality to a certain number of groups, and then the prompts were in development. So I think this kind of prompts that will also include
Meri Baghdasaryan: more of this.
Meri Baghdasaryan: Take another look. Think about this a bit more like stop and think, and then act kind of messaging will be helpful, and the board is definitely pushing for this, not only for AI generated content, but more more generally because this is an issue, regardless of
Meri Baghdasaryan: whether the content is manipulated or it's just a more simple disinformation piece. So if you have other ideas the board usually in all cases has the call for public comments, and we would love to hear more ideas and your thoughts or expertise on specific things that you think may make sense. And then, the board is happy
Meri Baghdasaryan: to take it into consideration and try to put, because this is also a direction we're trying to make an impact on.
Host, EuroDIG 2024, WS 1: Amazing. Thank you so much. Thank you, Mary and Everton. I would like to refer to you cause we label things we like try to improve trust. But how can you collectively, in enhance the public's ability to distinguish between fact and fake?
Host, EuroDIG 2024, WS 1: Come on
Host, EuroDIG 2024, WS 1: the final answer. I think it's still to be discussed by. But I think that a good start is by promoting more debates, collective debates, and to share ideas on how to build those the future that we want ahead. So in the case of the Brazilian Igf, we.
Host, EuroDIG 2024, WS 1: the the Brazilian Internet Steering Committee, organizes the events. And then we have this call for proposals. Many of those proposals, debate deep fix and deep, and needs information and other aspects. And this is one good way forward for sure. So I'm quite sure that by collectively
Host, EuroDIG 2024, WS 1: discussing and promoting those debates, not only
Host, EuroDIG 2024, WS 1: from our perspectives, but also allowing other voices, this is a good way forward.
Host, EuroDIG 2024, WS 1: So Pouse, would you like to add?
Host, EuroDIG 2024, WS 1: Well, I mean, I think, I agree with with everything else. Just said, I think
Host, EuroDIG 2024, WS 1: something that I also wanted to to bring attention to. That hasn't been discussed yet. Which I'm can. I can ask if my fellow panel panelists agree with me or not. But there is also, you know, a certain responsibility about or on how we talk about these type of things like this, information is different, especially with deep fix that can seem like it's very, very scary, and that the world's kind of
Host, EuroDIG 2024, WS 1: like coming to an end because we can't trust anything online anymore. And then
Host, EuroDIG 2024, WS 1: because we've also talked quite a bit about this back home as well. If if the people that use, you know. So let's say, social media are
Host, EuroDIG 2024, WS 1: are very, you know, scared and and become quite like frustrated, or essentially, if you, if you're scared of something you become less critical of or your ability to think critically also diminishes right? So there's also a responsibility for. Well, this may be more like journalists, but also social media about how they talk about these types of issues and the
Host, EuroDIG 2024, WS 1: and how the how they might perceive the audience to to perceive it.
Host, EuroDIG 2024, WS 1: and also because we tend to, of course, that
Host, EuroDIG 2024, WS 1: all the young people here may not know. But there was a time in which there was no Internet
Host, EuroDIG 2024, WS 1: interact like this. And but joke is aside.
Host, EuroDIG 2024, WS 1: real conversations in real spaces need to take place.
Host, EuroDIG 2024, WS 1: Sometimes we we tend to think
Host, EuroDIG 2024, WS 1: or to only convene meetings and and discussions online, and to think of the effects of deep fakes and fake news or misinformation.
Host, EuroDIG 2024, WS 1: a very restrict space of social medias, but they also affect people by gossips and other
Host, EuroDIG 2024, WS 1: analog situations. So of course, that deep fakes are not necessarily analog. They. They are digital artifact. But still so these conversations need to take place. As you were mentioning one very interesting topic
Host, EuroDIG 2024, WS 1: which is to talk to your peers and talk to your friends, because sometimes they are sources of the transmission of the fake content. But not only that, sometimes they
Host, EuroDIG 2024, WS 1: maybe by accident, generating that this information, or even generating this information, and and
Host, EuroDIG 2024, WS 1: is to talk to each other, to know the opinions of your friends, to stay in contact with them in the real world is one of the ways that we have on hand as well to identify. If that content is true is
Host, EuroDIG 2024, WS 1: truth or a like, yeah, Emily. So yes, I'd like to add, I agree with both of my fellow panelists.
Host, EuroDIG 2024, WS 1: And the thing is that the digital world is becoming bigger and bigger, but it can never change and take the place of human interaction, because by the end of the day all of us have been living the life since childhood. AI was not there. I don't know. Social media was not there, but it will never fully replace it, because, for example, Chi, Gpa can never replace my teacher, because this human interaction
Host, EuroDIG 2024, WS 1: motions and all this attachment are more important, and all the experience that we lived through are more important than any tool can be. I can use Gbt. As I don't know a calculator in order to do my coding or to help me understand my course or content, but it will never replace my teacher or my friends. It can never do that.
Host, EuroDIG 2024, WS 1: But it's important to embrace these tools and remember how to and learn how to productively use them, because you can use it and lose your critical critical thinking. But you can use it and become 100 times more productive and finish your work instead of
Host, EuroDIG 2024, WS 1: I don't know 3 h in 10 min, which is great. We are evolving, and we are getting into the innovation. So instead of just being scared, we should embrace it, and as well try to not have this gap between the generations. Because that's also very important.
Host, EuroDIG 2024, WS 1: Yeah, I think. And we put that very well. And I think, like you're saying, we yes, we we're probably gonna have to learn a few new things, and probably how to use it in the most optimal manner. But then, maybe on the on the other side of that. We also have to make sure that we
Host, EuroDIG 2024, WS 1: don't get so much ahead of us that we kind of forget what it was like before. We don't want to become too reliable on these types of tools, so that we forget what it's like to have, you know, a a real conversation, or what it's like to be in a real classroom. So so maybe we then have to change a little bit. Our focus to, you know, have a little bit more discussions about how to how to do things, the the good old fashioned way.
Host, EuroDIG 2024, WS 1: and also appreciate that not everything, just because it's new is necessarily that much better.
Host, EuroDIG 2024, WS 1: So I'm going to leave the floor to someone in the room. Now.
Host, EuroDIG 2024, WS 1: the real reason is, the microphone is here. So thank you very much. I've been listening to everyone talk about different different subjects. There are, a few questions here. So when you see a video on the Internet. You immediately believe it because most people will actually believe the video immediately. And the zoom, it is true.
Host, EuroDIG 2024, WS 1: Actually, there were some sessions talking about fake news, about detecting all of these kind of problems. The the issue is that people will believe it because they have no reason not to believe by default. Humans have a tendency to just believe things. What happens is that if you consider social networks like Facebook, Twitter, or whatever it's called these days
Host, EuroDIG 2024, WS 1: and anything else. People will just believe things because they are just there. Facebook, a few years ago created something called the fact Checkers, or something similar to that name. And this is actually a very good idea. So you have some which is, you should trust that actually validates this, I can tell you today in my company, the 2 ones that were chosen
Host, EuroDIG 2024, WS 1: are the media. That's the biggest source of fake news that we have so clearly that didn't work because most of the answers were wrong. So my suggestion here is that we might need to do fact checking on AI videos and make sure that they are true if they are created by AI or not. But the problem spans much before that. So the reality is what we need is critical thinking.
Host, EuroDIG 2024, WS 1: We need to make sure that people are aware of things. We need to make sure that kids from a very early age, when they are 2, 3 years old. They are. Gonna ask the questions that my niece kept asking, which is, I talk to her and say, Oh, something's like this. Oh, that seems very strange. Let me ask someone. Let me ask my parents to see if this is true or not, and this is the critical thinking that I kind of because that was actually important. So think about this. If you have fake news, if you have.
Host, EuroDIG 2024, WS 1: these kind of things.
Host, EuroDIG 2024, WS 1: have people to double check them. Okay. By the way, best conspiracy theory is not that the artist is flats is that the sun is flat. Try to prove that one that one is much better.
Host, EuroDIG 2024, WS 1: Now we're going to be going to the closing remarks. And I'm interested to be very brief. So I'm gonna ask each speaker to give one tweet or x now like headline that you want the people in the room to give with. And I'm we're gonna begin with Mary.
Meri Baghdasaryan: Yes, thank you so much. This has been very interesting. I learned a few things from the audience and from fellow speakers. Thank you so much for having me. I think I my last words would be just reinforcing what we were saying, that there is need for more cooperation and need to exchange the the thoughts and ideas that we have
Meri Baghdasaryan: and to invest more in not only technology but also in social skills, human connection. And most importantly, digital literacy, which is
Meri Baghdasaryan: a team that is here to stay into plus one to Andres
Meri Baghdasaryan: points definitely asking, I think the default should not be. I don't want to say the default should be not trusting, but definitely trust, but verify should be the model for everyone. And this goes for both online and offline information, because fake news is not that new. It Internet didn't invent fake news. It has been there for long time. It's just the spread and reach has changed. And this
Meri Baghdasaryan: it will be
Meri Baghdasaryan: already, is a big issue with AI generated, manipulated content. So I really look forward to reading the youth messages, and definitely, if you have any ideas, please. We feel free to reach out we'll be happy to discuss this further and also cooperate later. Thank you so much.
Host, EuroDIG 2024, WS 1: Thank you, Mary Emily. Yes, I would like again to highlight and support the point that Everton brought up, that we need to have a body in that will include media policy makers, tech companies, young people this diverse body will have. We can have an open conversation all the time, because, for example, we came to Eurodic and I myself from Armenia, and I'm
Host, EuroDIG 2024, WS 1: also some Armenians from the Igf in Armenia. And now we are thinking for some ways how we can collaborate. And these kind of conferences and events help people to meet each other, connect and actually come up with something good because we can build our future together, no matter which generation it is. And I don't know. It's just important to
Host, EuroDIG 2024, WS 1: continue having real live talks and continue trying to decrease this gap between our generations and include people and multi-stakeholders from different places in order to come up with the best solutions and fight the issues that we're facing because they are becoming serious. And we need to be able to face it
Host, EuroDIG 2024, WS 1: together. And yes, thank you. Thank you, Emily. So everything. Oh, great segue! Just to add to
Host, EuroDIG 2024, WS 1: to. The last point is that
Host, EuroDIG 2024, WS 1: don't let others build
Host, EuroDIG 2024, WS 1: your truth the the truth of your group. So if you have an issue to tackle, organize yourself, it's not funding is hard for finding. Funding is hard for basically everyone. But don't let others tell the truth, for of your group. So if you're feeling
Host, EuroDIG 2024, WS 1: targeted by by this kind of info, that information, or or with this information, organize yourself. Organize your group make yourself heard. And as I said before, so truth starts with fostering a robust information environment, and that
Host, EuroDIG 2024, WS 1: requires you all of you there building the the reality that we want in the future. Thank you. Thank you. Maybe to sum it up, I think. When it comes to the realm of misinformation and disinformation. Real empowerment comes from being able to take those difficult discussions and not being afraid to being critical, even if it's to your friends or family, or someone, if you might be in like a higher position than you.
Host, EuroDIG 2024, WS 1: Yeah, let me just handle that. Thank you. I want to finish thanking the speakers, thanking the youth track all the volunteers, the rappers, and everyone involved in creating this session. And of course you the public that has been wonderful, and with this I think we can wrap it up.
Host, EuroDIG 2024, WS 1: So thank everyone.