YOUthDIG 2025
9 – 11 May 2025, YOUthDIG | 12 – 14 May 2025, EuroDIG
YOUthDIG programme 2025 | About EuroDIG 2025
YOUthDIG Messages 2025
Offline solutions to digital problems
Digital Literacy
- Problem: Children are becoming increasingly susceptible to the dangers of the digital world, including harmful content, manipulative narratives and echo chambers. We believe that digital literacy is essential in equipping young people with the skills needed to critically engage with online information, protect themselves from these threats and gain positive digital experiences. Many people do not realise how algorithmic manipulation (particularly social media) can play a role in the creation of breedings grounds for polarization.
- Solution: Education in digital literacy should provide individuals with vital skills for their digital future. This begins with comprehensive education in schools. They should empower students to identify disinformation, understand how digital platforms shape perceptions, and develop resilience against manipulation, with teachers being provided targeted training and resources to guide these discussions. This is where the private sector can contribute meaningfully through the development and support in training initiatives and community resources.
Other Stakeholders and considerations
- Parents also play a crucial role in fostering informed and safe digital practices at home. As the primary influence in a child’s early development, they must not only monitor screen time, but foster open dialogue about online experiences and encourage offline socialization.
- Public information and awareness campaigns can further promote vigilance across all age groups.Above all, digital literacy must remain politically neutral and rebuild public trust. At the moment widespread skepticism fueled by perceptions that scientists are biased or that politicians act out of self-interests can undermine the credibility of accurate information, there is a lack of trust which impedes adoption of education.
- As online spaces for youth become increasingly controlled and restricted, offline solutions offer a crucial alternative by creating physical environments where young people can freely interact, organize, and express themselves. These offline spaces provide safe havens for youth to develop the skills and confidence needed to navigate digital spaces responsibly and advocate for their rights online. By fostering a strong offline community, we empower youth to reclaim agency in both the digital and physical worlds, ensuring their voices remain heard.
Content moderation from two perspectives
Algorithms and content moderation.
- Problem now: One of the key problems that make young people so addicted to the digital space and online communities is the strength and specificity of the algorithm which can create polarisation.
- To address this issue, users must actively engage with algorithms that were once hidden and secretive but have now become central to how we experience content. Instead of passively scrolling, users should be empowered to give feedback on what they want to see more or less of, taking control over the content they consume. Platforms should not only explain why specific content is being shown but also offer alternatives—different accounts, channels, or media—so users can actively seek out diverse perspectives and reshape their digital experience.
Free speech vs Content Moderation
- Countries have a right to enforce their laws to protect their national sovereignty. Private companies controlling content online can undermine state authority and democratic oversight, giving governments unchecked power over the internet risks censorship which can lead to authoritarian control. The ultimate power over digital spaces should rest with the people—through transparent, democratic governance that protects both freedom and accountability.
- We should have ways to enforce content moderation that align with the right of free speech. To make sure our digital ecosystem does not enlarge any political bias in content moderation such as flagging and shadowbanning, we should allow individuals to be able to be exposed to all sides or all information. Individuals should have personal control over the information they consume online.
- Emphasize the importance of promoting user control tools on social media and platforms, such as allowing users to manage their own feeds and engage with democratic content moderation tools like lagging and community notes. This can lead to a positive political involvement and enhances awareness, making sure users take an active role in shaping their online experience. This requires raising awareness of what tools are already in place as well as expanding the application of them.
Everything AI
Increased Energy Demand and AI
- Increase collaboration for energy- and cost-efficient computing: As the demand for computational power grows, it's essential to develop greener, more energy-efficient GPU systems. Governments, companies, youth, academia, tech communities, and civil society must work together to advance sustainable infrastructure for future computing needs.
- Make data centers smarter and more sustainable: Promote the creation of smart cities that can utilize the waste heat produced by data centers, and move data centers to operate with environmental feasibility as a key standard.
- Ensure transparent energy usage in AI: Introduce accessible and comprehensible information about energy consumption and associated costs when using AI tools, both for end-users and developers. This transparency will foster more conscious digital behavior.
Making AI Practical, Understandable, and Accountable
- Indicate when AI is used: Digital services and platforms should inform users when an AI is operating, using visible markers or labels.
- Develop ethical guidelines on AI application: Establish clear ethical frameworks regarding how and where AI tools may be used, shaped with input from young people who will live with the consequences of these technologies.
- Educate for AI literacy and critical thinking: Embed AI education in school curricula and lifelong learning initiatives. Training programs should emphasize how AI functions, its limitations, potential errors, and the continuing responsibility of humans to critically assess and oversee AI systems.
- Assess risks before deployment: Ensure that AI systems undergo impact and risk assessments before implementation. Define accountability measures and clarify legal liability in the case of AI-related errors.
- Automate responsibly: Digitalize and automate more administrative tasks through governed AI tools to increase efficiency, while maintaining transparency and oversight.
Innovation and Inclusive AI Development
- Create frameworks for ethical AI innovation: Establish dedicated legal and policy frameworks that facilitate innovation in AI while minimizing economic burdens. These frameworks should ensure inclusivity and sustainability in access to AI development tools.
- Support the next generation of AI pioneers: Provide more funding, mentorship, and sandbox environments for students, researchers, and entrepreneurs to explore and experiment with AI technologies responsibly.
- Strengthen the startup ecosystem: Increase financial support and visibility for startups innovating in the AI field, especially those with social impact and sustainability goals.
- Empower youth, participating in shaping the 'AI Continent'—a digitally sovereign future where young people are actively engaged in developing, governing, and ensuring responsible access to AI tools across all sectors. Their voices and leadership must drive ethical, environmental, and economic integration, making youth central to the AI vision and innovation.
Human Rights and Data Security
- Software development companies, as key stakeholders in digital governance, must be held accountable for ensuring data protection. This responsibility involves deploying robust technical mechanisms, such as developing and maintaining secure IT infrastructure, and ensuring compliance with data privacy legislation. By doing so, they safeguard critical infrastructure and sensitive data, thereby fulfilling their duty to protect public trust and uphold security.
- Governments are responsible for regulating the use of automated decision-making systems and the handling of sensitive data, with a particular focus on ensuring transparency and protecting the rights of vulnerable groups—such as migrants, asylum seekers, and members of the LGBTQ+ community—who risk being disproportionately affected by the unbalanced power held by digital platforms.
- The use of surveillance technologies, including spyware, poses a serious threat to the right to privacy, personal security, and freedom of expression. Journalists, human rights defenders, and activists are particularly vulnerable to these intrusions, which can lead to self-censorship and suppression of dissent. Governments must be held accountable for the deployment of such tools, especially when used to silence opposition or restrict civic space. Without transparency and oversight, the misuse of surveillance erodes democratic institutions and fundamental rights. The internet should not be exploited as a tool of control by those in positions of power to manipulate public opinion or suppress dissent.
- Greater resources should be allocated by the governments to judicial and executive authorities to strengthen the fight against cybercrime. These resources should include advanced technical tools, improved cybersecurity infrastructure, and ongoing training for judges, prosecutors, and law enforcement personnel. Legal consequences must be effectively enforced, especially in cases involving data breaches and unauthorized disclosure of sensitive information. EU-level support and cross-border cooperation are also essential to address the global nature of cyber threats effectively.
Digital ecosystems / Regulation
We call for the creation of a strategic innovation ecosystem that places inclusion and collaboration at the heart of European digital development.
1) Smart Regulations
Smart regulation forms the foundation of our vision. We advocate for updated, simplified, and harmonized future regulations that support rather than hinder innovation. Technical experts must be systematically integrated into decision-making processes when creating regulations. Furthermore, emerging innovators should have access to free legal advisory services to navigate the complex regulatory landscape.
2) Strategic innovation ecosystem
The centerpiece of our proposal is a European Innovation Hub that serves as a catalyst for the development and the innovation of the digital field.
- European Innovation Hub: This hub would incorporate a secure sandbox for testing innovations, interconnected talent centers across Europe, and a shared pool of essential resources including funding, knowledge databases, legal counsel, and technological solutions. The existing European innovation facilities (eg. AI factory) will be involved in the European Innovation Hub. Universal digital accessibility must be guaranteed, while youth participation should be actively fostered through scholarships and study visits.
- Governance: We call for a pan-European representation with flexible participation models that accommodate different financial capabilities of the participating countries. The Council of Europe should oversee the initiative to ensure adherence to human rights and rule of law principles. A rotating presidency system would ensure equitable leadership, while a multi-stakeholder governance body modeled as EuroDIG would guarantee diverse perspectives. To promote global inclusion, the physical infrastructure should be located in a developing country in Europe.
About YOUthDIG
The Youth Dialogue on Internet Governance (YOUthDIG) is a yearly pre-event to the European Dialogue on Internet Governance (EuroDIG) aimed to foster active youth participation. Young people (ages 18-30) from the pan-European region with a strong interest in Internet governance, digital policy and cooperation are working together to draft and advocate for the youth messages.
Every year the YOUthDIG Organising Team is compiled of former YOUthDIG participants who design the programme for the upcoming edition. This is a chance to reflect and improve the programme from last editions, build further connections with YOUthDIG alumni and the EuroDIG community and to provide an opportunity for young people throughout Europe to participate in the Internet Governance Ecosystem focusing on topics that are important to youth in the field.
Find out more at our website.