Data Sovereignty and Trusted Online Identity – COVID-19 Vaccination Data – WS 03 2021

From EuroDIG Wiki
Revision as of 17:41, 20 September 2022 by ConstanceW (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

29 June 2021 | 12:15-13:15 CEST | Studio Trieste | Video recording | Transcript
Consolidated programme 2021 overview / Day 1

Proposals: #10 #21 #92

You are invited to become a member of the session Org Team! By joining an Org Team, you agree to your name and affiliation being published on the respective wiki page of the session for transparency. Please subscribe to the mailing list to join the Org Team and answer the email that will be sent to you requesting your subscription confirmation.

Session teaser

Data Sovereignty and Trusted Online Identity – COVID-19 Vaccination Data

Online identities are the key for many digital services. Identification is essential to everything from identifying with health or government services, to traveling, to participating in social media. But who should control those IDs and how can we minimize the personal data exchanged to a minimum that is needed for the services?

Using the concrete example of COVID-19 vaccination data, we will discuss possible approaches regarding who should be in control of the data – private companies, government, or citizens – in different scenarios.

Session description

Until .

Online identities are the key for many digital services. From identifying with health or government services to managing a bank account or just participating in social media, from paying taxes to buying goods, end-users and consumers, identification is essential. But who should control those IDs and how can we minimize the personal data exchanged to a minimum that is needed for the services? The recent discussions about vaccination passports have highlighted that this discussion is at the center of the current debate. There are 3 approaches that we would like to discuss here:

Scenario 1: Private companies lead the effort. Private tech companies provide us with secure electronic identification including two factor security and biometric verification. However, this raises many privacy and data-sovereignty concerns. For example, the Swiss people recently voted against an eID-law that wanted to allow private companies to control the access to government services.

Scenario 2: Government leads the way with a centralized public key infrastructure (e.g., EU-eIDAS). EU-eIDAS regulation (as well as the Swiss ZertES law) have long ago established electronic identification based on a centralized public key infrastructure PKI that has reached very high adoption rates in some countries (e.g., Estonia) and low adoption rates in other countries (e.g., Germany).

Scenario 3: Hand some control to citizens (e.g., European Self Sovereign Identity Framework [ESSIF]). The EU-Commission has developed the European Self Sovereign Identity Framework ESSIF, that is handing some of the control back to the citizens and neither to centralized government service nor to private tech companies.

Discussants from each of the stakeholder groups will kick off a conversation that will involve everyone, and by the end we will hopefully all have a fuller understanding of the possibilities and limitations of various paths forward.


Until .

The session will have four discussants who will have 4-5 minutes each to speak, followed by a discussion of the topic among discussants and attendees. Discussants represent a variety of actors (European Institutions, Companies, Users) with crossover experience in many cases, in the hopes of creating a rich discussion that takes into account the different views and circumstances of each stakeholder.

Further reading

Ethically Aligned Design Ethically Aligned Design, First Edition is a comprehensive report that combines a conceptual framework addressing universal human values, data agency, and technical dependability with a set of principles to guide A/IS creators and users through a comprehensive set of recommendations.

The following chapter on Personal Data and Individual Agency would be of particular interest.


Until .

Focal Point Focal Points take over the responsibility and lead of the session organisation. They work in close cooperation with the respective Subject Matter Expert (SME) and the EuroDIG Secretariat and are kindly requested to follow EuroDIG’s session principles

  • Kristin Little, IEEE
  • Miguel Pérez Subías, Internet Users Association

Organising Team (Org Team) List Org Team members here as they sign up.

Subject Matter Experts (SMEs)

  • Polina Malaja
  • Jörn Erbguth

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member by subscribing to the mailing list.

  • Kristin Little, IEEE
  • Vittorio Bertola, Open-Xchange
  • Concettina Cassa, AgID
  • Amali De Silva-Mitchell, Dynamic Coalition on Data Driven Health Technologies / Futurist
  • Miguel Pérez Subías
  • Lucien Castex
  • Jutta Croll

Key Participants

Key Participants are experts willing to provide their knowledge during a session – not necessarily on stage. Key Participants should contribute to the session planning process and keep statements short and punchy during the session. They will be selected and assigned by the Org Team, ensuring a stakeholder balanced dialogue also considering gender and geographical balance. Please provide short CV’s of the Key Participants involved in your session at the Wiki or link to another source.


Clara Neppel - IEEE (Confirmed)

Senior Director European Operations

Dr. Clara Neppel is responsible for the growth of IEEE’s operations and presence in Europe, focusing on the needs of industry, academia, and government. She serves as a point of contact for initiatives with regard to technology, engineering, and related public policy issues that help to implement IEEE’s continued global commitment to fostering technological innovation for the benefit of humanity. She contributes to issues regarding the technology policy of several international organizations, such as the OECD, European Commission, and Parliament or the Council of Europe. Dr. Neppel holds a Ph.D. in Computer Science from the Technical University of Munich and a Master in Intellectual Property Law and Management from the University of Strasbourg.


Cecilia Alvarez - Facebook (Confirmed)

EMEA Privacy Policy Director

Cecilia Álvarez Rigaudias is the EMEA Privacy Policy Director at Facebook since March 2019. From 2015 to 2019, she served as European Privacy Officer Lead of Pfizer, Vice-Chair of the EFPIA Data Protection Group and Chairwoman of IPPC-Europe. For an interim period, she was also the Legal Lead of the Spanish Pfizer subsidiaries. She formerly worked 18 years in a reputed Spanish law firm, leading the data protection, IT and e-commerce areas of practice as well as the LATAM Data Protection Working Group.

Cecilia was the Chairwoman of APEP (Spanish Privacy Professional Association) until June and currently in charge of its international affairs. She is also the Spanish member of CEDPO (Confederation of European Data Protection Organisations) and member of the Leadership Council of The Sedona Conference (W-6).

She is a member of the Spanish Royal Academy of Jurisprudence and Legislation in the section of the Law on Technologies of the Information and the Knowledge as well as Arbitrator of the European Association of Arbitration (ITC section).

She formed part of the Volunteer Group of Privacy Experts of the OECD (Working Party on Information Security and Privacy; WPISP) in charge of the 2013 review of the OECD guidelines governing the protection of privacy and transborder data flows of personal data. She formerly participated in the Group of Experts selected by the Spanish DPA to prepare the Madrid Resolution on International Privacy Standards in 2009.

Cecilia has written numerous publications on data protection and regularly lectures on data protection, IT and e-commerce at different Master’s programmes and seminars.

Nishan Chelvachandran - Iron Lakes (Confirmed)

Founder and CEO, Iron Lakes;

Chair, Trustworthy Technical Implementations of Children’s Online/Offline Experiences Industry Connections Programme, IEEE Standards Association

Co-Chair, AI-Driven Innovations for Cities and People Industry Connections Programme, IEEE Standards Association

Nishan Chelvachandran is the Founder of Iron Lakes (Finland), a cyber impact consultancy specialising in providing expertise from the conflux of technology and humanity, with clients and partners from across the world, ranging through private business, NGOs and Governments. He is also a Director at Future Memory Inc (Canada); a creative and speculative design consultancy that pressure tests and anticipates undesirable futures to avoid harmful, unethical or negative consequences. He is a High-Level cybersecurity adviser, strategist, published author, researcher, and former UK Police Officer, with years of experience built on the strong foundations of bespoke operational activity in the UK Public Sector. Nishan spent 6 years as one of the UK National leads for Diversity in Policing, driving equity throughout the Police in the UK.

Nishan specialised in fields such as Digital Transformation, Digital Intelligence Forensics, Cybercrime, Cyberoperations and Cyberwar, Surveillance, and Intelligence. Nishan’s research interests include Big Data keyword and behavioural analytics, jurisdictional and legislative affairs relating to cyber-operations and cyber-warfare, ethical frameworks for mass and automated data surveillance, profiling and decision-making, IoT, AI and it’s ethical and responsible use and design, and Data Use and Privacy. He is an advisor in AI Commons, and an Ambassador for the Xprize Pandemic Alliance. He is actively engaged in the Cybersecurity and Impact Tech Space. A thought leader in the Cyber sector, He is actively driving the UN’s Sustainable Development Goals agenda and initiatives involving AI for Good.

Nishan is also on the Fellowship Council at the RSA (Royal Society of Arts, Manufactures and Commerce), and a Special Advisor to the British & Commonwealth Chambers of Commerce in Finland.

Pēteris Zilgalvis - European Commission (Confirmed)

Head of Unit, Digital Innovation and Blockchain, Digital Single Market Directorate, DG CONNECT;

Co-Chairman of the European Commission Task Force on Financial Technology

Pēteris Zilgalvis, J.D. is the Head of the Startups and Innovation Unit at the Directorate General Communications Networks, Content and Technology (DG-Connect). He is also Co-Chairman of the European Commission Task Force on Financial Technology. He was the Visiting EU Fellow at St. Antony’s College, University of Oxford for 2013-14, where is a Senior Member and Associate of the Political Economy of Financial Markets Programme. From 1997 to 2005, he was Deputy Head of the Bioethics Department of the Council of Europe, in its Directorate General of Legal Affairs. In addition, he has held various positions in the Latvian civil service (Ministry of Foreign Affairs, Ministry of Environment).

Previously, he was Senior Environmental Law Advisor to the World Bank/Russian Federation Environmental Management Project and was Regional Environmental Specialist for the Baltic Countries at the World Bank. He has been a member of the California State Bar since 1991, completed his J.D. at the University of Southern California, his B.A. in Political ScienceCum Laude at UCLA, and the High Potentials Leadership Program at Harvard Business School. A recent publication of his is “The Need for an Innovation Principle in Regulatory Impact Assessment: The Case of Finance and Innovation in Europe” in Policy & Internet.

Remote Moderator

Trained remote moderators will be assigned on the spot by the EuroDIG secretariat to each session.


Reporters will be assigned by the EuroDIG secretariat in cooperation with the Geneva Internet Platform. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.


Next meeting of the working group on Friday 23 April at 18:00 CEST Items we will be taking care of leading up to the meeting:

  • Confirm speakers
  • Confirm 100% online
  • Add information to wiki on our invited speakers as we find out who is confirmed.


  • The use of data and authentication methods are proliferating, but legal frameworks for data governance need to rapidly address the concerns of the governments, private sector, and citizens.
  • Privacy, security, and sovereignty concerns are moving to the background of COVID-19 vaccination certification process.
  • In designing authentication frameworks we should bring to the table all proposals from both the public and private sectors, and from citizens themselves.
  • It is important for a citizen to know how their data is used, stored, and secured: what are the stages, who has access at each particular point.
  • Citizens should have a choice to control how their data is used by different entities in a centralised or a decentralised manner.
  • Both the public and private sectors should work to develop a better visualisation of authentication frameworks comprehensible by citizens.
  • In developing innovative identification and authentication governance frameworks, we should keep in mind interoperability issues in order to ensure consistency in technology standards for the normalisation of data, while including consented use of such data.

Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at

Video record


Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-482-9835,

This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.

>> CLARA NEPPEL: Just checking that the microphone is working.

>> Yes, it works.

>> CLARA NEPPEL: Okay. Perfect. Thanks.

Hello, everyone.

>> Hi.

>> Hi.

>> ROBERTO GAETANO: So we’ll start in one minute.

>> CLARA NEPPEL: Perfect. Are you going to give the –

>> ROBERTO GAETANO: I do the opening, show a couple of slides.

>> CLARA NEPPEL: To Miguel probably.

>> ROBERTO GAETANO: Not to you? Miguel?

>> CLARA NEPPEL: Miguel will make the introduction.

>> ROBERTO GAETANO: Hello, good afternoon. This is the session on data sovereignty and trusted online identity, and that is workshop 3.

Let me just remind everybody about the session rules. Please enter with your full name to ask a question, raise hand using the Zoom function. You will be unmuted when the floor is given to you. When speaking, switch on the video and give your name and affiliation. Chat will not be stored or published.

Do not share links to the Zoom meetings, not even with your colleagues.

And this said, without further ado let me give the floor to Miguel.

>> MIGUEL PEREZ SUBIAS: Hello. Hi to everyone, my name is Miguel Perez Subias. As you know, personal information managed systems are systems that have control over personal date, allowing them to maintain their personal data and share with whom they choose. This can enable human-centric approach. We have developed so that companies and organizations can manage or do business with those with transparency and with respect to the rights.

Personal data is the key for many digital services, identification is essential from everything from identifying with health or government services, to traveling or to participating in social media. Data is essential to economic growth, competitiveness, and progress in general. It will ensure Europe’s relative competitiveness. It will ensure that more data be considered for use in the economy and the society and keeping the individuals and the individuals who generate the data in control.

The commission proposed this month of framework for the European identity which will be available for the European citizens, residents and business in all the European Union. They will be able to share their electronic documents from their digital identity wallets, and just clicking a button on their phone. They can enable online services with national identities recognized through Europe.

Any citizen resident and business in the union who would like to make use of the European identity will be able to do so. The digital identity wallets will be used as a way to identify users or to prove certain personal attributes. The digital identity will be enabled people to choose which aspect of their identity, data and certificates they share, with third-parties. And users ensure that the only information that feeds to be shared will be shared. But who should control those identities, how can we minimize the personal data changes needed for our service. We have a lot of questions and we need some answers. With that, we have four panelists. I give the floor to Clara Neppel. Clara, welcome. The floor is yours.

>> CLARA NEPPEL: My name is Neppel and I’m the senior director of IEEE Europe, based in Vienna. I have two focus areas are standards and public policy.

It really looks like the digital identity is at the center of these two. I’m thrilled to moderate this. Online identities are the key for many digital services, but the question is: Who should control these ideas and how can we minimize the data exchanged to minimum that is needed for the services?

The recent discussions about vaccination passports, for example have highlighted that this discussion is at the center of the current debate. So basically, as we see there are three approaches that we can discuss here. The first scenario is where private companies lead the effort. We have private companies who already provide us with secure electronic identification, including two factor security and biometric verification.

However, this raises many privacy and data sovereignty concerns. The Swiss people recently voted against an EID law that wanted to allow private companies to control the access to government services.

The second scenario is where the government leads the way, with a centralized public infrastructure. Examples are the above already mentioned EU, eIDAS regulation and the Swiss ZertES law that have established an identify. However, the uptake is still a challenge. We have countries that have reached a very high adoption rate, for instance, Estonia, but other countries such as Germany, still see a low adoption rate.

And then we have the third scenario, where we hand back some control to the citizens. One example is the European self-serving identity framework, which was developed by the EU commission. Here the control lies neither with the centralized government service, nor with the private tech companies but is given back to the citizens. And we are in a position to have discussions from each of these stakeholder groups and by the end we will all have a fuller understanding of the possibility and the limitations each of the paths forward.

It is my pleasure to introduce the discussants, first Cecila Alvarez, she’s with Facebook and she’s currently in charge with the international affairs. She also the Spanish member of the confederation of European data protection organization.

And the second discussant, Peteris Zilgalvis, he’s with the Director General of communications network content technology, DG connect in short. He’s also the cochairman of the European task force on financial technology. And as a third discussant, we have Nishan Chelvachandran, I hope I pronounced it correctly. He’s founder and CEO of Iron Lakes, a cyber Internet consulting specializing in providing expertise from the context of technology and humanity. He’s also chair of an IEEE standards association program on trustworthy technical implementation of children’s online/offline experiences, as well as cochair on AI driven industries.

So first of all, we know that the online terms of service are usually not modifiable and 90% of the people don’t read them. So this take it or leave it approach puts the control of these relationships in the tech provider and one the big issues is transparency. What is happening to the data? How is it used and reused? How do we ensure that the trustworthiness of private data used by companies, Cecila?

(No audio).

I think you are still on mute, Cecila.

>> CECILIA ALVAREZ: You are right. Let me see if it – it is working? I hope so.

Very good. Thank you very much, Clara and thank you very much for EuroDIG to invite me to speak to you.

I would like to address my contribution with respect to the identification, presentation and the digital identity by transmitting three kind of main ideas. One is the state of play. The proliferation of a certain number of proposals that are calling for different forms of identification that control or conflated with ID verification and all of them or each of them are actually addressing different issues. And I think that it is important that we identify them in order to choose the right method in each of the occasions.

The second idea that I would like to share with you is that we understand that there are component equities at play and that become even more apparent when we address this identification issues, in particular, with respect to the rights of people to access the well-being, and privacy and that must be balanced with each other. And also with respect to the technical and operational constraints.

And finally, another to address this the competitive and equities is the one that needs to be adopted in order to identify one we need authentication and this is desirable which methods are we going to use and whether there are alternatives in order to take account this competitive inequities that I was mentioning at the beginning.

I would like to develop a bit these three ideas.

With respect to the first one, on the proliferation of proposals that we are seeing globally, not only in the European Union but in the rest of the world, they are going for various forms of authentication that include or conflicted with ID verification.

If I may use a classification, maybe some of them could be trying to address online speech forms, others may be they are going to facilitate government services, others that maybe are trying to address the protection of youth, and finally, others that may address fraudulent online activities and maybe these are others. These are definitely four that I have identified in legislations around the world.

And ID verification is something that is addressed as a general rule in each of these buckets. With respect to the online speech form it seems like there’s a reasoning behind the proposals that ID verification by de-anonymizing will prevent harmful speech but however, it seems like there’s not a consensus with respect to the scorers with the identity is a way to remedy the online speech harms.

When we see government-sponsored ID systems, such as the one that we have, the European proposal now, as a general rule, these kind of systems have been specifically addressing how to facilitate government service access, such as for filing taxes or even for voting or other kinds of government and government-related assistance. We are seeing there’s a trend to invite by private businesses in some manner to be involved in the use of this kind of systems.

When we think about protecting youth, I think here there is a conflation between ID verification, and sometimes other kinds of issues with respect in particular to the age detection or even the identification of the guardian. And probably these are issues that are separated. Sometimes we do not see necessarily that the ID verification system is providing the solution. In particular, because there are so many young people that do not have IDs, at least in the current state of art, that makes this not necessarily useful and also with respect to data minimization, the kind of information that you get in addition to the age, which probably is the main goal, when we think about protecting youth.

And regarding fraudulent online, we have seen identification requirements that are not necessarily only looking at ID verification, with respect to money transfers or B2B activities and commercial activities and sometimes three are driven know your client relations and regulations that are in many other parts of the world.

Turning the second thing, the second idea with respect to the competent equities at play. I was mentioning access. Access to connection or access to freedom of expression, these are different issues but definitely we need to take into account this. Because if we think of mandatory ID verification requirements these may exclude populations without access to IDs and in particular to connect with the services or the communities.

The freedom of expression is another one that I think is relevant, as well in many scenarios and I maybe use the international covenant on civil and political rights. One we need to restrict freedom of expression and they mention to prevent imminent physical harm. So these are very, very important elements that need to be balanced for the freedom of expression and not others.

And privacy. I think we should be concerned or at least think about the ID verification, thinking of the access by governments of what we are doing, and with respect to – because it could threaten Civil Rights. Also with respect to companies and this is a fear that there has also been a claim by many. And others the fact that this type of information is in the hands of companies and private companies with respect to other individuals whether people should be forced to reveal their identities in the way they communicate and this could undercut in certain scenarios the ability to enjoy the fundamental right of expression.

Well being. I think we should also be thinking of how ID verification may impact in the harmful speech, because as I mentioned before, we have not yet seen an evidence that identifiability is an effective remedy for this kind of elements and with respect to real world interactions. Sometimes alternative verification models could be more effective for scenarios in which there’s a high likelihood of leading to in-person interaction and then the technical constraints.

Since I know that I am very close to the end of time that we have allocated, I will briefly refer to the last part in the sense of because we have this very different equities at hand and different goals that we may be can lead to find of solution, I think that it is sensible to conduct a risk base and tailor and proportionate approach in order to determine the nature of the risks that we will mitigate and whether some form of authentication will be able to successfully mitigate the risk and if should, should ID verification is the one to be used or other solutions that could, different will be the most appropriate to adopt?

And with this, I will give the floor to the other participants.

>> CLARA NEPPEL: Thank you for this comprehensive. With this I would like to hand over to Peteris. There’s trusted and secure EID, which should be available to all citizens, residents and businesses to prove who they are in order to access public sector and commercial services regardless where they are in the bloc. The EU has the EIDs which enter into force in 2014, but the commission’s intentions is to expand on that by addressing some limitations, such as the poor uptake and the lack of mobile support.

So Peteris, we would like to hear your views on this new framework and, again, the role of the government in providing electronic identity solutions. Thank you.

>> PETERIS ZILGALVIS: Thank you very much for giving me the floor and it’s a pleasure to be here. I’m head of the unit for digital innovation and blockchain and collaborating with our colleagues in eGovernment, among other things on this regulation on updating eIDAS for the EU EID moving into the digital age and making the most of the decentralized to put Europe at the forefront of both protection of fundamental rights of its citizens but also in developing innovative technologies.

So in the approach to the EU EID. You have the digital identity wallets. It’s been adopted and proposed by the European Commission and now it goes to the parliament and the Member States in our system for its final adoption eventually.

And what does the regulation do? It establishes a framework for European digital wallets enabling citizens to link their national digital identities with proof of personal attributes, driving licenses, diplomas, bank accounts, and one the solutions that we definitely see from it, especially from my perspective in this collaboration across the commission, is that DLT-based self-sovereign identity solutions are one of the technological solutions. It’s a tech neutral approach. So we don’t say that there’s one way to do it, either legacy systems or a decentralized system but allow both the Member States and the markets to make the choices of the complimentary mix of technologies that they would like to choose.

But we see these DLT-based self-sovereign identity solutions as being a very viable technology solution underpinning the new EU EID. The added value of such a technology solution is putting citizens in control of their own digital identity, which fits very nicely with the general ideology, the general approach of the European Union, focusing on the individual citizens, subsidiary, closer to the citizen and on individual rights in a societal context of protecting privacy, protecting fundamental rights.

The new trust service for electronic lectures, which is also found in the commission proposal, it’s ensuring the neutral service on the trustworthiness of electronic ledgers under the eIDAS regime which is very important. So giving it legal recognition. And the European self-sovereign identity framework we see again as a very important player in ensuring the implementation because this is not brainstorming. This is not piloting. This is the idea of really going to full deployment of a solution that European Union citizens can use, the framework is being developed by 29 European countries including all 27 European Member States. And it’s aiming to deliver EU-wide. This is a ministerial declaration, signed by all 27EU Member States. And we are moving to some of the use cases this year and the European blockchain partnership is operating as a regulatory stand box because we are breaking new ground. We’re building something that while not prohibited in EU or Member States law, is generally almost with – how do you say – very few exceptions not foreseen in Member States law and not the subject of judicial decisions, judgments, in the past to give us guidance.

Formally, also within the DIGITAL EUROPE program, we are moving into a regulatory sandbox, which will also accept use cases, Propositions from outside of the supported use cases in the FC to try to give more legal certainty, which is a general principle of EU law, respected at the level of primary law, the very top level in our system in the E U. and this will be opening as the DIGITAL EUROPE program, work program is adopted and the first calls will be going out towards the end of this year and early next year.

And with that, perhaps those are my introductory words and I’m happy to pass the floor to. Other speakers and be ready to answer questions during the discussion.

>> CLARA NEPPEL: Thank you very much. Thank you for this also very normative and very European human-centered approach.

So to the third scenario, Nishan, do you think that the previous solutions, let’s say poor uptake was because citizens were not involved in the design process? What are your views in how the citizens can be better be involve and will it contradict the efforts of government/private sector-led initiatives. I’m looking forward to your views on that.

>> NISHAN CHELVACHANRAN: Thank you, Dr. Neppel. And I hope that everyone can hear me okay. I’ve had to augment my audio setup for the call today.

So, yes, I think there’s some really, really good points both from Peteris, of course from the government perspective and Cecila with perspective from, I guess the private sector. I think traditionally we have seen the technological solutions or even these frameworks are bilateral. They are either built by government sector or private sector, and whatever does it, the user, the citizen, the human in the formula, are the people that are – they are the end user. They are not involved in the process. You know, something is delivered to them and then they use it.

So whether that’s for any kind of product or service, or whether it’s actually for a government service or some kind of public sector deliverable. So I think what we are seeing at the minute, of course is the use of digitalization, digital technologies to streamline government services which, of course is a good thing because, of course, you know, any way we can, I guess, bring efficiencies for the taxpayer and to deliver services that fit for purpose for the individual and the user are, of course, great things.

But I think one of the things, from a security perspective, of course, and that’s my background in cybersecurity is that we have a lot of, I guess, loopholes or gaps in our thinking and in the development of these solutions.

And we have a huge rise in what’s being called synthetic crawl, where data breaches or the data of individuals that’s stored in various solutions platforms are being utilized by cyber criminals and creating separate identities which are used. And it’s almost an evolution of regular identity theft which has been a distraction but it’s a very real and worrying scenario as we think how people’s personal data is stored and used on the various platforms and systems.

Especially if we are talking about a centralized system. I think when we are talking about a citizen-led approach, that’s not to dismiss efforts from government or private sector, but it’s really to consider other solutions we are developing, actually fit for purpose and are they actually solving the problems that the people are facing? I think a lot of times perhaps they are. But I think perhaps there are – I don’t want to say that sometimes they are not but I think perhaps we really are in a situation where we find that actually, there are a lot of considerations or assumptions that are made or considerations that haven’t been taken into account, especially if we are talking about the use of apps or, again, a lot of – when we are talking about the multifactorial identification and single-sign on and all of this stuff.

We forget that with the digital divide, you know, the majority of older people in Europe and people in rural areas and also in the minority communities, the uptake or the use of SmartPhones, for example, is very low, compared to the – you know, people that are, I guess better off socioeconomically. So when we have this divide and we are building services for people that have access to infrastructure, and have access to these mechanisms, then how do the people that don’t have access utilize these solutions?

And so that’s just, you know, what one kind of example or perhaps thousand this can perpetuate that kind of divide. But, of course, the – I guess from a citizen perspective, I would argue that citizens and people that in countries, especially in Estonia and Finland, for example, where use of digital systems for public sector deliverables is very high, but people actually want to know what their data is being used for and I think that is actually key. It’s not so much the fact that government or private entities are going to use data and are going to create these frameworks by which to use this agency and secure this data to deliver their services, but it’s how they are using the date, what – you know, what the data is that they are collecting and what is actually happening with that data.

I think some very good examples especially if we pivot to the COVID vaccination situation, of course, that we all – we’re all aware of, but when we look at the two different approaches between the EU or the EU area and the US, for example, so within the EU, we have the additional green certification, in which we use GDPR. So a lot of principles are baked in and actually, you know the way the data is being treated and the data is controlled in the mechanism. And that’s a secure process, and that’s being augmented and there’s work being done to evolve from that and utilize it elsewhere.

If you look at the US where their equivalent, the health data certification program, there’s very little privacy protection there. So there’s a big question mark as to what it actually happening with user’s data and where is it going? And is it being sold on? And would is making the money with that? And what it actually happening with the data. From a citizen perspective, it boils down to agency of data. Or agency of data and the explainability or at least the understandability and for people to understand what is happening with their information and their data and even how is it being secured S. it centralized or not and understanding the mechanism behind it.

And then having the mechanism to allow the individuals themselves to determine who uses their data or not, because as I said, I’m pretty sure as a private individual, a person might be willing or happy to consent to their data being used by government services but perhaps not a private company. Or somebody may be happy to have their data shared all together or maybe someone would want to minimally share that.

And really having these conversations both baked in in the design process and through to the solution, is key. And I think the only way we can really do that is to involve citizens within the process, but as you – as you said in your beginning question, of course, how do we actually do that, given, you know, the current mechanisms and the current processes because as we know, technology and the use of data is proliferating at the – you know, an increasingly rapid rate, but then frameworks and legislation is very slow to adapt to that.

But then, of course, from a citizen perspective who is to say that a five, ten, a thousand people who are incorporated within the design process are also reflective of the mass population and their needs?

So I think it’s – you know, and I can’t really give a definitive answer, of course. I think, you know, I will be very successful if I was able to create the panacea model, perhaps, but I think what is key is to really understand that we need to create this hybridized approach, to really incorporate the citizens within the design process with government and with private sector.

I think the three elements of this Nexus, so to speak can fuel innovation in an equitable and accessible way, and I think the one positive if there was such a thing from the COVID pandemic is to show that a lot of these barriers and silos that have traditionally existed between regulators or between private and government sectors have been broken down to expedite process and to redesign how we actually regulate things are do things.

So I think with that, I will kind of put a pin in it and – and open the floor back to yourself and questions being of course.

>> CLARA NEPPEL: Thank you, Nishan. So it was also very comprehensive view where you touched upon issues on security, inclusion and transparency and usage of data.

And I think I would very much like to stay on this topic of transparency. I think it is really essential for making the services trustworthy and as ask the other two discussants what are actions that you think should be put in place to ensure that the user can know anything about the consent acquisition, management and the use of their personal data.

So Cecila, maybe you can start?

>> CECILIA ALVAREZ: Yes, I think Nishan has pointed out something that is very important. We do not – as citizens, its not only that we understand the authentication method that we are using but we understand the reason why this is happening. And also, that we understand why what is happening with the data afterwards. Some of data may be loaned already to certain information that will be useful to service, but some other information maybe is only going to be useful for this part, for this phase, right, in which we are in this authentication mechanism.

And I think that this is something that needs to be addressed. I think all of us are facing the difficulties in how transparency can be delivered in an effective manner. Of course, all the organizations are restrained in a good and bad manner. When you have a legislation that says you need to address, X, Y, and Z. We have not been very successful. Anyone, I’m a privacy lawyer. I have been drafting privacy policies for more than 20 years. I’m not necessarily very proud on thinking that my privacy policies have been successful in delivering transparency, of course, this is something I was – I needed to do it because the law requires me to do it.

But in addition to the compulsory thing, I think there is room, but for this also we need also to have that regulators and civil society participates with an open mind. It’s not a question of citing, A, B, and C, but what are the elements relevant for you to know and try to see what is the best way to do this. Not necessarily in the text of 20 pages long, but trying to see what – I have seen, of course in the trends in general, the transparency, the contextual transparency is sometimes more useful when you see the data is requested and you understand for which is going to happen. Sometimes it seems that people of other ages do not like to read.

I still like to read, but apparently it is no longer trendy and the images and videos seems to be better addressed to not only the young people, I would say, but also people of different generations. So probably we will need to make different tries and with the participation also at the end of the day of the citizens. Because these are those who are receiving the thing.

It is not a excuse to say I don’t read or I am lazy, but still, I think that we need to focus on what are the elements as Nishan mentioned.

What am I doing? Who will have this data? And what is the destination and kind of in a very, very broad manner, the security elements they are not sufficiently for the bad people who circumvent them but in order for you to have an idea in the kind of things that the organization or the government has actually taken care of having this human centric approach that I think is – that is as a European, I feel very much aligned with it.

>> CLARA NEPPEL: Thank you.

So Peteris is also the same question on transparency and maybe also how – know, how you see the role of blockchain also to improve trust in these services.

>> PETERIS ZILGALVIS: Definitely. Definitely. Thank you very much.

And what we have to do is give citizens the tools to realize their autonomy, to realize their rights in this area and on the one hand this cannot be too technical. We can’t be expecting people either on a technical level to be programming smart contributions and solidity themselves or going through major steps in the software of a program. And on the other hand, and this is the problem with the terms and the conditions for now, they should not be too legally complex and filled with the boilerplate, which obviously does not benefit anyone except for the law firm drafting this.

So one the solutions again, that we would identify then, are the decentralized solutions giving very real control to the individual citizen. There are things being developed in ESSIF, with the Member States closer to the citizens and the regions even and the European Commission, the European Union and then obviously the energy and the creativity of the private sector, giving things again very simply. I want my data managed in this kind of way. I can see it. These things, perhaps I individually don’t care about. These things are very important to me. I want to push that a smart contract or other tool can manage this for me.

So, again, we have a lot of optimism as one of the alternatives being the self-sovereign identity solutions, for instance as being developed ESSIF which can create a European eidentity that citizens can trust because they control how and which data are disclosed and this is actually our president Wonderland who said that. And based on decentralized ledger technologies, offers this proposed solution as an option under the new EU ex ID regulation.

>> CLARA NEPPEL: Thank you.

So I would also like to draw the attention of our audience that we are open for questions. So if you have any questions, please indicate so. Until then, I would like to have your opinion on a second question which at least for me seems very important. Namely that of interoperability. So it looks like – well, of course these frameworks should work across borders, across different also digital borders, across different services.

How do you see we can guarantee the interoperability of these solutions? And maybe we start with Nishan, because I think that you are already active in some of these initiatives.

>> NISHAN CHELVACHANRAN: Thank you, Dr. Neppel. I mean, to also briefly touch on the last question, I think Peteris and Cecila spoked about TLDR, too long, didn’t read sensibilities that we have now.

But looking at interoperability, I think that’s key. We are talking about blockchain and decentralized other kind of decentralization technologies, but, of course, there are also other technologies, and let’s not forget as well that technologies change.

So, of course, we are talking about the current, I guess, evolution of blockchain right now, but when we look – as time progressions, technologies will change and really what I think we need to avoid is building the framework based on the technology and then, of course, should technology and use of data change. The framework is no longer fit for both purposes. We need to ensure that the frameworks that we are building are did, I don’t want to go as far as technology agnostic but irrespective of the security implementations that the overall infrastructure and framework, you know, supports a lot of the – promotes a lot of things that we are talking about here.

But that interoperability, of course, is key, because especially from, I guess, a private sector perspective as well, where different companies have different solutions, which, of course, have different data models and how they utilize their data, you know algorithmically, or whatnot is very, very different as well. To give an example, if we look at the health sector, for example, and healthcare services, of course, you know, the data that I generate in terms of my health and what the health of the authorities here or in the UK have on me, of course, they utilize that to deliver whatever health deliverables they need to do.

But, you know, what’s held in the UK for example, and what is held in Finland is probably quite different data because the different times I have been in these two different countries. And the data that I’m generating perhaps from wearable devices that creates data, that could be usable from a healthcare perspective, but then, of course, there are two different levels of legal frameworks that define what data can be used. So data that is collected for my healthcare provider of a subject for certain HIPAA framework. But then from a consumer framework, it’s not as strict. If you have that data that could be useful there, then how do you connect the two? How do you create the interoperability between them and create the standardization? I think that’s really – I think perhaps you are probably the better person to answer this question than me from an IEEE perspective, because, of course, IEEE, you know, that’s the business of IEEE is to create that standardized process between multi-stakeholders.

It’s really key. I think this is understanding – understanding the problems, understanding, you know, some of the elements about how data is used as Cecila was talking about. But then from that understanding, then figuring out how to put that into whether it’s a legislative process that drives it, whether it’s actually the utilization of certain technologies that can provide a solution that actually progresses that or whether it’s just by virtue of the having the forum, by virtue of having an initiative where the different stakeholders can cross pollenate. In that in itself may allow for certain interoperability of technologies and data.

But I think this is definitely, I think, we’re at the forefront and where the work needs to be happening.

I think the historical or the legacy approach of having tease technologies and efforts in silo, I think has really reached its kind of climax to a point and really we need to figure out a way of actually standardizing and allowing for both the normalization of data and the interoperability of data while including the agency and, I guess, consent – consent – you know, consented use of such data in these processes.

>> CLARA NEPPEL: Thank you, Nishan. Indeed, we have a lot of initiatives, as you know, regarding to digital identity, as well as digital sovereignty which is also about defining where are the – let’s say the digital borders. We no longer have physical borders but digital borders across services and so on.

So Peteris, I would be interested, are there initiatives from the EU to support this interoperability that was mentioned before?

>> PETERIS ZILGALVIS: Well, I think the whole process of the European blockchain partnership, which are 29EU countries and the upcoming collaboration with the EU eID initiative which is with the same countries, is one in interoperability between legacy systems and a self-sovereign identity based on distributed ledger technologies and blockchain.

Further we have a proof of concept with Canada. So going globally on initiatives related to self-sovereign identity and also work at earlier stages with Australia and Japan at least on discussions, reflections on the way that such an approach can be implemented. The European self-sovereignty identity framework is one of the use cases of European blockchain partnership, and it’s aiming to develop and deliver cross border self-sovereign identities and it has to interact with things like driver’s license, other issues. It’s offering a test environment right now for blockchain and DLT solutions, including self-sovereign identity in conjunction, in coordination with the existing legacy solutions or new centralized solutions that may be developed within the Member States. And then obviously, with a perspective of public/private partnership collaboration, beyond the collaboration we have now, which is by and procuring solutions that can be implemented in a public service context, but also to perhaps have public/private partnerships as the ministerial declaration on a European blockchain partnerships over the possibilities for and identity could be one of the areas that could be explored.

>> CLARA NEPPEL: Just a follow-up question, are standard setting organizations involved in this effort?

>> PETERIS ZILGALVIS: Very, very definitely. I mean, we have the work that we are undertaking with ISO technology committee 307, also with the initiatives in ETSI, and we had a whole set of standardization and black chain roundtables that just closed recently, that were addressing many different subjects and as well, ESSIF framework is looking both how to utilize standards that are being developed and contribute to the standardization process.

I mean, I challenge that you always have, when people are very busy working on an initiative like this and I’m also in the policy position of saying to the Member States experts and my own experts, oh, and let’s get input into standardization. I mean, it is extra work. I mean, in my mind, from my policy point of view, absolutely essential but it’s extra work on top of building the thing. So this is a question we have to, have how to make it, again, simpler and easier for the practitioners would have not been allotted the time to do this especially by their employer to be able to participate in the standardization process, but it’s definitely a policy goal of ours. And hopefully we will giving the input and will be giving even more in the future.

>> CLARA NEPPEL: Thank you. I think is very important.

Cecila, do they accept the eIDs or how do you see the interoperability working here?

>> CECILIA ALVAREZ: As I mentioned before, it very much depends on the use cases that each company has or expected. It’s not the same thing, I think, to enable to combat the hate speech and whether you are making sure that the age detection is appropriate for certain content. So each kind of goal deserves, we think, the relevant approach, at least that you have a risk-based approach. And I think these kind of systems being as long as we have the choice, in order to be able to choose which is the right authentication method, in orders, the goal to be more effective and to be the less privacy invasive, I think that this is the duty of the thing. So not necessarily to have, like the system which is probably not the best one to be scalable around the world, also for companies designated like Facebook. Either they are already providing global services, or they would like to grow global services.

We are thinking of systems. It’s not that it’s black and white situations. The system and that’s it. I think the systems are good, as they are fit for the purpose, and as long as they are able, let’s say to offer the best alternative, with respect it the technology and the kind of goal that you want to achieve, which I think is that is in the interest of everyone.

When thinking about inclusion and also the promise of scalability, with respect to security, that Nishan was touching upon. These are two – I wouldn’t necessarily say constraint but circumstances that definitely you need to take into account with respect to how you are building, let’s say your different alternatives. Maybe we should not be always putting all the eggs in one single basket but think about the baskets that are available and take the best that you can for the goals that you need to serve.

I would just like to share with you one figure that was striking. That in 2019, there was 1 billion people worldwide lack access to basic identification documents. This is very impressive. Sometimes I think we are not conscious of how lucky we are in Europe and other developed countries to see that even though we have – the numbers for Europe are not that terrible. We are not the only ones but to think of a system that is scalable and this is the thing that we are thinking. We should be thinking that there are many parts of the world and many also kind of marginalized people that – that we need also to think when we’re developing this kind of eggs in the basket, if I may reuse my former expression.

>> CLARA NEPPEL: Thank you. I think in IEEE, it’s digital and inclusion and trust and the issues you mentioned. We have one question from the audience and I think this will also be our last question because at the end we have a small session with a rapporteur on the key points.

So would a transparency control, similar to food labels – so basically it’s about certification, if understand it correctly. So what are your views on this and also, maybe last thoughts, if you can do this in two minutes, that would be great.

So let’s start with Nishan.

>> NISHAN CHELVACHANRAN: Great. Thank you. That’s actually a very – you know, it’s a good idea and especially when you think of food labeling, of course there’s a lot of detail that could go into it, but not – the end user doesn’t need to know a lot of this stuff. There are some people who have the traffic light system, green means good and red means too much salt or whatever. That impacts choice, what Cecila was talking about, to be informed enough of what is happening and whether or not you want to use the particular product or solution.

One thing that we need to steer clear of when it comes to building these frameworks is the view of creating a universal system that fits everything, which, you know, I mean doesn’t work. And Peteris was touching upon some of the challenges from the government perspective, when it comes to standardization as well in that it ultimately creates more work and needs more resources and we venture into the world of a Kafka-esque type of scenario. These are the things that need to be considered how to do it in a way that’s simple and not misleading.

And going away from the 25 pages of terms and conditions where you agree and you need to use the platform or whatever it is. When you find out afterwards being you referred to the terms and conditions that you agreed to, right? I think that’s what it boils down to. For a final thought, I think, really – I think with all of these approaches, from my perspective, there needs to be a hybrid between them. I don’t think one can really lead the way as such.

Without sounding too utopian, we should all eat together. We can all bring different perspectives, and the different stakeholders have the different needs. Of course, it’s important that private companies, there needs to be the business driver there, of course, because, of course, it is a business but at the same time and regulation needs to be this to protect the public and citizens and whatnot. But without it being an inhibitor to do business and at the same time business being exploitive of people and the people who are end users get the things that they want to get and everybody can be productive and whatnot. So I think I the appetite and the impetus. We need less talk and more doing.

>> CECILIA ALVAREZ: If I may participate in this discussion of nutritional labels. This is something that I would like to see and contribute. I may think of already initiative that exists in the AI space, regarding model cards that were started by Google and Facebook is working on this and many other companies which are trying to do it.

The difficult thing is the audience for who you are drafting.

It are you drafting for the regulator or the developers or are you drafting the audience? You need to see who you are drafting this for in the perspective of the labels. There was an initiative that was failing unfortunately, with the GDPR, in order to deliver the long privacy policy that we are obliged to do, but nobody is happy.

I do not think that we should be thinking that we should not be working towards this, but when we think about the regulators. It’s not only to protect the users, but also the companies in order that they are able to do this because if you still have the obligation to address 13 things that are listed in the law, you need to address them. And therefore, there is – the tension between the information, exhaustive information and the transparency, I think to be transparent, it doesn’t mean to explain everything, but to explain the things that matters with respect to the goals that you need to achieve.

And I will finish with something that I was actually preaching a bit when I was in my former position in former companies. We were trying to do a consent form for clinical trials which is quite important for you, because you are going to receive drugs in the process of that testing. And I was telling them, if when we are in the plane, we have been able to have things that are so important for our security and in a leaflet of two pages with plenty of drawings, in order for you to know what you need to do in the plane, in case there is an accident, maybe we should be able to draft this in a consent form for patients in order to know what is going to happen here but for this we need also that the legislature helps us, that because also this mindset because otherwise, we are still obliged do the thing that we are not passionate about, which so draft a very long privacy policy which at the end of the day, lawyers that I was proud to tick the boxes but not necessarily delivering the thing.

I’m a fan of the nutritional labels, Miguel, to answer your question.

>> CLARA NEPPEL: Thank you, Peteris, you have the last word and also as you see, there is expectations of the government side. So how would you like to respond?

>> PETERIS ZILGALVIS: Again, the last word is a benefit. I would say, obviously, I mean another my Dossiers is digital standardization. So we support standardization certification. But I think there’s an important point to make here while supporting user empowerment and individual autonomy. It’s not to put responsibilities on to the citizen. Okay, here it is. Take care of yourself. Go look at these two different frameworks. Try to figure out what is safe and what is not.

Both the legal framework has to be simple, and protect the fundamental rights of the citizens, the things that violate fundamental rights and should not be allowed in the EU, and then between things which might be better or a little bit how to say, not as good, then it is really giving people easy-to-use tools. I mean this really has to be kind of a – a button, a choice in an app and easy to understand.

We think that this section in the new EUID – the new EUID regulations for the new trust for electronic ledgers will provide this.

I mean, obviously with the inputs the private sector and its creativity, providing users with proof and immutable audit trail for the sequencing of data that is important for ensuring the trustworthiness and we think this can be a step in supporting data pooling and data sharing which is a central almost under the commission’s data strategy and trust in DLT infrastructures will be key in scaling up such solutions and obviously it almost goes without saying that the infrastructures have to be in line with the fundamental rights as a starting point.

>> CLARA NEPPEL: Thank you very much. So my conclusion at least is that we don’t have any preferred solution. We need the involvement of all the stakeholders, the government, the private companies, as well as the citizens in order to handle such important issues as interoperability, inclusion, certification, and so on.

Thank you very much for your valuable contributions and I think we are now at the end of our session.

Please stay for six more minutes. As far as I understand we have a rapporteur who was assigned to this session and would like to – to check with you if the key messages were the light runs.

So here I would like to hear the moderator, the technical moderator if this is the case.

>> ILONA STADNIK: Okay. I assume I can take the floor. Can you hear me?


>> ILONA STADNIK: Hello, I’m Ilona Stadnik. I will try to quickly summarize the main points from this discussion. It was quite challenging, I must confess but anyway, I prepared seven messages. I will show them one by one. And I will ask after each statement to indicate whether is strict objections. We are trying to have rough consensus. If there’s something that is completely not in the line of the discussion, just indicate it to me and I will remove this message from the list. Okay? Are.

So let’s move forward.

The first one. The use of personal data is proliferating but legal frameworks for the data governance are very low so address the concerns of the governments, private sector and citizens.

>> CECILIA ALVAREZ: So I don’t think – this is me because I know that I have been using “proliferation,” the proliferation was not referring to the use of personal data but to the use of – the proposals that are calling for various forms of authentication.

>> ILONA STADNIK: Okay. So anyway, the use – the use of data is use of data is also proliferating.

>> CECILIA ALVAREZ: But this is not the point. The point is to talk about the authentication methods.

>> ILONA STADNIK: Okay. If I add authentication methods, it will be better.

>> CECILIA ALVAREZ: Yes, but you should redact personal data. The point is to speak about the authentication measures that exist and proposals that are talking about how to address the authentication methods.

>> NISHAN CHELVACHANRAN: I think I also used proliferation but in the context of the technology itself. Yeah.

>> PETERIS ZILGALVIS: And I don’t know that we can say that legal frameworks are very slow. It’s high in the eye of the beholder of what people think of GDPR and data governance act, but they are out there and adopted and being implemented. So I think slow is perhaps not the correct – the correct adjective we need to address or – better address. You could say that if you want to be critical.

>> NISHAN CHELVACHANRAN: I think this is sounding like a point I might have made. I think my comparison was between the speed at which the advancements of technology versus the development of the legal framework. So it’s to the to say that, you know, legal frameworks are slow. It’s that by comparison, the technology advances that are at a pace that sometimes the bureaucracy and everything that’s required in order to develop the legal framework is usually outpaced. Of course, you know, to condense that down into a salient point might be a bit more difficult. But that was my –

>> PETERIS ZILGALVIS: I think we can agree that the technology moves faster, but I would say not necessarily. Bureaucracy. The thing that takes a lot of time of us is the parliament and the Member States debates which is that – which is the democracy and I think it’s very necessary.

>> ILONA STADNIK: I okay. I see that there are a lot of different views on this. I can remove. This and later, you can contact the digital observatory with additional comments.

>> PETERIS ZILGALVIS: I think you could say meet the legal frameworks need to – or need to if people want to be critical, better or more rapidly address, I think – those are being – being democratic, I think you and the European Union are open to criticism, but I was just saying that the legal frameworks are there.

>> ILONA STADNIK: Need to rapidly address. I need to move forward. Privacy, security and sovereignty concerns are getting deeper in the background of the COVID vaccination. Any objection to that?

Okay. In designing the data governance frameworks, we should bring to the table all proposals coming from public, private sectors and citizens.

>> CECILIA ALVAREZ: I would say that what we are addressing again is not the data – not necessarily the governance framework. We already have legislation for this. As the authentication frameworks.

>> CECILIA ALVAREZ: This is the topic that we have been discussing today.

>> ILONA STADNIK: Yes, I agree with you. It is important for citizens to know how the data is used towards security, would has access at each particular point.

Citizens should have tools to manage how the data is used by different entities in a decentralized manner.

>> CECILIA ALVAREZ: It’s not something that I have said, but I think it is – it has not been necessarily a push for any of us or with respect to decentralized manner per se but maybe a focus on the fact that it will be good that they have the choice, even though if I can summarize something that Peteris was also saying that we should not put the responsibility, let’s say on the user confronted through very difficult panoply of choices in which it’s difficult for the citizens to say this but I think there was this idea of choice and to keep it simple.

>> ILONA STADNIK: Citizen choice.

>> NISHAN CHELVACHANRAN: Yes, I would say for that point that perhaps citizens should have a – a choice to understand how their data is used. Rather than managing, you know who is using the data, to understanding this is how my data is being used.

>> ILONA STADNIK: So the key point is they were not to have to really manage how it is used but just to have a choice to know, right?


>> PETERIS ZILGALVIS: I would say control for those would want to manage and those who simply want to know. I think we had both variants because I think our citizens are very different people and we need to address all of them.

Yeah. So either say fine with me or I want to donate it myself to the clinical trials or whatever.

>> CECILIA ALVAREZ: And decentralized manner. It’s not that it’s a bad idea, per se, but it’s one idea among others that could be addressed.

>> PETERIS ZILGALVIS: Valid. I was talking about decentralized, but there’s other ways too.

>> ILONA STADNIK: Both private and public sectors should work to develop a better visualization of authentication frameworks comprehensible in citizens.

And then we should keep in mind the interoperability issue to ensure consistency in technology standards for the normalization of data wile including the agency and the consented use of each data.

>> CECILIA ALVAREZ: Authentication is what we have been talking about. Depending on which authentication method, there will be data but the point that is behind this, to think about the authentication methods that could be different and that’s why it’s so important to understand, some of them could be maybe more useful in certain I kind of high-risk or low-risk situations.

>> CLARA NEPPEL: So we have Jorn who would like to contribute.

>> JÖRN: Why should we not only say authentication but authentication and identification? Because it’s not only about –

>> CECILIA ALVAREZ: Okay. I agree.

>> JÖRN: So we used authentication and concerning the decentralized manner, I think it’s been mentioned but not as the only way. So that they should have a choice to the entity. They should have an opportunity to choose an entity or choose a decentralized manner to manage their data.

>> ILONA STADNIK: So citizens should have a choice to control how this their data is used and by whom.

>> JÖRN: How their data is used by and managed by different entities or a decentralized manner. They should have a choice to do it in a decentralized way. They should not be obliged to do it in a decentralized way.

>> CECILIA ALVAREZ: Maybe if it helps we can say how – so to control how the data is managed in a centralized or decentralized manner. You see the difference between the two.

>> JÖRN: That’s a good way.


>> CECILIA ALVAREZ: In a centralized or decentralized manner.

If I may to be a pit picky. Since we are European, can we use the centralized with an s.

>> ILONA STADNIK: It will be well-edited no worries. I think we are done. Thank you for the help.

As I already said, you will have the opportunity to comment on that. If there are some really important additions or omissions that we made, we can revise it and edit it.

Thank you for the help and for having me.

>> CLARA NEPPEL: Thank you.

>> PETERIS ZILGALVIS: Thank you. Thank you, bye.

>> CLARA NEPPEL: Bye-bye.

>> ROBERTO GAETANO: Thank you all for participating in the session and thanks also to the rapporteur, and the speakers and the audience. We are going to resume after lunch break with the keynote speech of Roberto Viola at quarter past 1:00 if I am not – no, it can’t be. Quarter past 2:00. Thank you.