Global digital governance – can technical solutions respond to policy questions? – PL 01 2019
19 June 2019 | 11:20-12:20 | KING WILLEM-ALEXANDER AUDITORIUM| |
Consolidated programme 2019 overview
Proposals assigned to this session: ID 93, 140, 142, 148, 159, 162, 171, Paris call – list of all proposals as pdf
You are invited to become a member of the session Org Team! By joining an Org Team you agree to that your name and affiliation will be published at the respective wiki page of the session for transparency reasons. Please subscribe to the session mailing list and answer the email that will be send to you requesting your confirmation of subscription.
Focal point for this session is Maarten Botterman email. If you would just like to leave a comment feel free to use the discussion-page here at the wiki, or email Maarten directly. Please contact email@example.com to get access to the wiki.
Can we find technical solutions for political issues? And what political action is needed to be able to embrace new technical solutions to be sustainable, supportive of human dignity and contributing to achieving the Sustainable Development Goals? This, and more, will be discussed by a multistakeholder panel, with participation of all in the room.
In recent years a number of high-level political initiatives have contributed to the further evolution of the Internet Governance Ecosystem and its political framework. Examples are the „London Process” with its Global Cybersecurity Conference, the Global Commission on Stability in Cyberspace, the Paris Call for Trust and Security in Cyberspace, the UN High Level Panel on Digital Cooperation and industry led initiatives as the Tech Accord (Microsoft), the Charter of Trust (Siemens) and the Contract for the Web (W3C).
All of those initiatives are supporting the multistakeholder approach and have developed new forms of interaction among governments, business, civil society and the technical community. However, with latest developments in fields like artificial intelligence, Internet of Things, cloud computing and blockchain, there is a growing need to go beyond the traditional multistakeholder cooperation and to deepen the interaction among law makers and code makers.
Can we find technical solutions for political issues? And what political action is needed to be able to embrace new technical solutions to be sustainable, supportive of human dignity and contributing to achieving the Sustainable Development Goals? Do we need new forms of policy development and decision making if it comes to issues like cybersecurity and lethal autonomous weapon system, like digital trade and sustainable digital development as well as the protection of freedom of expression and privacy in the digital age? And how can new intergovernmental projects as the Global Compact, the G20 Osaka Fast Track or the two new cybersecurity groups under the 1st Committee of the UN General Assembly contribute to a secure, free, open and stable Internet?
The panel of esteemed thought and action leaders in the field is invited to leapfrog to the future in 2030, and add their description of that future including a statement of what, most importantly, needed to happen in 2020 to make this possible.
Based on this, the panel will engage which each other with the aim to end up with a top three recommendations for action in the coming year, to help ensure we are moving towards a future we want.
Following the "lightning talk" by Jonathan and Jaya, a panel of 4 people from different stakeholder groups will be invited by the two co-moderators to lead in a discussion, to which all participants in the room are invited to contribute. By putting the perspective on 2030, and "looking back from the future" to what needs to be done today to work towards the best possible future tomorrow, this session will be of interest to policy makers and all that are to contribute to making a future we want happen!
Panelists (all confirmed):
- Mattia Fantinati (State Secretary for Public Administration, Italy): government perspective;
- Anriette Esterhuysen (former CEO of APC, Member of GCSC): civil society perspective;
- Bill Woodcock Woodcock (executive director of Packet Clearing House, Member of GCSC): technical community perspective;
- Jaya Baloo (Chief Information Security Officer KNP): business perspective.
The session will be moderated by Emily Taylor (Oxford Information Labs) and Maarten Botterman (GNKS). Jonathan Cave will join as resource to the panel. Moderators will invite interaction with other EuroDIG participants in the room, including Audience Voting. The questions that will be posed to the panel are:
I. Is this a digital future you would want to live in? If not what would you change?;
II. What policy, market or tech interventions need to happen now to make this future scenario happen?;
III. What do you see as the main risks or challenges that might prevent realisation of this ideal digital future?;
IV. What do you see as priorities for action, and what positive hope keeps you going as you think about our shared digital future?
Until 30 April 2019.
Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, so only links to external resources are possible. Example for an external link: Website of EuroDIG
- Maarten Botterman / GNKS Consult, ICANN Board
Organising Team (Org Team) List them here as they sign up.
- Andrea Beccalli, ICANN
- Ayden Férdeline, Technology Policy Fellow, Mozilla
- Kristina Olausson, ETNO - European Telecommunications Network Operators' Association
- Arnold van Rhijn, Ministry of Economic Affairs and Climate Policy of the Netherlands
- Marjolijn Bonthuis, Deputy director ECP, information society platform & coordinator NLIGF
- Wout de Natris, De Natris Consult
- Adam Peake, ICANN
- Emily Taylor, Oxford Information Labs and Chatham House
- Mattia Fantinati, Under Secretary of State for Public Administration, Italy (government)
Mattia Fantinati is currently an MP and Under Secretary of State for Public Administration. He previously served as MP and was the Chair of both the Industry, Commerce, Energy, Research and Tourism Committee and the Committee of Inquiry on counterfeiting, commercial piracy and abusive trade. He is a management engineer and an entrepreneur who attended Masters at Bocconi School of Management, Manchester Metropolitan University and London School of Economics. As a lawmaker, he promoted a law nullifying public funding for companies which relocate productions to non-UE countries, and a fiscal agreement implying a debt-credit swap between companies and the State.
- Anriette Esterhuysen, Director of global policy and strategy , Association for Progressive Communications, South Africa (civil society)
Anriette Esterhuysen is a human rights defender and computer networking pioneer from South Africa. She has pioneered the use of Internet and Communications Technologies (ICTs) to promote social justice in South Africa and throughout the world, focusing on affordable internet access. She has been the Executive Director of the Association for Progressive Communications since 2000 until April 2017, when she became APC's Director of Policy and Strategy. She was one of five finalists for IT Personality of the Year in South Africa in 2012. She was inducted to the Internet Hall of Fame in 2013 as a "Global Connector". In 2015, she was the winner of the Electronic Frontier Foundation's Pioneer Awards.
- Bill Woodcock, Executive director of Packet Clearing House, USA (technical community)
Bill Woodcock is the executive director of Packet Clearing House, the international non-governmental organization that builds and supports critical Internet infrastructure, including Internet exchange points and the core of the domain name system. Since entering the Internet industry in 1985, Bill has helped establish nearly three hundred Internet exchange points. In 1989, Bill developed the anycast routing technique that now protects the domain name system. In 2007, Bill was one of the two international liaisons deployed by NSP-Sec to the Estonian CERT during the Russian cyber-attack. And in 2011, Bill authored the first survey of Internet interconnection agreements. Bill serves on the Global Commission on the Stability of Cyberspace and the Commission on Caribbean Communications Resilience. He's on the board of directors of the M3AA Foundation, and was on the board of the American Registry for Internet Numbers for fifteen years. Now, Bill’s work focuses principally on the security and economic stability of critical Internet infrastructure.
- Jaya Baloo will join the panel for the business community perspective, and
- Jonathan Cave for the academic community.
- Maarten Botterman, GNKS Consult, Director on the ICANN Board, Chairman IGF DC IoT
Maarten Botterman is Director of GNKS Consult, an independent policy analysis and consultancy think tank, and has been a leader in the use of Internet and related technologies in support of society for more than 25 years. Next to working as an independent policy analyst, he is Director at the ICANN Board, Chairman of the IGF Dynamic Coalition of the Internet of Things, Board Member of the Institute for Accountability in the Digital Age, and Chairman of the Supervisory Board of the NLnet Foundation. He is also former Chairman of the Board of the Public Interest Registry, Director Information Society Policy in the European office of RAND Corporation, Scientific Officer at the European Commission DG CNECT (formerly INFSO, XIII), and Senior Advisor in the Dutch Ministry of Transport, Public Works and Water Management.
- Emily Taylor, CEO of Oxford Information Labs, Associate Fellow at Chatham House International Security
Emily Taylor is an associate fellow with the International Security Department and is editor of the Journal of Cyber Policy. She is CEO of Oxford Information Labs. She is the author of several research papers, and is a frequent panellist and moderator and conferences and events around the world. Previous roles have included chair of ICANN WHOIS Review Team, Internet Governance Forum Multistakeholder Advisory Group, Global Commission on Internet Governance research network, and director of Legal and Policy for Nominet. She has written for the Guardian, Wired, Ars Technica, the New Statesman and Slate, and has appeared on the BBC Now Show and the BBC Radio 4 ‘Long View’.
Trained remote moderators will be assigned on the spot by the EuroDIG secretariat to each session.
- Stefania Grottola, Geneva Internet Platform
Reporters will be assigned by the EuroDIG secretariat in cooperation with the Geneva Internet Platform. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:
- are summarised on a slide and presented to the audience at the end of each session
- relate to the particular session and to European Internet governance policy
- are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
- are in (rough) consensus with the audience
Current discussion, conference calls, schedules and minutes
See the discussion tab on the upper left side of this page. Please use this page to publish:
- dates for virtual meetings or coordination calls
- short summary of calls or email exchange
Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.
- The current status quo is featured by a variety of legal tools with various degrees of effectivity. Approaches should be principle based and facilitate statutory rules and restrictions with duties of care.
- Ensuring trust in hardware and software devices has long been focused on lengthy and expensive standard-setting processes and lengthy and expensive certifications which could undermine innovation. A different approach could consist in making users more responsible for their actions, being informed by standards, labels, and self-certification that is enforced by third parties. Being in charge of their online privacy and security would give control back to the users.
- The role of governments and policymakers should be a future-oriented one. Policymakers need to develop long-term strategies to address existing and future challenges, such as but not limited to, inequalities, the digital divide, and the impact of digitalisation on jobs. Such strategies need to have a long-term vision to ensure effectivity in an exponentially evolving digital and technological scenario.
- Due to the rapid evolution of technologies, regulation struggles to keep up with the pace of change. Therefore, regulation is not enough: there is a need for norms, standards, and safety nets. The approach needs to be human-centric, focused on the protection of individual rights and a global regime where human rights standards are respected.
- As a society, we should be aware of two existential threats: the destruction of the environment we are evolving within, and the automation of the exploitation of human psychological weaknesses at scale. Such industry-created threats need to be tackled within the next ten years. In order to achieve a healthy digital environment by 2030, an effective regulation of digital platforms should be developed. Moreover, the excessive power and influence of digital giants should be tackled.
Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-800-825-5234, www.captionfirst.com
This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.
>> MAARTEN BOTTERMAN: Thank you. Thank you very much, for that. What if we could spend all of our efforts on doing in the interest of people, right? Beautiful.
I love it. For the interaction of the audience, we have a little extra tool and that is that we have -- can you click for me?
>> JAYA BALOO: Yes.
>> MAARTEN BOTTERMAN: We have this website where you can go to and answer this question. Take your time. While you are going to the website, and answer this question, I will hand over to Emily, who will introduce the other speakers of the panel and we'll take them to a couple of questions after which we'll come back to this. Emily? The floor is yours.
>> EMILY TAYLOR: Thank you very much, Maarten and thank you, Jonathan and Jaya for really kicking off the discussions with your imagined future in 2030. I'm very happy to be back at EuroDIG this year, and to be introducing such a wonderful panel of speakers, who are going to be reacting to what we just heard. Let me just introduce the speakers.
We have State Secretary Mattia Fantianti who is the State Secretary of Italian Ministry of Public Affairs.
We have Anriette Esterhuysen who is formerly the CEO of the Association for Progressive Communications, and is also a member of the global commission for the security of cyberspace.
Alongside of our panelist, Bill Woodcock, the executive director of the Packet Clearinghouse. So you will see, we have very subtly and cleverly replicated the multi-stakeholder environment with each of the panelists and the speakers today. We had Jonathan, I think your academia and also tech. Of course, Anriette from civil society, State Secretary Fantianti, from the government perspective and Bill from the technical community and Jaya who is technical and also from business.
And so we have a full array of stakeholders here but we also have you as the audience. Maarten is going to be looking after the audience interaction. Please don't sit back. Interact. Keep going to these questionnaires and allow us to react.
So turning to the panel, we heard, actually pretty hopeful futures from both of our speakers. We're hearing about a world of -- where international cooperation is solved, that these laws are agreed on by all of the relevant stakeholders, where people have control over their data, where there is liability, but also an environment where innovation is still fostered, where AI is done for the benefit of the community.
So state secretary, are these worlds that you would like to live in?
>> MATTIA FANTINATI: Thank you very much. Thank you for the invitation. This is a very good question. I'm Mattia Fantianti, the state secretary of public administration of the Italian government and my key interest is to follow up the digital revolution that we are having in Italy. And my duty is to help Italy have a benefit from the digital revolution.
I would like to -- to begin from your question, and from your here 2030. I remember that I used to be -- I am a politician. And we used to be at opposition, and we would -- we got prepared to try to be at the government, and we thought about this and our role. If you want to rule the future, the first thing that you have to do is to know it. It's to design a scenario. Just because a couple of years ago, we -- we asked 11 professors, how would you be -- how will be our job in 2030? What does it mean? I have to get prepared now. I have to write down the right law today to sort of regulate the work of tomorrow.
And these 11 professors said a lot of things, but they agreed in two things, two important things. The first thing is the domain aspect, the main game from the -- to be competitive on the next future is to be competent in artificial intelligence and the IT sector. This is first issue. And I mean even our work of today, we -- I think that if we think that in the next ten years, our job is the same that we have today, maybe is not like this.
>> EMILY TAYLOR: So how do you see -- take a seat, please.
How do you see the role of government evolving? I mean, will national governments still be relevant in this world that you heard from Jonathan Cave, where there's perfect international cooperation, where laws are agreed upon internationally? And do you believe it?
>> MATTIA FANTINATI: Yes, I think the government is relevant, because -- of course being they can write down a law. This is the main aspect. And this is it's really important. I'm speaking like a legislator. A lot of legislators, I think, of course, me as well, we do a lot of -- we do a big mistake, but in good faith. We try to write down the perfect law in the present, and sometimes when we agree to write down a law, a perfect law, in the meantime that we -- that we -- that we agree, the reality changes. So that's why the issue and the important issue as to have a longer vision. And sometimes to have a long-term vision, it means to create a top-down strategy. Because when the government, when a nation, when the European -- the Europe, design a strategy for the next ten years, it means we have to cover in this strategy all the visions. Is it good for safety? For healthy? And for whatever?
>> EMILY TAYLOR: Thank you very much, Anriette Esterhuysen, do you feel warm and fuzzy, listening to these two visions of 2030? And I would particularly like you to address the perspective of coming from a developing environment. Does this ring to true to you? Is this achievable for the environment in which you are operating and in which your stakeholders are?
>> ANRIETTE ESTERHUYSEN: No.
>> EMILY TAYLOR: Why not?
>> ANRIETTE ESTERHUYSEN: I would start with no and I would actually respond to the idea of future oriented policy. I think if you want your policy to endure, and if you want to address the future, you need to consider the present and the past.
And I felt that that was a little bit lacking in both scenarios. I saw no real acknowledgment of the inequality, the social inequality, economic inequality, the massive and growing digital divide, particularly as we are moving towards new forms of data control and data management, and I saw a scenario where regulation is being imposed, which could, in fact, the close down the status quo rather than open it up and make it diverse.
Yes, I would like to think that parts of it could be achieved, but I was not convinced and I think I also didn't hear enough about a global regime where human rights standards are respected across the board, and with -- where the rights of all people, all over the world, are considered.
So, yes. In fact, it sounded a little bit scary to me, although I like the idea of trust very much, but I think you can only really achieve trust if you con front how little security and trust there is now and the fact that our business models -- that's the other thing I didn't hear. I didn't really hear -- I heard that we'll have better regulation. More regulation. Regulation that protects people's identity and data, but I didn't see any real acknowledgment of the fact that the current business models that we are using as a basis of growing this economy have built-in challenges and built-in equities and nothing about job. Nothing about work and the impact of increased digitization on the number of jobs and where the jobs are.
>> EMILY TAYLOR: Thank you very much, Anriette. Before turning to you, Bill, I just want to come back to our speakers. You have been challenged. Good. From Anriette. Let's hear about your responses about how the world you sketched is relevant for today's digital have nots and let's hear about work and the business models we have today, and did you mean more regulations, or did you mean better regulation?
Jaya, can I come you to first? And Bill I will come to you.
>> MAARTEN BOTTERMAN: Maybe we look at it from the room.
>> EMILY TAYLOR: I want to come to Bill. Please, if you are mulling a question, please come to the microphones. Be heard. You are part of this conversation. Thank you.
And, yes. Perhaps you can take us through the statistics when we just heard from our panel? Thank you. Jaya?
>> JAYA BALOO: So, again, a little bit of caveat emptor, when you speak to a person would works in security, like me, we tend to be highly dystopic. So this is a huge stretch for me to sketch this positive vision of the future. But I absolutely do not refute the inequality that is there, nor do I not acknowledge the digital divide. For all the issues we mentioned, the greatest theft for quantum or the insecure hardware or software is the dumping grounds of the developing world. They will have the biggest degree of impact if we do not fix this now.
With it comes to actually being in a position to verify that trust, because we all give it. When it comes to verify that trust, they are the least qualified and capable now of being able to do that. So we have created first world problems for countries that are absolutely not, you know, having first world infrastructure to be able to kind of digest how it is.
And as for business models, that also is a place where, again, I see this as a dumping ground. Take a look at the technology that's available there. The only opportunity is potentially, if we have the right amount of exchange with those countries, that we allow them to leapfrog, that they don't go through the painful process that we had, but because they don't have anything, are able to kind of jump ahead.
And I think that's where the business discussion will play a huge role.
>> EMILY TAYLOR: Thank you very much. Very brief response from you, Jonathan Cave and then I come to Bill.
>> JONATHAN CAVE: There are different ways to do scenarios. One is a reason place and the other place is a reason forecast, and what would be necessary to reach it. Now, certainly I'm not dismissing inequality and in particular the different forms of inequality. We are not talking about income equality. We are talking about profound differences. And what we know is too little inequality and too much inequality are both stealthifying and lead to a form of stasis. And too much equality of access and unmediated social network space, too much connection to more people than we can cognitive deal with, with too many diverse and invisible agendas will create certain things with which we will have to have doubt.
Second thing is human rights. We are talking about a complex society. It has emergent behaviors. You can't necessarily reason up from the individuals, to what the society as a whole does. And that emergence works in both directions. You can have irrationally people who behave collectively so they are rational. And you can have people who are rational and put them in the wrong situation, and they behavior in irrational ways. It's impossible to know everything.
And so on the future of work, I did want to comment on that, because I don't think that the future of work is work. I think that the idea that we trade some part of ourself for money, which we use to buy goods is a model that we can no longer sustain. And I do hope that we have got past that, with some of the social economy and circular economy things. Everybody has something to contribute. It doesn't need to be mediated through supply of labor that immiserizes rather than enriches the person.
There's a strong distinction between more regulation and better regulation or as we hear it some countries between better regulation and deregulation. Governance is the issue. Governance does not mean that there's a regulator or a government. It's a property of the system, which allows the people in it to understand what they are doing, and to link their choices to the outcomes they hope for.
>> EMILY TAYLOR: Thank you very much.
Bill Woodcock thank you for your patience. Bill is the executive director of the Packet Clearinghouse and also on the global commission for stability of cyberspace. Bill, can I have your responses to the scenarios that you heard and the discussions that you have heard and particularly Jaya was a little bit critical of the GDPR. And maybe you can just have a -- have a riff on those ideas as well.
>> BILL WOODCOCK: Sure. I think as a species and as a society, we are facing two existential threats that are coming very, very quickly relative to our millions of years of evolution that are not preparing us for them.
First one is the destruction of the environment that we have evolved to exist within. And the second one is much more complex one, but I will summarize it as saying the automation of exploitation of human psychological weakness at scale, right?
So there are things that our brains do, that are now relatively well understood, that can be exploited by machines in vast quantity very fast, right? And, you know, we as sort of grownups in this space are existing in a window between the very young people who are easily exploited and very old people who are easily exploited, and that window is closing quickly, right?
Three or four years ago, this was not that big of a worry. Right? Now people, the ages of our children and our parents are relatively easily exploited and in three or for more years, it will be difficult for us to escape AI calling us on the phone and telling us that there's a problem, that we are very sympathetic to, that desperately needs our attention and our wallets to be opened. So I feel like as an industry, we are not particularly helping with the environment. Things like bitcoin are disaster, right?
You know, the environmental footprint of a second life Avatar was more than the environmental footprint of a citizen of Brazil. All right?
>> EMILY TAYLOR: And one AI query is the same as five cars in their lifetime. That's just --
>> BILL WOODCOCK: Yes. So there's some real problems there. And on the other side, this is a problem, entirely of our industry's creation. Right? The Facebooks and Twitters and so forth of the world that are monetizing attention, this is a problem that we have created, and we need to solve it. And ten years is a relatively brief time to solve problems of these scale, yet it is the time frame in which they are becoming critical.
>> EMILY TAYLOR: Thank you very much. Let's hear from the audience. Maarten?
>> MAARTEN BOTTERMAN: Yes, I'm trying to get the statistics of the question up. The audience thinks that over regulation is and bigger threat than under regulation. However, the highest score is the unchecked power, the green bar in the middle of the corporate platforms, the big corporates. So how that compares to under regulation, I -- I would love to know more.
Fragmentation of the Internet is relatively modest. It's the red one. The purple one is widening of social digital divide as Anriette raised and the black one, interesting that the system chose black, it's about mass digital surveillance.
So you recognize this. Are there surprises here? I would like to maybe one or two comments from the room and then back not panel. So we got two people there. So please, ask your first question.
And then we'll take the second and then we'll go back to the panel.
>> AUDIENCE MEMBER: To be honest, it wasn't really about this specific graph, is that okay?
>> MAARTEN BOTTERMAN: It's the only graph or a question to the panel. Please introduce yourself.
>> AUDIENCE MEMBER: Yes, my name is Ifonig and I work from The Hague. The lady from KPN, I don't remember your name. You talked about giving back data ownership, identity. There are some things we are working on here in the city. It's very basic still. So I have my ideas on how we can do this but I would like to hear from you, how do you think we should do that?
And the second question for Bill on the end. You said bitcoin is a disaster. From the environmental perspective, I understand your point but we are talking about digital cooperation and if you look at the crypto space, I think that you will actually see this is the first form of global, secure, digital cooperation and bitcoin is just the first form of that. So I would ask you, do you see the crypto space as something positive in its essence with many things to learn or what else?
>> MAARTEN BOTTERMAN: Thank you.
>> AUDIENCE MEMBER: My comment is about the graph. Since we have the Italian government. So there's a big argument going on between the Italian government and the European Union on whether the Italian government can meet some bonds that seem to look a bit too many like currency. So the European Union says you cannot do that because you already have the Euro.
And then yesterday Facebook comes and says, yes, we created a new currency for the Internet. And so it's us and Visa and Mastercard and this will be the new currency and apparently no one has a problem with that. So I think with this discussion, don't we think the discussion is a bit moot if we can't find a way to bring all of these big foreign companies that are actually shaping the future into a bigger discussion and a check in a way. This is also what the graph wanted to say.
>> MAARTEN BOTTERMAN: I think I have some interesting material for you and your panel.
>> EMILY TAYLOR: Thank you very much. And for the audience, the next time we come out, I would love some questions from the women in the audience as well as the men. Thank you.
Can I turn to the panel. State Secretary Fantianti, can I start with you on the remarks made by Vitorio there about -- and also your general reactions. We saw from the audience that they are more worried about unchecked corporate power than they are about over regulation. So this is your moment as governments to get the regulation right. How are you going to do it?
>> MATTIA FANTINATI: Yes. It's a problem, I know that. We are already too regulated and sometimes less regulated. What I can say that the -- it's about the situation of the Italian government now. There's a slightly -- it's slightly different from the other states. The problem is that we are a central government and we can take some decisions for the center and not the local administration. My point of view is to -- is to reduce the -- in 2030, we have to reduce the digital divide and not only the people the digital divide, but even the company digital divide and even for the public administration digital divide.
So there are several ways to do that. One that we are going to pursue and I'm talking Italian government, it's -- it's a bit different. For some aspects, one is due to our constitution, because our local administration in Italy, and region and province, are independent. So we can say as a government, we want to digitalize all the -- all the Italy service, but it's just how you are thinking is not the way of our province. It's not the way of our regions.
The main issue is the problem of interoperability between data center and so we come back to the security safe of the data centers. In Italy, we have more on the 11,000 data center, which is not a number that I'm afraid, so the data centers, they don't talk to each other. And sometimes we have a problem of interoperability.
Sometimes I think to reduce the digital divide, you have to invest on education in knowledge and think that what you are doing is for the healthy of the people. So it's sort of win/win strategy. Its not just a rule and you have to do in this way because I am the boss. It's just to try to say, we are leading in a solution for the citizen and for the companies.
>> EMILY TAYLOR: Thank you very much. Bill Woodcock, you had a couple of questions posed to you and then I will come back to the panel for other reactions. I'm glad you mentioned the environment. That was on my tick list of things I wanted to talk about today. So thank you for raising that.
And, of course, that naturally leads to discussions about crypto currencies and your response or your reaction to the announcement yesterday of Facebook and the Libra crypto is going to be solving us.
>> BILL WOODCOCK: There a problem with the people using the phrase crypto and traditionalists.
>> EMILY TAYLOR: I see Jaya.
>> BILL WOODCOCK: Knowing what this means is cryptograph. Some think it means crypto currency. So crypto currency, not inherently problematic. What is problematic is those kids who did not pay attention in high school economics and have a pre-Keynesian understanding of economies think that proof of work have something to do with money.
So the notion that digging a hole in the ground, extracting gold from it, digging another hole in the ground and putting the gold in that hole creates value is a medieval understanding of value, right?
Post Keynesian, we understand that it's the sum of the work being done, not work destroyed. So the idea that you burn GPU cycles to create value is a false understanding and we have used a vast amount of energy and resources chasing that false goal.
So let's put that aside and look instead at cryptography.
>> EMILY TAYLOR: Very briefly, Bill. Because I want to get to the other panel.
>> BILL WOODCOCK: Yeah. It's a really, really useful tool. It can be used to protect privacy and assure trust and that's where our focus needs to be.
>> EMILY TAYLOR: Jaya, there was a question to you and particularly, I think it's resonating from the idea of giving back digital ownership. How? And these visionary statements about how wonderful the world is going to be. We are always light on how the we get there. How do we get from where we are now, which is probably the opposite, to that vision that you --
>> JAYA BALOO: That's a great saying that the thinking that got us here is not the thinking that we need to solve them. Fundamentally, we are motivated. Went we are selling our data. The social media network, would have the most identity, but the people who have the most identity they see that as a cha-ching mechanism to monetize that identity information to do more targeted -- we have so many identities. Therefore, we are the best place to do targeted advertising against those identities. We are the best place to, you know, then sell the data from those identities, to anyone else looking for it.
So this is where I think we need to take a stand, and here is a perfect spot for regulation, because this was never the intent of the users. They did not go in with all eyes open, and understand what was being done with their data. It was not made simple for them, what it meant when they gave it away.
So here I see that, again, like I'm not a big fan of regulation. I believe, you know, self-governance for a large extent when it comes for a lot of things around the Internet. But this particular evil, needs this particular tool.
>> EMILY TAYLOR: Well, it's like the remarks of Minister Keijzer, it's self-regulation.
>> JONATHAN CAVE: We ceded power. What we didn't understand was the consequences of our choices that we made. That's true for governments as well. When the governments try to permit innovation or change, it used to be that they fixed the things that needed fixing and got out of the way of the other stuff. Now they feel a degree of the precautionary principle and unless they can have a degree of assurance and foresight, they won't permit businesses this power to regulate themselves.
Second aspect of that has to do with giving back data ownership. We are talking about data subjects but that's not the society that we are living. In what about the data objects? Your data are used to control my life. And there's no obvious virtue in giving you a control, what is not only impossible for you to comprehend and burdensome for you to exercise. And one final thing --
>> EMILY TAYLOR: Hold that thought. I want to turn to Anriette first and then back to the audience through Maarten. Thank you.
>> ANRIETTE ESTERHUYSEN: I think that what I have -- maybe what we should look at to bring all of this together, is the user, people. How can you put people at the center of this? And I think we tend to talk about our current digital reality and the future one, assuming that it has its own trajectory as if this online universe is evolving on its own terms. Jonathan, what about the textile industry? The cost of clothing in Europe is cheaper than it's ever been. Why? Because of globalized labor inequalities. This is a reality.
There is something in either of your scenarios to suggest that that will not continue to be a reality. I mean I think -- I absolutely think the business models need to be addressed and I don't see in either of why you are scenarios actually how that is going to be reality changed.
>> JAYA BALOO: I'm going to cut in here. We have nine minutes. Neither one of us is exhaustive. I tried to keep it to five things.
>> ANRIETTE ESTERHUYSEN: Maybe you can address that. I think we need interoperability frameworks. I think we need norms. We need standards. Went don't just need regulation. And I think we need safety nets. I think if there's one thing that any kind of unpredictable future should come with, is some -- some safety nets and I do think individual rights protections are very important one.
>> EMILY TAYLOR: Maarten.
>> MAARTEN BOTTERMAN: Our way to be comprehensive is to get you on the podium, not to give our scenario sketches all the time they would have otherwise.
For more, what to do now for a healthy digital in 2030 and we have a couple of options for you that you would like to project. The next question to the audience, please.
My clicker doesn't do it. Can we have the second question to the audience, please. So what to do now for the healthy digital environment in 2030? The first is innovation first. The new Digital Agenda and the second is the effective regulation of digital platforms and the third one is breaking up of traditional giants. We did that of AT&T, I think.
Fourth one is ensuring infrastructure competition, and the fifth one is blocking access for bad actors, countries, whatever. So please take your time to answer these, while Emily will take the panel to the next phase.
>> EMILY TAYLOR: Thank you.
(Off microphone comments)
>> MAARTEN BOTTERMAN: Oh, can we get the previous slide back that has --
>> It's a new URL. It's not the same URL.
>> EMILY TAYLOR: Oh, I think the code is different.
>> JONATHAN CAVE: That's a disadvantage compared to slider.
>> EMILY TAYLOR: While we sort that out, I will use our precious moments to come back to our speaker and ask you the same question. What -- I will start with you, Jonathan, because I cut you off.
What -- one thing, if you just had one thing that we could do now that will get us to a better digital future that will change our trajectory, and then maybe we could hear from that lady in the audience.
>> JONATHAN CAVE: I don't recall the options that were on the table.
>> EMILY TAYLOR: You don't have to be guided by them.
>> JONATHAN CAVE: We rely on the law because we rely on the law to be slow and a system which has nominated parts of it that move slowly and other parts that move fast has a kind of coherence. The slow bits have the obligations to take what the fast bits put into account. We are now living in a world in which the environment is moving at the same time scale, as a lot of the things that we are doing.
And we say that -- and what I meant to say about foresight was that governments were content to set the ground rules for technological innovation if they thought it would move faster than law, but law is driven by public opinion and that moves pretty darn fast.
>> EMILY TAYLOR: Thank you very much. We have got the code. We have a lady wanting to ask a question with why you are permission on the panel, I would like to go to this audience member first.
>> AUDIENCE MEMBER: Hi. Thank you for the question. My name is Nelly Gawi, I work for the Dutch ministry of economic affairs. I have a question for Jaya, based on your introduction about certification and liability there. I agree that we need to use the entire toolbox to make hardware and software more secure over time and that we need lots of things to do this, including liability. But I am curious, one of the things of liability is the civil law. The framework that's in place now it generally is -- you could use it now because if there's damages you can sue. Granted, you can look at the question of damages and the cyber secure environment and insecurity, but the question is also to what extent would businesses be willing to sue their suppliers to apply liability? Because that's something that government can do, because it's civil law. So I'm curious to hear your reflections on that.
>> EMILY TAYLOR: Jaya.
>> JAYA BALOO: I am happy to sue. Let's start there. I'm thankfully not in charge of the legal department but I think that here's the bottom line, when I try to take any new vendor, before we introduce them towards application of the service for our customers, or use internally, we pen test it. And the amount of high severity vulnerabilities that we found in pretty much every single product that we buy from a vendor shows me one thing. They are not testing themselves because they would have found it too, because we are not talking about the stuff that takes us six months to discover. We're talking about a two-week pen test and we see a CSS score ten vulnerability. And zero days as well. And not only on regular products but on security products. We find high severity vulnerabilities. Zero days! It's not acceptable!
So I want liability. And you are right. The issue is what constitutes damage. If the presence of the vulnerability is there, I want the vendor to pay to fix or to remove the product from circulation until it's fixed. You know, we have to really rethink our thing, because, again, what we see is the biggest vulnerabilities that we know about are still left unpatched all over the Internet. So with need to come up with a better mechanism to encourage, motivate, self-patching, automated and fixing and all propagated by the bloody creator of the hardware or software, the thing.
>> EMILY TAYLOR: I can see you agreeing.
>> ANRIETTE ESTERHUYSEN: We are trying to end the global commission of cyber state. Some of our storms precisely try to address the vulnerability equity processes, back doors and hack backs. Absolutely right. I think more accountability and in some cases, more accountability requires liability. So agree with you.
>> EMILY TAYLOR: Bill, from the perspective of the global commission on the stability of cyberspace, can you elaborate on those.
>> BILL WOODCOCK: Yes. As Anriette said, we have from our very blandest norm, which is the one promoting good cyber hygiene, all the way up through the ones aimed at keeping governments from attacking the private sector, we're looking at this question of vulnerabilities and how they are exploited or fixed, and I think, obviously, one the big questions there is vulnerability equities, where governments assert that they have an equitable right to deep vulnerabilities secret and exploit them rather than fixing them.
>> EMILY TAYLOR: Thank you very much. Now, State Secretary Fantianti, you see looking at the audience reactions, again, I think first of all, just my personal observations, these responses probably would have been unthink about five years ago that the most popular responses from a EuroDIG audience are that to do something now no get our healthy digital future is all about regulation. It's about effective regulation of the digital platforms, according to our audience, 35% and breaking up the digital giants.
Now to just highlight the perspective of the Vittoria, from the perspective of an Italian state secretary, how do you get that regulation against foreign companies?
>> MATTIA FANTINATI: It's quite difficult, because we know about the problems of a lot of web giants are creating on Italy and Tourissimo. I said tourism, and the tourism in Italy is a very big market.
Of course, we need an effective regulation of this sort of big platform that we can say that the big fives control everything. Yes. You said, but it's like that.
About the government, I would like to introduce -- I would like to introduce something, because we wonder, how to digitalize the system, and not to be like another giant because for the government, it's different. That's why in Italy, we decided to digitalize service. I mean, the -- the citizen service must be at the center of the stage for you. That is more hard to do that.
>> EMILY TAYLOR: Thank you very much. Jonathan Cave and then perhaps we can take more questions from the audience.
>> MAARTEN BOTTERMAN: Yes, please step up to the microphone if you have time for one or two. Jonathan?
>> JONATHAN CAVE: Yes. In relation to the vulnerabilities, something occurred to me. We had an earlier discussion yesterday about IoT and ethical standards in the IoT. One of the points that was made was that an awful lot of the players on whom we rely as a countervailing course against foreclosure of IP chains are SMEs in remote jurisdictions that don't give a if they are being sued or not. The large firms do care about these liabilities and they will bring their bit of the world into compliance. It will magnify their market power in foreclosing those rival market chains.
We also hear these large firms making entirely self-serving pleas for regulation, along the lines of those two middle things and I think we need to think very carefully about whether Mark Zuckerberg is asking for enhanced access to our data so that he can perform this public function on our behalf.
>> EMILY TAYLOR: I know we have burning responses from the panel. Let's take these questions.
>> MAARTEN BOTTERMAN: Okay. We have three questions and thyme flies. So we will keep it to this as well. Thank you for coming to the microphone in the back and then the lady and then the gentlemen in the front.
>> AUDIENCE MEMBER: I'm Peter Russ and with a digital automation provider. There you have a lot of legacy systems and those legacy systems cannot be pitched.
Also I have been to a Microsoft seminar and they say a shortfall of security professionals will be 3 million within the next two years.
So if you want to patch and if you want to have a lot of difficult systems in, I think the only way out is to make things less complex. What does the panel think of that?
>> EMILY TAYLOR: Thank you very much, Peter.
Can we hear from the lady behind you?
>> MAARTEN BOTTERMAN: The lady? You get to -- we'll take all three questions and then we'll go to the panel. Thank you for your excellent question.
>> AUDIENCE MEMBER: Thank you very much. My name is Jessica Petroski, I'm a professor. I'm hearing this conversation about the environment, of course, but what I want to know is what about the user side? What are you -- when we are talking about what we should do now, thinking about 2030, I'm thinking about digital literacy. How do you respond to that? How do you think about should we be playing a role from a government or public policy in thinking and supporting digital literacy?
>> EMILY TAYLOR: Thank you.
>> MAARTEN BOTTERMAN: Thank you. I think it leads very well to what the gentleman said about having the right skills. Users and professionals. Please.
>> AUDIENCE MEMBER: This is perfect. I'm Antonio. I come from the southwest of Spain. And I work with rural youth, and I cannot have the responsibility to send a message to those young people would live in very rural areas who already have the feeling that they don't have any opportunities for the future. And now we are talking about some scenarios about topics that are already familiar to us, jobs and regulations, but for them, everything is new and there's not a reference to follow.
So for those working in that field, I would like to ask the panel, what is the message we should send to those people that -- not that far, but actually quite far.
>> EMILY TAYLOR: Thank you very much for those excellent questions. I saw Bill wanting to respond, and also Jaya.
>> BILL WOODCOCK: Yes. Really quickly. The last three questions strike me as being closely related in a way that is not apparent. The distinction between users and providers is a completely artificial one. We cannot proceed if we continue to honor that false distinction, right? We need users to be empowered, to also provide services and goods in a digital economy. We can't say that there's one class of people who sell and another class of people who buy. That relates to the question of simplicity.
If we assume that you to have the complexity of a Facebook or Microsoft or Apple to be a provider, then we're reinforcing that distinction. If on the other hand, we look at open source, we look at modularity and we look at small parts and pieces that form building blocks, each one of which is simple enough to be auditable by many people, that's how you build a secure environment. That's how people can take care of themselves. That's how people can create services to sell to each other.
>> EMILY TAYLOR: Jaya, legacy systems, digital literacy, rural youth?
>> JAYA BALOO: Okay. Again, I'm going to take from the last comment. So liability. We talked about only suing. I don't think that's the only way to get liability. We still have the small products on the market, the issue is take them out of circulation. This is an economic issue, not a security one. If we take those products. For example, consumer advocacy, we are talking about insecure IoT, the Kayla doll? Why is that still available? We should not have those type of insecure devices even available.
>> EMILY TAYLOR: It's the Bruce Nai comment. I replaced my smart thermostat approximately never. Once they are out in the wild, how do you get them back?
>> JAYA BALOO: That's a different story. That means some user already has it. I'm talking about the fact that these are still sellable, and they are still on the market for sale. I would like to take the insecure products out of circulation.
When it comes to patching IoT, I feel your pain. There's no such thing as not able to be patched. What there is, there's hard-coded credentials embedded in software and a manufacturing process or health care process which requires additional certification to be able to operate as a hospital or a manufacturing plant and when doing a patch, you potentially lose that overriding certification when you update these systems or change them out. So I think even here, we need to figure that one out and the last one is for education, look, in The Netherlands, there is a national free AI course in Dutch that you can download as an app. We have need to look at those areas where we have the severe score shortage, AI, quantum, all of these things and we need to figure out how to get to those vulnerable communities and educate them on the areas that we need educating in.
>> EMILY TAYLOR: Thank you very much. I think that brings us to State Secretary Fantianti. We are in the final couple of minutes and brief comments from our three remaining panelists because then we will go to Stefania to wrap up. Rural youth, this must be a problem that Italy has as well.
>> MATTIA FANTINATI: Yes, we have the same problem.
>> EMILY TAYLOR: What is your message?
>> MATTIA FANTINATI: Yes, if you think about the digitalization of our country, it's difficult. In Italy, we say that the digitization of Italy is like the skin of a onion. There's good examples of digitalization. We have cities where we have driverless car, we are testing there or big data analysis and other cities, they are using paper.
>> EMILY TAYLOR: Yes. And particularly in the country, for young people, what is your message to the young people, the digital -- the would-be digital youth who can't get online?
>> MATTIA FANTINATI: The message just is that I think a state could provide the education of the young people. We know that education is a base. The digitalization that the safety of the data should be -- should be taught in the school, but because think, if I put on the charge my iPhone in a public administration computer, just to charge it up, there's some data that can go to the public administration. So the security is something that's clanging level that we have to learn about how to face it.
>> EMILY TAYLOR: So security and education. Jonathan Cave, some final responses to these questions and then I will go to you.
>> JONATHAN CAVE: I really resonated with making the component systems less complex, which allows the complexity and of the whole system to operate and that requires a different kind of expertise than patching expertise and I think that we know a little bit about how that interoperability works.
Digital literacy, most of the kids that I see are digitally literate in the sense of knowledge. The embedded values that went with that knowledge and the way in which they acquired that knowledge are much less understood. And we try to fit them into roles that existed in a prior economy. There's an enormous potential that can create value even if they don't yet pay money, both among the young and the under employed people of more mature years and if you put those two groups together, who feel that dissatisfaction, they could actually come up with something. So in particular, for the rural youth, at the moment the only way they can escape is to come to the cities and to magnify the unsustainability of congested infrastructures. It should be inherent in what we do digitally that they don't need to do that, and some type of provision, not just broadband provision but empowerment to allow them to invent new things and do new things. If you look at what happens in Africa, where everyone has a mobile phone, they are doing things that we are only beginning to dream of.
>> EMILY TAYLOR: And Anriette. Everyone has a mobile phone in Africa.
>> ANRIETTE ESTERHUYSEN: Not yet and, particularly not SmartPhones and Internet penetration and mobile data penetration is plateauing. But response to a few questions. I think complexity, yes, and I think there's complexity at the policy level as well. Content regulation, for example, as David Kaye has proposed, let's use international human rights policies and corporations.
Digital literacy, I think what does it mean? We need to look at it as literacy or understanding of how the system works, how your data is used, and abused, and I think we need to look the digital literacy, more than just building more consumers. We will need to have people that interact with the system, that understand its risks and its potential more effectively.
And I think in terms of the inclusion question, I think we need social policies that can ensure inclusion, not just digital policies and they need to be -- to be integrated, and I think that should be the measure of how future proof these policies actually are.
>> EMILY TAYLOR: Thank you very much. There's so much more to discuss. I can feel that each one of our panelists is burning to make several more comments, which is great but unfortunately, we have totally run out of time. I will pass the baton on to Maarten.
>> MAARTEN BOTTERMAN: Thank you Emily and panel, and people in the room. In the tradition of EuroDIG, at the end of the session we have a short summary and Stefania Grottola, has a short text and thanks to Reiner, I saw him walk back with the slide. It's all realtime. A report will come back. The most important topics that were picked up for this time, is that the current status is featured by different degrees of existential statements of rights and available legal tools. It's about getting the principles right and understanding that there is a change of how we can make things happen and organize society.
We have a clear warning from the panel, questions from the room, accompanying that, for digital divide, how do we preserve human rights, but also the need for skills and people to be informed?
And clearly, a wide concern about how the balance will be with large corporate and platforms doing major roles in society, whatever drives that. It has a major impact because people want it and follow it, and that's the services.
So going to the very last remarks on that, the excess of power and influence should be tackled and effective regulation should be developed. How that looks like is something where people like Mattia and Jonathan will work on directly, but they will need your voice in that. They will need your feedback. More than ever, regulation also needs to develop in a multi-stakeholder fashion. Thank you for practicing that. Please join me in thanking Emily and the panel for the excellent conversation.
This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.