A new European regulatory regime: What are the Dos and Don’ts? – FA 02 Sub 01 2022

From EuroDIG Wiki
Revision as of 22:29, 21 July 2022 by Eurodigwiki-edit (talk | contribs) (→‎Messages)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

21 June 2022 | 14:45 - 15:30 CEST | SISSA Main Auditorium | Video recording | Transcript
Consolidated programme 2022 overview / Day 1

Proposals: #40 #47 #49 #55

You are invited to become a member of the session Org Team! By joining a Org Team you agree to that your name and affiliation will be published at the respective wiki page of the session for transparency reasons. Please subscribe to the mailing list to join the Org Team and answer the email that will be send to you requesting your confirmation of subscription.

Session teaser

In this session we will address the current regulatory strategies of the EC in regard of content moderation, monitoring and control mechanisms, intermediary liability and transparency obligations. DSA, DMA, the Cyber Resilience Act, the proposal for an eID and the draft regulation on CSAM among others will be discussed in light of consumer protection, human rights and a level-playing field for service providers.

Session description

Always use your own words to describe the session. If you decide to quote the words of an external source, give them the due respect and acknowledgement by specifying the source.

Format

round table debate

Further reading

People

Please provide name and institution for all people you list here.

Focal Point

  • Jutta Croll

Focal Points take over the responsibility and lead of the session organisation. They work in close cooperation with the respective Subject Matter Expert (SME) and the EuroDIG Secretariat and are kindly requested to follow EuroDIG’s session principles

Organising Team (Org Team) List Org Team members here as they sign up.

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member by subscribing to the mailing list.

  • Vittorio Bertola
  • Luc Steinberg
  • Jutta Croll
  • Charles Martinet

Key Participants

Key Participants are experts willing to provide their knowledge during a session – not necessarily on stage. Key Participants should contribute to the session planning process and keep statements short and punchy during the session. They will be selected and assigned by the Org Team, ensuring a stakeholder balanced dialogue also considering gender and geographical balance. Please provide short CV’s of the Key Participants involved in your session at the Wiki or link to another source.

Moderator

The moderator is the facilitator of the session at the event. Moderators are responsible for including the audience and encouraging a lively interaction among all session attendants. Please make sure the moderator takes a neutral role and can balance between all speakers. Please provide short CV of the moderator of your session at the Wiki or link to another source.

Remote Moderator

Trained remote moderators will be assigned on the spot by the EuroDIG secretariat to each session.

Reporter

Reporters will be assigned by the EuroDIG secretariat in cooperation with the Geneva Internet Platform. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.

Messages

Video record

https://youtu.be/oQII31f6Bco

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-482-9835, www.captionfirst.com


This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.


>> NADIA TJAHJA: Now we would like to start with focus area 2. For those of you just joining, this year we introduced a new format at EuroDIG called focus areas. There are four overarching themes that we’re looking at, but today there are only two. This morning, in the auditorium room we’ll look at digital sovereignty, is Europe going in the right direction to keep the Internet safe and open, if you were interested in this theme, they’re continuing with this theme in the workshop rooms. If you’re interested, you’re welcome to go to the FabLab where they’ll continue the interactions and going into the nitty gritty, interactive parts of the discussions.

Now we’ll start on focus area 2. Earlier there were discussions in the lab and now we’re coming together, and we’ll talk about the focus area, reality check. Do we implement effective regulations and set the right standards to solve the problems of the future? Please remember, at EuroDIG we create the EuroDIG messages and at the end of each focus area you’ll have the opportunity to listen to the Geneva Internet platform who will present the messages and then you will find that the focus area 2 will be presented here and focus area 1 will be presented in the fab lab online.

Now I would like to invite Vittorio Bertola to come to the stage. The next session was originally organized by focal point Jutta Croll, and she’s unwell. We hope to open up a community discussion in which you’re actively engaged with the topic. We look forward to the contribution.

Vittorio Bertola, thank you so much for coming, for leading this discussion.

>> (Multiple voices).

We have a lot of legislation prepared by the E.U., we have seen – hello?

>> JUTTA CROLL: I’m here. I’m here. Good to see you.

I think we had some fake news in regard to me being there or taking part.

>> VITTORIO CROLL: You are here and we’ll do this workshop together.

>> JUTTA CROLL: Thank you.

>> VITTORIO BERTOLA: There are initiatives for digital regulation and patterns, how digital regulations should work together.

We’ll start with a brief overview of the Digital Legislation Plan by the E.U. and how this works.

There are issues on connecting the laptop.

Maybe we’ll start with the Council of Europe. Could you tell us about the activities of the Council of Europe regarding the regulation and how the Council of Europe is accompanying the regulation for example on AI? We all know E.U., of course, covers a lot of Europe, but it is not all of Europe and the Council of Europe has a broader approach and is, of course, directly oriented on Human Rights.

You can join us here if you want.

I can give you the mic, just come here.

>> THOMAS: I just saw, I didn’t join the panel, it doesn’t make the sense to have the same people on the panel too many times. So maybe I will sit here and then leave.

Yeah, I have been asked to talk about the Council of Europe. For those that do not know, the Council of Europe, this is not an organ of the European Union, it is something like the UN that’s been set up after the Second World War inspired by people like Winston Churchill to trying to have a rule-based peace oriented democratic setting in Europe and the Council of Europe has a structure that develops a lot of soft law instruments, therapy normally called recommendations or guidelines, and it has 250 or so binding treaties that can be signed up by Council of Europe Member States and some can be signed up by non-council of Europe Member States, the most famous one probably in the area are the cybercrime prevention, the Budapest Convention with comets from North America, Latin America, other regions that have signed this Convention.

It is not only broader the broader, and the other one, that’s never been a member, a neighbor of the other country, and it also has – it has an even outreach beyond Europe, it is not unimportant for instance now with the work on AI where the Committee of Ministers has decide that had the Council of Europe should in addition to the specific soft law in several sectors like the judicial system, they developed ethical charter by the use of AI and the judicial system in a Committee I have been Chair, we have developed recommendation on the impact of artificial – of algorithm systems on Human Rights online and now I have been tasked to lead the negotiations on so the called Committee on AI on developing so-called for the time being called appropriate legal instrument on the development, design and use of AI, with regard to the Council of Europe standards on Human Rights, democracy and rule of law.

The deadline, it is end of November (Thomas Schneider) where we will develop this instrument which is most probably, hopefully going to be announced still this month, it is a legally binding instrument, a treaty, a Convention, and the question is, what is the connection to the European Union. As you know, all 27 members of the European Union are members of the Council of Europe and they have all ratified the Human Rights Convention and we’ll take care that the two instruments are interoperable, we’re working closely with the European Union, with the European Commission with the Member States and the level will be fairly different, the E.U. is developing a regulation for a market and we’re developing principles derived from the Human Rights Convention and basic democracy principles and rule of law principles targeted to AI which can then be implemented, they will have to be implemented by Member States in according to their own tradition.

Thank you.

>> VITTORIO BERTOLA: Thank you, Thomas.

Can we now have the slide.

On the European level, on the E.U. level, we have the Services Act, the Digital Market Act, we have the AI Act, have the messages, it is highly disputed currently and we have lots of other acts that we will have a look at in a moment on the slide.

Since you are here, Jutta Croll, online, what is your stake on this wave of European legislation and of the activities of the UN and the Council of Europe? Do you think that they will regulate digital society in a good way? Do you think they’re missing out something? Do you think they may be over regulating something? What’s your stake on it?

>> JUTTA CROLL: As you can see, my background, I’m a children’s rights advocate. First of all, I’m advocating that children’s rights, as they are in the U.N. Convention and also in the European charter on Human Rights, that they’re respected in this regulation.

We have several regulations that is addressing directly matters of child safety for example and we have several other regulation attempts that somehow are inflicting on children’s rights. For example, the AI regulation needs to Nike account that children are more or less a third of all Internet users worldwide as we know from former research. Children are a relevant group and therefore their rights need to be respected and then we have kind of – we need a balance. You addressed the detection on communication, an instrument within the draft regulation on child sexual abuse material and, of course, WSIS is debatable because it affects the courtesy of users, young users as well as all users of all ages and also there is a need for protection of children. What my intention would be is to have a balanced approach and to avoid what I would call collateral damage that may come along with a regulation, a new regulatory regime.

Sorry for my mispronunciation.

>> VITTORIO BERTOLA: Thank you. This is an important point.

When we talk about this scanning of message, this scanning will take place in the device of the provider and actually what is happening if there is a positive scanning result? We all know there may be just a hit rate of 99.9%. There will be false positives and if the number of problematic messages is low, which is hopefully the case, it means that we will have more false positives than true positives. This is statistics. What’s it mean when and for example there is a hit on somebody’s device, what will be the next steps? Will the police arrive and do a search on the device? How can we imagine that this will work?

>> JUTTA CROLL: Let me explain a little bit what is in that new regulation. It doesn’t start with the detection order for communication, it starts with the risk assessment from the platform provider, whether the platform at all, whether there is a risk for moving of children and for the spreading of child sexual abuse material. That’s the first step.

Before we end up with the check control, the mechanisms, which is called the detection order, this detection order, it is the last step in a row of three steps and the protection order can only be issued by a court or by a national authority but is authorized to go to that step. It is not that we will have a scanning of email in a broad sense of all email or chat communication and then also you need to take account that there is only if the circumstances the checked communications point in the direction that may be the spreading of abuse material are a grooming process, only then these three steps are started so that – just take caution, we don’t think, my God, everybody will have a look into my chat communication, that will not be the case. That’s not the result of that new regulation.

>> VITTORIO BERTOLA: Thank you.

I think we now can go to you for an overview on the overall E.U. regulation plans.

>> Sorry for the delay. Everything seems to be working fine.

First of all, who am I? I am co-Chair of the Swiss IGF, but at the same time, I’m also a legal practitioner and I’m focusing in particular on the E.U. digital regulation. In context, I see that the general perception in public, particularly in the industry I’m coming from, it is GDPR. Everybody knows GDPR, GDPR, GDPR. Let’s remember, that GDPR, it came into being some eight – some nine years ago. In the meantime, much developed, but nevertheless, it is only regarded by certain industries and maybe some journalists but in general, nobody was aware of what’s happening, what evolution is taking place.

If I go from this, I hope this will work now. We have an entire carpet of policy, regulations, rules (technical issues).

While the connection is being set up, maybe you have an opinion on the immense regulation coming up. Is there something that you think that is still missing or something that should have been left out, better left out, what is the opinion on this upcoming legislation?

>> VITTORIO BERTOLA: We have seen a change in the way of regulation of Europe. If you think of it, until throw, four, five years ago basically, we only had lived 20 years with mostly just the privacy law, maybe so indeed we have this wave of regulations that are now built in the trend. Question is clearly S this good, bad, is it just maybe adjusting to a change in the global, economic, geopolitical climate.

I would say that we really need to examine the regulations one by one. Actually some of these regulations like the digital markets which is basically the replacement for the directive are aimed at the way to preserve companies from liabilities and the same as the digital Services Act by the way.

At the same time, they started to have more liabilities, as you have seen, there is more advocating for Children Rights that noticed how the Internet affected negatively certain group sites. At the end, there seems to be a need to intervene. So the discussion now, it is actually where do we draw the line between what is necessary to preserve the Rights of everyone that sometimes were harmed by a unlimited Freedom of Expression and then where do we stand in front of rights that can’t be compromised? We’re in the ready to give out the individual rights because of the need to protect other groups.

So, there are different approach, some of these regulations, they’ll be highly contested because in the end, they end up being on technical issues. Maybe the role for a community which is mostly technical, even if there are many policy people, but we’re still technical policy people, maybe what we have to do, it is to try to be – trying to find a good way to help them, the policymakers in drawing the line in the right place.

In avoiding, for example, what is necessary for privacy, but finding ways to deal with the problems that maybe sometimes it is hard to deal with, maybe it was not the problem in the community, they were online and those sorts of things.

Other regulations, like the ones on the markets, they’re basically just the result of the conclusion, the less developed approach that Europe had with the market regulation of the Internet, it doesn’t work.

In the end, in terms of growth, economic opportunities, European Union is missing out a lot. As products that we use as slides and companies, they’re made in – they’re not made in Europe. There is nothing bad using things not made in Europe, but there is a collective interest in ensuring that we have an industry of certain size that can provide the services no matter what happens at the global geopolitical level. This means that the ability of Europe to be self-sufficient, and in any case, if international trade, even in digital terms stops. This is actually I think a good objective for regulation.

>> Thank you, Vittorio.

We have now Jack available. He can really introduce us to the universe. I have to say, it is a universe of E.U. regulatory approaches to the Digital World.

Please, Jack.

>> JACK: Okay. Next try.

Well, we were at the moment where I was saying that GDPR is known to everybody, but what incredible development has happened since. If you look now on the slide, how many policies, proposals that have popped up. Actually, most of it, if you look closely, what’s involved, enforce and everything else. Maybe 90% is not in force yet. What do we have here? We have a quite interesting attempt by the E.U. regulation to just bring somehow in sync what is technically feasible, what is industry actually doing and what could eventually be regulated.

Is he tame time, everybody knows that the legal track is so slow that it is impossible to have everything regulated in good time.

Basically, what this analysis proves is that the E.U. is just somehow building on the rough consensus, maybe as in the IGF, of what could be a policy that’s not totally outlandish and what’s then found as rough consensus, it is cast into some text a is open for public, quite a democratic approach.

When I look into industry, which I well know, what are the compliance department is doing, in fact, they just take any text they find, any text is better than nothing, what we see is proposals and texts that’s rough consensus in the political community is for the industry something really real. Industry, I do not mean tech industry, the big ones, who really have the power and the means and the will to challenge, but just average small and medium-sized companies that just somehow have to grapple with this regulatory developments.

Not to forget also, that much of it is just for the development of what’s been before.

Where does it come from, just to build up on what Thomas was saying before, it is not everything invented by the E.U., of course. It is heavily building on earlier works of the OECD and of the Council of Europe. Just think of the GDPR, GDPR, it is in a way the E.U. implementation of the Council of Europe data protection Convention 108. We also heard from Thomas Schneider earlier in AI, it may be similar.

Now let’s have a last close look at which area are actually affected.

Once again, quite interesting, we were starting with the GDPR, if we take a close look, what is at stake now? What will be soon? Data starting over here with GDPR, personal data, but actually any data shall somehow be regulated, if not by hard law, then by also very efficient soft law. Any data, and then there we have the famous data act.

Digital services, digital markets, digital service, everybody thinks of Digital Services Act, but actually digital services, of course, it is much more. The eCommerce directive of 2000 was already about digital service, what we see here in latest proposal for development on the E.U. side, it is not something totally new but for the development on certain areas mole.

We have digital markets going into competition law, we have digital sovereignty which was just proposed in February and then we have also sectorial responses and just going to show you the one part, I think it is at the other end. We have the E.U. health Union, it is below – yeah, yeah. Yeah.

So we have the E.U. health Union. The E.U. health Union goes with the E.U. health – it is the E.U. health data space, the data space needs regulation, brand-new from May. We see that sectorial approach as well. GDPR, that stuff, it is horizontal but now it goes into depth. What do we take out of all this, the E.U. does mainly for development, first. Second, it is courage is enough, to put it on paper, and then even more courageous, to threaten to make a law out of it. This is the point where we step in as Civil Society.

>> Thank you. This is a great overview.

>> MODERATOR: There is an intervention online and there are many requests to have access to this document and if it could be uploaded to the EuroDIG Wiki, there is great many interests in this.

>> Okay. Thank you.

We have – maybe one last –

>> One last one.

One last word. I’m glad to share this document. The good thing is, it has links on it. And if you click on one of the links, it is directly at the right document.

So very useful for.

>> CHAIR: We look forward to receiving this document to spread it in the communities.

In the meantime, I invite Andrew to unmute himself, he had an intervention and also another intervention..

Have I unmuted successfully?

>> You have.

>> Fantastic. Thank you for the presentations, I found it very interesting and look forward to that document. It looks fanatically useful.

It supports the point, there is a lot of activity on this regulatory front, legislative front, et cetera, and there is a lot of interaction between legislature, particularly Civil Society and then I think there was a comment that maybe some of the legislation cuts across standards and what we have, completely in isolation of all of that legislation and activity by Civil Society, new Internet standards being developed, almost exclusively by the tech community with a complete lack of involvement of policy people Civil Society, regulatories, government, and in my view, the multistakeholder model, it is really important and would work, the standards development would work much better if they were multistakeholder involvement in the standards. Currently for example, no Member States are represented in the IGF to pick on one of the standards bodies. None of governments go, none of the government agencies go. I think that’s a real issue.

While there is a separation between these activities it is going to lead to problems where new standards break legislation and vice versa.

You can disagree with me, but I think that’s a problem that needs to be addressed.

Thank you.

>> MODERATOR: Thank you. Anyone want to comment on that? I know that the E.U., it is checking a lot of comments on their linked legislation, still, of course, it is a legislative process and it is not the IGF process.

Maybe you want to comment on it.

>> JACK: Very briefly, repeating what I tried to convey before, which is actually what you read in the papers, it is a huge fine based on whatever applicable law.

Actually, the bulk of regulatory work currently is soft law, soft law and standards and this is exactly where Civil Society can cut in and be even quicker than the E.U. I encourage everybody in particular when it comes to the standards, the development standards and – develop standards and defend them.

>> MODERATOR: I see we have two questions here.

>> Thank you. It is interesting to hear the randomly thoughts here.

I was randomly thinking about the few of the points, I’ll take the last one there, the point about soft law and the fact that companies are often responding not to the legislation that’s already passed but to try to predict, to understand the landscape, it is a really important one. I think it comes, it highlights the fact that with this historic deluge of legislature proposals we see in the E.U., it is – that’s all anyone is doing, predicting, understanding the ability, the ability to predict and understand what’s going on, it is so much more difficult and complex than in the past. I think that has some really unpredictable outcomes and side effects, most not good, because most of them are chilling effects, they show what people are likely to do and the risks they’re likely to take. That’s not good for the industry, for the Internet, for evolution and how the Internet is used.

I think the point – I don’t need to pick on this point, you mentioned –

>> MODERATOR: Could you please try to be shorted.

Sorry. Sorry, sure.

Well, linking back to that. The point of we each take them one by one and understand what each proposal is proposing, how it is doing it, I think it makes sense in one side but I think in another sense, part of the problem is that we can’t take all of these one by one. They all reference each other, the DSA references another and it references another. Even as that sort of agreement process is going on, it is not clear that all interactions and the way the different pieces of legislation effect each other and combine to affect the world, that they’re clear to policymakers and to the stakeholders trying to engage.

I think that complexity, it is very dangerous for us at the moment. Thank you.

>> MODERATOR: Thank you. It was a comment.

We have an intervention. Please. You have the mic.

Thank you for giving me the floor again.

>> JUTTA CROLL: Thank you for the wonderful mapping of the landscape, it is very useful.

As you said before, I think it was mentioned by Chris also, we have the links in the map to the regulatory acts and there are interlinkages between the different acts like, for example, the GDPR references in Article 8 to the definition of Information Society services that comes from the eCommerce directives. There are many interlinkages that can be beneficial but they can also cost too much with the intention of the regulatory act.

I think we also – it would be good if we could see in the landscape of these regulatory acts that some have the power to override other regulatory acts, not only on national level but also within this landscape. We have the rule here, and here that decides which regulation really chimes into force when certain issues are affected.

I think it is really useful to have a look at this and for Civil Society, of course, it is a helpful instrument to better understand the regulatory regime on European level. This also has signalized around the world, with though from GDPR that it has been copied or parts of it have been copied in several other countries, we know on the youth protection acts, they have been copied on the African continent. It is not only the E.U. we’re talking about, we’re talking about legislation on international level as well.

Thank you.

>> MODERATOR: You want to comment on that?

>> Jack: A short comment to this.

Whilst within all that I shared before, it is very difficult to find what’s enter interacting with what. What we know for sure in, the E.U. legislation, there is always the distinction between primary law and secondary law. Primary law being whatever in the E.U. contract and in the charter, and in almost all of the acts, we see somehow a reference to Article 8 of the Charter of Human Rights of the European Union. Parallel to this, we have the European – the Convention of Human Rights in Europe which also cuts in. We really have already some distinction between what is prime over what.

When it is within the arm of the Secretariat, it is really getting complicated. Human Rights are really at the basis of quite everything.

>> MODERATOR: Thank you.

You wanted to comment on it as well?

>> VITTORIO BERTOLA: Very briefly is that we can take the other question. I wanted to comment on several interventions, on each agenda special little with Chris and Andrew, what they were say, I think also you.

It is true, it is very hard for Civil Society sometimes to interact with regulatory processes. Maybe in institutions, it has been easier. On the other hand, there are other stakeholders that think the opposite. For example, on if you would agree, I think it is easier to interact with regulators and politicians than it has been to interact with the Internet community and institutions because if they showed up there, even at the IGF, even in the policy places, they would not listen, sometimes they were not treated very well. This is the feeling from parts of other communities.

My advice to the multistakeholder community and the standardization organization, they have to listen more actively to other views of the world, it will increase, people will go to the regulators saying that the multistakeholder model is not working, it is just few stakeholders that dominate and we can’t get our voice heard.

>> MODERATOR: Thank you. We have another question from the floor.

>> I have a question for Jutta Croll, for child protection, you have a proposal of a screen of communication, but to safeguard our privacy, it will happen only after the detection order. Could you please elaborate more how it is going to function? Will it be the order to some communication platform to provide this screening of communications of the person on the platform or the protection order will be addressed exactly to that person and all of his communications will be screened on all of the platforms and how do you see it working? Thank you very much.

>> MODERATOR: Could you directly answer this question?

>> JUTTA CROLL: I will try to do that very brief.

>> MODERATOR: Thank you.

>> JUTTA CROLL: So in the draft regulation, bear in mind that it is only draft so far and it will take about two years of debate, it is very important that we start this debate right now. We don’t know what will come out of it.

But the processes, first of all, the platform itself, it is checked with the risk assessment that’s done by the platform and when the risk assessment comes to the result that there is a risk, for example, for this, then the platform has to take measures against that. That is the first two steps.

So we always start with the risk assessment and not detection order, and then still, after taking these precautionary measure, there is a risk of grooming, for example, and then the next step would be to have a process of seeing whether there are certain communications that tend to be related to a process of grooming. For example, when a person is spreading messages to a huge number of unknown accounts, so to people, the person who is not connected to before, then that might be a signal that there might be a grooming process going on. That would be the next step, and whether it is checked, whether this is really tending to give a signal for grooming processes.

You need to know that we have several steps before the court authority, can issue the detection order that then is addressing the communication of a certain – on a certain channel and of a certain profile.

Does it make that clear to you?

>> Thank you very much. It is a bit more clear. Still in case of – like in modern reality, we have so many communication channels, will it be addressed then to all of the communication channels, the person has? I personally have at least –on, ten platforms where I communicate with people. If one platform is screened, it doesn’t mean that I cannot do the same activities on over the platforms.

>> JUTTA CROLL: No. No. No.

>> Maybe I misunderstood.

>> JUTTA CROLL: It is not based on the person, it is based first of all on the platform and the channel.

If it gives evidence, reason to assume that there’s grooming going on, on this certain channel, on this certain platform, it is not like a person that’s using this channel is checked in on all of the communication channel, it is not how it is proposed so far.

>> MODERATOR: Thank you.

Two minutes left, time for a closing statement.

Maybe what is your guess, how many digital acts will have been passed in this year and the next year, and what will the digital acts be? What is the guess? Maybe you would like to start.

>> JACK: Well, a crystal ball question.

We have – what is interesting, in certain areas, we have a big announcement that there is political agreement, but then it takes quite some time until these so-called political agreements are phrased out so, baring this in mind, that there might be just plus, minus a year, we may have something on the digital market really coming out. Not in force yet, it normally takes two years. On AI, I don’t think that we’ll come even close, we think in the near future, it is just too controversial.

>> MODERATOR: And maybe the second question, do you think this will improve innovation or hinder innovation?

>> Jack: An interesting question, a response to what’s been asked before. Well, I think it is two sides, one side, of course, if one could justifier away in a research department, and then, of course, it would be gorgeous and it is happening assumedly in the U.S., on the other hand in, a society like the European where it is still some sense of being Pampered. Consumers are much more happy years, more inclined to buy in service it is they know that they’re properly regulated. In this lens, it will provide the platform for innovation not that quickly but probably more solidly at least in the European continent.

>> MODERATOR: Thank you. Jutta Croll, your take on the two questions? What are acts that will pass and will this be good for innovation or bad for innovation?

>> JUTTA CROLL: I don’t think it is a simple answer to these two questions.

Definitely it is good to discuss regulation from the beginning, not to bring in new issues and things that might not have been taken into account in the last step of the regulation. It is better to take all these issues from the beginning into these parliamentarian processes that usually take two or three years, and as jack had said before, legislation is a slow instrument. Of course, we’re also needing to look for quicker responses to upcoming issues.

>> MODERATOR: Thank you.

>> VINT CERF: Quickly, in the next two, three years, we’ll see several, it is swinging like a pendulum, maybe in 20 years we’ll go back to deregulation and so on, it depend on geopolitical tension, if there is more war, there is more intervention by government.

On the other question, you have to tell one from another, regional markets is very good for innovation, if we create opportunities for new service, new interoperable service so, on. And others may be bad, they introduce the liabilities and responsibilities that is more of a platform to be able to deal with. It really depends.

>> MODERATOR: Thank you for this very interesting discussion. I think we will have definitely quite a different digital world or a bigger part of this regulation, when it comes into force.

Thank you.

>> CHAIR: Thank you very much. Thank you very much for your moderation.

Of course, we would like to welcome you back at 3:45 for focus area 2, sub topic 2, How to Enable Rapid Cybersecurity Standards Implementation for Impact. Please do go for a break, but do come back for joining us for the next discussion.

(Break)