Impact of zero rating – WS 07 follow up 2016

From EuroDIG Wiki
Jump to navigation Jump to search

10 June 2016 | 14:30-16:00
Programme overview 2016

Session teaser

This session aims at addressing the impact of ZR on stakeholders including users.

Session description

Always use your own words to describe your session. If you decide to quote the words of an external source, give them the due respect and acknowledgement by specifying the source.

Keywords

Competition, Consumers choice, Developing Countries, Business Models, Costs/Benefits, Walled Gardens, Opportunities, Innovation

Format

  • Agree on the key resources and formally identify them at the beginning of the session and make sure that they can intervene individually
  • No panelists on stage
  • One main moderator in the audience

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, only links to external resources are possible.

People

  • Focal Point: Frederic Donck, Internet Society
  • Key participants

Until 15 May 2016. Key participants (workshop) are experts willing to provide their knowledge during a session – not necessarily on stage. Key participants should contribute to the session planning process and keep statements short and punchy during the session. Panellist (plenary) will be selected and assigned by the org team, ensuring a stakeholder balanced dialogue also considering gender and geographical balance. Panellists should contribute to the session planning process and keep statements short and punchy during the session. Please provide short CV’s of the participants involved in your session at the Wiki or link to another source.

  • Moderator: Frederic Donck, Internet Society
  • Remote moderator: Valentina Pavel
  • Org team

Organising team is a group of people shaping the session. Every interested individual can become a member of an organising team (org team).

  • Reporter: Konstantinos Komaitis, ISOC

Current discussion

See the discussion tab on the upper left side of this page.

Conference call. Schedules and minutes

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange
  • be as open and transparent as possible in order to allow others to get involved and contact you
  • use the wiki not only as the place to publish results but also to summarize and publish the discussion process

Mailing list

Contact: ws7@eurodig.org

Messages

  • Is it part of Network Neutrality or a business model or both?
  • Is Zero Rating protecting the Internet as a system of innovation?
  • Does zero rating affect customer choice and experience?
  • Should law makers provide a general rule on zero rating? What role does competition play?
  • Discussions on zero rating should focus on principles, e.g. exclusive vs non exclusive, etc.

Video record

See the video record in our youtube channel

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com


This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.


>> MODERATOR: Let’s take another two or three minutes and wait for people to trickle in.

>> And hope we didn’t scare them away with the first session.

>> MODERATOR: No, I think everybody is still getting coffee and water.

>> MODERATOR: So #EuroDIGsec is the new trending hash to go.

>> If you’re going to tweet anything about this session, you can also do #EuroDIGsec. So your tweet will go right to this wall.

>> MODERATOR: This Twitter hashtag, because they stopped counting the @ addresses. These were impactful things on global society. I thought we

(Off microphone)

>> MODERATOR: Let’s just get started. Welcome, everybody. This is the second part of a two part discussion on cybersecurity. This morning we talked about current practices and issues with current practices, I would say. Fascinating panel. We went a little bit all over the map, I would say. We talked about various things. But today we want to look a little bit more forward and talk about trust. Specifically, the trust between stakeholders, what makes and breaks that trust. I think that trust of end users is critical to the success of a global Internet, a global open Internet will not work if users don’t trust that open Internet. They will walk away from it. They will demand measures that will in the end close that open Internet and put limits to innovation, limits to global reach, puts limits to the things that made the Internet a great medium for us all.

That’s my perspective. We have other perspectives around the table, I’m sure. And we’ve invited a number of great panelists to share their perspective; Louise Bennett is a consultant from the British Computer Society.

>> LOUISE BENNETT: The British Computer Society is in fact a professional associate and a charity. So volunteers like myself work pro bono.

>> MODERATOR: Next to me is Mr. Rainer Stentzel, he deals with data protection, German Federal Ministry of Interior.

My colleague, Maarit Palovirta. She is the manager of the Internet Society.

Finally, Konstantinos Moulinos, ENISA, expert in network and information security.

>> MODERATOR: I’m co moderating today with Tatiana Tropina.

We have with us Mr. Kosta. He will be the remote moderator for today, and who will take the notes and make the report.

Something that is important is that we also have remote captioning. This morning I made the mistake to point to a gentleman in the back and say that he is doing all the captioning, but in fact, there is somebody all the way in the U.S. who is listening to this and capturing this. Could you type your name into the captioning so that we can?

>> CAPTIONER: Lesia Mervin in California.

>> MODERATOR: Lesia Mervin in California. Thank you for doing that work. It is highly appreciated.

So trust. Let me ask you a couple of questions, panel. Louise, when was your trust last violated?

>> You told me this started at ten o’clock. Seriously, I thought it was violated when from Google yesterday said security and was on the top of the top of their minds when designing the Internet of Things, because that is absolutely not my experience.

>> MODERATOR: Maarit, do you trust your partner with your passwords?

>> MAARIT PALOVIRTA: I have my sheet on a piece of paper and partially online in various places, because I think all in all I must have more than 50 of them, which is a challenge in terms of management.

>> MODERATOR: Okay. I did the count 951 for myself, by the way. When we walk into your house and the game console is there. There is a bunch of cameras. How many are there in your house? How many lenses with cameras?

>> KONSTANTINOS MOULINOS: Actually none. But I know the problem very well because I used to work in the data protection authority. Nowadays everywhere there are cameras, which in my opinion and my experience I don’t think they are a lot in trust. Perhaps there are less violating measures to watch this.

>> MODERATOR: I see roughly 20 cameras here in the room all in your laptops, probably not covered, and microphones. So it can be sure that

>> Who is covering the cameras?

>> MODERATOR: Rainer, I see you have an iPhone there. I know these things drain batteries. When is the last time that you put the charger cable in a foreign device, like an airplane seat?

>> RAINER STENTZEL: I cannot I can’t really remember when I did it last time, but I don’t think that I really thought about it, because trust is a question of good and bad experience, and I have to say maybe I am only lucky one in the room here, but I was not disappointed. There was no damage in the past which gave me a bad experience. So I would say my general rule as behavior is that I’m trustful and not suspicious. I’m trustful unless I make a bad experience.

>> MODERATOR: This was just some teaser questions. That is not the topic of the conversation. The topic of the conversation is going to be set by you all. What we’re going to do in the next 20 minutes or so is to have you all ask questions share observations, share ideas of how to make sure and I think that I want to focus it a little bit to the perspective of how do we make sure that users continue to trust the Internet that they interact with on the long term? What are the things that we need to consider going forward in order for users to trust the Internet? What can we do from our perspectives? That’s government, private sectors, Civil Society, what are the things that we can do. That’s the question.

I want to ask you all in the next 20 minutes or so to provide that perspective, ask questions to our panelists, and then after that we will give the panelist 30 minutes to reflect on what happened in the room. Then we have ten minutes of more audience participation, so to speak, and then after that we will wrap up.

We might divert a little bit from this if a specific theme starts to develop. We might want to go around on the team for a little while, but we take that as we flow.

So

>> TATIANA TROPINA: Though the view holder on the first panel know that I’m the person, and if no one rises the hand, I will come to you anyway. Marko.

>> MARKO: Let me break the ice. Hello. Quickly, jump back to what Lea said in her closing statement on the first panel, and that’s we have to start again. I’m not so sure we can. This is sort of an evolution. This is not something we can just go away from and then try again. The Internet is there and we have to deal with what we have. So how to increase trust: This is more about people than it is about technology. This is about humans, especially when you talk about trust. And my last comment, we’re very reactive. Everywhere we discuss cybersecurity; we’re always discussing the incidents. We’re always discussing finding the people who did something bad. I think we have to turn it into more proactive. In an environment where we only highlight the incidents, how can we establish trust? The people on the street we read in the newspaper are security incidents, hacks, data leakage. We should be more transparent and what are we doing to make this world a better place, but showing how we are responding to these incidents and making this a security environment, and also taking in account that we do it in such a way that the person on the street understands what we’re doing and understands, too, they, too, can play a role. They are part of the ecosystem and part of the Internet. They also have to play a role in making sure this is becoming a more secure and remains a security environment.

>> MODERATOR: So do you have a particular approach that you follow yourself that is positive? An example?

>> MARKO: One of the examples, I think, in sort of showing what we do positively is also in my work I spend a lot of time training cybercrime investigators in understanding the Internet and actually helping them to find those people. We also always encourage them to show the results with those investigations and show people that it’s really makes sense. It’s not all doom and gloom. We do manage to capture those people. Sometimes we do manage to bring those to trial who have stolen data or who have abused leaks in the system. We should continue to do that.

There is a way out. This is not something that we can’t solve, but everybody has a role to play in solving it.

>> So basically what you’re saying it’s not considered cybersecurity as a result, but as a process in being more proactive, if I may sum up.

>> MARKO: Yep.

>> Yes, I also saw that Nigel was going to contribute and he was burning with impatience.

>> I talked in a brief session. I think the keyword would be openness. As I think Ian mentioned, we need to trust that nobody planted anything that we don’t know in the products we use. And I would like to make a distinction on the security. We have two parts of security. One is tools, and the other one are procedures. For tools, basically ITF did good job. I trust the tools, most of the tools, because they are open source because they are on open protocols. Nigel mentioned that we discussed on a different places different I think we always talk about procedures, actually policies. I think that for the policies, they need to be open in a way that each actor that’s kind of affected by the policy needs to be included. In that case, nobody can tell the difference. Openness would be the keyword.

>> TATIANA TROPINA: This is something later for could not Stan teen owes can work on the security I would have asked you another question. How last time when was the last time you got grilled, but you might answer this later.

>> Good evening. My name is Angelo. I began to have an interest about the cybersecurity in 2008 when I made the Facebook account in three weeks there were gathered, according 3,000 citizens all around Europe and suddenly Facebook closed the account without any reason for four years. So I investigated the issue. I researched them how it’s working, the whole staff, from the engineering point of view, and I began speaking in the European Commission in the various workshops about what we would do. From the legal aspect, because I researched the legal aspect, there was no any kind of conclusion because either big brother in U.S.A. owns the Net, the justice had only some results on similar situations, which for the citizen because matter of investment of time and money, which most of the citizens don’t have.

So we have spoken with German professor in Berlin who was very good during that time in the legislation issues. Again, we didn’t find any kind of result. So we don’t as Europeans, we don’t have trust. All the citizens, we don’t have trust, but we can make trust, as we are suggesting, by making the Internet our Internet, which the user can interact and can be breached with the Internet of today. Our Internet based and being instituted by the universities and the SMEs, on the contrary our Big Brother has it on the military under covering of nonprofitable organisation, because we have to have in mind that our Digital Economy is not going to be developed if we don’t have our own Internet. The officers, according to our suggestion also to the agency which was five years ago, I think, it was instituted, they created special agents for each territorially to look over the issues.

>> MODERATOR: Can you get to the summary of the point, please.

>> If we are speaking for Europe, we have to construct our own Internet, our own Digital Economy, which can be competitive and also collaborative with other territorial zones and alliance. Thank you.

>> TATIANA TROPINA: Nigel.

>>

>> NIGEL HICKSON: Nigel Hickson, ICANN. Thanks to our chair. Lea encapsulated something at the end when she said should we reboot, start again on cybersecurity? And I think there is a serious point here which would be good to get the panel’s reaction to in terms of trust; because I think trust has to be earned. Once trust is lost in institutions or in other entities, then it’s very difficult for those same entities to get it back unless there is a fundamental change.

>> MODERATOR: I think that was what the previous speaker was sort of alluding to.

>> NIGEL HICKSON: As we trusted this morning, trust has been lost in government and government institutions in terms cybersecurity. How does that how do we where do we go from there? Thanks.

>> TATIANA TROPINA: Any more questions?

>> I think if we are to reboot, which I don’t think just like Marko is really possible because we have developed too far for that, if we look at cybersecurity, and it’s mentioned before, it’s so broad, and what are we discussing? If it’s national security, you’re talking to completely different set of people from a government than if you talk about digital rights or private rights. In other words, find the people that you want to discuss the specific topic with. If you do that, make sure that they come to a panel and discuss it with you. And only that is my personal experience and I’m going to reiterate it, it takes three meetings before something changes. In the first two, they’re only looking at each other. This is a very odd person sitting next to me and it’s a bit scary, but there’s another one there, so better sit still as possible.

And a third meeting, and that is really what I’ve seen happening time and again, we have common problems. And then they start discussing solutions and start listening to each other. But you have to invest and have to have somebody who is dedicated to start that process up and continue that process. If that person’s not there, nobody’s going to show up a second time.

>> MODERATOR: I hear what you’re saying. What I try to do here is focus on user trust in the Internet. So not trust between institutions, although we might meander there, but I wanted to start off with that. So getting various institutions to tables and have them create a solution for things that bother us collectively, does that necessarily create trust of the ordinary user? Will that make my mother more confident to go on the Internet and do her business there?

>> To answer you, I think that it would if you identified the right things to discuss, because in the end, the institutions can solve the problem and not the individual end user. So you have to address the concerns, but with the right people at the table, which includes Civil Society, which will probably represent your mother in one way or another.

>> LOUISE BENNETT: I came to the ICANN board in October. I want to get an observation that I made since I come from a completely different background. Everyone in here is part of the Internet elite, the fact that you’re here. I think the thing we have to think about is our mother who wants to go safely on the Internet. Most people don’t care how it’s done. It’s like with food. You buy it in the supermarket; you expect it to be safe. You go onto the Internet; you expect it to be safe. You want someone else to take care of it.

I think most people expect governments to be the one to do that. So the question is how can we help the governments provide that? How can we deliver that? And is it even possible for governments to do this? And this is where and they’re under serious pressure because they’re facing all kinds of threats. They’re trying to find solutions. The real problem is that governments have a huge responsibility. Governments are under pressure to address serious issues, terrorism, pedophilia, all these things happening. At the same time they lack the technical skills to actually find the solutions and sometimes will take a bulldozer to kill a mosquito because they don’t know how to get the mosquito. One of the challenges is how we get those different planets together to make it work.

>> TATIANA TROPINA: This is a very valid intervention. When we were playing in the session, we had to shape it from the proposals we got. And most of the proposals were actually concentrating not on the user’s trust, this perspective here and now. Most of the proposals were concentrated on how do we trust governments, how much of the intervention from governments we need, how do we build trust between stakeholders. Of course, when we are talking about trust, it’s like these dichotomies. It’s sometimes like chicken and egg debate. We don’t know what comes first: government, users, or companies. This is a very valid point to consider in this discussion.

>> MARKO: So I have to respond. I was kind of letting go with the technicians being scary. You raise a very valid point, Louise, that a lot of people think that it’s scary. Do you know your system administrator, as just tweeted? You used to know your butcher and your baker and we kind of moved it all to the supermarkets. We found it wasn’t beef we were eating and at that point you kind of see government step in and regulate and monitor and measure. So open question to the panel is, what can we do as an industry to make people sort of trust us as technicians and trust technology enough that we don’t end up in a very regulated word that we now see in the food industry.

>> I give you the response we gave yesterday, the very good judge of the Belgium state who was told I can make any kind of decision as a judge, but I cannot implement it.

>> That would be another question for the panel or maybe for the floor. Even if we have governments intervene, what about enforcement? What about actual implementation of what governments can develop? I’m sorry

>> MODERATOR: So to capture the flow of the conversation now, I think going from user trust, we sort of arrived at the place where we say the governments are conduit for that user trust to be established, but they can make policies. They are not in the position to implement. We’re now at a question how do we make sure that things that are being proposed can be implemented? Is that the right captioning?

>> I think this is the right captioning, and how many interventions we can take.

>> MODERATOR: We have four minutes, 53, 52, 51.

>> I work for the government, so for once, I can there are so many representatives not so many representatives from the governments here. I work for Belgium for the Ministry of Economy. I’m a member of the GAC as well. So I really appreciate. I don’t understand directly to your question, but I appreciate the last intervention because, as you know, there is still a lack of trust between it seems to be the debate since this morning. I have the impression there is a lack of trust between the Civil Society and the governments. I don’t understand this in Europe, because I think we are living in a beautiful way world with democracy. I appreciate your intervention because it’s really necessary for us.

We are no specialists. Of course, we have our own specialists, but it’s really important to work together. I think that we have the same aims. We launched in Belgium the cyber coalition, private programme to try to get some solutions. We have to work in European contacts. You speak with laws and enforcement.

In Belgium we have our own laws, but we have to work in the European context. Of course, someone spoke this morning about this directive. Itself very important and now we have sort of coordination and the laws has to be European laws, otherwise it doesn’t work. Because, of course, as someone else mentioned this morning, the attacks come from other part of the world. It’s very important to have this coordination.

I was very happy to hear just this morning I was really frustrated to hear so many lacks of trust with the between the government and the Civil Society.

>> MODERATOR: It’s an issue to address in the panel. I saw a hand over there.

>> TATIANA TROPINA: I saw two hands over there.

>> MODERATOR: We need to close it off because we have two minutes, 28, 27.

>> My name is (?) I’m a Dutch Youngster. My question is how can we find the ideal compromise between all the different interests that all the stakeholders of governments and Civil Society and companies. How can we do that?

>> I’m (?) I used to work in the European Commission. I’ve been doing this sort of thing for a long, long time. I’m an old European, not a young Dutch guy.

(Laughter)

Two things, I’m one of these people here a long time and I should had been comfortable. I have two problems these days. One is surveillance by websites. I just checked. It says here, “161 websites stole cookies or other data.” This is one browser, and I use two browsers to stop total surveillance. That’s probably since yesterday. I think I cleared my cookies or most of them yesterday. I have not been to 161 websites to my knowledge. When I go to a newspaper, suddenly I’m going elsewhere. So there is a scale of surveillance there which I feel I’m uncomfortable with.

The other side is the government side. With NSA in Brussels at the end of the 1980s, and it’s been part of my day job to be under surveillance, but I feel now that a lot of people think that the governments are the enemies, and that makes me feel uncomfortable. We all live in democracies here. Somehow we know the governments are now the enemy. I do know when we Nigel and I on the OECD guidelines on security. We all chatted and reached a consensus. Last minutes the governments came in and said when it comes to national security, the gloves are off. And somehow we still have this idea that military at the Geneva Convention, when it comes to the cyberworld, there are no rules.

I’d like there to be clearer rules so that I could feel more comfortable and not be in conflict with my democratically elected governments. So I want to work in both areas. I want to work in the commercial of the Internet and make it clearer what I can trust and what I can’t trust. A lot of it has to do with surveillance, not with cyber incidents. And I want to work with governments so I know that I’m not an enemy.

>> TATIANA TROPINA: A very short and we came back to users trust again.

>> MODERATOR: We came back to users trust again.

>> It’s about the industry, because we are not having any kind (?) We are discussing here as Europeans and it’s not good for our alliance also because we don’t have competition. For Google, for Facebook, for Twitter, for whatever, is not any European, and (?) Yesterday it was mentioned that when the judges and the justice system cannot implement, when the military’s outside, when our satellites are selling our services to the terrorists because ISIS cannot have any kind of presentation to the Internet without our permission, how this is democratic element which we are fighting for? We are fighting by being hunger, by being on the streets, by working very intensely on the democratic element.

Thank you very much.

>> MODERATOR: Over to the panel. We had a full circle, I think, from user trust to governments as the conduit, industry as the implementer, user trust in the government, not only in terms of their processes, which is one aspect that we discussed this morning, too, a little bit, but also the impact of state surveillance, and industry surveillance, another aspect of trust. Those had clear impacts in a complex world. But we have, I think, a number of perspectives at the table. So who wants to give go ahead.

>> RAINER STENTZEL: Just listening to this discussion also in the morning, I would like to go a step back and wonder myself, is it really a fact in social that we have to rebuild trust? This is a topic. It’s a narrative. If you read, for example, proposals from the commission or other law makers, most of them are in the sake of rebuilding trust. It’s about data protection, rebuilding trust, cybersecurity, rebuilding trust, even if the protection on organic food, rebuilding trust. And I wonder whether really there is an empirical study about whether people consumers don’t buy organic food because they think it’s not produced in the right way. On the other hand, you have a situation where you see the Internet is growing, the services are developing, and my impression is not that people don’t use the Internet because they feel that there is a lack of trust. Even if there is a lack of trust, the question is whether there is an impact. We had the situation in Germany a few years ago that we have a big social network. It was a German social network. And I think it’s not a surprise if we say that the data protection rules were better than the terms and conditions of Facebook.

Facebook was only the second social network in Germany. Everyone was complaining with Facebook and the data protection policy. It took only one year, and they were the biggest one in Germany, although there was no trust, they used it. The other social network doesn’t play any role at all in the moment. I wonder what is this discussion about trust? Is that something

>> MODERATOR: Is that something that Louise just said? We’re here as elite? I’m just rephrasing what you said. Is this because we are the elite and we look at this for a completely different perspective than most users?

>> RAINER STENTZEL: Let me finish my sentence. My impression is you always have some kind of general mood for policymaking. And I think the easiest separation is if you go to the states, there’s always complaint and the big issue is hope or fear. It’s always the same. It’s hope or fear. Obama was hope. Bush was fear. It always can change. And what is interesting in the whole multi stakeholder process in the whole discussion, my impression is when I came to the first IGFs and EuroDIG, there was much more hope. It was about the sake of the Internet and so on. Now we have a discussion that it’s driven by fear, by fear, lacking of trust, and we have to regulate in a way and we don’t trust governments. But this morning I didn’t hear any point concretely what we have to do. What is the point about cybersecurity? What do we have to do? It’s more a general we don’t feel good. I wonder if this is a very productive approach on the multi stakeholder process and on getting some issues done, even if you have to do in the end to draft some law and to find some decisions.

MODERATOR: So I’m going to do a shameless plug. We have this collaborative security approach. One of the things that I’m trying to test here is whether what we’re trying to do, what I hear fits in this approach. One of the principles that we try to apply is foster confidence and protecting opportunities. That is in the realisation, I think, that the only the Internet only just works. People always find the right balance between risk and just continues working. Anyway, also resonated with what Marco said at the beginning, by the way, a positive approach. So I think that’s a good take away.

>> LOUISE BENNETT: I think trust is important and that are value exchanges, whoever they’re with, are fair and equitable. And I think what was said about in the real world if you know people, you know whether you trust them or not. I think the absolute key thing on the Internet is to be confident about the identity of the person or organisation that you’re actually interacting with. So I actually think the key to trust on the Internet is identity management and identity assurance, and a shameless plug, anyone who wants it, I have a thing here which we’ve just published on how to recognize a good online identity scheme. In Europe, we have a whole lot of identity schemes of different types in the different countries. Most of the countries in Europe base their identity schemes on ID cards. In the UK, ID cards are an anathema. We do not like ID cards. They tried to introduce it. The last government but one, and it was a completely failure. An enormous database they tried to produce. Now we have a scheme that has gone live last month because it had to go live according to European law by the first of April, which is a federated identity scheme. So it’s we have private sector identity providers, eight of them, to verify your identity to a certain level, not the very highest level, but to a reasonable level to interact with government. They hope it would be taken up in other places.

In some of the Scandinavian countries, very sensibly, I think, they use the bank ID schemes, the credit cards. Although we all say we hate banks, we don’t trust them after the banking crisis, actually, most of us trust using our credit cards to do interactions on the Internet, buy things. One of the reasons we trust them is because we see the little lock and we think, oh, that’s safe, it’s doing they may give you a dongle and you feel comfortable with that. We’re comfortable because they accept liability for what we’re doing. If something goes wrong, the credit card company will pick up the tab. We aren’t losing money.

Now, these are all important things, and I actually think there is a very positive way forward, and I’m very for about things. We need to fed rate all these different types of identity schemes, ones that people in different cultures, different organisations, are happy with, happy to use in different situations. As long as we have open standards for them to interact, then we can have a successful trustworthy Internet, where you can buy things and feel confident, sell things and feel confident. You can interact with your government and feel confident. Very importantly, I think we’re missing a thing. I think Alibaba has a very good eCommerce strategy. You have Alipay and it stands in escrow between the seller and the buyer. You only get the goods once Alipay is holding the money. And I think escrow, a trusted third party, is a very important element of developing trust on the Internet, giving everyone confidence that they will get things. And so I would I put that out as a way forward.

>> MODERATOR: I think I hear what you’re saying. The model of federation is much related to being able to spread and create meshwork of trusts. Assertions being made about identities are standardized. I believe there is new work on that. So I fully subdescribe. Of course, it’s not the magic bullet. It’s an important foundational technology that you’re talking about, yes.

>> TATIANA TROPINA: I would like to ask the question to frame into our previous debate about governments and companies and users. It sounds very appealing to have these standards. Who is going to actually impose them? It’s voluntary; right? Am I understanding right? Or do you think the governments have some role to play how far of these goals if you think that they have to intervene? So voluntary or co regulatory or pure intervention? What is the answer?

>> LOUISE BENNETT: I think you have to have open interoperate ability standards, to interoperate between the two governments. Government has a role. It can’t force its citizens to do X, Y, or Z. You need to understand some very complicated things to choose. We’re not saying there is not a single right answer, but these are all the things you should think about. This is why you should think about them. And I think that’s the way forward.

>> MODERATOR: I see people in the audience raising hands, but I’m not going to go there. We’re going to take 20 minutes with the panel now and then come back to the audience.

>> TATIANA TROPINA: Unless there is some pressing issue.

>> MODERATOR: Only very pressing issues. There is a very pressing issue.

>> Sorry to take, but I would be very short you have a European regulation on the ID. So we have open standards. We are working on it. It’s now we are working on it, but I don’t remember since last year. We have trust, so we think it’s in the European law. I don’t understand the point, actually.

>> TATIANA TROPINA: Was it your intervention?

>> KONSTANTINOS MOULINOS: Since July of 2015, and now it will be coming to force first of July 2016. We have the electronic identification and trust services regulation, which more or less sets that more or less sets the mandatory framework around the regulation of electronic IDs. And I think what Louise said makes absolutely sense, but now apart from the federation schemes, we have a framework around the identities in order to operate proper, in the same manner, towards the security and trust of the citizen.

>> TATIANA TROPINA: In your opinion there is a need for regulation? Apparently you have a problem supporting this framework.

>> KONSTANTINOS MOULINOS: In European Commission’s opinion. Usually European Commission intervenes whenever the market doesn’t work.

>> LOUISE BENNETT: There is the trust regulations. That’s why the UK has had to rush in and we’ve been taking a long time about it, rush in a scheme and say that it’s live, which it kind of is, to meet that regulation, because we’re the only country in the EEU that doesn’t have an ID card.

So we’ve done it in a different way to the rest of the EEU. It is for cross border signature of contracts and so on. However, there are big problems for people in the U.K. with those two regulations, because they’re based on the premise of that Napoleonic law, and it’s very hard to obey them with our rules of precedent. There are enormous problems in the UK with those regulations.

>> MODERATOR: Other reflections on what we talked about. I was sort of going through the panel asking for reflections on that arc that we went through.

>> KONSTANTINOS MOULINOS: First of all, I have some thoughts I don’t know where to start. We were discussing the matter with other experts, with multistakeholders and so on and so forth. One observation is that if you notice, all the cybersecurity regulations, the obligations are set to the end part of the value chain. We’re talking about security in the sector; the obligations go to the telecom providers. We’re talking now with the directive about security obligations, so we go to the providers. With the electronic IDs, they go to the ISPs. There are no obligations for those who provide the providers and the end users with equipment or the software.

You know, the bottom line is that your computer, your mobile phone is your interface to the Internet. You access all these the Internet with these devices. So nobody sets some rules or security requirements how this interface should be built as regards security.

We started discussing and debating a little bit with different stakeholders. We have to realize that it’s not easy to put this requirements, not only for lobbying reasons, because the most important thing in the multiconnected and multinational market is that you cannot we don’t know where to set the barrier. For example, we’re talking today for a vendor, it’s not one vendor. The vendor has some other vendors. So other components are produced in North Pole, others in South Pole. So where the liability lies? And this is very difficult when you have to make intervention to decide in order to improve the situation. Because as already said, it’s true that that there is no obligation for those who produce devices and for those who produce the software. How can you push them to follow better practices? One might say, okay, let’s put an obligation for a (?) superhuman, for example. To require from utilities from end users, to set some security requirements and these security requirements should be followed by the vendors, but, again, there are many, many difficulties to create a common list of security requirements

>> MODERATOR: A very funny mistype, there was a requirement for a superhuman.

(Laughter)

>> KONSTANTINOS MOULINOS: In order to set a meaningful set of requirements, you have to be very, very detailed. And this means that eventually the requirements should be on an ad hoc basis. Another idea was certification. Okay. Let’s adopt the certification scheme for Europe, which doesn’t exist, for example. This is wishful thinking, but we face numerous problems when it comes to that. Let’s go to the solution of testing. Yes, nice. Compromise. Somebody spoke before about compromise, which is my opinion is the key. But against what? Is there any European acceptable set of requirements that we can test our devices or software against? No.

So this is only a set of thoughts, as I said before.

>> MODERATOR: Specifically towards a minimum set of security parameters to be implemented on products. That’s sort of the minimum amount of security you’re thinking of.

>> KONSTANTINOS MOULINOS: Yes. How can you strengthen the end user trust. These are thoughts. I don’t have the solution. If there are some thoughts, how can we put the barrier? Where we stop? At which point of the supply chain we can put the liability is more than welcome to hear.

>> TATIANA TROPINA: I think we have one pressing intervention. Marko was raising the hand. We want to connect it what we were talking about during the first session. You can have set of rules, but how you’re actually in force in the de centralized global market. Is your intervention pressing?

(Off microphone)

>> MODERATOR: We have seven minutes left on this block of panel intervention. So I want to give the floor to March the.

>> MAARIT PALOVIRTA: I will go back to the roots of user trust in the more institutional setting. And I was thinking, Louise, you asked when was the last time you lost trust, I think you meant on the Internet, and I was thinking for myself there are a couple of occasions I can think of. And one thing that really ticks me off is when the social media platform, like Facebook or Microsoft Hotmail, and they ask for your phone number. It’s impossible to log in unless you give it. I just log out because I don’t want to give them my phone number. And I would say that this kind of boils down to the control of your data. So from my perspective, one of the major components of the user trust is control of your data and how much of that you can do.

The other example that’s recently happened to me was well well, not so recently, in my previous job there was an update that came onto my laptop. You know we are working, working, something pops up, I clicked off. There was all these neon colored started running around my laptop screen. Thankfully it was my work laptop. There I would maybe classified under user responsibility as well. So maybe it was in this case an SME, that didn’t have sufficient security system, but maybe it could have been myself as well a bit more diligent with my clicking, etc.

So going a bit to the control of data. So companies and also governments through surveillance, they are collecting, sharing, processing, stirring our data. The things we are saying in our discussion, it’s the data economy, vis à vis the large Internet giants, if you like. Where is the trust relationship there? Are they selling my data, etc.? A lot of paranoia there.

The second point is about the governments. So is it necessary for governments to have my data, etc. And I think if we don’t link it to the national security, the European framework, I think I’m personally fairly competent with it, so data protection, etc., there are still some gaps, namely data storage, you know. There is some legislation that remains to be reviewed. But I think that is going to pretty well.

The second part is then the private sector responsibility. And I think there should be some progress on how we can manage this data dichotomy a bit better.

As to the infrastructure, a regular user doesn’t have the means to judge or assess whether the infrastructure of my ISP is good or not, or whether the vendor equipment that they use is solid or not. So it really comes to the openness of the processes that they use, vis à vis your data.

So I think I will end there. I wanted to mention on the trust labels, this is something that I know that in the European Commission many events has been discussed on certifications, and especially vis à vis the Internet of Things, because the worry there is when you start plugging in your devices, how does a consumer know this device is not going to ruin my Wi Fi network at home, etc. So I think this discussion is quite an interesting one, and probably the kind of heavy certifications that we see in other areas and other sectors may not necessarily work in the area of the Internet, whereas a trust label, maybe for a device, might be something that supports a consumer choice, kind of voluntary scheme if you like. So you will park it there for now.

>> MODERATOR: Thank you. Could you do us a favor and give us a sort of overview of what you as a reporter have heard so far.

>> TATIANA TROPINA: As usual, the problem is you’re going to do it in three or four bullet points.

>> I will certainly do my best. Thank you for that. We started off by establishing that we cannot start over again technically. We build upon what we have. When we talk about trust, we talk about human beings. That was essential, I think, to establish as well. We need to be more proactive than just talking about incidents, and if we want to make the world a better place and have the average citizen understand this and engage them, that’s something we need to work on. If we do that, a number of people said that they trust that we can solve issues, but everyone is part of it. Everyone has a role and a responsibility.

Interestingly, someone said here raised the question and talking amongst each other here, but aren’t we an elite for thinking of the average citizen? Most people don’t even care are not even aware. They expect their governments and expect and I take that as a trusting of the governments to take care of it for them. But then if a government makes a decision or public officials or even a judge, how then to technically implement and enforce decisions that have been made.

Someone else said that more more people tend to feel that the comfort is the enemy. So how does that work in our democratic societies?

Trust, you know, I think that was the main thing we talked about. Someone even said do we actually need to rebuild trust? Is there a real problem? Are there twins? A couple of years ago we were very hopeful and the tone of the conversation was hopeful and now it’s dominated by a sense of fear.

I think an interesting point that was made with regard to trust, if you think about the identity aspect of it, and a couple of the developments there were discussed and also regulation on the EU level that was established. That should help there.

Finally, at least from my perspective, an important element with regard to user trust is the ability to control your own data. So how much control do you actually have, especially when using certain social media platforms? Where is the trust relation there?

>> MODERATOR: So yeah. That’s good to hear. So a couple of trends. I think sort of technical trend, what is the infrastructure needed that we can build upon, the technical transition set in motion, but it’s not complete yet, I would figure a policy trend. But also distrust between the entities and the outcomes of the processes. These are the things are the three branches heard here.

We have ten minutes now for you to respond on all of this.

>> TATIANA TROPINA: I count 3 people.

>> MODERATOR: Go ahead. I’m trying to get us to focus a little bit. I’m sort of fishing for that focus. I think we would do a benefit if we could walk away with a conclusion.

>> TATIANA TROPINA: I would make a small amount to these regulatory trends to the policy trend.

>> I should lower my time. Here we go. Very quickly, Louise triggered it a bit and followed up by the comments and focusing on the regulatory trend. Yes, it’s good to look at supply chain and the responsibilities there, but I would also say look at the complete value chain, that includes the end user, yes, my credit card has some security card and when things go in I pay. I’m pretty sure I pay it myself. I have some means because it’s certainly not coming from their profit margin, but also when I’m not doing anything stupid. You have the wonderful example with Alipay and escrow, the industry is building a lot of tools and things that can help users to secure their environment and be more secure, but it’s ultimately the user that has to use it. To draw an analogy that we can put it in IoT, yes, the car manufacturers didn’t put seat belts in, but you have to fault the end user for not wearing them. So I would really love to hear what’s the end user responsibility when you enter into liability for the suppliers? Then also somebody else is liable when they do something wrong.

>> TATIANA TROPINA: Sorry, just a question. So do you think a question for moderator, do you think we have to regulate end user responsibility? Yes or no?

>> MARKO: No. I think if you regulate the industry, you should also take into account end user responsibility.

>> We have been discussing private sectors and all the rights of the consumer, but we tend to forget that the consumer also has some obligations. As my CEO this morning or the first day of the EuroDIG meeting in the plenary said, like the Spiderman quote, with great power comes great responsibility. The Googles and the Facebooks of these days, they do know their power, but they also know these days they have their obligations, including with the new GDPR, they know what’s coming their way. Believe me, the private sector, they will know that there is risks involved and they will not just use your data for things they are not allowed to. So I’m a bit surprised, but also annoyed when I hear people say and then they ask me my telephone number and I don’t want to give them my telephone number. Well, it’s a security measure, and they will not use it for other purposes. It’s because if you block your account, etc., etc., but I want to take it to the domain. I’m the legal manager, so when we have authority who is .eu, I’m now talking domain names only. The Internet is much broader than that. When we take it to the domain name, I can say 80 to 90% of the complaints we get of abuse on the Internet will relate to domain names that have inaccurate data or fake data. So for me a big concern is the privacy services, the proxy services, and even the obfuscating services, and means they get the data from the customer and do some weird things with it and it ends up in our registration database and we can’t do anything with us because it’s just bad data. If we have to give to the public to let you know who is behind the domain name, nobody knows who is behind a domain name. That’s a big trust element on the industry. So if we allow people to hide on the Internet, I’m not saying that we need to give up privacy, because the way we do it, when you give the registration data, which is correct your personal data, we only show your email address and your language. So people are contactable via our who is on the domain name. You can have a Gmail or Hotmail address, but if you’re not contactable, we will we will not show your private data, but we know who you are, so in case we need to give it to law enforcement, etc., then we need to use the data for those purposes. But we will not show it to the public. So there is people always say about cybersecurity and privacy, there must be a balance. The balance is already there, but it needs to be made clear that individuals who are online, all consumers, we also from that perspective have some obligations and responsibilities.

>> MODERATOR: Just to throw in my own perspective here, that resonates that general approach of feeling responsibility for the public comments that the Internet is resonates a lot with me. I always say when you’re on Internet, you’re part of the Internet. And that means you have a responsibility to keep it clean.

As an end user, that means install a virus scanner, please, update your software. And I think that also moves into the responsibility of paying ten cents more for something that has a little bit of a higher security standard. Anyway, I’m digressing. I’m on my own soapbox now. We have a few interventions.

>> Stacy Walsh. I’m going to go back a step, because I have a comment about an earlier topic about how the interplay of governments and citizens and organisations and where the trust is and maybe not trusting governments anymore or organisations. And I agreed a lot with what Maarit was saying. We can’t say we definitively do or don’t trust governments or organisations because what we’re doing is we’re really getting into the weeds of cybersecurity. If we don’t know exactly what kind of “cybersecurity,” air quotes, we’re talking about, our trust relationship with that entity will change. So right now when it comes to data protection, we are trusting and we are looking to our governments to go and to regulate that to put in some kind of structure for that, the new EDPR is going in. And that dates back to when governments first started digitizing data in the 1960s. That’s when you got your first data protection laws in government. So that has a long history.

Resting within that kind of trust element, whereas if you look at national security, which has for history of the world, almost, been a very private closed conversation within government. There is the shift happening, but they’re not opening in the same way that governments opened when they put in data protection laws. It’s more about thinking about what we’re talking about and finding the right resource to open up that conversation rather than saying I do or don’t trust somebody, because in Apple versus CIA, I trusted Apple because the CIA isn’t being open with me, so why be open with them.

>> MODERATOR: We have one or two minutes.

>> TATIANA TROPINA: I saw a hand over there.

>> Thank you. I already mentioned in the morning the network information security directive, the general data protection regulations, the directive for the data protection for law enforcement purposes. I think there are quite a lot of officials who have been rebuilding trust in order to quote. I think trust is something 50% of the people don’t trust to the Internet. That’s why they don’t use it, so a huge economic maneuver if you can build trustor trust you use all the time and network information directive, you hear it 28 times in the round. But in the Austrian government, they are acting on the legal base, so everything they have to do they’re doing on the legal ground. The question do I trust two entities and private companies, what they do? Trust is also have to do with my skills. So do I know? Am I aware what they’re doing? What happens when I’m booking a flight online, which happens now in 15 minutes. 20 years ago it took one month in order to obtain a book a flight; you have to go to the travel agent. You had to pay cash. You had to give your post address so you can receive the copy.

>> MODERATOR: Isn’t the Internet beautiful.

>> Exactly, but we learn to read and to write and to count in order to interact all together but also in government. We don’t learn in schools how to interact in the Internet. So the point is trust. As far as I know, what I am doing and what this is also cybersecurity. I’m sorry. I have three points more about the Facebook and the telephone number. Facebook asked me several times. I didn’t want to give them, but certainly they suggested me to put my number. I don’t know where they get that, but they had the number, probably from other people who have Whatsapp and they have my number, so they just suggested is this your number? So think about trust.

And I also would like to have cybersecurity, but also maybe data security, data protection. This is everything connected. Again, cybersecurity, digital skills. I didn’t hear about it, but we have to also educate people. This is the end user responsibility.

>> MODERATOR: Clear point. Capacity, skills.

>> Right. If you don’t know what are you doing over there. So I’ll make your point. Thank you.

>> MODERATOR: Thank you. We have two more.

>> TATIANA TROPINA: We have two more interventions and we are closing.

>> MODERATOR: Keep them to 30 seconds.

>> I’m going to get like this, so in the worst case, I will just take it back.

>> NICK: My name is Nick. I’m from the UK government. I apologize. I missed the earlier session on this, so I missed some of the dialogue. Wow, what a great discussion and loads of points. I think I’m maybe public enemy number one going by some of the chats. Trust, really, really interesting issue. I work feel similarly at ICANN. I’m on the GAC. We had transition, and trust has been the key feature for a lot of the debates. For me, personally, my fundamental trust in the Internet as a whole is that it’s decentralized operation where no single entity is in control.

I think whether we decide we like Facebook or we don’t like Facebook or whatever, the decentralized network gives us the ability to make a choice and say no we don’t like this, we’ll use something else.

Open standards, we have the IGF that opened these open standards. The decentralized network, there are pressures elsewhere in other institutions to have greater governmental control of the Internet. I’m sure people all around this table wouldn’t like that. So I say as Europeans, make sure you go and speak to your government people like me and say get involved in those debates and defend this open decentralized network and promote the multi stakeholder model.

>> MODERATOR: Music to my ears.

(Laughter)

I heard that inviting him to the IGF actually paid off.

>> Anna speaking. Back to what Marko was speaking and the follow up from Tatiana about making users responsibility for the security and privacy protections, regulation. A century ago a car was something very new, as new as Internet. Those who started driving and using the cars had no idea that one day someone would give them 40 Euro penalty for not using the safety belt; right? So it’s not imaginable to us to be held responsible for not being stupid, for being stupid online, as it was back then.

So I believe that at some point we need to stop taking the government and private companies as our parents and always relying that they will be the ones providing us security and privacy. But we need to start becoming responsible.

And I love this multi stakeholder responsibility thing. And I put users in the very center of it. So if we need the regulation to make users responsible for their privacy protection and security, I think we can start discussing it. And I think we need to take care of our digital security as important as our physical.

>> MODERATOR: Thank you for that intervention. Now I’m going to go back to the panel and give them two minutes each to responsibility. After that, I’m going to ask Bastian to make a summary.

>> TATIANA TROPINA: Four bullet points.

>> MODERATOR: To make a four bullet point summary. In the meantime I have a question. Who wants to start? March at?

>> MAARIT PALOVIRTA: Yes, why not. There was a large range of comments. I don’t think we expected to answer to all of them. I would like to pick on nick’s comment on de centralization, and I think that’s not a bad thing. I agree with your comment. And I know that European Commission and I was in a previous discussion with them a few weeks ago. They were saying we were talking about password, identity management, and they were talking about the centralized system that the European Commission will provide to citizens to allow us to use one password for many things and there would be some kind of guarantee by the government to keep us safe.

And I asked the question then, and I would like to raise the point again here, that I don’t necessarily feel better if a government has all my identification methods centralized into one pot somewhere and the government is, let’s say, watching over it. That doesn’t necessarily make me trust the Internet more.

So I do like your point about decentralization. I would also raise a point on behalf of the technical community, I like Marko mentioned that part of the technical community, they work to law enforcement, do technical training with the few of allowing policymakers, law enforcement to understand the ways of working on the Internet. And recently, again, I was in an EU meeting. There, again, this was very officially recognized in Amsterdam a couple of weeks ago. They have a box called involvement or engagement with hackers. They have nicely called them the technical community, but they are now even using hackers, you know, as kind of training, etc. So I think it’s not only about governments and private sector, and even users. We also have to then make sure that the technical community and the guys who know the Internet involved somewhere along the way. So I will leave it there.

>> MODERATOR: Hackers are not per definition bad people, by the way.

>> There was a very nice hacker guy who was talking. He was very nice.

>> MODERATOR: It’s people wanting to tinker with technology.

>> KONSTANTINOS MOULINOS: I would conclude that trust is a multifaceted principle which evolves around some key issues, control of data the confidence that the security works privacy compliance, transparency and openness and awareness and user responsibility. I think that’s the most important elements of this concept.

>> MODERATOR: That was far less than the two minutes I had planned for you, thank you.

>> LOUISE BENNETT: I can use a bit more than two minutes, then. I think a very important thing for trust in the Internet in the future that we haven’t touched on properly, although it’s come up in the questions both this morning and this afternoon, is the question of the security of the Internet of Things. And I think that is something that we should really be thinking about very hard now. And I’d just like to take to bring up three examples of where I think it’s important. I think it’s most important where there are safety issues. And so we have already seen stories in the paper, people around here, I’m sure, who have read them about the Internet of Things in cars and about intervention with those causing accidents. These kind of things should be covered in the safety regulations.

I think the other area where it’s really important and trust would be lost if we don’t do it properly, is things to do with health and social care in the Internet of things, which are tremendously valuable in that area. I had to look after my mother with Alzheimer’s for about 15 years. One of the things we had was said if you got out of bed, gone downstairs, open the front door, and all the rest of it, sensors. At that time it was relatively unusual for someone would be monitoring their mother remotely in that way, and I could alert people to help her. Now I would be really frightened that there wasn’t proper security on any of those systems that I was buying off the shelf and installing around the place. And I think it’s incredibly important that we should regulate about the safety of systems used in health and social care.

The other area we have talked a lot about privacy and security. And I think there are consumer goods where I think the IoT is it very intrusive. The one that comes most obviously to me is TVs that actually watch you as the audience, record what you’re saying, see your reaction to advertisements and things. Most people who buy smart TVs don’t actually realize what they’re garnering in your home. And I think

>> MODERATOR: It’s in the Terms of Service.

>> LOUISE BENNETT: Not everybody reads all those Terms of Service. And I think on all these things truly privacy intrusive, we should default, we should say, is that you opt in to using it, not having to opt out. And I know that that is hated by people who want to sell you those things, but I think that is the privacy friendly and correct way to operate. And I think that is the kind of legislation that our government could bring in. So I really wanted to just touch on that, because I think it’s such an important area that we hadn’t had a chance to talk about.

>> MODERATOR: Just ten seconds of his time, but he’s going to get the full two minutes.

>> RAINER STENTZEL: I missed already three seconds. My main question is what does all mean this in practice? For policy and law maker, what is the conclusion of all this discussion? And I’d like to give an example, just to explain where we have sometimes even more problems if we come to transparency, informed consent, and all these things. So if you think about airplane security, would you feel comfortable if the airplane or the air career would provide construction plans of the plane before you enter the plane and saying please read this, and then you can confirm and give the consent whether you find this is secure or not.

I guess not many people would feel very comfortable with this solution, but this is exactly the solution we found in data protection regulation. It’s informed consent. Everybody has to read terms and conditions and then he has to decide whether he’s comfortable with the service or not. In the end, nobody does it. We shift responsibility to the user. Instead of saying, okay, maybe it would be more appropriate to have more general rules for the providers for the services.

So that means in the end this was, for example, a discussion driven only by trust. Rebuild trust. We need informed consent. That sound very good. But if you see what it means in practice, you come to a very, very poor result. You say oh, that’s the solution. Oh, no, that was not what I meant. I meant a totally different thing.

And I think I can only give the advice for multi stakeholder process. If you would like to involve the governments and get them engaged in this process, come to practical points. Come to also examples. Come to maybe proposals and very detailed things, because it is a very, very high level might sound comfortable to each other, but in the end, if you implemented the consequences, it’s not exactly what you wanted to do. Thanks.

>> MODERATOR: Gather together on a topical basis is what you’re saying. So before I’m going to hand over to Bastian for a summary, here is an assignment to all of you. Close your eyes and think of the divine intervention that you are able to make. You can impose a trust enhancing measure. What would that be? Please tweet that with the #EuroDIGsec and #EuroDIG16. What would you impose on the humanity?

>> TATIANA TROPINA: Don’t forget it should be in 140 symbols.

>> MODERATOR: Go ahead.

>> Thank you. I’ll skip the previous summary I did halfway. What I took from the second part, I think from the start, what I found interesting is someone who commented on the fact that we cannot say by default we do not trust governments or private sector organisations. It’s important to agree when we talk about things beforehand specifically what we’re trying to cover. The example was mentioned by Stacy, for instance, with regard to data protection. That’s the transparent process and people reflect on that. On the other hand, we have the policy that security agencies to do for legitimate purposes. All that is very hidden. Exactly what are we talking about when referring to governments?

When you are on the Internet, you are responsibility. That brings a responsibility for the end user and the responsibility that he or she has. If you talk about this particular responsibility, also the trust that comes along with it, it’s not as easy as it sounds. It requires skills. Do you actually know the specifics and the details you’re forming an opinion about? Do you have the time and access to do so? To have an actual opinion? This requires capacity building. I think that is lacking to an extent.

I like the comment that Nick made, from a decentralized aspect of the Internet as a network and the open standards that are used to connect everything together. Also, the call to interact with your governments and participate in the multi stakeholder governments. Last, but not least, at the end of the session a couple of new items came into play. I noted them down as well. With regard to accepting Terms of Service, the role of informed consent, what does that actually mean? People do not read the terms. We accept them blindly. Another suggestion that was made to make people expeditiously opt in instead of opting out afterward once they have committed to something that might help to gain trust in the services. That was it.

>> MODERATOR: Thank you. So most of the people I see here were in the first session. We’re going to do I’m going to ask you whether this was a fair reflection a fair summary of our panel, and then or not, and you can hum on both of those sessions and we can set against of the room. Please hum that way if you think that this was a fair reflection of the panel.

>> Hum.

>> MODERATOR: Please hum if you think this was not a fair reflection of the panel. Thank you. Bastian.

(Applause)

Now, your divine intervention, panelist, please.

>> KONSTANTINOS MOULINOS: Education. I think a lesson on all European schools about Internet and how to use the Internet.

>> MAARIT PALOVIRTA: I think we need some kind of virtual platform to break the security taboo and opening it up to each other.

>> LOUISE BENNETT: I think trust is even enhanced by transparency and simplicity. And so you should have a traffic light system for good security and privacy.

>> RAINER STENTZEL: I enjoy discussions like this, but I would enjoy discussions also with more practical and real input in what policymakers could do to make it a better and safer place.

>> MODERATOR: And with that, thank you all for being here. Thank you to the panelists. Have a good continuation of your EuroDIG day.

(Applause)

Session twitter hashtag

Hashtag: #zerorating