Privacy is everywhere: how to deal with emerging problems? – WS 02 2018: Difference between revisions

From EuroDIG Wiki
Jump to navigation Jump to search
Line 73: Line 73:
*Some users are willing to share their data in return for a service. Therefore, education and information are key for users to understand what is at stake, and to take responsibility.  
*Some users are willing to share their data in return for a service. Therefore, education and information are key for users to understand what is at stake, and to take responsibility.  
*Privacy enhanced technologies should be everyone’s right.  
*Privacy enhanced technologies should be everyone’s right.  


Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/privacy-everywhere-how-deal-emerging-problems
Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/privacy-everywhere-how-deal-emerging-problems

Revision as of 16:22, 2 July 2018

5 June 2018 | 14:30-16:00 | MASTER ROOM | YouTube video
Consolidated programme 2018 overview

Session teaser

Protection of privacy has always been an integral part of efforts to ensure protection of human rights. There are old and new challenges in these directions, which need innovative and new solutions. Their implications and interventions should be considered while designing new approaches.

Keywords

Ethical use of data, privacy, GDPR, transparency, enhancing technologies

Session description

The advent of new technologies created new challenges for privacy and protection of personal data. Development of Internet of things (IoT) devices and emerging technologies imply that more personal data is accessible for collection and misuse by companies. Hence, the issues of ethical use of data have become one of the major challenges for users and consumers of particular services and technologies. This session will attempt to foster discussion among different stakeholders about possible solutions to the existing challenges. To this end, the importance and implication of General Data Protection Regulation (GDPR) will be discussed, as it attempted to establish privacy centric approach. However, it also creates some legal uncertainties. The big companies - even Facebook and Google - might be better able to comply with the regulations than small companies and NGOs - but the rules apply to both of them in the same way. In addition, some new technologies, like Big Data and Machine Learning are based on large data lakes created independent of a specific purpose. GDPR however, requires setting the purpose of every data processing and data collection. This might be inhibitive for these kinds of technologies. Considering these challenges, it is important to discuss whether issues envisaged in GDPR need the shift of focus. Apart from regulations, self-regulation approaches and mechanisms should also be considered. Transparency has always been an important tool for ensuring responsiveness and accountability for various organizations, including companies. Hence, the session participants will overview how companies can deal with privacy issues, how identifying new business models and new privacy internal policies can enhance consumer protection. In addition, the workshop will refer to the role of civil activist and civil society representatives in this regard. Moreover, emerging privacy enhancing technologies offer new shields and protection mechanisms. In the context of emerging privacy challenges, companies attempt to change their business model in order to gain consumer trust. Hence, it is essential to discuss the technical opportunities these technologies, like blockchain, cryptography offer, what their capacities and limitations are.

Format

4 short presentations overviewing the issue and underlining major developments of the issue. The presentations will be followed by interactive discussion (Q&A) with the active participation of the audience. The workshop will be run by a moderator.

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, so only links to external resources are possible. Example for an external link: Website of EuroDIG

People

Please provide name and institution for all people you list here.

Focal Point

  • Giorgi Kldiashvili, Institute for Development of Freedom of Information

Subject Matter Expert

  • Farzaneh Badii - the Chair of the Non-Commercial Users Constituency, Research Associate at the Internet Governance Project at Georgia Tech in Atlanta, United States

Organising Team (Org Team)

  • Melle Tiel Groenestege, VEON (Global Telecom Operator), Digital Policy Advisor
  • Jörn Erbguth, University of Geneva, Geneva School of Diplomacy
  • Levan Avalishvili, Institute for Development of Freedom of Information, Programs Director
  • Teona Turashvili, Institute for Development of Freedom of Information, E-Governance Direction Head
  • Thomas Struett, Researcher, Istanbul Bilgi University, IT Law Institute
  • Elif Sert, Researcher, Istanbul Bilgi University, IT Law Institute
  • Adam Peake, ICANN, Executive Research Fellow at the Center for Global Communications (GLOCOM)
  • Laurin Weissinger, University of Oxford
  • Teemu Ropponen, Open Knowledge Finland
  • Fotjon Kosta, Albania IGF\Ministry of Infrastructure and Energy of Albania
  • Valentina Pavel, Association for Technology and Internet (ApTI)
  • Marina Shentsova, United Nations Economic Commission for Europe (UNECE)
  • Miguel Perez Suvias, Spain Interent Users Association
  • Tamar Alpaidze, Academia

Key Participants

  • Jean Gonié, VEON (Global Telecom Operator), Digital Policy Group Director
  • Jörn Erbguth, University of Geneva, Geneva School of Diplomacy
  • Ani Nozadze, Head of International Relations Department at the Office of Personal Data Protection Inspector of Georgia
  • Susanne Tarkowski Tempelhof, Founder at BITNATION

Moderator

  • Ceren Unal [1], Regional Policy Manager (Europe), Internet Society (ISOC)

Remote Moderator The Remote Moderator is in charge of facilitating participation via digital channels such as WebEx and social medial (Twitter, facebook). Remote Moderators monitor and moderate the social media channels and the participants via WebEX and forward questions to the session moderator. Please contact the EuroDIG secretariat if you need help to find a Remote Moderator.

Reporter

  • Aida Mahmutovic

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.

Messages

  • Privacy should be an important issue in everything we do, and its central parts should be privacy by design and security by design.
  • Privacy is about trust, and companies need to demonstrate that they are trustworthy. Entities collecting user data need to be proactive in ensuring transparency and accountability.
  • Some users are willing to share their data in return for a service. Therefore, education and information are key for users to understand what is at stake, and to take responsibility.
  • Privacy enhanced technologies should be everyone’s right.

Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/privacy-everywhere-how-deal-emerging-problems

Video record

https://youtu.be/wm-SsMlZqxs

Transcript

Provided by: Caption First, Inc. P.O Box 3066. Monument, CO 80132, Phone: +001-877-825-5234, +001-719-481-9835, www.captionfirst.com


This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.


>> CEREN UNAL: Okay. I think we should start. We're already a bit late. Hello and welcome to the Privacy Workshop No. 3. Thanks for joining us here, especially despite the fact that it's not in the printed brochure, so we ended up having a very private privacy workshop.

Today we have -- we have a distinguished lineup of speakers, and we're going to discuss the current challenges that we are facing when it comes to online privacy and the way forward, so the good-old definition of "privacy" as the right to be let alone is suddenly not making that much sense, and it's constantly being challenged by new technologies, like IoT and AI, and we live in the era of peak or it didn't happen.

We have GDPR now. We thought that they never would come, but as of 25th of May we have this game-changing regulation, and of course, this will definitely come up.

My name is Ceren Unal, I'm working at the European Regional Bureau of Internet Society, and I'll be moderating this session. And today we have a surprise speaker, Mr. Pearse O'Donohue from the European Commission. He's the director of the Future Networks at DG Connect, and at first we thought he wouldn't be able to join us, so that's why his name is not on the list of speakers, but since he has limited time, I would like to start with you, Mr. O'Donohue. Thank you.

(Applause)

>> PEARSE O'DONOHUE: Thank you very much, Ceren, and thank you for giving me the opportunity to speak.

(Audio feedback)

Sorry for gate-crashing the party and making that horrible noise, but it's because the team is very important. I was misled by the old version of the program, so even though a colleague was making arrangements for me to be here, I was making arrangements to be somewhere else.

(Laughter)

So it's great that you could squeeze me in because, of course, this is an issue which is central to the work that I do with my team. We work on 5G and Internet of Things and the Next Generation Internet on Cloud, on software, and, obviously, we have a lot of involvement in virtual networks on artificial intelligence, et cetera.

And so privacy is everywhere. Could you pick a more obvious title?

(Laughter)

So it allows me to play around with that in the sense that everything we do, the protection of personal data or privacy is actually a central issue, and, therefore, we need to not only do what we do with the tech community, we spend quite a bit of European taxpayers' money funding research on privacy technologies, on ensuring that, for example, Internet of Things, technologies, and the standards going with IoT are privacy enabled and take that into account, but we actually have to go further, and this is why talking with the multi-stakeholder community in EuroDIG and the IGF is so important.

There are a few points I wanted to make. First of all, the GDPR. It was talked about in the session this morning. It will keep being talked about. It is something we are actually quite proud of. I know it's something we are still criticized a lot about, but it is not the high-water mark, but it is what we feel should become the benchmark for protection across the whole ICT space.

Now, it does impose a cost on companies, particularly small companies, and that's something which we are still looking at, in terms of the compliance cost and something that we do have to take account of when the national DPAs are looking just how things are done, but, of course, it makes it all the more important that regulators, governments, and the European Commission, as well as very large companies and platforms that control or have a bottleneck control on certain elements of the Internet, they, we need to take even greater responsibility to ensure that others can understand and be put in a place where they can comply with privacy requirements, with personal data security requirements as well.

So when you think of it in those terms -- and we have done quite a lot of work, for example, when we looked at the economic data supporting our free flow of data proposal that's currently being discussed in council and parliament, we have clear data, not just sentiment, that shows that a lack of trust and security is one of the biggest inhibitors to the development of the Internet in wide terms, the development and take-up of digital technologies, the digital technologies which will help business to advance and drive the economy, but even more importantly, the digital technologies that will actually facilitate individuals, create better living conditions, allow access to government, health, and other public services in a low-cost but highly secure and efficient manner.

The lack of security and trust, therefore, is our enemy in achieving our goal of rolling out those technologies to all parts of society, and, therefore, that's why privacy is everywhere. That's why privacy has to be in everything that we do.

It's also a situation which the more mature businesses, of which there are many, realize that it is also in their interest to ensure that they are themselves secure, seem to be secure, that they're part of a secure ecosystem, and, therefore, we should be able to exploit even more the incentive for business to actually provide that privacy to their customers, whether it's because they're selling equipment to intermediaries or they're actually selling equipment or services to an end user.

And so in the work that we are doing, whether it is on the design of Cloud platforms, particularly as computing power moves out to the edge of the network to take account of artificial intelligence, to take account of the low latency required for many of the realtime services that we are pushing, for example, in 5G, with online e-Health services or even telemedicine with connected cars, et cetera, the security again has to be a central part of what is put in place and, hence, we come to security by design, but on the same path, increasingly we are now coming to privacy by design, and, in fact, whereas, the Commission is often criticized for regulating first, not thinking of how else to do things, we have tended to move away from that model in many areas in recent years, but in this situation, there is a place for a floor, there is a place for a minimum set of rules. Why? Because those who are playing fair, those who are actually having the secure technologies and designing secure applications and services should not be undermined or undercut by somebody who's going for the lowest common denominator; i.e., I drive down my cost, I can offer my service or product more cheaply and, therefore, I will corner the market.

That's something where we, as regulators, we actually have an obligation to ensure that that does not happen. It's a market-structuring measure which is often criticized in other parts of the world, but it's something which we have resigned ourselves to having to do while we create what we hope is a virtuous circle. So that's not perhaps what you expected me to say, but that is actually why for so much of what we do, we need to listen to you as to how can we actually do that.

There are experts in the room here on safety technologies, on privacy-enhancing technologies, but also experts on governance. How do we actually set rules, how do we actually set guidance for those who are actually supposed to implement this privacy by design? I'd love to hear the discussion. Thank you very much.

>> CEREN UNAL: Thank you very much.

(Audio feedback)

This was great for setting the scene. Is it my mic? Speaking of GDPR, we have Ani Nozadze, who is the head of International Relations Department at the Office of Personal Data Protection Inspector of Georgia. Boy, that's a long title. And she will bring us the DPA perspective, and perhaps you might want to bring the Georgian experience too because you're harmonized with the EU and -- yes. Thank you.

>> ANI NOZADZE: Thank you, Ceren. I think as we are now talking about the emerging problems and their solutions, what GDPR is trying to do is to attempt -- well, attempting to do is to provide solutions from a legal perspective, and these legal regulations, these regulations have become necessary because we see that organizations are collecting more and more data, they're using this data; however, they are not proactively self-regulating and ensuring transparency and accountability, so I think in this case, there is a need for legal -- for legal regulations to offer new obligations; however, it is also -- it also needs to be noted, when we were discussing this, that GDPR really stands on the same core principles that were there long before GDPR was adopted and even further beyond the -- beyond the EU, including Georgia, and I think what the Data Protection Authorities will still be -- will still be looking at would be the same principles, the same core principles of data processing that were there and that are still there, such as do not collect and use more data than is necessary and do not do it longer than it is necessary, so these principles are still there.

I think -- well, as GDPR applies to some Georgian companies as well, what the Georgian DPA has been doing is offering guidance and advice in the form of free consultations to the companies and organizations in general. We have elaborated a couple of thematic and textual guidelines, and we recently published a short informational guide on GDPR as well in Georgian language, and most importantly, we are also working on a new draft law on personal data protection that will try to advance data protection standards in the country, of course, taking into consideration all of the developments, the recent challenges that we are facing in Georgia, as well as across the globe, and also the development on the legislative level, including the EU GDPR and other regulations, as well as the modernization of the 108 -- Council of Europe, 108 Convention, the text of which has been finally agreed upon recently.

So additionally, what we also do with -- how we work with the data controllers is to raise awareness and make them kind of realize to -- realize that personal data protection, protection of privacy is not only a legal obligation but it is also a way to gain trust of your consumers.

So a recent survey in Georgia, for example, that was carried out last year by the EU and the UNDP joint project showed that 81% of Georgian citizens think that personal data protection is a very important issue, so this, once again, highlights that people are more concerned about privacy, and it is them who also want to have their personal data protected more in the face of and in light of all the developments that are taking place.

So I think if the organizations understand that protecting privacy well would be a competitive advantage for them, that could also play a role for them to do more themselves and to protect privacy better, so due to time constraints, I will stop here and hopefully --

>> CEREN UNAL: Okay.

>> ANI NOZADZE: -- we can discuss further.

>> CEREN UNAL: Yes. There will be plenty of room for discussion.

And now we have Jean Gonie from VEON. They are a global telecom operators, so we look forward to hearing your perspective, the private sector's perspective on the GDPR and privacy online.

>> JEAN GONIE: Thanks. Yes. So I'm very happy to be here. Thanks to all of you, and, yes, indeed, VEON is a telecom operator. This is also an OTT. VEON has developed a platform, an app, which is kind of a schizophrenic attitude between a telecom and an OTT, so we are on both fronts.

Telecom was created previously, sorry, and known as VimpelCom, and the name was changed more than a year ago, and VEON has a footprint here in Georgia, so this is VEON Georgia, and so we operate in 11 countries. One of them is in Italy where we have a joint venture with Hutchinson and other countries not in Europe, so you can ask yourself why does this guy want to speak about GDPR? Well, his footprint is not in Europe.

So, in fact, our HQ is in Amsterdam, so we are a European company with HQ in Amsterdam, offices in London, a presence in Italy, and more -- more activities in Europe, so this is why GDPR is key for us. This is key for us because, of course, telecom and also an app, so this is, of course, key for us to comply, but we also want to and try to do more.

I've had the chance of now working for almost 20 years, and in 2001, so quite a long time ago -- I don't know if you remember where you were in 2001 -- I was like you, I was a member of the French Data Protection Authority, and privacy at that time was not that important, and the big Facebook, Google, Amazon, and others were very small activities, and it's very good, I'm very happy to see that now privacy has become top priority for everybody, for every company, to telecom, to bank, to hotel type of industry. This is really good because we have seen changes, and this is why VEON wants to also try to take the lead on privacy, so, of course, we comply with GDPR. We have developed a GDPR readiness plan, established a lot of workstream, and, of course, have accomplished what we had to do by May 25. You know, this is the date, where everybody has to comply with GDPR.

But what I wanted to share with you is that as you may know, compliance is -- I don't know if this is the most appropriate word for GDPR because there is no specific guidelines, so there's no rules. You don't know -- in fact, there is not a European Commission document that says if you do A, B, and C, you will comply. It's not a tick-in-the-box exercise, which is what is interesting with GDPR.

So some Data Protection Authorities in the world, in the UK, in Spain, have drafted their own guidance to their own companies; some haven't, which is fine as well.

So this is really about a new concept that I like with GDPR. When you think about compliance, this is about accountability, and I really think that this is a new concept. We are in discussion with colleagues about accountability.

I remember in 2010 when Commissioner Reding was leading the GDPR -- so it took eight years before the implementation -- I was at Microsoft at that time. I was leading data protection for Europe and Middle East and Africa, and we discussed with the Commissioner Reding's staff about accountability, and most of them weren't aware about this concept because this is, I would say, from a U.S. Anglo-Saxon environment. This basically means that you are accountable, it means it's more of a risk-based approach.

If you as a company do the right thing, you shall not be punished. It's really about self-certification. This is totally new in Europe. This is for a French guy -- I'm French, obviously. For a French guy, this is new. You don't Translate accountability. Is it liability? We don't really translate. Then in Spanish, then in Russian. It is a word that is very U.S. and Anglo-Saxon but works today, so that's interesting to see that at VEON we took an accountability approach, so we have established our own workstream and we worked with the business units, and we really wanted to make sure that we complied, if the word "comply" can apply, within GDPR, so we appointed a data protection officer, and this person will be in charge with relation with the Dutch Data Protection Authority, so our authority.

And I just wanted to finish -- because I know the idea is to be short -- I just wanted -- no pressure. I just wanted to finish because I have, as you can imagine, many things to say about privacy, but as a company, what is key around this is, of course, privacy is about trust, we all know that, but privacy is about a demonstration, a proof of trust, so if a company is not in a position to explain to its customers or users that it's doing the right things to basically comply with the data protection environment, this is totally useless, so this is why we adopted privacy by design rules, we adopted a dashboard and many things. Many companies will say the same, so that's very important that they continue to say the same and to offer the new subject rights to their customers. This is a long journey, so May 25 is just -- well, the second or first step because it's not new, the privacy environment, and I'm sure that we will be able to discuss this in -- for the years to come. Thank you.

>> CEREN UNAL: Thank you very much.

Now we have Jorn Erbguth from Geneva School of Diplomacy. We have some slides.

>> JORN ERBGUTH: Yes.

>> CEREN UNAL: Yes, perfect. And since you'll be talking about Blockchain, you've prepared some slides for us mere mortals in the audience, so, yes, please.

>> JORN ERBGUTH: Thank you. I don't want to put too many slides, I just want to say first --

(No audio)

Sorry. Sorry. Thank you. First thing is don't put any personally identifiable information directly on Blockchain in plain text because then it doesn't work together, that's clear, but there are lots of possibilities you can use Blockchain in a very GDPR compliant way. You can use encryption, hash values, hash function, zero knowledge proofs, homomorphic encryption, and there are even specialized Blockchains, like Z-cash, Hyperledger, or Blockchains using chameleon hash functions. I won't go into detail what that means, otherwise I'd be talking the rest of the session, but these technologies provide a high level of privacy, and the difference between existing technology and these technologies is that they ensure privacy by design, meaning even an administrator cannot circumvent it.

If you have a conventional database, an administrator can always set privileges in a way that people -- administrators can access any kind of data, and there is a higher chance of abuse, higher rate of abuse.

With these technologies, this problem does not exist. You ensure it for everybody. It's not just for the outside but for everybody.

But, of course, the drawbacks, the Article 29 Working Party has discussed them and has applied a very strict stand on those technologies, and I think it's a much too strict stand because you should take the Recycle 26 approach evenly for all technologies, and it means that if you cannot identify a person for all the means reasonably likely to be used, then it isn't personally identifiable information, and you should apply the same approach there as well.

And I think there is some possibility to use privacy-enabling technology, even in the domain of public Blockchains, which will ensure privacy even above current standards in conventional technology. Thank you.

>> CEREN UNAL: Well, that was quick. Thank you, Jorn.

Last but not least, we have Susanne Tempelhof, founder BITNATION, and we're hoping to hear from you the transparency and accountability and the Civil Society perspective from a also rights-based approach.

>> SUSANNE TARKOWSKI TEMPELHOF: Thank you. Yes, hello. Okay. So I have several things I would like to address that has been discussed today. I will start with just mentioning what you said briefly, which is that I completely agree with you that nothing personal should ever be put on a Blockchain apart from hashes and a couple of other things that I will come to later, but I would like to discuss a much deeper issue that I think is at the core of everything we're discussing here.

So I'm very attached to something that I'd like to call the battle between positive rights vs. negative rights; right? If you look at something like the U.S. Constitution, for instance -- and please mark my words, I'm not a U.S. citizen, and I'm not even alt-right, even though I come across as one from time to time, but I do think the U.S. Constitution has something real interesting to offer in the sense that every constitutional article just says this is a negative right, this is -- the government is not allowed to do this, the government is not allowed to do that, right, and if we look at everything we have discussed today, and in particular GDPR, we're talking about positive rights, saying the government must do this or the government must do that, and I'm actually hugely opposed to that, and I personally feel offended by it. I feel offended by every single person in this entire room for trying to tell me how to live my life. You have no right to do so.

Even if that -- even if you're telling me that you need to protect my data privacy, which I do just fine myself, thank you very much, you have no single right to tell me what to do with my life, and I don't think I speak only for myself, I think I speak for a large number of my generation, whether right-wing or left-wing or in the middle. It's just offensive that a bunch of bureaucrats in this office are trying to tell us what to do.

So I'm all for negative rights, and I think you should all swallow your pride and forget everything what so-called positive rights is because it's not positive in anyone else's mind than people who's getting paid by Brussels or by Frankfurt or by international government or anything else. Forget it.

So that was my main point, really. Thank you.

>> CEREN UNAL: Oh, okay. So the discussion is heating up.

(Applause)

So, again, from a rights-based perspective, it's also -- I mean, you're more than welcome to share your comments and questions, but before Mr. O'Donohue leaves, in the GDPR aim - GDPR has a very ambitious aim, putting the user back into the equation, in the center, empowered, in charge of their own data, so what are actually the additional steps to be taken to actually achieve a user or human-centric data protection?

>> MR. O'DONOHUE: Thank you. Well, as a Brussels bureaucrat, the first thing I would say is that whereas we would love if we were in a legal framework in which we could have legislation that asserted the rights of people, which is the positive --

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> PEARSE O'DONOHUE: Yeah, I'm not sure I understand that. The rights of people and then create an environment in which people could assert that right, that would mean that they would have to spend their time litigating, but also because of the way in which, for example, the technology -- the sector that we're dealing with is working is that it appears to me that there will always be a technical work-around to any such rules that we create, so the GDPR has actually been criticized rather not in the way that you've criticized it, as being a set of negative prescriptions because it is clearly forbidding people from doing things with data.

>> SUSANNE TARKOWSKI TEMPELHOF: No, actually positive rights.

>> PEARSE O'DONOHUE: No, no, that's not correct. While it seeks -- while it frames itself as something which is seeking to assert the rights of individuals and to create an environment for them, it is nevertheless basically articulated on a set of things that data processors and data controllers may not do.

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> PEARSE O'DONOHUE: Well, I'm not talking from a constitutional perspective, I'm actually talking from a practical implementation of law perspective.

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> PEARSE O'DONOHUE: But -- okay. Well, that's fine. I mean, I'd love to hear you explain your position. I don't quite understand everything that you have said. I'll try to make myself clear, and then I'd love to hear you.

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> PEARSE O'DONOHUE: But on terms of your question, Ceren, is that that is just the legal framework. What we really need to do now is to ensure that the technologies that we were working on, which can, unfortunately, we see in examples, work against the individual and work for the machinery, the system, the commercial interest that has put them in the market, that nevertheless, with the set of technologies, privacy-enhancing technologies, we want to be in the position where through our Next Generation Internet initiative increasingly we allow the user to control their Internet environment to a much greater extent, so there will be a greater element of choice, rather than the Internet environment defining them.

So the most recent example of a disconnect that we're seeing is that as individuals, all of us are now receiving huge amount of email traffic, in some cases completely pointless, from operators who are saying on the basis of what our obligations now are with regard to the GDPR, please click on something, please unclick on something, please inform us of your choices or whatever, when, in fact, the first thing, there was no need, there was no break in continuity, but there was a greater obligation now on those who held any of your data to ensure that you were aware of it.

And what we are clear of is that with the GDPR, an awful lot of users are not aware of how their data is held, what data is held, and how it is processed, and that is where the privacy-enhancing technologies have to come in, sitting on top of the GDPR, where we create an Internet that actually allows people to take the choices because as we know from practice with regard to a lot of the social media, for example, a significant number of users are not concerned about the use of their data. They are prepared to trade their data for services or for other things, and it is just a question of ensuring that they can do so in a dynamic way and with greater transparency, and that's really where we need to go now in the work that we're doing on data-enhancing privacy technologies.

>> CEREN UNAL: (Off microphone)

>> JONATHAN CAVE: Okay. Jonathan Cave from various places, mainly UK. The thing about the burden imposed on people by these GDPR emails, leaving aside the question of whether those emails themselves are not an illegal infringement of privacy, which, in many cases, of course, they are, is that it's works like cookie law. I'm concerned that the lessons of that cookie law have not been learned, which is that as with demonstrations of right to live in a place or work in a place or something like that, the process creates a requirement for making people aware and documenting that you have made them aware. It doesn't ask whether that's proportionate, whether people can understand the things to which they are being asked to give consent.

The thing about privacy, it's not about the identification of individuals because privacy is data protection. I mean, it's interesting that you mentioned the United States. I'm a U.S. citizen as well as a European citizen.

In the U.S., with these privacy -- with these positive rights, rather negative rights against the state, you do actually have privacy protection in the Fourth and Fourteenth Amendments. There's nothing analogous over here, it's about data protection. I am not my data. I resent being reduced to my data and being asked to spend a large chunk of my life curating my data so that people who are in control of this data have a fiduciary duty to understand them will be able to lay off liability onto me. Consent may not be an adequate basis for regulation in a world of such complexity and such speed, and that being the case, I'm not sure that GDPR doesn't represent a precautionary approach of a system trying to retain its traction within a domain where it doesn't really -- may not be the most effective insurer of a well-regulated system where the use of these data actually doesn't work against people's interests.

So I'm concerned that it's following the same work that's been done before, which is inform people, treat them as rational beings despite the fact that we know that they aren't, and do so by reducing privacy, which is a fundamental philosophical concept, to something which is manageable within black-letter law.

So I understand why it's happened that way. I welcome your notion that like personal data protection directive, it's going to be reinterpreted and reinvented, and we'll pursue this natural experiment, but I would be careful about claiming too much for it, and I know civil law processes tend to do that.

>> CEREN UNAL: Do you want to --

(No audio)

>> JAMES FENNELL: (No audio) -- encryption enables everyone to control their own data, and yet within the European Union, there's quite a lot of fear about encrypted technology, and, indeed, legislation to prevent encrypted technologies -- the possibility of encrypted technologies within the European Union.

>> PEARSE O'DONOHUE: Could you give me an example, please.

>> JAMES FENNELL: So there has been discussion, I think, between France and Germany about encryption. Maybe I'm wrong. The UK --

>> PEARSE O'DONOHUE: In the member states, several of the more security conscious or security (Speaking in non-English language)-- as you would say in French -- Member States --

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> AUDIENCE MEMBER: Susanne.

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> AUDIENCE MEMBER: Susanne, Susanne.

>> AUDIENCE MEMBER: Can you get closer so we can listen. The mic is not open.

>> JAMES FENNELL: So the point I'm making is there is a possibility of returning to a situation where -- privacy was a given, you know. If you sent a letter to someone in the 1990s, if somebody opened that letter, they had immediately committed a crime, but now privacy is not -- we've shifted, so privacy is now in the gift of the state. We may have it, perhaps, but it's not a guarantee, and encryption provides the possibility for everyone to reclaim privacy over their data, to own their data, and --

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> JAMES FENNELL: Yes, to be sovereign over their data, and yet on the one hand, we're trying to use a regulatory framework, like GDPR, to somehow control or limit -- or to somehow give privacy back to people rather -- where it's already, to some degree, been taken, but on the other hand --

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> JAMES FENNELL: -- isn't the future about people reclaiming their privacy by allowing everyone to encrypt their data and to not -- and, therefore, to be able to give it, as you said -- many people want to give it to other people, but to have control over that themselves.

>> PEARSE O'DONOHUE: Can I answer the question that was put to me?

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone) freedom of expression. If you don't subscribe to privacy --

>> CEREN UNAL: I'm so sorry. The mic is not open. If you could just introduce yourself.

>> JAMES FENNELL: My name is James Fennell, and I'm a former UK government employee, but I'm also working with BITNATION right now.

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> PEARSE O'DONOHUE: Okay. Thank you. The question that was put to me on the previous comment as well, which I think is very relevant, is that we can be very clear as the European Commission that we have stated very clearly that we want to allow end-to-end encryption, controlled by the user, and there have been discussions that -- and that's why I asked you to be precise as to when you said within the European Union.

>> JAMES FENNELL: (Off microphone)

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> PEARSE O'DONOHUE: Several member states have suggested automatically there should be several back doors. That's my experience with having discussions with several member states that I'm talking about, and the commission has flatly refused that for the very thing I was saying earlier on about trust and security. If people believe that -- what you have said about privacy being in the gift, we don't believe the privacy should be in the gift of governments. It should be a right. In Europe, under our charter --

>> JAMES FENNELL: (Off microphone)

>> PEARSE O'DONOHUE: In our charter, it is a fundamental right -- it is a fundamental right which has been backed up by the European Court of Justice. Some of the cases that people are very familiar with now, Facebook, et cetera, have been taken not in relation to an article of the law but actually on the provisions of that fundamental -- that charter and fundamental rights, so --

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> PEARSE O'DONOHUE: That's an interesting statement. What I will admit to --

(Overlapping speakers)

>> AUDIENCE MEMBER: Please, you keep doing that.

>> PEARSE O'DONOHUE: There have been over-literal interpretations and, in fact, in some cases, inaccurate interpretations, which have, if anything, created greater ill will by the average user to what is legislated and which was supposed to be to support them, but what we would, of course, ideally like to have is a situation in which the technology allows the individual to take their choices, but where I cannot agree with you is that at the end of the day, because it's a slippery slope, we need to accept that the individual is not a rational and cautious human being. In other words, individuals have to take responsibility. They have to curate their data.

>> SUSANNE TARKOWSKI TEMPELHOF: Are they capable?

>> PEARSE O'DONOHUE: Well, they are capable, that's just it, but we're making it more difficult for them because of the volume and the complexity of this repeated every single website you've ever given a piece of data is now coming back to ask you; whereas, we should have systems which allow you at point of access to have the same controls, perhaps AI-based, which will allow you to curate, as you called it, all of your interfaces, and maybe to break them down to different sections so you don't have to do it on a piecemeal bit-by-bit basis.

>> AUDIENCE MEMBER: (Off microphone).

(No audio)

>> AUDIENCE MEMBER: -- then with this discussion that we just made, we come to a point, then, if there is this collecting of data from different social media and from different people who actually collect our data in order for us to be more responsible for our data, then shouldn't be controlled what they are asking us because if you use one of the social media, he asked the permission to collect your data, then if you don't give this access and you don't get this -- let's say this option of using this, then where is the state controlling this data or this -- it's like an obligatory information that he wants to take from me to use Facebook, let's say. Then why don't we give the people the right to be more aware their data and the way they use their data and then control the others who are collecting this data in a better way?

>> JEAN GONIE: If I may -- sorry -- you are very correct. I mean, this is a passionate debate. This is nice, interesting, I would say, but this is not new. The privacy paradox is something we discussed for ages. On the one hand, effectively you can imagine that each people accept to put all this data online. This is because they want it. This the -- at the moment they are very happy with this, and this is something that is going, but I agree with you, this is about information, education, and this is very important, and I don't see any type of control. It's very important to explain to any citizen what is good and what is on the Internet online because a teenager doesn't know, doesn't want to know, has no idea about the right to be forgotten and that stuff, and the consequences are very important, so I said to my kids, well, don't do that, don't do that, but not everybody can do that to our own kids and even to their family, so that's very interesting, but I'm a strong supporter of this action.

We can say -- we can spend hours discussing on GDPR and we just have ten minutes left, but really, I think this is a very good step for protecting individuals because this is not rocket science, maybe. There is no maybe optimal solution, but this is really a very important one, and as you know very well, with the new artificial intelligence, Internet of Things, this will certainly -- this maybe no longer adopted maybe today already, but that's fine. We will adopt regulation. We're always lagging behind the new technologies, but that's fine, we just want to educate. Education is key, and people are smart enough to understand this.

>> JORN ERBGUTH: Well, I think Susanne asked a very important question, whom did we really empower with GDPR? Did we empower the user, did we empower governments, DPAs, Data Protection Authorities? Did we really empower maybe the big data processing companies?

One thing, when you look at GDPR, mostly government is excluded. They can make laws and they can process even more than they used to process, and even -- what an important example -- point, peer-to-peer technology is disfavored. You have an image of a centralized architecture and peer-to-peer technology runs into trouble, and we are discussing this heavily right now, and when you take a look, I have prepared a short example. For example, I have an image that I put on some -- well, some platform --

>> CEREN UNAL: Sophia.

>> JORN ERBGUTH: Yeah, Sophia. She has no privacy rights because she's not human. But, of course I do have -- when this image is then transferred to other platforms, it means I lose control. Of course, I can ask people to follow it, and what GDPR basically says -- everybody else has a justification, everybody needs to keep track, they need to keep track, that's my image, et cetera, and what this does, basically, is that the GDPR is producing even more personalized data in order to protect my personalized data.

>> SUSANNE TARKOWSKI TEMPELHOF: Might I add something to that.

>> JORN ERBGUTH: If I can just -- I'm finished in a bit, and then -- thank you. When I put the permission on the Blockchain, I can put it in a way that I cannot be followed and cannot be identified with myself just with the image, and then I can link the image to the permission, and when I revoke the permission, all providers have always to check if they still have the permission to keep the data, keep the image, and then they will need to delete the image to comply with the regulation, so they need -- there's no need to keep the trace, but you can put it on a decentralized system and prevent the collection of even more data in order to be compliant with GDPR.

>> SUSANNE TARKOWSKI TEMPELHOF: And then the very question, of course, is who is they who are going to check whose data it is, whose IP it is, Intellectual Property, right, or whose data it is? That means a third party can access all of your communication, all of your messages, all your private messages, all your government messages, all of it, right, so here -- here you go. It's not about tracing terrorism, it's not about tracing IP, it's about the ability from a few select individuals, like individuals in this room, to access everyone's personal data at all times for no peculiar reason.

>> CEREN UNAL: (No audio)

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> MIGUEL PEREZ: Hi, everybody. My name is Miguel Perez?). I came from the Spanish Internet Users Association, and I want to share with you some experiences that we have working now created with privacy and some experiences that we have worked with the privacy. In the last three years we developed some tools in order to have some information about who takes your data in the network or how is the value that you generated with your data. We developed some -- to know the money that you generate for Facebook, for example, you put this plug-in in your brochure, and you get the money that you generate each time you use your Facebook, and you have an idea what is the value of your time that you spent in this social network.

But now we are working -- tools are really okay, that helps, but it seems that nobody are interested in their privacy. Then we did detect that there are some things that we can change. For example, we don't believe the Terms of Use -- every time you have a long Terms of Use in your services, and nobody reads, and everyone says, okay, I agree with these Terms of Use, and we thought it would be interesting to change these Terms of Use by labels, by transparency labels in order to have something very simple, like food labels or -- in order to have the information -- not the information that the company wants to give us but the companies that are really important for us. Then we need to define these labels and to try to convince the other party -- the companies in order to adopt these labels, then to want to begin the discussion about is it possible to generate those privacy labels with a very simple message for end users. We think this will be possible. Then we want to discuss.

Another question is to have personal data about us, now you need to say in every application to say, okay, I want that information, I want to share that or not. Then it will be very easy if we said what my preference is only once and the companies check what are your preferences. That would be more rational for end users, and I think that that will be possible then. It's just to share with you these two projects.

And also, if you are interested in our tools, you can check (?)privacy.org. You can find these tools that we have there. Thank you.

>> CEREN UNAL: Thank you very much, and before the session, we were having this discussion about the prior information to be given to the end user, and we talk about this consent mania, consent vs. limited interest. I'm sure I'm not the only one who received tons of emails asking for my explicit consent. I mean, some --

>> AUDIENCE MEMBER: (Off microphone) who had no idea you were on their list.

>> CEREN UNAL: Yeah, exactly. That was interesting. Maybe Ani and Jean.

>> JEAN GONIE: I have a question for you. What's the -- I don't really know, in fact, because we say so many figures, so what is your cost if you spend, like, half of your day on Facebook and social media?

>> MIGUEL PEREZ: You can get this information there.

>> JEAN GONIE: Yes, I will visit your website, but can you share something with us?

>> AUDIENCE MEMBER: What we do is we check the value that the advertisers add for your profile in the -- for the platform, and then I check how many advertisers you receive, how many you click, and we can calculate the revenue for that.

>> JEAN GONIE: The debate at this moment precisely about the value of your data -- but it says many things about -- you know, this is -- okay. What is interesting is that enforceabilities are debated. Okay, on the one hand, if you want to have an access to something and because you now have your own data, you can have a choice. Choice 1 is you accept to have this access, but this has a cost. Then -- so this company should send you some money back. Option 2, you don't want to have this access, you don't want to have any advertising, and you don't have access to this service. This is really an interesting debate that we have today in force about basically I'm fine to receive some money back, but thanks to that -- for example, if I use Waze, you know, this application, thanks to Waze, I can drive all around the place and I have a lot of advice, but -- that's fine, but this is for free. Nothing is for free, as we know, so it means that today if you want to change this, you can give -- this paradigm, Waze can give you some money because you give your data that they use, and you have access to the service.

On the other end, Option 2, you don't want to use Waze or you don't want Waze to use your data, you don't have access to Waze, so this is a debate.

>> JAMES FENNELL: (Off microphone)

>> Microphone.

>> JAMES FENNELL: Sorry. So there's already (No audio) by --

>> CEREN UNAL: Sorry. There are people online. Or that. Yeah.

>> JAMES FENNELL: So there are already technologies which are -- I mean, it's -- if you like the Internet of Value is the next technological step, but there are already technologies and very successful businesses which are paying people for their data to create content on the Internet, so in many ways I think the next --

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

>> JAMES FENNELL: -- technological leap is going to be that people will automatically be paid for providing data, and that will create, hopefully, vast amounts of new value and new economic growth, you know, globally.

>> SUSANNE TARKOWSKI TEMPELHOF: No, if you look at (Off microphone)

(No audio)

>> ANI NOZADZE: Thank you. If we go back to the consent mania, of course, consent has to be voluntary. In the other room, when we were discussing the GDPR issues and one of the issues with consent is -- for example, if it a company's offering you a service in exchange of your consent, how voluntary can it be considered? And as for -- we were discussing about informed consent and how much the -- how much the users are really informed when they are giving consent.

I guess one of the roles of the DPAs as well as -- while assessing if the consent was duly obtained from people is to see whether people really understand what they are saying yes to; however, that is, of course, not easy because different people have different -- well, different understandings of things, and at least, like, what we would encourage would be to draft these consent clauses with -- as easier -- as easier as it could be or in a simple language, but also we should not forget -- and the companies, I think, forgot about that -- that consent is not really the only basis for collecting and using data, so there could be other legal basis for that as well, so we should definitely not be overwhelming our users with this.

>> SUSANNE TARKOWSKI TEMPELHOF: Thank you. Actually, if you look at the Steemit, it does not sell user data, Steemit sells attention, which is a completely different issue. What Steemit rewards users for is a number of views, which are up against a very clever reward program, but it can most closely be associated with something like flight miles, for instance. That's the closest nondescriptive you could have for something like Steemit, and it does not gather or sell anyone's personal data whatsoever.

>> CEREN UNAL: (Off microphone)

>> JORN ERBGUTH: Of course, personal information is variable, but I'm not sure if we are already ready to go that way. When you look at what is imposed now is they have a special EU subscription without any advertisement, which is much more expensive than the user subscription model, and we are talking about choice, but EU citizens don't have a choice anymore to be exploited, and receive cheap content, so are we ready to accept that or are we not?

>> CEREN UNAL: (Off microphone)

>> PEARSE O'DONOHUE: Since quite surprisingly Europe has drunk the Kool-Aid of competitive markets and their particular notion of consent, you would have to say no because nobody ever asked us, at least they never asked us in those terms.

On the question of consent, I think it's important to be clear about this. You're right, many of these contracts are contracts of adhesion. You don't have a fair choice and intermediates between clicking yes and clicking no are not negotiable. In responsible research and innovation, we talk about the negotiability of these ethical costs, and it's absent from this, of course, because it's a regulation. It's a cost to become informed.

If you have, for example -- there was a trichloroethylene spill in Los Angeles, and they notified the people that their water was poisoned, and there was nothing they could individually do about it, but consent is informed on individuals, not groups. What they could do is get angry politically and do something about it, but they couldn't afford bottled water. All they did was take a risk they couldn't effect and shove it right in their face. Unambiguous made them more soft, and that fear does have a cost. I was very struck by this notion that we shouldn't be thinking of people as I rational and stupid. For me, the release of information in my life, is not the me that would like to be forgotten at another point or remembered at another point. Because my positions are not consistent, you can't hold me today accountable in any normative sense for the actions of my future sense. Moreover, it may be an event that causes me to change my mind about whether my data should be released. The if I'm not aware of the event, I can't choose that, so you can't infer what my positions would have been if it's not feasible to replicate.

Another point is it's not my personal data, as your picture shows, it's many people's personal data, and it may be difficult to sustain across all the different uses. That's magnifying the cost because most of what's observed about us is not observed about us as private individuals but rather as people acting in society. Then finally the issue of value, the value which I'm controlling by my consent, my data have one use to target me as part of the discussion that I might like to make things better, but of course, you don't have to identify me by this. That's just the way GDPR chose to tagline it. There is such a thing as differential privacy. You can get part of the value by partially identifying me. We quite understand this in general, and the two extremes are you observe me and target me and give me an add, or you collect my data, put it into a database, and use analytics to understand how people like me behave.

In both cases you release value from it. In one case, the value is entirely released from my pocket; in the other case, the value is more of a social value. If we give people too much consent, for example, around health situations, then some people will consent; other people will not. The data sets will change. They may change in a way that's correlated with the thing that we care about. That selection bias undercuts the use of analytics to deal with these data, so if I only get privacy insensitive people volunteering to tell me about the side effects of drugs that they take, for example, if that privacy insensitivity is correlated with the thing that makes them sensitive or not, our decisions, our decisions as to how to treat people with medical conditions and so on, those decisions are undercut. These are not simple things to be dealt with either at the level of society as a whole or the lone individual, and so I think there are real dangers in trying to stick to one or another end of that continuum.

>> CEREN UNAL: (Off microphone)

>> SUSANNE TARKOWSKI TEMPELHOF: Okay. We have been debating for a long time and it's coming to an end, so if I can say something that really matters to me, as an entrepreneur, as a European citizen, and as a software developer is that keep your hands off my code, and don't pressure the App Stores, don't pressure the Google Play Store, don't pressure -- don't make our lives impossible. We're just trying to make life livable and provide better software for everyone around us. We are not asking you for money, we don't want your money, we just want the freedom to create software that people find usable, and we don't want anyone to get in our way while doing so. We want our encryption alone, we want our crypto.

>> CEREN UNAL: Oh, Jean, please.

>> JEAN GONIE: Thanks. I just wanted to say two additional things. One is based on what you said, I like very much, and the other one is what Susanne said, that generally speaking, I -- I mean, of course, that's important to encrypt and to protect the maximum, but I'm always very surprised with the fact that we put far more pressure and passion on this online world vs. the offline world because the condition -- when you buy your machine whatsoever, no one reads it. Maybe someone in this room is reading the Terms and Conditions, 20 pages, but no one is reading that. Same for offline work. Maybe that's good; may be that's not good, I have no idea, but this exists. As a consumer, this is good with rights or issues when you guy your car and it shows you read everything. This is exactly the same. This is good that we overprotect ourselves because of this effect of the Internet. That's very good, but we also shall always remind that if we are doing that on the online world, we're still bearing in mind for the offline world, this is the same.

When you go to a supermarket, what you maybe don't want to buy is just on the forefront because guys pay for this, which makes sense, and this is advertising. Online, digital advertising, you have a lot of rules and security, which is very good as well, but it's just never forget the offline world, the world we live every day, the conditions aren't specific enough.

On the point that you mentioned about the fact that it's not about -- that -- of course, let's say that by design or by default, people are smart, okay, but the problem of those guys, of --

>> AUDIENCE MEMBER: (Off microphone)

>> JEAN GONIE: Intelligence. We can discuss about semantics. Let's say people have a strong appetite to do what is good for them. The problem is what people want is basically to be informed, so we come back to what this says. This is about, for me, education. When you go on a certain web page, what you need to understand is the consequences of that, so you want to buy a ticket online, I mean, no one will hit the privacy settings and privacy statements, they just want to buy that ticket, so it's important that they have clear terminology that explains that we buy the ticket, one, we'll have some advertising from a lot of companies; two, they will not be able to do that except this has to be done in a very simple and explicit way, so we're coming back to the user control, which is not new. User control is something that we all mentioned for many years, but this is about maybe we can imagine privacy dashboard or explicit notice. This is what, to some extent, is with GDPR, maybe with other directives, but, again, this is about information and understand the consequences of what you're doing online. This is for me the most important. Thank you.

>> JORN ERBGUTH: Well, I'm afraid this will not be a very user-friendly future. We will have more check boxes, and it's not enough to check a 50-page document because it's not relevant anymore. You have to give explicit consent to every single thing and you will be checking dozens of check boxes in the future, and we will all feel very empowered by that.

(Laughter)

So what is the solution to this? It's -- the solution is something like a do-not-track system, meaning that you have your privacy preferences but not in the way the track does it. It's not that you say I don't want to be tracked, but you will state what you are allowing people to do, and then this configuration can be sent as a default setting, and they don't have to manually ask you, but you give consent because you say I'm okay with this, this, and this, and I'm not okay with the rest.

>> JEAN GONIE: If I may, yes, but I think this already exists. The idea of privacy settings and privacy dashboard, but, again, it's not because this is by default that you want this, so as a user, as a citizen, I maybe would like to know what has been permitted by default because I would like to change them, and there is not a single response, but it's not something that everybody wants.

>> JORN ERBGUTH: Before it's optimum privacy. It's a lot.

>> JEAN GONIE: I mean, what is optimum privacy. For some, it would be I don't care, I just want to do this. For others, it's privacy products. It's not easy. Who can decide what is the best for the citizen? That is why you have these rules today.

>> SUSANNE TARKOWSKI TEMPELHOF: Okay. So I've been in the crypto world for quite some time, and I do think the next great battle with encrypto over the next decade to come will be between the extreme privacy camp and the extreme transparency camp, which you may not thing are opposed but which are opposed, and I know which side I stand on, so actually, when people speak about radical privacy, you know, it's easy to say, like, yeah, there is no such thing as radical privacy because -- okay, so let's say for instance, let's take an example. Let's say you want to buy a phone in a store, so obviously you can't drive there with your car because your car has like the GPS system, you can't be shown on the street because it might have cameras, you need to mask yourself. Obviously, you can't use a card, you can only use the cash, but where do you get the cash from, on the other hand, and whether you use the phone, you can't use it with any other devices because the devices connect to each other, so if you're followed by intelligence, they all make the connection. So if you buy a banner phone, if you actually care about privacy, you have to go to extremes, so many people come from the perspective of saying, well, there's no such thing as radical privacy. That might be true, but there is also know such thing as radical transparency on the other side, but there is such a thing as plausible privacy. There is such a thing as controlling your own data. There is such a thing as not being monitored every single day, every single waking hour, by every single communication means. There is such thing as actually having a private life and a public life that are separate, and I think, you know -- and the government cannot and should not get involved in that because that will guarantee we'll never have it, that will guarantee that we are right back into Orwellian 1984 sort of setting.

The only people who can guarantee we'll have it are individuals themselves who are claiming their own privacy and their own security.

>> CEREN UNAL: (Off microphone)

>> AUDIENCE MEMBER: Oh, sorry, sorry, sorry, sorry.

>> AUDIENCE MEMBER: I think that's a common misunderstanding that transparency and privacy would fight each other. I really don't think that's the case, and it's really about transparency of administration and governance that a lot of people who are -- who feel strongly about having good privacy, so that's a usual confusion. People don't have to be transparent about their own life, so don't think there's a contradiction there. Doesn't need to be.

>> SUSANNE TARKOWSKI TEMPELHOF: Okay. So please define governance. What is governance? Is governance the United Nations? Is governance the European Union? Is governance a company? Is governance an NGO? Is governance a family? Is governance a relationship? Is governance a startup? So anything that is more than one unit becomes a governance, so assume there is more than one person that becomes a governance. So if you are a unit that is free, let's say a polyamorous unit, that becomes a governance situation which should be transparent to the world, even if you might live in a place like Iran, where it's punishable by death.

>> AUDIENCE MEMBER: (Off microphone) how you think that what you're explaining right now can be managed because you are saying something that maybe you think you are explaining but you are not explaining. We are trying to manage as best as -- in the better way possible the way we -- they, whoever, controls these data. That's what we're trying to do, no, by regulation? I don't understand your point.

>> CEREN UNAL: I think it's like the main -- the main objective of this EuroDIG meeting is creating -- regulation is not the only solution, as we say --

>> AUDIENCE MEMBER: Yes, I agree with that, but how she's proposing it --

>> CEREN UNAL: No, I'm afraid we're running out of time. I would be happy to start the discussion. We started eight minutes late. We're going to end eight minutes late. Before -- especially when I'm discussing this with younger people, I'm observing that they don't share -- I mean, almost all of the discussions we're having right now is based on our analog notion of privacy, and since we have some youth fellows here, I don't know how you perceive privacy because I think there's a -- I mean, we don't define privacy the same way, we don't experience privacy the same way, so how do you -- any youth fellows who would like to contribute on that? No? Out of the blue?

>> AUDIENCE MEMBER: So to me, I mean, I research on privacy, so I do care about privacy, but I -- I could be an exception among my friends because I've got to -- I see ridiculous stuff where they do not care about it at all because I think somehow it's thought to them, but there are also friends of mine who do care about privacy, so like my age, I think it's really helpful, but if you go to younger generations, like Generation Z, it's the age. Thank you.

>> AUDIENCE MEMBER: That's fine. I just wanted to share my opinion. You know, I'm also part of the young generation or -- how we call it.

>> CEREN UNAL: You have to be proud. Don't be shy.

>> AUDIENCE MEMBER: I'm proud. I'm proud. And the thing is that, you know, it's my decision what I share with the world, and I can set it up with any service I use, so if I agree to Terms and Conditions and the things they use, I don't really see a point in -- you know, in, like, being concerned of data I share. You know, I'm aware of what I share and I'm okay with that, so, like -- you know, I don't really see a reason why I should pay too much attention to data protection.

>> CEREN UNAL: So it's -- I'd like to have some final remarks. Yes.

>> SUSANNE TARKOWSKI TEMPELHOF: Just very quickly -- I'm sorry to interrupt your final remarks -- I would say, no, that's incorrect. You need to own and control your own data, it's not just that you choose whatever you -- you know, Facebook says you can share with everyone. You actually need to own it on your own, whatever, IPFS or server or computer. Do not ever, ever trust a provider ever. Sorry about that. Thank you.

>> JORN ERBGUTH: Well, I think privacy is about self-determination, so if you say I want to cry this out to the world, I want to publish, I want to say I've been there, please, everybody should know it. That's fine. That's not against privacy. It's your privacy right to decide that you want this to be public, and it's your privacy right if you reverse your mind that you say -- say afterwards --

>> AUDIENCE MEMBER: (Off microphone)

>> JORN ERBGUTH: So it's perfect. There's no contradiction here. I don't see it.

>> CEREN UNAL: (Off microphone)

>> JEAN GONIE: Everyone has their own views on privacy, which is nice, interesting, but for me, privacy's about trust. This is -- well, we can discuss about this notion, but this is really about trust. Why? Because this is about controlling our valued data, as someone that owns his personal data, but, again, we come back to the privacy paradox where the young generation, even if we are not that old, but my teenagers are doing the same. I know that you are no longer teenagers, but they are still doing the same, but they'll start to be educated again. This is about education, so the privacy paradox, maybe no one cares about having his data used by another company. Others care, so we need to care about those that really care and educate those that don't know the impact, the consequences of that, so privacy, again, is about trust because if we have this trust, we go to the things we've heard over the last two weeks, and I'm sure that dozens and dozens will occur in the future, so this is the whole of us as a company to be accountable, so basically meaning that we need to behave, of course, very well. We need, of course, to follow the law, but we need to do more if we can. We can offer a privacy dashboard, we can -- we can encourage companies to do more, and not -- to follow as a whole, this goes without saying, but to be a good company, show the examples on the way they deal with data.

>> ANI NOZADZE: I think I would add to that that the roles of the DPAs is also raising the awareness of the public, and one of the top priorities of the Georgian DPA, especially taking into consideration we've been here -- we've been operating only for five years now is also that, and when we're talking about other things like people not realizing the decisions that they're making and not being able to make informed decisions, I think the role of Data Protection Authorities is definitely one of the biggest roles is to educate the public in order to -- in order for them then to take decisions whether to share and not care about privacy or to protect it more, so I guess --

>> CEREN UNAL: Thank you. (No audio)

>> AUDIENCE MEMBER: The big problem at the moment is the way that the Internet has evolved is that our data is held by too few institutions, too few companies, too few governments, so what we've had is a process of all our data being consolidated in very few sources, and for that reason, it is not private, it can be compromised, and often it is compromised. We have a new wave of technology which makes it possible for us to hold our data in a decentralized way, which means that each of us can control that data and have control over who gets access to it. We're not dependent upon the regulations that are placed upon these very few institutions which hold so much data, so much control over the information which runs our world.

For me, the hope is that that technology will take hold and will transform the whole context in which this discussion is taking place so that data is in the control of individuals and individuals give through encryption to decide whether or not they give it to institutions or not in the future, and for me, I think right now we're having to over-regulate because we have very few institutions holding all this data in. In the future that will not be necessary because each of us will have control over our data.

>> CEREN UNAL: Thank you very much. That was a wrapping up point. We're going to turn to our reporter. She's going to share the bullet points that he took note of that you're free to object at any time.

>> REPORTER: That's correct. I'm here in front of Geneva Internet Platform and the Digital Watch Observatory. What we're going to do is publish full summary reports of all sessions, including this one, which you can find at the dig.watch as well as EuroDIG Wiki page, but now I will share with you five messages that I kind of gathered from all this fantastic discussion and what we're trying to do is gather rough consensus in the room, so bear with me.

First, privacy should be the center issue in everything we do and it's central parts should be privacy by design and security by design.

>> JORN ERBGUTH: No. But -- privacy's important, but it's not the most important thing in the world.

>> SUSANNE TARKOWSKI TEMPELHOF: Yes it is. Come on.

>> REPORTER: I will make sure to note if someone says no.

>> SUSANNE TARKOWSKI TEMPELHOF: Privacy is the most important thing.

>> REPORTER: Yes, that is the statement, and I will make sure how many hands is for no, I will make sure to note it.

>> PEARSE O'DONOHUE: Can I just point out that pursuant to what you say, health is the most important thing in the world when you're ill. Financial security is the most important thing in the world when you're poor.

>> SUSANNE TARKOWSKI TEMPELHOF: We see your point.

>> PEARSE O'DONOHUE: Privacy is important (Off microphone).

>> SUSANNE TARKOWSKI TEMPELHOF: (Off microphone)

(Overlapping speakers)

>> CEREN UNAL: I think we should move on.

>> REPORTER: Okay. Second, privacy is about a demonstration and the proof of trust, and entities collecting users' data need to be proactive in ensuring transparency and accountability.

>> SUSANNE TARKOWSKI TEMPELHOF: Wrong. Please note the objections.

>> CEREN UNAL: Okay. (Off microphone)

>> REPORTER: I'm sorry, maybe I wasn't -- I apologize, maybe I wasn't clear. This is not about prolonging the discussion, but just like you saying --

>> (Off microphone)

>> REPORTER: Exactly. You can put fingers. Well, these fingers at least.

Three, users are willing to share their data for a service; therefore, education and information is key for users to understand what is at stake and to take responsibility.

>> ANI NOZADZE: Well, maybe some users are willing to --

>> REPORTER: Okay. Thank you. I'm noting that. Two more to go. Information that companies ask users in order to access a service need to be controlled.

>> ANI NOZADZE: Sorry, can you repeat that?

>> REPORTER: Yes. Information that companies ask users in order to access a service need to be controlled.

>> AUDIENCE MEMBER: What do you mean by controlled?

>> REPORTER: Because you were talking before like what questions they were asking and what type of information they were asking users for because sometimes it makes no sense --

>> AUDIENCE MEMBER: That is not very clear.

>> (Off microphone)

>> JEAN GONIE: (Off microphone)

>> AUDIENCE MEMBER: Yeah. User control would be a better term.

>> PEARSE O'DONOHUE: Maybe not. I like the way it's phrased now. (Off microphone) government regulation.

(Overlapping speakers)

That if there are legitimate interests, that you have a voice. I think it's -- it shouldn't be made longer. It just gets --

>> REPORTER: And five, right to encrypt. We heard lot about encryption. Right to encrypt should be everyone's right.

(Overlapping speakers)

>> SUSANNE TARKOWSKI TEMPELHOF: It should be a negative right, meaning like the First Amendment. It's a right the government cannot infringe upon.

(Overlapping speakers)

It's something the government can't infringe on.

>> REPORTER: May I suggest a right to encrypt should be available to everyone?

>> SUSANNE TARKOWSKI TEMPELHOF: No. That's a positive right. We're talking about negative rights, something the government should never be allowed to prevent.

>> AUDIENCE MEMBER: (Off microphone)

>> AUDIENCE MEMBER: (Off microphone) First Amendment right.

>> JORN ERBGUTH: Can we go further and say government should not have the right to break encryption?

>> SUSANNE TARKOWSKI TEMPELHOF: No, no. Government should --

(Overlapping speakers)

-- to prevent encryption, to even look at encryption. (Off microphone)

>> JEAN GONIE: You had a good formula. Maybe we can use that -- maybe we can say the right to encryption -- I disagree, but that's fine -- could be offered together with privacy enhanced technologies.

>> AUDIENCE MEMBER: Could be offered with?

>> JEAN GONIE: Could be offered with privacy enhanced technologies to users.

>> SUSANNE TARKOWSKI TEMPELHOF: Absolutely not. This is not something the --

>> CEREN UNAL: Please just --

>> SUSANNE TARKOWSKI TEMPELHOF: This is something a government should not be able to comment on or be involved in.

>> CEREN UNAL: (Off microphone). Okay. Thank you, everyone.

(Session concluded at 1604 GET)


This text is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text is not to be distributed or used in any way that may violate copyright law.