Crypto Wars 3.0 – can privacy, security and encryption co-exist? – WS 05 2021

From EuroDIG Wiki
Jump to navigation Jump to search

29 June 2021 | 14:45-15:45 CEST | Studio Trieste | Video recording | Transcript
Consolidated programme 2021 overview / Day 1

Proposals: #11 #22 #47 #59

You are invited to become a member of the session Org Team! By joining an Org Team, you agree to your name and affiliation being published on the respective wiki page of the session for transparency. Please subscribe to the mailing list to join the Org Team and answer the email that will be sent to you requesting your subscription confirmation.

Session teaser

Providing law enforcement with a possibility to break encryption will always weaken the privacy of the communications for everyone. However, law enforcement can still lawfully access encrypted information for example via intercepting devices when the information is decrypted or via other many solutions put forward by technical experts that respect the rule of law and fundamental rights; technical proposals to this effect have been circulated in Brussels in the past.

Session description

In general, lawmakers, civil society and the tech industry agree that the use of encryption is a necessary means of protecting fundamental rights and the digital security of citizens, governments, industry and society. However, the debate on the relationships between encryption, privacy, online harms and the needs of law enforcement agencies has become more topical once again.

Broadly speaking, civil society and tech companies have been in agreement that further encryption is necessary to protect the privacy of individuals, albeit with some differences of view in how this should be implemented, noting for example the current public skirmishing between Facebook and Apple. On the other hand, many law enforcement agencies and legislators have been advocating the need for access to data to fulfil their obligations to protect society from crime, terrorism and other harms.

Representatives from some of the key groups are being assembled to take part in the workshop and also to identify relevant background materials so that the discussion can be focused on the key points without spending time on context setting.

Some of the questions to be addressed during the workshop:

  • Should privacy of the individual take primacy over all other considerations? is encryption erroneously being conflated with privacy?
  • Are tech companies hoping to use encryption to avoid having to comply with potentially arduous regulatory requirements that relate to content?
  • Are law enforcement agencies and others using unjustified scare tactics in an attempt to push lawmakers to break encryption?
  • Will methods that allow law enforcement agencies to break or circumvent encryption always weaken that encryption and ultimately help bad actors?
  • Is the argument moot anyway because a combination of AI and quantum computing will render most encryption ineffective?
  • If some kind of backdoor were to be built in, which countries should have access to them?

Format

Until .

Please try out new interactive formats. EuroDIG is about dialogue not about statements, presentations and speeches. Workshops should not be organised as a small plenary.

Further reading

Amnesty International: Encryption, a matter of human rights

EDRi: Encryption workarounds

EDRi: Position paper on encryption

European Council on Foreign Relations: No middle ground: Moving on from the crypto wars

Europol: Second report of the observatory function on encryption (2020)

Europol: First report of the observatory function on encryption (2019)

People

Until .

Please provide name and institution for all people you list here.

Focal Point

Focal Points take over the responsibility and lead of the session organisation. They work in close cooperation with the respective Subject Matter Expert (SME) and the EuroDIG Secretariat and are kindly requested to follow EuroDIG’s session principles

  • Andrew Campling
  • Diego Naranjo

Organising Team (Org Team) List Org Team members here as they sign up.

Subject Matter Experts (SMEs)

  • Tatiana Tropina
  • Polina Malaja
  • Jörn Erbguth

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member by subscribing to the mailing list.

  • André Melancia
  • Vittorio Bertola, Open-Xchange
  • Diego Naranjo
  • Andrew Campling

Key Participants

  • Dan Sexton, CTO, The Internet Watch Foundation
  • Iverna McGowan, Director Europe Office, Centre for Democracy and Technology
Iverna McGowan is Director of CDT’s Europe Office, and an advocate for ensuring international human rights standards are at the core of law and policy related to technology. At CDT, Iverna leads the Brussels-based Europe team that works to put human rights and democracy at the center of the European Union and its member countries’ tech policy agendas.
Prior to joining CDT, Iverna served as a Senior Advisor to the UN Office of the High Commissioner for Human Rights.
  • Jan Ellermann, Senior Data Protection Specialist, Europol
Jan Ellermann works as Senior Specialist in Europol's Data Protection Function and provides operational data protection related guidance across the organisation. He holds a doctoral degree in law and has published various articles on data protection and information security related topics. Jan is a certified data protection auditor and has obtained a Master of Science degree in Forensic Computing and Cybercrime Investigation at the University College in Dublin (UCD).
  • Robin Wilton, Director Internet Trust, Internet Society
Robin Wilton is the Internet Society’s Director, Internet Trust. He is a specialist in online privacy and digital identity, with over 30 years’ experience in systems engineering, consulting and industry analyst roles.
Robin joined the Internet Society in 2012, and has represented it in the OECD’s Internet Technical Advisory Committee, the Council of Europe’s committee on privacy and data protection, and in numerous industry forums. He recently led a project to produce Personal Data Protection Guidelines for Africa in conjunction with the African Union Commission.
  • Dr Stephen Farrell, Trinity College Dublin (an active IETF participant, former IAB member)
  • Professor Ulrich Kelber, the German Federal Commissioner for Data Protection and Freedom of Information

Moderator

  • Tatiana Tropina

The moderator is the facilitator of the session at the event. Moderators are responsible for including the audience and encouraging a lively interaction among all session attendants. Please make sure the moderator takes a neutral role and can balance between all speakers. Please provide short CV of the moderator of your session at the Wiki or link to another source.

Remote Moderator

Trained remote moderators will be assigned on the spot by the EuroDIG secretariat to each session.

Reporter

Reporters will be assigned by the EuroDIG secretariat in cooperation with the Geneva Internet Platform. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.

Messages

  • Trust in encrypted communication is necessary in a democratic society, but this trust will be undermined by simple access of authorities to encrypted messages. This would be the end of free communication: it will not prevent criminals from encrypting their communication in an unbreakable way, but it will weaken everybody’s encryption. Solution thus cannot be worse than the problem.
  • The focus should shift from generic regulations on allowing law enforcement to break encryption to open discussions with law enforcement agencies on the requirements for when and how to do so.
  • False framings and false dichotomies around encryption, privacy, and security should be avoided. Terms and concepts need to be better specified to avoid misunderstanding and inconsistency around their use.
  • Better multistakeholder engagement to consider the consequences of technological advances is necessary. At the EU level, concrete actions need to be taken to ensure formal structures for such engagement and to overcome the existing divide of frameworks and scattered discussions and debates.

Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/crypto-wars-30-can-privacy-security-and-encryption-co-exist.

Video record

https://youtu.be/UPKluRbP77A?t=17687s

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-482-9835, www.captionfirst.com


This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.


>> TATIANA TROPINA: Hello, everyone. I hope that I’m in the right room, in Trieste. Our host is Markko Zennaro.

>> MARKKO ZENNARO: Yes, hello.

>> TATIANA TROPINA: Very nice to see your faces and very nice to see your names and very nice to see so many friends here. We are here for the workshop number – and I forgot the number of the workshop. I’m sorry. Five. Right. Thank you.

See how bad I am? I’m a lawyer by background.

But we are here for the workshop number five on “Crypto Wars 3.0” even the name – even the title of this makes me wonder, are these really wars? Can privacy security and encryption coexist? I’m sure that we perhaps will not be able to dissect this issue to unpack it fully in the short amount of time that we have, but I hope we can kick off and have a very good discussion because we have a stellar panel in front of us. And we are joined by first of all, Dr. Stephen Farrell from Trinity College Dublin who is an active IETF participant and former member of the Internet Advisory Board.

Then we have Jan Ellerman who works for Europol as the senior data protection specialist. Hi, Jan. And looking at your profile picture on Zoom, I wonder if cats can be your post specialist, but this is something outside of our discussion.

And we are joined by Iverna McGowan, who is the director of Euro office Center for Democracy and Technology. We have Professor Ulrich Kelber, who works for the German Federal Commissioner for data protection and freedom of information. And then Dan Sexton, CTO of the Internet Watch Foundation and Robin Wilton who is the director of Internet trust at the Internet Society.

So we will start, of course, with a short intervention from our speakers. But before that, I would like to ask Markko to display a poll which we created for all. We do hope that speakers and also those who attend can participate in this poll, Markko, if you can ping me when the results are ready to be shared. And the questions we want to hear from you about – by the way, our speakers, feel free to press the answers in the poll as well. So option A, encryption should not be a barrier to law enforcement actions. Option b, defending encryption from any backdoors is crucial to protect privacy. C, encryption is a crucial privacy technology that requires some limited revision to allow the law enforcement action. And the last option, you haven’t thought about this. I hope this will not be the speakers.

And with this, let me start with Dr. Stephen Farrell. Stephen, what do you think? Can privacy security and encryption coexist? And the floor is yours. You have roughly three minutes.

And I cannot hear or see Stephen. You are online.

>> STEPHEN FARRELL: So the audio is fine?

>> TATIANA TROPINA: Yes, absolutely.

>> STEPHEN FARRELL: Yes, I guess I would say this is not crypto wars. In the ’90s, crypto was a kind of special product category. It kind of wasn’t that hugely important, but that’s actually not the case today. So we don’t really have that many specialist crypto companies any more. Crypto is everywhere and feed everywhere. It’s available in kind – nearly every product uses it, nearly every open source uses crypto and programming languages use it. If you look at Go or Roast, you have built in things like TLS. It’s not the case that we are dealing with the scenario.

The another way to think about it, if you think about regulating crypto is a good thing to do, you are suggesting something equivalent to regulation of for loop, in programming language, which to some that makes sense. At some level, it doesn’t make sense about regulating what simply is mathematics and programming.

So going forward from that, I think if you look at the title of the session, I mean, I would say it’s really not possible to have any privacy or security without strong crypto. So when I say strong crypto I mean well implemented, modern ciphers that have essentially no backdoors.

The idea of these weakened models where you put backdoor into the crypto, this is not technically doable in any sense of way. The last concrete proposal for how to do that was essentially back to the clipper proposal in the 1990s from the Clinton administration. That didn’t work.

I think the same is true. We heard some halfhearted possible things that might be a proposed, might not be, but without detail, it’s like squaring the circle. It’s hard to implement crypto well and given the use of a good application. Build in a weakness of any kind is just asking for big trouble.

So I think what we should be discuss, what are the requirements that really exist. If you look at law enforcement, they have real requirements but they historically, used we want to record anything or decipher or anything or see it all. That’s not clearly valid anymore, right? We have people using kind of Internet protocols to interact with heart monitors there’s all sorts of telemetry flying around the world. It’s not clear that there’s any real need for any law enforcement agency to ever interact with that traffic.

If there’s some that law enforcement should not be able to get at, that might lead to an interesting requirements discussion as to what really needs to be got at and how you go about getting at it. And I think the last point I would make, to be brief, is that I think thinking about this as crypto as a technology is the target of what needs to be regulated or broken or whatever you want to call it.

I think that’s the wrong way to think about. It instead, I think what we should be thinking about are more important things like there’s large commercial entities if reaching privacy in better ways. It is a much better use of our time to look at that situation than it is to do – stumble into the unknown by trying to break things that are necessary everywhere. And that are literally everywhere. Every open source package, every programming language, every tool that’s developed these days makes and requires unbroken crypto.

I could keep talking if you like or hand over to the next person.

>> TATIANA TROPINA: Thank you very much, Stephen. We’ll see if people have questions to you. I’m shoe pointed to a lot of things that we probably cannot discuss all of them during this session. But I would like to hand it over to the next discussant, Jan. You work for Europol. You are the law enforcement.

We would like to hear from you, what – how much data is too much? How much you want to intercept? How much does this actually breaks your ability to investigate and what to do about this?

>> JAN ELLERMANN: Yeah. Thank you very much, Tatiana for the floor. And good afternoon, everyone. Indeed, my name is Jan Ellermann and I work as a senior specialist in Europe for data protection and I joined the organization back in 2007.

I’m advising many on what we call operational data protection and so our operations department or data protection matters. And, yeah, thank you for having me for this important debate. A special thanks to Geovany from IWIP. I’m honored to be here and share this session with very honorable co-speakers and there’s many participants joining us here today would are bringing in their own expertise which is great.

So my understanding is that this is not a normal panel discussion but rather a workshop and in this context, let me say that I feel that sometimes this encryption debate is led in a very emotional way.

If we look at this panel, for instance we read crypto wars. That sounds cruel. We talk about as law enforcement, we talk about going dark. So dark is maybe a good keyword here. Our whole debate is a little bit too much just in terms of black and white. So there’s no middle ground. It’s a kind of zero sum game.

And having said this, on the other hand, encryption is difficult. If you look at it from a computing perspective. There are so far only ones and zeros and until we have quantum computing, there’s nothing in between. And maybe this is a bit illustrative for the debate we are having. On the other hand, at Europol, within the data protection and beyond. We firmly believe in enhancing freedom and security.

So to provide you with one example, in May of 2016, we organized the first event for EDEN, the next event will be in Rome on the 18th and 19th of October. If you would like to join us there, you are more than welcome. But back in the days, we had this slightly bikie header, privacy in the digital age of encryption and anonymity online.

If you want as an outcome of this event, we had a joint statement by EU cybersecurity and Europol. And they respect 21sturery data protection and our core message is that we do not believe in a concept of backdoors. Simply because ultimately, we would be shooting into our own foot. Backdoors would not only be used by the good guys and with good guys I obviously refer to us, even though I’m a bit biased. But as a matter of fact, criminals would inevitably, and as Stephen mentions, the encryption for those who need to survive repressive regimes and certainly last but not least the importance of encryption as a privacy preserving technology which is, if you want a value in itself in any democratic society.

However, and here’s the but, we certainly need to be conscious of that criminals take advantage of encryption in order to pursue their very own goals. And one of the questions to be addressed according to the workshop wiki is should privacy of the individual take primacy over all other considerations? So where you will probably not be very surprised working in law enforcement, my answer can only be a clear no: I mean, of course, privacy does not prevail over everything else. Of course, we need to find ways to fight serious crime and terrorism, despite the fact that we also need encryption.

So in other words we need to be conscious of the damage caused by criminals and when we bring forward this argument, there’s sometimes a response that okay, now they are using scare tactics again which I find a bit unfair maybe, but we can talk about it later, if you want.

But for me, it is like if you are maximizing your own freedoms by claiming that privacy always prevails, that’s a bit like accepting the infringement of rights of victims out there, as a collateral damage of your own libertarian lifestyle. And at least for me, this is a bit too simplistic. We have a joint responsibility to do better than that, I believe, and in this context then, a lot has been said about balancing rights in this context.

So I don’t like balancing that much to be honest, and if you want pore security, you need to sacrifice your freedoms or if you want more freedoms, you need to sacrifice on security. Again, we can probably do better on many occasions, and in concrete terms, again, I don’t believe in backdoors. But I do believe that law enforcement needs a very clear mandate to break encryption, as you call it, in individual and where justified cases.

And I know that some of that is then framed again as government hacking or hacking back, and, again, already the terminology seems to imply that law enforcement is inclined to use illegal techniques. That’s an issue.

I also believe that the famous argument a suspect can under no circumstances be completed to turn over a password are unlock his or her phone, that needs to be carefully assessed on a case-by-case basis as well.

So there’s the right not to self-incriminate is important, no doubt but let’s not forget that we also generally accept that an individual can be forced to provide a blood sample in case he or she caused a fatal accident under the influence of alcohol or drugs.

Can we suspect that a child is forced to place his or her thumb on the device because that might contain some hints of where these children are, a Dutch court in this particular case, said, yes, that’s proportionate. You can go ahead. Other courts might have a different view. So these are areas where I believe there’s room for establishing more common ground and I would once again like to thank you for granting me an opportunity here to also provide a law enforcement perspective. I will leave it to that and I’m looking forward to the discussion.

>> TATIANA TROPINA: Thank you, Jan. Before I go to another speaker, I want to ask you about the terminology. Lawful hacking, governmental hacking. Do you prefer goodware? I know that –

>> JAN ELLERMANN: Do I prefer?

>> TATIANA TROPINA: I know that there are some using goodware, like good remote forensic software. Do you prefer this terminology to like, you know – instead of moreware or goodware. I don’t know if you are aware of that case. But this has been like terminology –

>> JAN ELLERMANN: Yeah, yeah.

>> TATIANA TROPINA: Like through discourse but I think the essence here stays the same. So thank, Jan.

And with this, so we brought here the self-incrimination. We brought privacy and security and children being endangered.

>> JAN ELLERMANN: Scare tactics.

>> TATIANA TROPINA: Yes, but we agree that there are no crypto wars.

And with this, I would like to go to our next discussant which will be Iverna. The floor is yours.

>> IVERNA McGOWAN: Had good afternoon, everybody. And thank you very much for the invitation to speak today.

To the previous speaker Jan, I was very encouraged as a human rights lawyer myself to see that you are being watched over by great – the late, great, Ruth Bader Ginsburg in the background. I hope that’s an unwavering protection, even in the enforcement. Law.

As I looked at the questions and I dealt for a very long time with these questions coming up in human rights and democracy, I do think in asking ourselves, why are we here again? One of the questions I ask myself, perhaps that the framing in and of itself is problematic. This question that we ask about does the privacy of individuals trump everything else? I think that actually undermines the point that it’s not only privacy. Privacy is important. The privacy is a gateway to so many other rights in a functioning democracy.

Just to give you a concrete example, I worked in the past for many years at Amnesty International and I saw so many real life cases whereby people on the front line of defending democracy and human rights ended up, you know, being imprison or incarcerated by governments just for using encryption as a technology and a tool.

So you really – I think we need to challenge ourselves to avoid – I do agree that some of these false dichotomies, that it’s much more complex than that.

Another really important point and I’m glad that the previous speakers have already gone into more detail and deviating. We did see some leaked proposals and some of our technologists at the center for technology examined in great detail, and unfortunately it was backdoors, weakening encryption. It’s unfortunate to see that in 2021, with the widespread use and reliance on encryption and all of these things that we cede these points. I’m heartened by the state from Europol, it’s something that the center for democracy and technology would strongly concur with. And that we really there need to look at this – at each time. We can’t look at these questions in a vacuum. It’s tempting to use the emotional arguments S. it a ticking time bomb with a terrorist? Is it a child’s right versus other rights? But the reality is that the many myriad we need to see need to be used on a case-by-case basis. And it’s always disproportionate to undermine an entire technology in the pursuit of one’s goal.

I think that’s something that we need as to a community take forward and reflect on rather than these constant false dichotomies or binary discussions when I think at this stage, we all agree, actually, fundamentally that backdoors are a very bad idea and undermining encryption is detrimental to our democracies.

Maybe one last point, that I think is very important, something that struck me during the year for those of you who watched it and without getting into the discussions about what WhatsApp did or did not. There was a huge public response. You know, mass changing to other technologies, like Signal and others, and I actually think that as a community we also have to challenge ourselves, especially – like the Brussels or other places to understand just how strongly broader owe site in Europe and elsewhere has a very strong value on that ability to communicate in crypto ways and we don’t forget the political element as we consider other options that are on the table.

I think we need to be mindful of that as a strong European value and something that the public is paying attention to. It may seem like a niche issue that any legislation that actually, you know, comes forward or would undermine is likely to be faced with a very strong public reaction in that regard.

So I’ve got lots more to say but I will keep it to my three minutes for now and I look forward to the discussions.

>> TATIANA TROPINA: Thank you very much, Iverna. I see there is a little discussion on the chat already. Please do keep it going and please do intervene, all of you when we will have a few more people to go still.

I want to ask Markko, are there results that are worthy of displaying? If there’s only two answers, then let’s not. If there’s more, maybe we want to think about displaying the slide with the first one?

Most of the people say defending encryption from any backdoors is crucial to protect privacy.

And actually, I think that should have say stakeholder group, we have speakers belonging to – they largely agree on this, but I also think that here – one of the core of these debates was first highlighted by Stephen, yes. There should be no backdoors but what about the requirement for law enforcement access which was highlighted by Jan, for example, about self-incriminations and some other things is.

And the debate in the wider society which were highlighted by Iverna and I will go to our next discussant, which is Professor Ulrich Kelber. What can you say about this as a federal commissioner?

>> ULRICH KELBER: Yes, good afternoon. It’s good to be here with you, and greetings from the city of Bonn. From my point of view, data protection and the protection confidentially of communication are not possible without data security and data security is not possible without encryption.

To say it right up front, the same way is there’s no possible way to be a bit pregnant, there’s no such thing as a bit of encryption. So effective encryption of communication means end-to-end encryption with authentication of all communication partners.

We regularly talk about backdoors and however we experienced, and I see all the pathological cases as people from the law enforcement authorities see the pathological cases on their side too. But experienced teachers and backdoors of any time are viewed sooner or later by governments or criminals.

In the end, service providers would be obliged to provide information to security and law enforcement agencies by abandoning or faking secure encryption and selecting the relevant information unnoticed and ghost or key escrows. I consider these measure to be dangerous as they would considerably weaken the online communication infrastructure of our society, create new tech scenarios for criminals and cyber war and incidentally also create the conditions for broad surveillance of our population.

And not the least because of that, confidentiality of communication is a constitutionally guaranteed fundamental right. In the 21st century, people must be able to trust the IT infrastructure. Especially digital means of communication to a large extent in order to be able to exercise their fundamental rights. This trust would be permanently destroyed by third parties.

We need to talk about proper proportionality. Criminals will avoid mess encryption. And in the end, the strict applications would have to be used at all. This is the end of few communication as we know it, and it will still not stop hard core criminals. So in many ways I see a gross disproportionality here and therefore very encouraged to find out what will be added in our exchange today, but one thing is very clear to me encryption is nothing less than the important means to provide protection of fundamental rights and the basis of free, democratic society in a digitalized world.

>> TATIANA TROPINA: Thank you very much, Ulrich. I want to apologize for mispronouncing your name.

Big massive apologies. But now we will remember it forever.

So I even heard in addition to crypto wars we have heard cyber war coming up. I want to know that yet again, we hear all. Our discussants and from what I see on the chat, every participant agrees that there should be no backdoor to encryption.

And to our next two discussants, I have to warn you, if you think there should be a backdoor, please speak up, but otherwise, I’m banning the backdoor word for the next couple of minutes. And we will go to Dan Sexton who is from the Internet Watch Foundation.

>> DAN SEXTON: I thank you. It’s good that you banned this word because that’s where I was going to start.

>> TATIANA TROPINA: Okay. I refuse to use my superpower and, of course, gentleman restrict on what you are going to speak about it.

>> DAN SEXTON: I will just say, the Internet Watch Foundation, we are a charity and our mission is to eliminate child sexual abuse online. We have a very, very specific focused angle in the encryption. Jan was talking about backdoors by good guys. I was an IT manager for a small ten, and I remember eternal blue being held for the good guys and then that was hacked, leaked and has caused chaos across the world and continues to do so. We are still being hit by that again and again. So I very much agree that the backdoors whatever the intentions, it will be misused.

For my focus and the Internet Watch Foundation, encryption is not in itself a problem. It’s very much protection of privacy, and it’s incredibly important, fully supportive of that. Our focus is the elimination of child abuse. And I think this why I was glad to have the opportunity to in and talk to my colleagues today and the other participants. How do we get – how do we stop child abuse material getting into these encrypt environments? How do we find them or block them, remove them in those? How do we stop them from getting out and circulating?

There’s a lot of questions. We have a team of analysts here who go on to the Internet and look for this material. They find this material. They block it.

They take it down. We take fingerprints, digital prints and it enables the use of automated scanning and tools. We stop people being exposed to this content. I’m sure nobody would like to be exposed to that content. We stop our children from being exploited and it’s fundamentally stop these children from being revictimized again and again is the content from the piece that has been circulated for decades and decades. The question is how do I stop that from happening? The encryption is here and there. It’s enabled huge, huge, advances in privacy and democracy from across the world.

Stephen asked about what content needs to be gotten at. Child sexual abuse material is what needs to be getting at. I haven’t answered these questions, how do we get at content in a way that doesn’t compromise privacy and doesn’t compromise people’s rights? And there are theoretically solutions. I’m interested on other people’s views on those. Thank you, Tatiana.

>> TATIANA TROPINA: Thank you, Dan. I think we now have a discussion on the chat. It’s like for now, I – I have heard three options here. Before I move to the next and the last discussant, I will – I will just, you know, list them. First word is backdoor and we all agree that it shouldn’t happen. Okay. Then there is the good ware e, governmental hacking. I see the discussion going on. What is worse? What is better? We can think about this Exodus scandal in Italy, when the spyware, the remote forensic software used by law enforcement turned out to be greatly compromised and turned out to be all-around and there was no security, proper security measures in place.

And we also talked or rather Jan raised this issue about the provision of the encryption key. So the issue of self-incrimination and in which – on which – in which instances this is going to be allowed.

Please do let me know if I missed something among these three practical solutions I can think of. But I think this is what has been highlighted and it seems to me from these discussions from various interventions that choosing one of them or one of three is like choosing, you know, among evils and choosing the less evil.

I’m moving in terms of solutions only for further discussion. And I’m moving to our further discussant. Robin. You have been in this debate for how long? Probably a good decade, no less, right? So what are your takeouts from this or two or three decades. I don’t know. You tell us.

>> ROBIN WILTON: Are you calling me hold?

>> TATIANA TROPINA: Never! But you might have gotten involved since you were five.

>> ROBIN WILTON: That’s true. Okay so in 1985, I joined IBM as a systems engineer on banking products, which, of course, incorporated encryption as one of the few regulatory constraints at the time.

So basically since about 1987 onwards I have been working with encryption products.

I remember one example having to go to a meeting with a cabinet office in Paris, in order to get ministerial approval for the import into France of network controllers which had been specifically designed to encrypt banking traffic. And after we described what the product did, the civil servant who was responsible for making the decision said, yes, that’s absolutely no problem. You can import these devices and you can use them in the backing network. There’s only one small condition, the masters has to be launched with the interior service. So it was fine, as long as we completed the key escrow. I still bear the scars of crypto wars one.

In fact, I left IBM in 1987 to go to work for a start-up that was writing its own crypto libraries in C, in the UK and implementing SSL version 3 as it was then, because of the effect – the damaging effect that restrictions on crypto imports were having on business.

That was during the dot-com bubble. It was worth starting a company to try to find ways to deliver reliable encryption to people outside of the banking industry.

Sorry that was a digression but that was not where I intended no go.

The question for this session was can privacy, security and encryption coexist? And I think I can cram most of this into one sentence. We live in digital nations – so there’s a national security element to this. We live in Information Societies. So there is a societal well-being and a fundamental rights division to it and we look in data driven economies and there’s an element impact perspective to this too. And all of these happen in globally connected contexts.

So my very simple position is that private security and encryption must coexist. That has to be more than just taking the words and shoving them into the same sentence.

A recent example of that is the one from the European Commission, which published a statement about achieving security through encryption and security despite encryption. And that’s really extremely unhelpful. Because it uses the word “security” in the two halves of the sentence as if it meant the same thing in both. And it can’t. So we have to be more nuanced about it. We have to be more thoughtful about how we use these words. We have to define our words carefully and use them consistently and complete the sentence.

Security of whom? Against what threat? Privacy from whom? Confidentiality of what? Unless we start completing the sentences we can’t actually have a constructive conversation about this.

Certainly sound bites like that do not translate into workable policy, viable technology, beneficial societal and economic impact, and sustainable governance.

So we need to get past that sound bite approach, and as Iverna said, we need to get past the false framings and the false dichotomies, you must have either this or that.

So I think we have to be very weary of policies that cochlear implant to fix societal problems just by regulating technology. In purely technical terms that is an absurdity and as one of the other discussants has pointed out, trying to ban the use by law abiding people of strong encryption in no way prevents those who don’t regard the law as an obstacle for finding and using strong encryption of their own.

Now, another policy approach, I think we need to be very weary of is those products that don’t mention encryption at all. A current example is the draft online safety bill. It was called the online harm bill, which was a far accurate description. They have renamed it for the online safety bill. It doesn’t mention encryption, but it places such liabilities on service providers that some of them will inevitably to withdraw encrypted services rather than risk being fined or jailed for someone else’s misconduct.

It’s a little bit as if the government said to supermarkets, if you sell a kitchen knife and it’s later used in a knife crime, we will hold you responsible. If that happened overnight, kitchen knives would disappear from supermarket shelves and no one would have mentioned about the behavior of the person wielding the knife.

Policies like that would leave everyone else secure. I will round off by quoting two people who have been safeguarded the public by online and physical damage and so on. The first one is Robert Hannigan in the UK who said it is not a good idea to weaken security for everybody, in order to tackle a minority.

Encryption is overwhelmingly a good thing and he’s often quoted as saying that but what he was actually saying is encryption in secure messaging services is overwhelmingly a good thing.

And the second one is much shorter and it’s from a representative of ANSI, the French information security – the information security agency. The solution must not be worse than the problem.

>> TATIANA TROPINA: Thank you very much, Robin. And I actually think that this goes in line what Ulrich Kelber posted earlier. None will stop hard-core criminals to encrypt their communication in an unbreakable but will weaken everybody’s encryption.

And while I don’t want to make this discussion revolving around the UK. If anybody from the UK and especially the UK government, because I see them among the participants, want to comment on what Robin said about online harms and online safety, please feel free to.

As we reached the end of the first speakers’ statements, I would like to open the door for anybody who wants to contribute. Please do raise your hand and I do hope that I will see the queue forming because I see a lot of interesting online discussion. Vittorio.

>> VITTORIO BERTOLA: I was moderating and at some point, you hope for someone to raise their hand and take the ball off of you. No, I just wanted to a couple of considerations and see the reaction from some of the panelists.

So one thing I keep hearing, there are several people that keep putting perfection as a requirement for a resolution and I think that that leads to nowhere. So I think there is no possible solution that will never be abused by guarantee and there’s no solution that will prevent or motivate a criminal with good resources from still hiding their conversations.

But the point is whether we can find middle grounds and partial solutions that better than nothing. And so if we bring more risks and benefits, then we should not do them. It might be at least something that helps law enforcement in many cases without being this perfect solution that will never exist.

And if I may, think I we should stop focusing on encryption only. It should be easy to find a reasonable solution if he with factor in the mechanism for accessing the architecture and the role of architecture in interim services. It’s important to look for accessible opportunities at the edge of the connections rather than within the encryption and that could already be enough for many, many access needs.

In the end this was what was done with telephone. I mean, you cannot run a telephone network without law enforcement and access points. At least in my country, and no one finds that weird or no one finds it an unacceptable risk. Don’t get technological guarantees. We should have policy guarantees but this needs to be done service by service.

Perhaps rather than looking at this general solution, we should look at services one by one and let’s talk about instant messaging. How do we build access through instant messaging applications? There’s a mix of policy and technology requirements that can help you with that. They can screen opportunities for the child protection. Maybe that would be a more fruitful approach than for just a way to break encryption without breaking encryption.

>> TATIANA TROPINA: Thank you, Vittorio. Andrew, the next.

>> ANDREW CAMPLING: I wanted to bring up something which I raised on the chat, which provoked a comment. I have been critical of the tech sector and there it’s always awkward to wax reflective, and hopefully – but the tech sector often objects to particular steps being taken particularly around the law enforcement it seems to be. But without pulling forward alternatives that might work.

Now, Robin quite rightly in the chat gave some examples, if I just look at that point on my screen, where there were some areas where tech companies have been helpful, at least individual tech companies rather unnecessarily the sector as a whole.

And, yeah, rightly highlighted things like client side scanning and metadata analysis and forensic analysis and all of which he rightly says are helpful to law enforcement. But then that then reminded me but conversely, okay, one of those examples. Some of the steps being taken in the standards community would take steps like filtering technology a lot less useful.

In some cases, they are actually trying to circumvent content filtering, technology with some of the current plan enhancements to internet standards.

And I think one of the issues that made me realize is what we often lack in some of these is proper multi-stakeholder engagement to consider in the realm what some of the sequences of these potential advances in the technologies are, had and whether the consequences are worth suffering because the advances bring greater benefits than those consequences. I think my – so my point would be there needs to be better multi-stakeholder engagement and that’s something that certainly in the arena, for example, in the standards doesn’t really happen very well or very often and that might get us progress in this area amongst many others. Thank you.

>> TATIANA TROPINA: Thank you, Andrew. And bearing in mind that we – that speakers or discussants addressed each other’s points sometimes when they were talking. Please do raise your hand. We don’t have a lot of time, but we have another like ten minutes for some discussion. So I would urge both speakers and the audience, raise your hand if you have something to say, and Robin, over you to briefly.

>> ROBIN WILTON: Sure. Thanks Tatiana. And I will keep it brief.

As I like Stephen Farrell long-time participants in several standard definitions organizations, entirely dispute the fact that they are closed to multi-stakeholder participation. We see participation on encryption from government agencies from tech companies from civil society and so on.

Where I do think that multi-stakeholder engagement is lacking are places like the EU Internet forum, not to be confused with the European Internet forum. The EU Internet for sum a closed door session, primarily on content filtering and client side scanning and it is a closed door session to which civil society is not allowed to have access. I have asked for permission to attend the meetings. Nope. Permission to see the proceedings. Nope. Permission to see the reports. No.

I agree there’s a credibility gap on multi-stakeholder engagement but I don’t think it’s at the store of the standards bodies.

>> TATIANA TROPINA: Thank you very much, Robin. And give the floor to Andre. After Andre, I would like to ask Iverna to comment on multi-stakeholder involvement. But first, Andre, the floor is yours.

>> ANDRE: Okay, so my name is Andrea Melancia from Portugal.

In terms of all the discussions we have seen today, some are legal and we talked about government haggling and things like that, but I want to focus just to limit the range of discussion on technical scenarios, because if you consider the technical scenarios we can limit this discussion to whatever is really possible. So let’s consider a few things.

You mentioned a word starting with “b.” Let’s call it alternative access. It is a better way to access it. Well, someone will always find a way if you put in one of those alternative accesses, someone will always find a way and chances are based on previous history that there will always be the criminals. Someone also mentioned before that there’s going to be some limitations of exporting that the United States had for cyber tools and that’s not going to work because they will come up with an alternative way to do whatever they need to do, but to pass the message to engage in criminal or alternative activities. That will happen no matter how you limit it. There’s always a bigger fish as they usually say.

Now, if you want to do whatever kind of encryption it is, there’s always going to be a bigger fish called quantum and that will solve any problem that will hack anything, and guess what, none of the governments are actually using it, as much as the civil society is using it right now. It is available in a lot of different clouds, IBM, Microsoft, et cetera. They are all using it. Anyone can use it. You just have to use those tools and take advantage of that.

Now, what I really wanted to mention as well is besides it technical limitations that we will have, which actually limits the range that we can talk about in terms level, in terms of morality, et cetera is to talk about why these things are actually happening.

And the reason why we have terrorists, the reason we have criminals is that in the case of criminals, is because in some places, let’s say Nigeria, we have a lot of poverty. In cases of terrorism, that means that someone is unhappy with someone else’s country and there’s a lot of reasons why that happens. The easiest way to solve that problem – easiest, well, it’s not that easy, is to try to find the original causes and come up with some peaceful solutions in the case of terrorism, peaceful solutions. In the case of poverty some to guarantee equality. There is a topic logic. We will need encryption for a long time but we need to focus on the underlying reasons why these things need to happen. Okay?

Thank you for the opportunity.

>> TATIANA TROPINA: Thank you very much, Andre. Until you mentioned that this is a big topic I was going to call you as an optimist on this session.

Ulrich, I will come to you in a minute. I wanted to ask Iverna if she wanted to talk about civil society’s process on these debates.

>> IVERNA McGOWAN: Sure. Center for democracy and technology, it’s well known for pulling different stakeholders together around problems. Something – there’s to elements I would like to highlighter. Multi-stakeholder is a buzz word but definitely European Union level we are not seeing formal structures to ensure that multi-stakeholder dialogue around the issues. It feels like divide and rule sometimes where there’s just, you know lateral conversations between civil societies and other groups separately which is not. We do need to challenge ourselves, I believe, on how we can have a genuine debate and discussion.

Because obviously, it’s understanding how the technologies work, but sometimes the companies are better placed to explain, about talking about the democratic and the citizen elements in that, going from civil society. That’s really important.

A second point and this is a much broader societal challenge is that unfortunately, at CDT, we are dealing with who can have the adverse impact of digital policy gone wrong, whether it’s journalists human rights defenders, who rely on this technology a lot or whether it’s how AI can discriminate against marginalized groups and their groups in particular that are excluded from policy making decisions despite they face the most adverse impact. I think it’s really important that we do challenge ourselves and we need to be really weary when it’s just company and government in a room together because there you have the really powerful actors without any kind of checks and balances of watchdogs or civil society or other in the room.

You know, people were talking about client side scanning as if that’s a totally harmless solution, whereas across civil society, we would have serious human rights concerns with client side scanning in the same way that we would with backdoors and encryption. It’s important to challenge ourselves to move from that from a buzz word to actual practice on the laws that really matter and will ultimately find their way into different jurisdictions.

>> TATIANA TROPINA: We have seven to ten minutes left, correct me if I’m wrong, dear host. What I will do right now, I will go to Ulrich and then Jan and then perhaps I will give the floor to Steven as another speaker would didn’t have a chance to, you know, state his position at the end of the session.

Meanwhile, I would like the host to project the poll again to see if anything changed during the session in our opinions. In any case, Ulrich, the floor is yours.

>> ULRICH KELBER: I would like to come back on that going dark. It fails in two ways from my point of view. First, there’s not one-to-one transfer from analog communication encrypted digital communication, but as analog communication was a small line, we are now in the digital communication cloud. So it becomes much more things to look on, much more information, not even from the different channels but from metered information from the communication, which can be investigated if resources would be there. And skilled people to look at that.

That’s the second point.

So law enforcement authorities, and politicians are asking for additional possibilities to invest on every day and everybody communication and they got that for almost two decades, every year for several parts of our world and we have to ask them on the most common and most well-known, for example, terrorist, or child abuse cases, a lot of things were known to the law enforcement authorities. And there was a lack of communication. There was a lack of resources. There was a lack of skilled people to deal with that data which were known and it makes no sense to just have another heap of data put on those few people which are experienced and skilled enough to work with that data. So the real debate is not about weakening everybody’s communication and the very basics of our society but to give the resources and the skills to law enforcement that they can develop and do so without developing our freedom rights.

>> TATIANA TROPINA: Thank you very much, Ulrich.

And I have just got my superpower powered up and we can eat a bit of the coffee or whatever break there because the discussion seem the discussants – it doesn’t mean that we will stay here for another 40 minutes, no. I will give proper word to Jan and Dan and Stephen. Proper words doesn’t mean ten minutes.

Jan, it’s very much appreciated that you and as law enforcement join in this debate and sometimes it can be a hot seat, but we would like to hear from you from this hot seat.

>> JAN ELLERMANN: So far, you are all behaving very politely towards me. I’m very happy with that.

And regarding Ulrich’s statement on resources, I think he’s running open doors at law enforcement. I’m confident to say that.

Ulrich Kelber also addressed the issue of trust and said breaking encryption would break the trust of citizens. I subscribe to that statement because I also saw in the chat some discussion about the topic of situation and let me comment on that. Let me briefly do that by stating in law enforcement, we have run a number of undercover operations with the specific end to shatter the trust of criminals into which they are using in order to pursue their aims.

So in July of 2020, there was a chat, which you may have heard about. That was an encrypted platform which was dismantled by French and Dutch law enforcement. Here we talk about 120 million messages amongst criminals because they felt safe. They are quite openly sharing their plans. And our plan was to look at these masses of data into the quickest possible way. And here again, we are running into data protection challenges. In order to cope with this masses of information, you can’t do that just by human analysis and machine learn and AI form part of the discussions here. So we tried to identify those threats to life as quickly as we could. We are in ongoing conversations, with the data supervisor authority. In order to make that happen and a way which is fully compliant with fundamental rights and in particular the rights of data protection.

I could refer to other operations such as in March 2021, there was the takedown of Sky ECC. That was a similar case but still with much higher amounts of data.

And when that one was talking down, talking about infiltration and what is left in order to build our cases. It was an amazing operation which was run by the FBI. So the FBI created a honeypot, if you want. They set up their encrypted network. And in the end, they used undercover agents to promote this network, particularly in organized crime groups. So ultimately they ended up with 12,000 encrypted devices used by 300 criminal syndicates operating in 100 countries and there were motorcycle gangs and international drug trafficking.

And here we speak about 27 million messages from – which were obtained that way. And just talking about proportions, 800 arrests over 8 tons of cocaine, 22 tons of cannabis and so on and forth. So that – if I look at these figures, and I look at what is happening, there I feel that I understand colleagues who are issued with the issue of encryption and are desperate in terms of what we see as going on out there. And then maybe moving on to – because Dan earlier made a brief remark. He says it’s not just about encryption. We need to think outside of the box. It’s about – that involves hashing. We had a heated debate about whether the eprivacy could be adjusted in order to allow for the continuation of this automatic standing of messages in order to identify child sexual abuse materials and that has shown to be very difficult also. I won’t go too detail because Dan will be much more skilled to run you through that.

The bottom line is, I think as a society, I very much welcomed the debate we had today, but we also need. We can’t just say that encryption, backdoor is not an option, and the law enforcement needs to acts the investigator.

We need the legal framework in which we can tack will forms of organized crime in a way which respects fundamental rights.

With that, I stop. Thank you.

>> TATIANA TROPINA: Thank you very much, Jan.

I’m giving to give the floor to Nigel Hickson and then Dan and I believe for the reporter it’s a huge task because there is a lot of content.

Nigel Hickson, take the hot seat from Europol and tell us something. It you are still here, Nigel. Yes, I can see you.

(No audio)

>> NIGEL HICKSON: It’s far too complicated these new Zoom platforms. It’s – yeah. I only just got used to it. Anyway, Nigel Hickson, I work for the department of culture, media and sport. I don’t work on the online harms or the online safety policies and legislation. But I would say that the team that works in DCMS on this is – has had a lot of inputs lead of the drafting of the legislation and I know would work any further inputs into the policy. There will be a chance to discuss it, I suspect at various venues not least the UK-IGF. So I’m giving a plug for the UK-IGF later in October when there will be a session on the online safety legislation. I just wanted to say two things encryption. You might think what is idiot talking about? He’s not responsible for encryption. I’m not.

I work on Internet Governance. Back in the day if you worked on Internet issues in the ’90s, you worked on everything, but thankfully now, I don’t. I think there’s two things to say on encryption. You have to learn from the past to an extent. I was involved in the key’s Toronto policies of ’99 for the UK government. We asked them to put in key escrows. I think there are lessons there. There are lessons going against the grain of technological development and the absolute need of taking people with you in the technology sector and the business sector that played such an important role in that debate.

So I think I will finish there but this is has been an excellent session.

>> TATIANA TROPINA: Thank you very much, Nigel and for a shameless plug, in addition to our intervention about the – Dan and Steven.

>> DAN SEXTON: Hi, Tatiana. I want to draw – the debate is often – I know that law enforcement’s scope on this is massive and subjective as far as the different harms and different things going on there, but the debate is on the offenders and the finding bad guys and the ethics ever doing so. I think we need to remember and not forget that we are talking about the sexual abuse of children. These are children that are real people that are out there being discriminates, and we have to find ways of stopping those images getting out into those systems invited or otherwise. We have to say that’s an image of being sexually abused, can you take it off.

James. They can’t just be collateral damage, because of other reasons.

This is not a binary choice and it should be iterative and it should be focusing on what we can do and what we should be doing and what we can do is protect the most vulnerable and protect the people that can’t protect ourselves. We have a responsibility to do that, and we should be looking at solutions that do that in a private way, in a way that doesn’t compromise the other things. You can do that without a backdoor. You don’t need to read someone’s entire text message to stop an image of a image of a child being transmitted. There are solutions out there to do that. I think we should be focusing and our colleagues in industry and our college with resources to do so to put more effort into doing that. The risk and the concern for our side. We are pushing so much on to the encryption and very, very good reasons is that any technical – you know, the resources – the team that’s doing the encryption is significantly bigger than the team that’s looking at the ways to child abuse stopping from getting into those systems. We need to look at both of those things and not one at the expense of the other.

>> TATIANA TROPINA: Thank you. I got a message that we have to end in six minutes. Stephen, if you wrap it up in a sort of tweet, that would be great because then I will bring the reporter. Stephen the floor is yours, albeit briefly. I’m sorry about that.

>> STEPHEN FARRELL: I will be brief. I guess reacting no some of the discussion, I mean I think that backdoor in crypto is not.

Typically, we layer services upon services and down at the bottom, there’s a crypto algorithm but you are using TLS or instant messaging system and you are building something else on it.

So there’s lots of side effect if you try to do the access to the service level.

As Jan was saying, I think law enforcement agencies are getting more technically savvy these days and understand these issues better and a couple of years ago I would have said there’s no hope to have an open requirements discussion with law enforcement and maybe it’s becoming less hopeless. I think there’s opportunity there to somehow do it.

Because, you know, we can never really discuss signal intelligence requirements. They want to break anything. They don’t have to abide by any laws. But law enforcement, they can kind of banner and annunciate these days.

It might be time to have the open discussion. It has to be an open discussion. It has to be an open discussion where the results are amenable to implementation by open source. There’s the laws of physics while possible with technology in this century.

But, again, I guess I’m finished being a little bit more hopeful than I was a couple of years ago that that discussion could be something that we have now. Thank you.

>> TATIANA TROPINA: Thank you very much, Stephen. Thank you to everyone for this absolutely fantastic discussion. Thank you to Andrew and Diego for putting this together. And the laugh thing, Boris the floor is yours for the messages. Please stay with us.

>> BORIS: Hello, my name is Boris. And the Geneva Internet Platform is the official reporting mat form for EuroDIG and we are providing key messages and session reports for all workshop. I will present the message from this session and the report will be posted on the GIP digital watch observe story. A kind reminder that the messages will be available for additional comments and EuroDIG on that. The focus of discussion should change from regulation of encryption to the requirements that should exist, and this will be beneficial from the law enforcement perspective.

Unless there’s strong objection to this, we’ll consider that there is rough consensus on the messages. The second message is false framing and false dichotomies around encryption, privacy and security need to be avoided. Third message is trust in encrypted communication is necessary in a democratic society, but it will be undermined by simple access of authorities to encrypted messages. This would be the end of free communication. It will not prevent criminals of encrypting their communication in an unbreakable way, but it will weak everybody’s encryption. Solution cannot be worse than the problem.

And the fourth message is proper and better multi-stakeholder engagement to consider the consequences of technological advances is needed. EU level, concrete actions need to be taken to ensure formal structures for such engagement and overcome the existing divide of frameworks and scattered discussions and debates.

>> TATIANA TROPINA: Thank you very much. We have two minutes left and I will give one minute back to your lives. I would like to thank everybody yet again. Everybody who wrote on the chat, everybody to spoke at this session. Unfortunately, it was very short and we had a lot of discussants. It was an absolutely fantastic hour and nine minutes. Thank so much, everybody.

Thank you.

And see you later at EuroDIG.