Content moderation on the Internet infrastructure level – Where does censorship begin? – WS 09 2021: Difference between revisions

From EuroDIG Wiki
Jump to navigation Jump to search
No edit summary
(4 intermediate revisions by the same user not shown)
Line 1: Line 1:
29 June 2021 | 16:30-17:30 CEST | Amphitheatre | [[image:Icons_live_20px.png | Video recording | link=https://youtu.be/JUXnE4WLMUI?t=18001]] | [[image:Icon_transcript_20px.png | Live transcription | link=https://www.streamtext.net/text.aspx?event=CFI-EuroDIG-D]]<br />
29 June 2021 | 16:30-17:30 CEST | Amphitheatre | [[image:Icons_live_20px.png | Video recording | link=https://youtu.be/JUXnE4WLMUI?t=18001]] | [[image:Icon_transcript_20px.png | Transcript | link=Content moderation on the Internet infrastructure level – Where does censorship begin? – WS 09 2021#Transcript]]<br />
[[Consolidated_programme_2021#day-1|'''Consolidated programme 2021 overview / Day 1''']]<br /><br />
[[Consolidated_programme_2021#day-1|'''Consolidated programme 2021 overview / Day 1''']]<br /><br />
Proposals: [[List of proposals for EuroDIG 2021#prop_7|#7]] [[List of proposals for EuroDIG 2021#prop_55|#55]] [[List of proposals for EuroDIG 2021#prop_63|#63]]<br /><br />
Proposals: [[List of proposals for EuroDIG 2021#prop_7|#7]] [[List of proposals for EuroDIG 2021#prop_55|#55]] [[List of proposals for EuroDIG 2021#prop_63|#63]]<br /><br />
Line 84: Line 84:


== Messages ==  
== Messages ==  
A short summary of the session will be provided by the Reporter.
*The DSA is aimed at modernising relevant rules and limiting legal uncertainty for addressing risks in digital space, including the key question of liability of service providers, including necessary technical auxiliary functions.
*Infrastructure intermediaries have been in the legal ‘shadows’ since the 2000 e-Commerce Directive. Today we face the necessity to clarify liability exemptions and elaborate explicit categorisation of digital services.
*Recent cases show that certain infrastructure providers unwillingly take action that could be argued to be content moderation by suspending services for the platforms in an ad-hoc manner without any transparent policy. But infrastructure services have limited possible options, which tend to be temporary solutions (clearing cache), overbroad reactions (limiting access) or options that open up websites to cyberattack (terminating services of particular users).
*We should keep in mind the proportionality of measures to target illegal content and foresee the unintended consequences on the infrastructure level. Current content moderation practices are not ideal and transparency reports are not very helpful to escape the mistakes, so what can be expected from the broader infrastructure level?


Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/content-moderation-internet-infrastructure-level-where-does-censorship-begin.
Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/content-moderation-internet-infrastructure-level-where-does-censorship-begin.
Line 92: Line 95:


== Transcript ==
== Transcript ==
Will be provided here after the event.
Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-482-9835, www.captionfirst.com
 
 
 
This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.
 
 
 
>> ELISABETH SCHAUERMANN: All right, everyone. Welcome to the Amphitheatre and Workshop 9 dealing with content moderation on the infrastructure level. Before I hand over to your moderator and host, Sebastian Schwemer, I will just remind you of the session rules.
 
Please be in the room with your full name so that we can identify you. If you want to make a point or raise a question and you are not one of the assigned speakers, please raise your hand and the right to speak will be given to you. When you are unmuted and the floor is given to you, please state your name and your affiliation and switch on your camera as well, if you can.
 
The chat will not be published and please avoid sharing the Zoom link outside of this room because we, for safety reasons, need people to enter through the official way. All right. And with that, I would give the floor to Sebastian and wish you all a lot of fun and good discussion.
 
>> SEBASTIAN SCHWEMER: Thank you very much for the kind introduction. Good afternoon, everyone. I hope you can hear me well. And a warm welcome to this workshop on content moderation on the Internet infrastructure level and the question: Where does censorship begin? I’m your moderator and the focal point of this workshop, Sebastian Schwemer with Copenhagen’s Center for Innovation Law at the University of Oslo’s Norwegian Center. I was involved with some of the preparatory works for the digital services act and author of one of the background studies to that which deals with the non-hosting intermediaries.
 
Today we have six fantastic key participants to discuss with us some, I think, very, very important questions. We also have you, the audience. So, for the process, I would invite the six key participants to come with a short thought, three to four-minute intervention in the beginning. Please, feel free to ask questions in the chat and I will try to do my best to integrate you along the way. Good!
 
So, who do we have with us? We have with us, and I come with a better introduction in a second, Denis Sparas from the European Commission, which I’m very happy that he’s able to join us. We have Dr. Corinne Cath from the Oxford Internet Institute, Petra Arts from Cloudflare, Corynne McSherry from EFF and Fred Langford from OFCOM. And I think this is a fantastic panel to discuss these questions. And if we look at content moderation at the infrastructure level, first of all, I think we can ask the question, content moderation, is it really content moderation or should we put quotation marks, what can happen at the infrastructure level? It can be wrought with a lot of problems. The European proposal for the Digital Services Act addresses providers of services that establish and facilitate the underlying logical architecture and the proper functioning of the Internet, including technique auxiliary functions, and I think that is a good starting point to discuss where we are at, what is the current and future role of providers of Internet infrastructure in content moderation, again, maybe in quotation marks.
 
We have three rough themes that I think will guide us through our conversation, and the first relates to stock-taking. The question of, what is the current picture of content moderation at non-content layers? And important, I think, both to ask the question, what do we know and what don’t we know? We also should talk about the problems. What are the issues around moderation and non-content layers? And these can be technical, societal, political in nature.
 
And lastly, I think this is, I hope, where we can feed into the ongoing discussions in this workshop, the regulatory landscape. How is self-regulation looking and how does the Digital Services Act propose to address these rules, this phenomenon, and is it sufficient what is working, what is not, what is missing, et cetera.
 
So, without further ado, I would like to introduce to give his first intervention, Denis Sparas. Denis is a legal officer at the European Commission and I think there could be no one better to present the commission’s ideas and visions. Denis has been working on the DSA file, but also quite some years ago on the decommission directive. So, Denis, the floor is all yours.
 
>> DENIS SPARAS: Thank you, Sebastian, for a very nice introduction. I hope you can all hear me, because I have the feeling that my system is showing me some problems with the connection, so I hope everything goes well.
 
I will just briefly touch upon the, essentially, the Digital Services Act and where we are coming from, because you know, as Sebastian was saying, we have an excellent line of speakers and panelists, which I personally also want to listen to, and then also, I’m sure that there may be many questions from the audience, itself, so I’m really looking forward to this discussion.
 
I mean, just very briefly on the Digital Services Act, as already mentioned by Sebastian, this is part of the digital services Package adopted end of last year and is essentially one of the flagship initiatives following the Digital Agenda for Europe, which was in the commission in March of 2020.
 
What are the objectives of the Digital Services Act, just in a nutshell? On one side, we want to modernize the rules which we have today, and Sebastian was already referring to my past work, which is essentially the framework of rules that we have, which is the E-commerce Directive. And essentially, the idea is to make these rules more effective in addressing illegal content and systemic risks in the online space. But we also aim to do, and it’s very important in the context of today’s discussion, is to clarify the rules on liability, you know, at any layer, let’s say, so infrastructure layer, but also in the context of the hosting services.
 
And the idea here is primarily to give companies and market operators, but also users, legal certainty in different aspects, including in relation to any voluntary action that they may take in an unproportionate manner. We can discuss this, at what layer this is essentially possible and relevant. And then also what is also very important as one of the objectives of the Digital Services Act is to increase transparency of the content moderation decisions, ensure accountability, and facilitate better oversight so, you know, it’s not only about responsibilities, but also about accountability.
 
What is important to say, you know, in the context of the relationship between the Digital Services Act and the e-Commerce Directive is that the proposal is essentially to be perceived as complementary to the e-Commerce Directive, which is currently valid framework and which will remain essentially the valid framework. But of course, what is very important to keep in mind is that because the Digital Services Act covers the providers of intermediary services, it also takes over, if that’s the best possible term in this context, takes over the provisions which are the most interesting for today’s discussions, you know, so, Section 4 of the e-Commerce Directive dealing with the Conditional Liability Exemption.
 
DSA essentially builds on the liability exemptions of providers through intermediary services, so it replaces the respective parts of the e-Commerce Directive. Because we talked about the regulation, you know, it’s simply a question of legal technique where in the directive, the provisions were addressed to Member States, while in the regulations, essentially, provisions which are addressed to the providers and intermediary services.
 
There is, you know, to the extent we have been looking into these issues, and I’m trying to tackle the identified problems, it has been considered that, you know, since the evidence showed that the existing provisions are still very much valid and provide the cornerstones of the provision of these services in the EU as we know them today, we shouldn’t be changing something that works, but we should adapt and, you know, modify where this is necessary.
 
So, for example, one point which we found that is important to clarify is, you know, to what extent the liability exemption regime also applied to what we called Infrastructure Layer Services. And what we aim to do through Recital 27 of the DSA to clarify that services establishing and facilitating the underlying logical infrastructure and the proper functioning of the Internet can also benefit from the exemption from liability.
 
And then what we also tried to do, we tried to clarify by providing certain examples, which these services could be. But of course, here it’s very important to put a clarification and disclaimer, if that’s the good word to use, that you know, it will always very much depend on a case-by-case basis whether a specific service added today or in the future, because we still don’t know how these services will look in the future or whether there will be some new services, you know, whether the existing. All the new services fall within one of the three categories of the intermediary services, so, the classical caching and hosting which, I think we all well know.
 
I think also what is important to keep in mind is that the DSA also proposes to maintain the prohibition of general monitoring obligation, but which is probably something which is a bit more relevant in the context of the hosting services and maybe less in the context of the infrastructure layer services.
 
And then what is also an important part and novel part of the Digital Services Act is, you know, the part on the due diligence obligations, you know, obligations which providers of intermediary services should comply with. And I think here, again, it is very important to stipulate that the Digital Services Act is based on the principle of asymmetric due diligence obligations, which means, you know, depending on the size, nature, and the reach of the specific service, you know, service providers will be subject to different type of obligations.
 
So, for example, when we talk about, let’s say infrastructure layer services, the new obligations would essentially cover relatively limited set of obligations, assuming that, you know, none of these for the – for the moment, let’s assume none of these would be hosting services, so these would be very limited obligations, like having a point of contact or a legal representative, if we talk about the service provider coming from the outside of the EU, and then you know, having some standard terms and conditions, plus you know, transparency about the content moderation policy.
 
What is also important to keep in mind is that there is a specific exemption for micro and small enterprises. And the idea behind all this is to have, you know, a balanced and proportionate approach to the role that the intermediaries play in the daily provision of such services, and in particular, when we talk about non-hosting intermediaries in the present context.
 
I mean, all this is based on, you know, vast evidence collection, including the study that Sebastian was referring to and which was prepared for the commission, and you know, for any follow-up, really, because we really don’t have much time for this. You know, I would really recommend reading Annex 9. Our Impact Assessment, which was accompanying the proposal, which deals with a lot of, you know, evidence-based and elements why the proposal looks the way it looks, in particular when it comes to infrastructure layer services. And maybe I would stop here just to give also others the opportunity. I’m really looking forward to our discussion.
 
>> SEBASTIAN SCHWEMER: Thank you very much, Denis. And I’m sure we’ll get back to you with lots of questions and comments, but thank you so much for the first introduction and very concrete way that it is put forward.
 
I want to invite up Dr. Corinne Cath, recent graduate of the Oxford Internet Institute, to tell us a little bit more about the proper identifications. What are we looking at here? And maybe you want to tell us a little bit more about the problems that we see with this phenomena of looking at the infrastructure layer. Corinne?
 
>> CORINNE CATH-SPETH: Thank you so much, Sebastian, and thank you for the invitation to be here today. As mentioned, I am.
 
>> ANIA HELSETH: Three apologist who studies the politics of infrastructure, which essentially means that I look at how organizations and companies that operate at the Internet infrastructure level engage in political gatekeeping, including in what today we’re calling content moderation, through their day-to-day operational work.
 
And so, in the past, I have written about, for example, Cloudflare’s decision to take down messaging board HM in the wake of the El Paso shooting in the U.S. I’ve also considered Amazon’s suspension of services to right-wing outlet media Parlor earlier this year after the storming of the U.S. Capitol. And I think there’s two things we see in both of these cases, namely, that Internet infrastructure companies intervene in political debates by deciding to rescind their services to online platforms. Now, this isn’t necessarily a new phenomenon. Academics like Laura DeNardis have shown how essentially this is a thing that’s been going on for at least a decade, if not longer, but it seems like it’s becoming, if not more frequent, more visible to policy debates, which is a great thing.
 
But that does also bring me to one of the guiding questions for today’s session, which is, what are the issues around content moderation at the infrastructure level? And for me, one of the big things I see within these decisions is that they often, at least from the outside, seem to be happening quite ad hoc and at the whims of CEOs of these infrastructure companies. And I wrote a piece about this last year for the Brookings Institute together with Savannah in which we dubbed these decisions as coming from the Internet-reluctant sheriffs, which by we mean that for the most part, Internet infrastructure companies are naturally hesitant to position themselves as explicit political players, but when they do intervene, they often attempt to do so without a strong policy framework around how they make these decisions and also with limited sort of follow-ups in terms of accountability measures afterwards.
 
So, I think one of the issues that I would like to focus on and that I think is important for this debate is, given that, you know, political gatekeeping and content moderation will continue to happen at the infrastructure level, there is a real need to make sure that its main players have a much more mature and robust framework around how to respond to those kinds of situations, and these frameworks need to be publicly available so that researchers like myself, but also policymakers and others can actually scrutinize them, as we do with platforms, and also because that creates a certain level of predictability and accountability, both towards the consumers of these infrastructure companies, but also to the Internet users at large.
 
>> SEBASTIAN SCHWEMER: Thank you so much for your interventions, Corinne. And I took some interesting notes that I will send back then maybe as questions to Denis, on to you, whether the framework contains these safeguards and the policy framework that you kind of identify is lacking.
 
I would invite over Fred Langford, who is Online Technology Principal at OFCOM, and has two decades of experience with online architecture and the content moderation landscape at large. He was also previously the IWF Deputy CEO and CTO, prior to joining OFCOM and can provide us insight into where we are at and what is needed in terms of governing content moderation by infrastructure actors.
 
>> FRED LANGFORD: Thank you very much, Sebastian. And as Denis said, I’m having occasional intermittent network issues, so I do apologize if I freeze occasionally.
 
The issues at hand, I suppose, yeah, very, very interesting, I think. So, I mean, from a regulatory perspective, I’m interested in the options that are available, so what comes out at the end, after all these discussions is really key from a regulatory point of view, and where they fit within the flow of data. So, in the stack, if we’re talking about how things go up and down the stack and how they interact with each other, particularly who is providing those services on that TCIP stack that means that action could actually be taken. I think we’ve already touched on jurisdictional issues. What happens – what are the expectations on providers of these sorts of services who are operating outside of the regulatory zone? What enforcements? I use an example. There was recently something, a news article relating to a court judgment for Quad 9, who are based in Switzerland, and what was the implementation going to be for the copyright protection for Quad 9? What legal jurisdiction did that court have in making that decision? It’s still open to challenge, I understand, but that is an interesting concept.
 
But assuming the two points I’ve mentioned have already been addressed, there are other considerations and how effective are the approaches in the first place? Is it going to provide a universal expectation of stopping access to the sort of content that people don’t want to see? But it really is, the main issue is, is it proportionate? What collateral damage is taking place? So, if I use examples around the sorts of things that are taking place on networks and architectures already, we have URL blocking and there’s DNS blocking, key words, use of key words with registries to stop domains. We have CDNs clearing caches or not caching, access from sites which Corinne already mentioned what Cloudflare were doing with HN, based upon that decision as well. We have filtering solutions on Wi-Fi and also this search. But so far, these have been voluntary, and this sort of leads into the need of really being able to address those providers who should be doing more on their platforms but don’t engage. And then you need to have alternative options available to be able to do something.
 
I mean, some of the problems we have are, obviously, clear, have been mentioned in conversations around this, to provide clear definitions to provide that legal certainty for providers, if that was an expectation of them, in future-proofing. But I’ll also say that some of them are very blunt instruments, particularly when we’re talking about content moderation around particular pieces of content on large providers’ networks, so DNA blocking as an example. That would black hole the entire site, and that was the proportionality discussion I was raising. Is that proportionate? I mean, what is the volume of content on the site overall compared to the content that’s trying to be blocked? Are we talking one image out of a billion or one image out of two? And the proportionality is very, very important here.
 
Also, who’s going to see this sort of content? Who are you trying to protect by stopping people seeing this content? So, there’s lots and lots of considerations, which makes our job interesting. That’s why we do it. And I think that I’m really looking forward to this debate and conversation to find out more.
 
>> SEBASTIAN SCHWEMER: Thank you so much, Fred, for your intervention and interesting thoughts. And this issue of the blunt instrument, intervening infrastructurely. I might invite next up one of those representatives of those actors, Polina Malaja. She’s the Council Advisor at the Council of European National Top-level Domain Registries, where she leads their work and is liaising with other institutions in the ecosystem. And maybe, Polina, do you want to address kind of how you see your industry in the context of the DSA? And maybe also what Corinne mentioned, this policy framework, is that something you’re looking for and is that something you’re finding now? Polina, the floor is yours.
 
>> POLINA MALAJA: Thank you, Sebastian. I hope you’re all hearing me well and seeing me as well. Wonderful. Thanks so much for organizing the discussion and inviting CENTR to outline the European Look at the debate and looking at the proposal.
 
As we heard earlier and as mentioned by Denis, the DSA’s essentially an update of the e-Commerce Directive. In email, when the directive was put in place, the providers and domain registries were not interested to be intermediary service providers. As the definitions of intermediaries in the e-commerce Defensive, such as caching and hosting service providers simply do not reflect the technical function of DNS. So, specifically talking about the registries, they do not provide access to communication networks, do not store, nor transmit any content through their managed infrastructure, as suggested by the definitions in the e-Commerce Directive. And now, fast-forwarding to the DSA proposal in the end of 2020, it is noteworthy, and as Denis also already pointed out, the proposal recognizes the limited but very important role of certain technical auxiliary functions, as they are called in the proposal, of Internet infrastructure services.
 
And as Denis said, it’s the recite of 27 is important here. And notably, as the DSA confirms that domains and registries are intermediary and should benefit from an exemption for liability for illegal content provided by the end user. But only to the extent that their services qualify as these intermediary categories, such as mere caching or hosting services. And as we have established already before, at no point in time can registries be considered for offering services falling in any of these categories. And this, in our opinion, creates legal uncertainty for the operators, as they are finding themselves in a vulnerable position as, technically, there is no specific way to target on technical level, on registry level, specific content.
 
So, what registries can do is to only suspend the underlying infrastructure. That will, of course, have an effect on all services linked to it. And of course, will have an effect on the accessibility of the service. Sorry, I had the video malfunction. So, by suspending a domain name, so, the service – all the services connected to a domain name are not accessible by end users, while the content actually still stays online. So, to use some type of analog example here, is for example, by spinning a domain name, if I use the analogous example here is to actually erase the location of the house on the map, but the actual house is still there. So, you’re just not able to navigate to the house. So, yeah. So, in this way, it’s not really the most proportionate way to go to the registry in the beginning.
 
And we can also actually already anticipate these unintended consequences of not applying a proper provisionality test when it comes to mandating action on a registry level. And notably, I want to highlight the European Parliament’s draft position that suggests amendments that catalogization of underlying infrastructure such as the domain name, in case of repeated infringements under DSA, and this also includes cases of a very large number of platforms, such as Facebook.com, for example. And of course, this can have grave consequences on all users using the service, and not only the ones who are engaging in illegal activities.
 
And these parliamentary amendments are actually inspired by the Consumer Protection Information regulation, but however, they do not actually follow the same logic as in the CBC, so to abbreviate the Consumer Protection Corporation regulation. Under the CPC, the action on the regulatory level can only be taken as a measure of last resort, and that important part was not considered by the parliamentary amendments.
 
So, I know I’m already running out of time, but I want to really briefly conclude. So, from CENTR’s perspective, there is a need of an explicit fourth category exemption for intermediary under DSA reserved for these technical auxiliary functions of the Internet infrastructure. So, this specifically is important for the DNS level, since without that, it cannot be shielded from content liability, while registries specifically lack any effective control over the content removal.
 
So, of course, in our position and technically speaking, this will have a disproportionate effect on the functioning of other services associated and functioning on top of the DNS, and since we consider the action registry level to be this type of nuclear measure, and we also feel this can only be reserved for exceptional circumstances when all other more effective means with other intermediaries are exhausted, and obviously, of course, after very careful deliberation by the content public authority. So, thank you, and I’m looking forward to discussion.
 
>> SEBASTIAN SCHWEMER: Thanks, Polina, for presenting us with the view of the European CCT registries. I would also like to see that maybe some suggestions from the study are going to be picked up again out of a hat, and I think we’ll have an interesting discussion in just a bit.
 
I would invite next up to hear another industry perspective from Petra Arts. And Petra is Senior Manager for Public Policy at Cloudflare. She joined Cloudflare a little under a year ago, I think, but has been working similar positions for telecommunications and media company Liberty Global in Brussels and has a lot of experience in the field. So, Petra, tell us a little about the perspective that you have as yet another non-hosting intermediary actor in this field. How do you see the developments?
 
>> PETRA ARTS: Thanks. I hope you can hear me and see me okay. Yes, thanks very much for inviting me to speak. And this is a very important topic and I’ve heard already many interesting things from the other speakers, so it’s already a very interesting session for me.
 
Maybe just a few words about Cloudflare and who we are, for those that may not be as familiar with our services, first of all, just to set the scene. So, we are a global provider of security, reliability, and performance services for websites. We provide a wide range of services that are kind of based on our global network of 200 data centers in 100 countries. And probably the most known is our cybersecurity protection services that we provide needed protection specifically, but we also run a content delivery network service and DNS services, including public resolver, 1.1.1.1. And we have always believed in the profitability of security protections but also the services to make the Internet a better place, essentially. So, we believe that in some cases, specifically for websites, sitting behind a large network like ours is the only way for them to be protected from cyber attack. So, we find that incredibly important that we provide those cybersecurity protections.
 
As, I think there’s a couple of things probably that I wanted to mention that probably have been mentioned before, so apologies for some repetition, maybe, from other speakers. But it maybe goes without saying that Internet infrastructure services like the ones that we provide, they’re typically operating lower down the stack as well they are far removed usually from interaction with actual pieces of content. So, maybe just first of all, and Sebastian mentioned it at the beginning, you know, the term content moderation as a broad term for, let’s say, lower layers of the stack. We might want to be careful with using that term because there are, you know, kind of little, proactive actions that would be proportionate, as already mentioned, to really address the specific types of illegal content that, you know, regulation like the DSA wants to address.
 
From our perspective, you know, what we can do for services that cannot see or modify, for that matter, specific content, we have very limited responses to that content. You know, clearing cache is one of them, as Fred was also mentioning. Limiting public access to a website, you know, is also mentioned at the DNS level. Or terminating services of particular users. That’s kind of the options that we have, you know, at that level. But they do not remove content from the Internet, you know. This is important, I think, in the context of the DSA as well. You know, the proportionality of it, you know? Caching is -- clear and cache we thought was a temporary solution. It doesn’t remove content, but it can temporarily close down the access to the content. You know, I would say limiting access at the DNS level is a very overbroad action, a very blunt instrument that we have noted before. And you know, in our case, terminating a customer or a user opens us up to cyber attack, which is also not something I think we would want to cause necessarily.
 
So, as Polina was saying, from our perspective, the measures that are, for example, described in the DSA, should be targeted primarily at those who can take proportional action first. So infrastructure services should come in only as a very last resort, when all ,f the other options are exhausted and when the more appropriate players higher up in the stack have been able to take action.
 
Maybe specifically on the DSA, as a last remark. As Denis was outlining, there have been a number of additions, let’s say, additional confirmations, especially in recitals, regarding infrastructure intermediaries and Recital 27, which, you know, there is some helpful clarifications about liability in there, and I think particularly the somewhat recital proposed around disproportionality, so the staggered approach, let’s say, not only towards the different categories of the liability regime, but also to outline that, you know, this hierarchy of going first to those that can take the best, most appropriate, most effective action is extremely important to reaffirm, I would say.
 
And the last point on transparency. We believe that this is a very important element to make sure that decisions about digital services are to expire generally and we have not only to – in our view be more transparent about any actions they might take voluntarily, but we have to be able to expand about any orders or any requests or any kind of mandated action that we’re ready to take, for example, from courts or competent authorities. This is obviously the case in the DSA, but also in the e-Commerce Directive. At the moment, there has been not a lot of push for transparency for judicial decisions, for example, which we think is very important because they’re such a blunt instrument, you know, which potentially affects a lot of users in any given country at the moment, or more broadly of the DSA, depending on how that will work out with the regulation. It could have an impact on, you know, freedom of expression and, you know, various fundamental rights questions, so we would want to be more – to expand as possible, to be mandated to take action.
 
Just to also say that from our perspective, just to underline how much we value transparency, we are the only company in our sector, on our side of the sector which published reports on a regular basis, twice a year. We have another one coming up very shortly. So, we encourage also others in the industry to be more transparent and to also start, you know DSA of course mentioned this already and this will probably come once the regulation has been – but we start with that because we think that’s just a very good practice. I’ll stop here because I think I’m already over time.
 
>> SEBASTIAN SCHWEMER: Thank you very much, Petra, for your perspectives on both the current label, but also the DSA and what it means for you and what would be way forward. Last but not least, I would invite Corynne McSherry to give us her perspective. She’s a Legal Director at the Electronic Frontier Foundation, specializing in copyright, liability, open access and free expression issues and has large experience. And I think Corynne, it will be quite interesting for you to hear the kind of top-level issues that we’re dealing with here, and yeah. So, the floor is all yours.
 
>> CORYNNE MCSHERRY: Okay. Hi! Thank you so much. And I apologize, I don’t know quite what’s going on with my camera, but I’m not actually sitting in the dark. It just seems to look like it. But I’m so glad to be participating in this conversation. And some of what I was going to say is actually a little repetitive, so I’m going to edit it down because I really want to make sure we have time for discussion.
 
I think one of the things to keep in mind is that the problems that we’re going to confront with content moderation at the infrastructure level start with content moderation itself at. And at the Electronic Frontier Foundation, we’ve been working on these issues for a very long time and have seen over and over how mistakes are made, inevitably. And over and over we see perfectly legal, legitimate content taken down, often mistakenly. But over a decade or so, many companies at the platform layer have put in place at least some processes to try to address that. But nonetheless, there continue to be lots and lots of mistakes, and it’s very frustrating from a user perspective, because those mistakes, what those mean is that legitimate expression is being taken off the Internet or made harder to see and so on.
 
So, we’re sort of starting with that problem that the content moderation system is really pretty broken, even at the platform level. So, then it’s really worrisome when we extend it to the infrastructure level, as is already happening, as everybody knows, for a few different reasons. One of them that several people have hit on is, of course, a problem of proportionality. Depending where you are on the infrastructure level, it is extremely difficult to target your response to a specific piece of content, even if you want to, right? So, you have to take down entire sites. You take down entire services as a practical matter. This is not the fault of the companies; it’s just a matter of, you know, how the internet works. Again, people have pointed this out. So, I won’t repeat that too much.
 
But there are some other things that are happening here that I think are important. One is, we’ve talked about transparency. From a user perspective, that’s really crucial. You could actually, a company could put out lots of transparency reports and even make its rules and guidelines transparent, but if I’m a user, I don’t actually realize, usually, all the different services that are involved in getting my speech out to an audience. And so, I may not even know how to comply with all of the guidelines and all of the terms of service of all of the different services that I’m relying on to share my speech. And I think for an ordinary user, it’s really an impossible thing to ask them to be familiar with everybody’s guidelines, so the guideline of the DNS provider and the guideline of the CDN and so on. So, it’s a process that’s really fundamentally very opaque for users and really challenging to figure out how to make transparency meaningful.
 
And then as a related problem when you hit the infrastructure level – and I just want to – I think it’s just useful for the conversation to concretize this from a user perspective. What are your options for redress? What are your options for appeal? If I’m a user and I see, you know, speech is being taken down, first of all, I have to figure out who at the infrastructure level is, you know, is causing the takedown. And then maybe I need to figure out how to reach them. And very often at the infrastructure level, you just don’t have that relationship between the speaker and the service that they’re relying on.
 
So, this is why when we move to the infrastructure level, the problems that we already have of content moderation get specific worrisome. And I think that the Commission has recognized this, and the DSA tries to deal with it, but I think we still have a long way to go. And while transparency is crucial, it’s not going to get us all the way there.
 
Then the other thing I want to say is that, often people don’t necessarily have alternatives. So, you can’t necessarily just switch to another provider if you think, well, this provider has made a mistake. I figured out who’s taking my content down, I figured out what’s going on here, but I don’t have a choice. I don’t have anywhere else to go. So, I think this person’s made a mistake, but I’m still stuck with those decisions because I can’t just switch providers. And that is what makes sort of the content moderation choices even worse at the infrastructure level.
 
And one thing I want to point out is that, at least at the platform level, content moderation has now extended into conduct moderation, by which I mean that we have seen platforms hiring law firms to investigate conduct of users and then kicking those users off their services based not on what they’ve done on the platforms, not what they’ve done online, but based on their offline conduct. If that starts getting extended to the infrastructure level, that could get even worse. I worry when every time I see what’s happening at the higher level, the platform level getting extended down, it’s just going to have really negative impacts, I think, on users’ expression.
 
So, the last thing I’ll say, or just a couple things. One thing that I think is very valuable in the DSA is an attempt to limit legal uncertainty, which, because legal uncertainty just makes all of these problems worse, right? When you don’t know what you’re going to be on the hook for, if you’re a company, what that usually leads to is over-removal, because that’s safer. And so, it’s really important to limit legal uncertainty as much as possible.
 
I think that, as I understand the DSA has it’s currently drafted, though, there’s still going to be a fair amount of uncertainty as courts kind of figure out what different kinds of services fall within which buckets, and it’s going to be expensive for the companies that decide to litigate it, and there’s going to be a lot of questions along the way.
 
Then finally, so, EFF has been involved in the discussions around the DSA in a bunch of different ways. We appreciate the effort that we see in Article 12, efforts in the DSA to call out intermediaries and ask them and require them to respect fundamental rights. But for the reasons we’ve just been talking about, all of us, I really worry about whether that’s going to be possible as a practical matter.
 
And so, one of the things that we’ve proposed and I think is worth thinking about is thinking about ways to actually put users in control, rather than just the companies in control of online expression. So, for example, promoting interoperability so that users have alternatives, so people have other services they can turn to. Transparency is very, very important, but as I said, that’s not going to take us all the way, but it’s a good start and it’s more than we get, actually, right now.
 
And then also actually encouraging companies to invest in tools that actually put users in control of the content that they see online, rather than relying on intermediaries to make all of those decisions, which are often going to be difficult because expression is contextual, and you can’t always figure out from the point of view of an intermediary, especially someone down on the stack, whether the content’s legal, whether it’s offensive, and so on. You don’t have the context to be able to make those decisions. So, we’d love to see more efforts to actually focus on empowering users who do have the context and are better positions to make those kinds of decisions. And I will close there.
 
>> SEBASTIAN SCHWEMER: Thank you so much for your intervention, Corynne, and some really important, interesting points. That’s also a good discussion in the chat already going on, which I tried to follow up on in a second. So, one thing I was wondering – thanks, everyone, for your contributions. And it seems there’s a common theme around – this at the infrastructure level is problematic. There is something about transparency, about proportionality, et cetera. And I wonder in a way, so, now there’s this framework put forward and there seems to still be some legal uncertainty, Denis, so that question I think I will play to you in a second. The second point I also want to raise is voluntary moderation. Now, the DSA also says, Dear Intermediary Services, your liability exemption won’t be lost if you try to do something good, which is great. But I wonder, I mean, in this sphere of voluntary moderation, shouldn’t the same principles apply that I think both Corynne and Corinne touched upon, and Polina and Petra to some extent – shouldn’t those apply to those voluntary measures as well, irrespective of whether it’s a platform taking those content moderation decisions or an infrastructure service provider, in terms of transparency, redress mechanisms and appeal, et cetera? But Denis, maybe first playing back to you. So, what are your thoughts on the question of that?
 
>> DENIS SPARAS: Sure. So, first thing, I think you know, I think we are all in agreement, as you, yourself, said, that you know, the question of transparency and legal certainty, something which I think here – so, transparency in terms of reporting obligations of DSA, something which is very important here. And this is what we tried to come up with, you know. So, you know, for the providers of intermediary service to transparently present when they are making certain decisions.
 
And you know, we tried to break down by different means by which the decisions are making. Because I hear when someone is saying it is a decision based on terms and conditions, and very often indeed what we have observed in our preparatory work is that, you know, those decisions are taken with no clear indication on what basis, you know, with what consequences. And this is what now the DSA’s trying to tackle to say, look, when you are taking these decisions, it needs to be transparently presented, not only at the time of decision-making, but potentially before.
 
So, if you as a service provider, you want to have a chance to do this, you should tell that to a user before. And you know, the classical example would be AWS, you know, let’s say taking down or however you want to call – it’s essentially you want to be as a recipient of the service, you want to know that. Certain ambiguity will stay. And you know, I think we see from our work, both our work, but you as an academic, and you know, many people around this table will see, there are questions, and replies are very difficult. And as we always say, it will depend on a case-by-case basis, and even there you may have different solutions depending on who is looking at it, which is unfortunate, because it will lead to legal uncertainty and fragmentation. But unfortunately, at this point in time, you know, it’s very difficult to prevent all that, you know.
 
I think we have the knowledge, we have the experience, but I think we are still lacking some of these things. And again, you know, the services are developing so fast, it is very, very difficult, you know, to pinpoint in a given point in time, you know, it’s for sure this or it’s for sure that. Again, I’m just thinking about studies I have read and academic papers I have read. I mean, there is no – you can’t find an agreement for one single service. So, it’s very difficult, you know. I’m not saying, you know, we shouldn’t be trying, but what I’m saying, we have tried.
 
And like you said, when it comes to voluntary measures, again, you know, even in case of voluntary measures, we would want to know when these are employed and what are the consequences. What we wanted to provide with Article 6 is essentially legal certainty, that the mere fact that you are employing these voluntary measures does not mean, irrespective of which of the intermediary services you are providing, that you are automatically, you know, outside the liability exemption.
 
I think, again, let’s say what we tried to address is the calls from the stakeholders, from the industry, to address some of these opened questions. I mean, again, you know, tech is still improving, many peoples are contributing to it, and you know, discussions like today are also probably very helpful in that. Whether we will have a perfect text, that I’m not sure, but you know, we can try our best.
 
>> SEBASTIAN SCHWEMER: Thanks, Denis. There was a question in the chat regarding kind of the view of the DSA and the e-Commerce Directive in terms of the network layers and where CDNs and the DNS. Polina and Petra, do you think the proposed model that is very functional-oriented is kind of sufficiently clear on that, and is that the right model to work with? The question in the chat was talking about layers. I have mixed feelings, because if you think of network layers, CDNs would be in the content layer or not, sometimes in doubt what distance CDN as an infrastructure intermediary as opposed to DNS operator.
 
So, Petra, maybe you want to quickly react on your thoughts? Is the world view the right world view in the DSA?
 
>> PETRA ARTS: Thanks, Sebastian. Thanks for the discussion. I think, you know, we definitely think there is still some merit in the categories as they were originally foresee in the e-Commerce Directive and that, you know, as Polina mentioned, maybe there is obscurity about the DNS and how it fits in the categories, but from CBN perspective, the main ability is to cache. So, for us, that is still a very relevant category to have in the liability framework.
 
I think, you know, the main important thing for us, from our perspective, because we provide so many types of services and many of them are based on the same coronet work that we have and the functionalities are based on the same technology, if you will, I think the service-oriented approach is very good, rather than, let’s say a provider approach, because a provider can provide many different services, and then you get, like, very modeled kind of situation that you’re not sure whether you’re covered, but the service by service, let’s say, if you will, approach with a very clear legal framework that applies to the core functionality of what the service does I think is very important from the DSA’s perspective, also was really important from the e-Commerce Directive’s point of view.
 
So, in that sense, I think from our perspective, I think one of the things that we have seen that we think could be further clarified in that respect is that when there is a service, that we provide, I think it was mentioned in the chat, there is often a service that provides a CDN, but also a reverse proxy subsecurity protection service, that when there is a consideration of, you know, liability, for example, from an authority or a court, that there is consideration of what is the core functionality and what liability is attached to the different functionality of the same service and that they’re not, like, put together and then you lose your liability protections because there is, you know, a different functionality that is also offered on the same network, for example. But CDN is the core service. So, that is something that we have seen that could be, you know, considered a kind of bit of the modeling of the different layers that could be clarified maybe further.
 
And as others had said and Denis just mentioned it, I think also for us, the lack of certainty, the categorization of the services in Title 27 is something that might not be helpful for the legal certainty across Europe as well because there will be, obviously, different national interpretations still, presumably, from different governments on how they would categorize services. So, that is also something, that where you can maybe see a little uncertainty happening.
 
>> SEBASTIAN SCHWEMER: Thanks. Polina, how would you explain the domain registry’s perspective on this? Is the issue of the legal certainty related to the liability exemptions? Is it related to the potential due diligence obligations? Is it related to both? Where’s the issue for your part of the industry in that respect?
 
>> POLINA MALAJA: Thank you, Sebastian, for directing this question to me. I would simply have to reiterate myself from my opening remarks. From our side, of course, the main issue is the unclarity on the liability exception. So, there is some type of intention from the legislators to shield the domain name registry specifically from the liability, for, let’s say, so-called “content moderation,” using air quotes for that. But in practice, it’s just not what can work under the current definitions of these three categories that simply just do not reflect the function, how DNS functions.
 
And I know, for example, Sebastian, and from also your previous work, you also argued that maybe potentially registries can be considered mere quandary, however, under some interpretation. But for us, it’s very clear that, unfortunately, DNS can also now be equated to providing access to communication network. So, for these reasons, of course, while there are really great intentions from the legislature and from the commission’s proposal specifically, but in practice, it will be left for national courts to decide case by case. And of course, this will definitely not improve the aimed harmonization and, yeah, avoiding fragmentation of the legislation within the EU.
 
And specifically on the due diligence obligations, actually, we fully support, because we also feel that transparency’s very important and the clarity in terms of service need to be ensured irrespective whether a service is intermediary or not. But something maybe I will very briefly touch upon, the voluntary measures. While this is another one for registries, of course, meaning that we’re not even shielded from liability at the moment, so the intention for the self-regulation in order to be able to take voluntary measures for us is, of course, a bit even one step further away, because we don’t even have a proper regulatory exception. So, that’s from my side.
 
>> SEBASTIAN SCHWEMER: Thanks, Polina. We only have 1 1/2 minutes left before we need to present the messages. And I still would love to hear two very brief thoughts. Fred, you mentioned jurisdictional issues, which I think are quite relevant in this context. Is the framework that the DSA is setting out, is that something that can help resolve those, or is it kind of just shipping around some of the issues that you mentioned in your intervention?
 
>> FRED LANGFORD: Well, I’ll pick up on something that Denis said earlier and I completely agree, there is not a one size fits all. And I think that each case does have its merits. And to pull it back on the jurisdictional issues, it would depend on if the intention was to remove access to an entire site, then it may well be proportionate to take action.
 
As far as the jurisdictional issues, I do think it’s a struggle, and I think this is probably better placed for lawyers and myself to answer. A lot of it comes back to the agreements in place between particular countries and how quickly people can actually take action. And there’s historical issues around that, legal, historical issues that I couldn’t possibly comment on.
 
But I would say the framework, I think, is a step forward. It’s definitely trying to address those things. It’s trying to take action, and I think, again, just to pick up on what Denis was saying, it’s to try and fix and stay futureproof with what was there and what’s coming, and it’s making best endeavors and not everything is going to be perfect.
 
>> SEBASTIAN SCHWEMER: Thanks so much, Fred. Corinne Cath, I directed this question to you and want to hear it again. You mentioned the lack of strong accountability framework and accountability measures. Do you think if we talk again in the next EuroDIG, is this issue resolved with the DSA?
 
>> CORINNE CATH-SPETH: That’s a tricky guess to make. The thing is, we need to also keep in mind that there’s two sides to this debate. So, on the one hand, what are the different obligations that regulators like the European Commission are putting on companies, right? And to what extent do questions around proportionality come up in those cases?
 
But there’s also the other side of the coin which we haven’t really spoken about during this debate, which is, when these companies proactively, themselves, decide to make political decisions about where to intervene. And the thing that I’m concerned about is those decisions. Like, I’m very receptive to the fact that there’s all sorts of concerns industry has when it comes to the DSA, but what about the moments in which they, themselves, decide, we don’t think this is okay, we don’t think this is ethical? Like, where are the policy frameworks for understanding those decisions and what accountability measures should those be followed up with?
 
>> SEBASTIAN SCHWEMER: Thank you so much, Corinne. Thank all of you for this fantastic discussion. I think we should have scheduled two hours. We don’t have that. Without further ado, I will give the floor to Ilona Stadnik, who is now presenting the three to five key messages from this session. I hope you will have a good follow-up following this discussion. Thank you so much.
 
>> ILONA STADNIK: Thank you, Sebastian. Can you hear me well? Okay. So, I was following your session and I must confess that it’s a bit sad because there is more questions than answers and solutions. So, I tried to summarize it briefly in several points. Since I’m not sharing the presentation, could the presenter turn the next slide? Okay, thank you.
 
So, I will be reading it out. And if there is a rough consensus, we can leave it as it is, or we can do some minor revisions. So, DS sank aimed at modernizing the rules and limiting digital uncertainty for addressing risks in digital space, including the key question of liability of service providers, including necessary technical auxiliary functions. Any objections? Okay, I don’t hear any.
 
Infrastructure intermediaries were in the legal shadow since the 2000 e-commerce directive. Today we face the necessity to define liability exemptions for them and elaborate explicit categorization. Any objections? Corrections?
 
>> DENIS SPARAS: I would just like to clarify that we don’t define any liability exemption for them. I mean, it’s still the same as it was in the e-commerce directive, you know. We don’t really change much in the Articles 3-5 of the DSA.
 
>> ILONA STADNIK: Maybe I misheard something, but there was some kind of intention to clarify maybe these exemptions.
 
>> DENIS SPARAS: Yeah, maybe it’s more about the clarifications, rather than definitions. Not to give the impression we are redefining the liability exemption regime, because we are not.
 
>> ILONA STADNIK: Okay. Let’s go further. Recent cases show that infrastructure providers unwillingly do content moderation by suspending services for the platforms in ad hoc manner without any transparent policy. Okay, I don’t hear any objections.
 
>> POLINA MALAJA: Maybe we can say “certain” infrastructure providers, because I don’t think we can generalize them. Thank you.
 
>> ILONA STADNIK: Okay. The next one – we should keep in mind the proportionality of measures on the infrastructure level to take down illegal content and foresee the collateral damage. Sorry.
 
>> DENIS SPARAS: Can we use “unintended consequences,” instead of “collateral damage”?
 
>> ILONA STADNIK: No, it’s okay.
 
>> CORINNE CATH-SPETH: And maybe add including when these takedowns are the choice of the company, right, not the concern forced regulation, because they choose to.
 
>> ILONA STADNIK: Takedowns are the choice of the – could you repeat?
 
>> SEBASTIAN SCHWEMER: Related to the voluntary measures. Ilona let me also point to the chat, where it was pointed out that the infrastructure level, there is no “takedown,” and I think maybe we can in the follow-up sharpen that a little so that the notions are not – the concepts are not confused to – yeah, blocking is suggested as an alternative.
 
>> ILONA STADNIK: Okay, so, what do you propose?
 
>> CORYNNE MCSHERRY: Perhaps something maybe to say to target? Because that covers a lot of territory.
 
>> ELISABETH SCHAUERMANN: And not to cut you short, but we are already over time and my colleagues are urging us to stop so that we can stream the closing. So, please keep it to strong objections now and more elaborate statements can be made in written after we’re done.
 
>> ILONA STADNIK: Yes, yes. There is a possibility to edit them afterwards. So, I will finish this message and the last one. Current content moderation practices on the content level are not ideal, transparency reports are not very helpful to escape the mistakes, so what to expect from the broader infrastructure level?
 
>> CORYNNE MCSHERRY: I would say it will only get worse at the broader infrastructure level. But I’m okay with editing later.
 
>> ILONA STADNIK: Okay. So, with this –
 
>> PETRA ARTS: Sorry, just one last point, Ilona? I’m sorry. Escape the mistakes. I don’t know exactly what we’re referring to there, but I can follow up later, if we have to.
 
>> CORYNNE MCSHERRY: I agree with you, Petra. I would probably say something a little different there as well.
 
>> ILONA STADNIK: All messages will be open to the public commenting afterwards, so I really invite you to do that. Okay, I think we can finish with this. Thank you very much for the session. It was really interesting to follow all you.
 
>> ELISABETH SCHAUERMANN: Also, thank you from my side. Thank you, Sebastian, for this wonderful exchange. In a few seconds, probably the studio will be here with the closing of the day, so I encourage you to stay in this room, because also afterwards at 6:30, the mentalist show will be happening right here. So, I wish everyone a good evening.  


[[Category:2021]][[Category:Sessions 2021]][[Category:Sessions]][[Category:Cross cutting/other issues 2021]]
[[Category:2021]][[Category:Sessions 2021]][[Category:Sessions]][[Category:Cross cutting/other issues 2021]]

Revision as of 15:18, 19 July 2021

29 June 2021 | 16:30-17:30 CEST | Amphitheatre | Video recording | Transcript
Consolidated programme 2021 overview / Day 1

Proposals: #7 #55 #63

You are invited to become a member of the session Org Team! By joining an Org Team, you agree to your name and affiliation being published on the respective wiki page of the session for transparency. Please subscribe to the mailing list to join the Org Team and answer the email that will be send to you requesting your subscription confirmation.

Session teaser

Content moderation on the Internet infrastructure level can be wrought with problems. The EU Digital Services Act (DSA) proposal addresses providers of services establishing and facilitating the underlying logical architecture and proper functioning of the Internet, including technical auxiliary functions. This workshop explores the current and future role of providers of Internet infrastructure (wireless networks, DNS service providers, TLD registries, certificate authorities, content delivery networks and others) in content moderation and how it is being addressed in the proposed DSA.

Session description

Interfering with non-content layers of the Internet in order to censor contents, especially those on social media platforms, is an ongoing reality in some parts of Europe. In a way, the workshop provides a counterpoint to the discourse about restraining the excessive power of platforms, and presents cases where global platforms are instrumental in maintaining freedom of expression in a tightly controlled domestic digital environment.

At this workshop the two different realities of what is behind the concept at issue are presented side by side. There will be a contrast, but the workshop also offers opportunities to look for a minimum of common ground and to preset ideas of how to avoid at least unintended disruptions of service and damage to the infrastructure.

The workshop will be structured around three main themes:

  1. Stocktaking: what is the current picture of content moderation at non-content layers (what do we know; what do we not know)?
  2. Problem identification: what are issues around moderation at non-content layers (legal, technical, societal)?
  3. Regulatory landscape: how is self-regulation looking? How does the Digital Services Act proposal suggest to mitigate issues (what is good/bad, what is missing)?

Format

The workshop will take place online. Key participants have been invited to give a short intervention followed by an interactive facilitated discussion with the audience.

Further reading

  • Digital Services Act proposal (link)
  • Study Legal analysis of the intermediary service providers of non-hosting nature (link)


People

Focal Point

Focal Points take over the responsibility and lead of the session organisation. They work in close cooperation with the respective Subject Matter Expert (SME) and the EuroDIG Secretariat and are kindly requested to follow EuroDIG’s session principles


Organising Team (Org Team)

Subject Matter Experts (SMEs)

  • Polina Malaja (Council of European National Top-Level Domain Registries (Centr))
  • Yrjö Länsipuro (N/A)

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member by subscribing to the mailing list.

  • Vittorio Bertola (Open-Xchange)
  • Giacomo Mazzone (UNDRR & Eurovisioni)
  • Collin Kurre (Ofcom)
  • Claire Pershan (EU DisinfoLab)
  • Velimira Grau (European Commission)
  • Vladislav Ivanets (N/A)


Key Participants

  • Denis Sparas, Legal Officer, European Commission
  • Dr. Corinne Cath-Speth (@C___S), Oxford Internet Institute (OII), University of Oxford
  • Petra Arts (@Petra_Arts), Senior Manager Public Policy, Europe, Cloudflare
  • Polina Malaja (@CENTR_Polina), Policy Advisor, Council of European National Top-Level Domain Registries (Centr)
  • Corynne McSherry (@cmsherr), Legal Director, Electronic Frontier Foundation (EFF)
  • Fred Langford, Online Technology Principal, Ofcom


Moderator

  • Sebastian Felix Schwemer (@schwemer), Associate Professor, Centre for Information and Innovation Law (CIIR), University of Copenhagen


Remote Moderator

Trained remote moderators will be assigned on the spot by the EuroDIG secretariat to each session.


Reporter

Reporters will be assigned by the EuroDIG secretariat in cooperation with the Geneva Internet Platform. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.

Messages

  • The DSA is aimed at modernising relevant rules and limiting legal uncertainty for addressing risks in digital space, including the key question of liability of service providers, including necessary technical auxiliary functions.
  • Infrastructure intermediaries have been in the legal ‘shadows’ since the 2000 e-Commerce Directive. Today we face the necessity to clarify liability exemptions and elaborate explicit categorisation of digital services.
  • Recent cases show that certain infrastructure providers unwillingly take action that could be argued to be content moderation by suspending services for the platforms in an ad-hoc manner without any transparent policy. But infrastructure services have limited possible options, which tend to be temporary solutions (clearing cache), overbroad reactions (limiting access) or options that open up websites to cyberattack (terminating services of particular users).
  • We should keep in mind the proportionality of measures to target illegal content and foresee the unintended consequences on the infrastructure level. Current content moderation practices are not ideal and transparency reports are not very helpful to escape the mistakes, so what can be expected from the broader infrastructure level?

Find an independent report of the session from the Geneva Internet Platform Digital Watch Observatory at https://dig.watch/resources/content-moderation-internet-infrastructure-level-where-does-censorship-begin.

Video record

https://youtu.be/JUXnE4WLMUI?t=18001

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-482-9835, www.captionfirst.com


This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.


>> ELISABETH SCHAUERMANN: All right, everyone. Welcome to the Amphitheatre and Workshop 9 dealing with content moderation on the infrastructure level. Before I hand over to your moderator and host, Sebastian Schwemer, I will just remind you of the session rules.

Please be in the room with your full name so that we can identify you. If you want to make a point or raise a question and you are not one of the assigned speakers, please raise your hand and the right to speak will be given to you. When you are unmuted and the floor is given to you, please state your name and your affiliation and switch on your camera as well, if you can.

The chat will not be published and please avoid sharing the Zoom link outside of this room because we, for safety reasons, need people to enter through the official way. All right. And with that, I would give the floor to Sebastian and wish you all a lot of fun and good discussion.

>> SEBASTIAN SCHWEMER: Thank you very much for the kind introduction. Good afternoon, everyone. I hope you can hear me well. And a warm welcome to this workshop on content moderation on the Internet infrastructure level and the question: Where does censorship begin? I’m your moderator and the focal point of this workshop, Sebastian Schwemer with Copenhagen’s Center for Innovation Law at the University of Oslo’s Norwegian Center. I was involved with some of the preparatory works for the digital services act and author of one of the background studies to that which deals with the non-hosting intermediaries.

Today we have six fantastic key participants to discuss with us some, I think, very, very important questions. We also have you, the audience. So, for the process, I would invite the six key participants to come with a short thought, three to four-minute intervention in the beginning. Please, feel free to ask questions in the chat and I will try to do my best to integrate you along the way. Good!

So, who do we have with us? We have with us, and I come with a better introduction in a second, Denis Sparas from the European Commission, which I’m very happy that he’s able to join us. We have Dr. Corinne Cath from the Oxford Internet Institute, Petra Arts from Cloudflare, Corynne McSherry from EFF and Fred Langford from OFCOM. And I think this is a fantastic panel to discuss these questions. And if we look at content moderation at the infrastructure level, first of all, I think we can ask the question, content moderation, is it really content moderation or should we put quotation marks, what can happen at the infrastructure level? It can be wrought with a lot of problems. The European proposal for the Digital Services Act addresses providers of services that establish and facilitate the underlying logical architecture and the proper functioning of the Internet, including technique auxiliary functions, and I think that is a good starting point to discuss where we are at, what is the current and future role of providers of Internet infrastructure in content moderation, again, maybe in quotation marks.

We have three rough themes that I think will guide us through our conversation, and the first relates to stock-taking. The question of, what is the current picture of content moderation at non-content layers? And important, I think, both to ask the question, what do we know and what don’t we know? We also should talk about the problems. What are the issues around moderation and non-content layers? And these can be technical, societal, political in nature.

And lastly, I think this is, I hope, where we can feed into the ongoing discussions in this workshop, the regulatory landscape. How is self-regulation looking and how does the Digital Services Act propose to address these rules, this phenomenon, and is it sufficient what is working, what is not, what is missing, et cetera.

So, without further ado, I would like to introduce to give his first intervention, Denis Sparas. Denis is a legal officer at the European Commission and I think there could be no one better to present the commission’s ideas and visions. Denis has been working on the DSA file, but also quite some years ago on the decommission directive. So, Denis, the floor is all yours.

>> DENIS SPARAS: Thank you, Sebastian, for a very nice introduction. I hope you can all hear me, because I have the feeling that my system is showing me some problems with the connection, so I hope everything goes well.

I will just briefly touch upon the, essentially, the Digital Services Act and where we are coming from, because you know, as Sebastian was saying, we have an excellent line of speakers and panelists, which I personally also want to listen to, and then also, I’m sure that there may be many questions from the audience, itself, so I’m really looking forward to this discussion.

I mean, just very briefly on the Digital Services Act, as already mentioned by Sebastian, this is part of the digital services Package adopted end of last year and is essentially one of the flagship initiatives following the Digital Agenda for Europe, which was in the commission in March of 2020.

What are the objectives of the Digital Services Act, just in a nutshell? On one side, we want to modernize the rules which we have today, and Sebastian was already referring to my past work, which is essentially the framework of rules that we have, which is the E-commerce Directive. And essentially, the idea is to make these rules more effective in addressing illegal content and systemic risks in the online space. But we also aim to do, and it’s very important in the context of today’s discussion, is to clarify the rules on liability, you know, at any layer, let’s say, so infrastructure layer, but also in the context of the hosting services.

And the idea here is primarily to give companies and market operators, but also users, legal certainty in different aspects, including in relation to any voluntary action that they may take in an unproportionate manner. We can discuss this, at what layer this is essentially possible and relevant. And then also what is also very important as one of the objectives of the Digital Services Act is to increase transparency of the content moderation decisions, ensure accountability, and facilitate better oversight so, you know, it’s not only about responsibilities, but also about accountability.

What is important to say, you know, in the context of the relationship between the Digital Services Act and the e-Commerce Directive is that the proposal is essentially to be perceived as complementary to the e-Commerce Directive, which is currently valid framework and which will remain essentially the valid framework. But of course, what is very important to keep in mind is that because the Digital Services Act covers the providers of intermediary services, it also takes over, if that’s the best possible term in this context, takes over the provisions which are the most interesting for today’s discussions, you know, so, Section 4 of the e-Commerce Directive dealing with the Conditional Liability Exemption.

DSA essentially builds on the liability exemptions of providers through intermediary services, so it replaces the respective parts of the e-Commerce Directive. Because we talked about the regulation, you know, it’s simply a question of legal technique where in the directive, the provisions were addressed to Member States, while in the regulations, essentially, provisions which are addressed to the providers and intermediary services.

There is, you know, to the extent we have been looking into these issues, and I’m trying to tackle the identified problems, it has been considered that, you know, since the evidence showed that the existing provisions are still very much valid and provide the cornerstones of the provision of these services in the EU as we know them today, we shouldn’t be changing something that works, but we should adapt and, you know, modify where this is necessary.

So, for example, one point which we found that is important to clarify is, you know, to what extent the liability exemption regime also applied to what we called Infrastructure Layer Services. And what we aim to do through Recital 27 of the DSA to clarify that services establishing and facilitating the underlying logical infrastructure and the proper functioning of the Internet can also benefit from the exemption from liability.

And then what we also tried to do, we tried to clarify by providing certain examples, which these services could be. But of course, here it’s very important to put a clarification and disclaimer, if that’s the good word to use, that you know, it will always very much depend on a case-by-case basis whether a specific service added today or in the future, because we still don’t know how these services will look in the future or whether there will be some new services, you know, whether the existing. All the new services fall within one of the three categories of the intermediary services, so, the classical caching and hosting which, I think we all well know.

I think also what is important to keep in mind is that the DSA also proposes to maintain the prohibition of general monitoring obligation, but which is probably something which is a bit more relevant in the context of the hosting services and maybe less in the context of the infrastructure layer services.

And then what is also an important part and novel part of the Digital Services Act is, you know, the part on the due diligence obligations, you know, obligations which providers of intermediary services should comply with. And I think here, again, it is very important to stipulate that the Digital Services Act is based on the principle of asymmetric due diligence obligations, which means, you know, depending on the size, nature, and the reach of the specific service, you know, service providers will be subject to different type of obligations.

So, for example, when we talk about, let’s say infrastructure layer services, the new obligations would essentially cover relatively limited set of obligations, assuming that, you know, none of these for the – for the moment, let’s assume none of these would be hosting services, so these would be very limited obligations, like having a point of contact or a legal representative, if we talk about the service provider coming from the outside of the EU, and then you know, having some standard terms and conditions, plus you know, transparency about the content moderation policy.

What is also important to keep in mind is that there is a specific exemption for micro and small enterprises. And the idea behind all this is to have, you know, a balanced and proportionate approach to the role that the intermediaries play in the daily provision of such services, and in particular, when we talk about non-hosting intermediaries in the present context.

I mean, all this is based on, you know, vast evidence collection, including the study that Sebastian was referring to and which was prepared for the commission, and you know, for any follow-up, really, because we really don’t have much time for this. You know, I would really recommend reading Annex 9. Our Impact Assessment, which was accompanying the proposal, which deals with a lot of, you know, evidence-based and elements why the proposal looks the way it looks, in particular when it comes to infrastructure layer services. And maybe I would stop here just to give also others the opportunity. I’m really looking forward to our discussion.

>> SEBASTIAN SCHWEMER: Thank you very much, Denis. And I’m sure we’ll get back to you with lots of questions and comments, but thank you so much for the first introduction and very concrete way that it is put forward.

I want to invite up Dr. Corinne Cath, recent graduate of the Oxford Internet Institute, to tell us a little bit more about the proper identifications. What are we looking at here? And maybe you want to tell us a little bit more about the problems that we see with this phenomena of looking at the infrastructure layer. Corinne?

>> CORINNE CATH-SPETH: Thank you so much, Sebastian, and thank you for the invitation to be here today. As mentioned, I am.

>> ANIA HELSETH: Three apologist who studies the politics of infrastructure, which essentially means that I look at how organizations and companies that operate at the Internet infrastructure level engage in political gatekeeping, including in what today we’re calling content moderation, through their day-to-day operational work.

And so, in the past, I have written about, for example, Cloudflare’s decision to take down messaging board HM in the wake of the El Paso shooting in the U.S. I’ve also considered Amazon’s suspension of services to right-wing outlet media Parlor earlier this year after the storming of the U.S. Capitol. And I think there’s two things we see in both of these cases, namely, that Internet infrastructure companies intervene in political debates by deciding to rescind their services to online platforms. Now, this isn’t necessarily a new phenomenon. Academics like Laura DeNardis have shown how essentially this is a thing that’s been going on for at least a decade, if not longer, but it seems like it’s becoming, if not more frequent, more visible to policy debates, which is a great thing.

But that does also bring me to one of the guiding questions for today’s session, which is, what are the issues around content moderation at the infrastructure level? And for me, one of the big things I see within these decisions is that they often, at least from the outside, seem to be happening quite ad hoc and at the whims of CEOs of these infrastructure companies. And I wrote a piece about this last year for the Brookings Institute together with Savannah in which we dubbed these decisions as coming from the Internet-reluctant sheriffs, which by we mean that for the most part, Internet infrastructure companies are naturally hesitant to position themselves as explicit political players, but when they do intervene, they often attempt to do so without a strong policy framework around how they make these decisions and also with limited sort of follow-ups in terms of accountability measures afterwards.

So, I think one of the issues that I would like to focus on and that I think is important for this debate is, given that, you know, political gatekeeping and content moderation will continue to happen at the infrastructure level, there is a real need to make sure that its main players have a much more mature and robust framework around how to respond to those kinds of situations, and these frameworks need to be publicly available so that researchers like myself, but also policymakers and others can actually scrutinize them, as we do with platforms, and also because that creates a certain level of predictability and accountability, both towards the consumers of these infrastructure companies, but also to the Internet users at large.

>> SEBASTIAN SCHWEMER: Thank you so much for your interventions, Corinne. And I took some interesting notes that I will send back then maybe as questions to Denis, on to you, whether the framework contains these safeguards and the policy framework that you kind of identify is lacking.

I would invite over Fred Langford, who is Online Technology Principal at OFCOM, and has two decades of experience with online architecture and the content moderation landscape at large. He was also previously the IWF Deputy CEO and CTO, prior to joining OFCOM and can provide us insight into where we are at and what is needed in terms of governing content moderation by infrastructure actors.

>> FRED LANGFORD: Thank you very much, Sebastian. And as Denis said, I’m having occasional intermittent network issues, so I do apologize if I freeze occasionally.

The issues at hand, I suppose, yeah, very, very interesting, I think. So, I mean, from a regulatory perspective, I’m interested in the options that are available, so what comes out at the end, after all these discussions is really key from a regulatory point of view, and where they fit within the flow of data. So, in the stack, if we’re talking about how things go up and down the stack and how they interact with each other, particularly who is providing those services on that TCIP stack that means that action could actually be taken. I think we’ve already touched on jurisdictional issues. What happens – what are the expectations on providers of these sorts of services who are operating outside of the regulatory zone? What enforcements? I use an example. There was recently something, a news article relating to a court judgment for Quad 9, who are based in Switzerland, and what was the implementation going to be for the copyright protection for Quad 9? What legal jurisdiction did that court have in making that decision? It’s still open to challenge, I understand, but that is an interesting concept.

But assuming the two points I’ve mentioned have already been addressed, there are other considerations and how effective are the approaches in the first place? Is it going to provide a universal expectation of stopping access to the sort of content that people don’t want to see? But it really is, the main issue is, is it proportionate? What collateral damage is taking place? So, if I use examples around the sorts of things that are taking place on networks and architectures already, we have URL blocking and there’s DNS blocking, key words, use of key words with registries to stop domains. We have CDNs clearing caches or not caching, access from sites which Corinne already mentioned what Cloudflare were doing with HN, based upon that decision as well. We have filtering solutions on Wi-Fi and also this search. But so far, these have been voluntary, and this sort of leads into the need of really being able to address those providers who should be doing more on their platforms but don’t engage. And then you need to have alternative options available to be able to do something.

I mean, some of the problems we have are, obviously, clear, have been mentioned in conversations around this, to provide clear definitions to provide that legal certainty for providers, if that was an expectation of them, in future-proofing. But I’ll also say that some of them are very blunt instruments, particularly when we’re talking about content moderation around particular pieces of content on large providers’ networks, so DNA blocking as an example. That would black hole the entire site, and that was the proportionality discussion I was raising. Is that proportionate? I mean, what is the volume of content on the site overall compared to the content that’s trying to be blocked? Are we talking one image out of a billion or one image out of two? And the proportionality is very, very important here.

Also, who’s going to see this sort of content? Who are you trying to protect by stopping people seeing this content? So, there’s lots and lots of considerations, which makes our job interesting. That’s why we do it. And I think that I’m really looking forward to this debate and conversation to find out more.

>> SEBASTIAN SCHWEMER: Thank you so much, Fred, for your intervention and interesting thoughts. And this issue of the blunt instrument, intervening infrastructurely. I might invite next up one of those representatives of those actors, Polina Malaja. She’s the Council Advisor at the Council of European National Top-level Domain Registries, where she leads their work and is liaising with other institutions in the ecosystem. And maybe, Polina, do you want to address kind of how you see your industry in the context of the DSA? And maybe also what Corinne mentioned, this policy framework, is that something you’re looking for and is that something you’re finding now? Polina, the floor is yours.

>> POLINA MALAJA: Thank you, Sebastian. I hope you’re all hearing me well and seeing me as well. Wonderful. Thanks so much for organizing the discussion and inviting CENTR to outline the European Look at the debate and looking at the proposal.

As we heard earlier and as mentioned by Denis, the DSA’s essentially an update of the e-Commerce Directive. In email, when the directive was put in place, the providers and domain registries were not interested to be intermediary service providers. As the definitions of intermediaries in the e-commerce Defensive, such as caching and hosting service providers simply do not reflect the technical function of DNS. So, specifically talking about the registries, they do not provide access to communication networks, do not store, nor transmit any content through their managed infrastructure, as suggested by the definitions in the e-Commerce Directive. And now, fast-forwarding to the DSA proposal in the end of 2020, it is noteworthy, and as Denis also already pointed out, the proposal recognizes the limited but very important role of certain technical auxiliary functions, as they are called in the proposal, of Internet infrastructure services.

And as Denis said, it’s the recite of 27 is important here. And notably, as the DSA confirms that domains and registries are intermediary and should benefit from an exemption for liability for illegal content provided by the end user. But only to the extent that their services qualify as these intermediary categories, such as mere caching or hosting services. And as we have established already before, at no point in time can registries be considered for offering services falling in any of these categories. And this, in our opinion, creates legal uncertainty for the operators, as they are finding themselves in a vulnerable position as, technically, there is no specific way to target on technical level, on registry level, specific content.

So, what registries can do is to only suspend the underlying infrastructure. That will, of course, have an effect on all services linked to it. And of course, will have an effect on the accessibility of the service. Sorry, I had the video malfunction. So, by suspending a domain name, so, the service – all the services connected to a domain name are not accessible by end users, while the content actually still stays online. So, to use some type of analog example here, is for example, by spinning a domain name, if I use the analogous example here is to actually erase the location of the house on the map, but the actual house is still there. So, you’re just not able to navigate to the house. So, yeah. So, in this way, it’s not really the most proportionate way to go to the registry in the beginning.

And we can also actually already anticipate these unintended consequences of not applying a proper provisionality test when it comes to mandating action on a registry level. And notably, I want to highlight the European Parliament’s draft position that suggests amendments that catalogization of underlying infrastructure such as the domain name, in case of repeated infringements under DSA, and this also includes cases of a very large number of platforms, such as Facebook.com, for example. And of course, this can have grave consequences on all users using the service, and not only the ones who are engaging in illegal activities.

And these parliamentary amendments are actually inspired by the Consumer Protection Information regulation, but however, they do not actually follow the same logic as in the CBC, so to abbreviate the Consumer Protection Corporation regulation. Under the CPC, the action on the regulatory level can only be taken as a measure of last resort, and that important part was not considered by the parliamentary amendments.

So, I know I’m already running out of time, but I want to really briefly conclude. So, from CENTR’s perspective, there is a need of an explicit fourth category exemption for intermediary under DSA reserved for these technical auxiliary functions of the Internet infrastructure. So, this specifically is important for the DNS level, since without that, it cannot be shielded from content liability, while registries specifically lack any effective control over the content removal.

So, of course, in our position and technically speaking, this will have a disproportionate effect on the functioning of other services associated and functioning on top of the DNS, and since we consider the action registry level to be this type of nuclear measure, and we also feel this can only be reserved for exceptional circumstances when all other more effective means with other intermediaries are exhausted, and obviously, of course, after very careful deliberation by the content public authority. So, thank you, and I’m looking forward to discussion.

>> SEBASTIAN SCHWEMER: Thanks, Polina, for presenting us with the view of the European CCT registries. I would also like to see that maybe some suggestions from the study are going to be picked up again out of a hat, and I think we’ll have an interesting discussion in just a bit.

I would invite next up to hear another industry perspective from Petra Arts. And Petra is Senior Manager for Public Policy at Cloudflare. She joined Cloudflare a little under a year ago, I think, but has been working similar positions for telecommunications and media company Liberty Global in Brussels and has a lot of experience in the field. So, Petra, tell us a little about the perspective that you have as yet another non-hosting intermediary actor in this field. How do you see the developments?

>> PETRA ARTS: Thanks. I hope you can hear me and see me okay. Yes, thanks very much for inviting me to speak. And this is a very important topic and I’ve heard already many interesting things from the other speakers, so it’s already a very interesting session for me.

Maybe just a few words about Cloudflare and who we are, for those that may not be as familiar with our services, first of all, just to set the scene. So, we are a global provider of security, reliability, and performance services for websites. We provide a wide range of services that are kind of based on our global network of 200 data centers in 100 countries. And probably the most known is our cybersecurity protection services that we provide needed protection specifically, but we also run a content delivery network service and DNS services, including public resolver, 1.1.1.1. And we have always believed in the profitability of security protections but also the services to make the Internet a better place, essentially. So, we believe that in some cases, specifically for websites, sitting behind a large network like ours is the only way for them to be protected from cyber attack. So, we find that incredibly important that we provide those cybersecurity protections.

As, I think there’s a couple of things probably that I wanted to mention that probably have been mentioned before, so apologies for some repetition, maybe, from other speakers. But it maybe goes without saying that Internet infrastructure services like the ones that we provide, they’re typically operating lower down the stack as well they are far removed usually from interaction with actual pieces of content. So, maybe just first of all, and Sebastian mentioned it at the beginning, you know, the term content moderation as a broad term for, let’s say, lower layers of the stack. We might want to be careful with using that term because there are, you know, kind of little, proactive actions that would be proportionate, as already mentioned, to really address the specific types of illegal content that, you know, regulation like the DSA wants to address.

From our perspective, you know, what we can do for services that cannot see or modify, for that matter, specific content, we have very limited responses to that content. You know, clearing cache is one of them, as Fred was also mentioning. Limiting public access to a website, you know, is also mentioned at the DNS level. Or terminating services of particular users. That’s kind of the options that we have, you know, at that level. But they do not remove content from the Internet, you know. This is important, I think, in the context of the DSA as well. You know, the proportionality of it, you know? Caching is -- clear and cache we thought was a temporary solution. It doesn’t remove content, but it can temporarily close down the access to the content. You know, I would say limiting access at the DNS level is a very overbroad action, a very blunt instrument that we have noted before. And you know, in our case, terminating a customer or a user opens us up to cyber attack, which is also not something I think we would want to cause necessarily.

So, as Polina was saying, from our perspective, the measures that are, for example, described in the DSA, should be targeted primarily at those who can take proportional action first. So infrastructure services should come in only as a very last resort, when all ,f the other options are exhausted and when the more appropriate players higher up in the stack have been able to take action.

Maybe specifically on the DSA, as a last remark. As Denis was outlining, there have been a number of additions, let’s say, additional confirmations, especially in recitals, regarding infrastructure intermediaries and Recital 27, which, you know, there is some helpful clarifications about liability in there, and I think particularly the somewhat recital proposed around disproportionality, so the staggered approach, let’s say, not only towards the different categories of the liability regime, but also to outline that, you know, this hierarchy of going first to those that can take the best, most appropriate, most effective action is extremely important to reaffirm, I would say.

And the last point on transparency. We believe that this is a very important element to make sure that decisions about digital services are to expire generally and we have not only to – in our view be more transparent about any actions they might take voluntarily, but we have to be able to expand about any orders or any requests or any kind of mandated action that we’re ready to take, for example, from courts or competent authorities. This is obviously the case in the DSA, but also in the e-Commerce Directive. At the moment, there has been not a lot of push for transparency for judicial decisions, for example, which we think is very important because they’re such a blunt instrument, you know, which potentially affects a lot of users in any given country at the moment, or more broadly of the DSA, depending on how that will work out with the regulation. It could have an impact on, you know, freedom of expression and, you know, various fundamental rights questions, so we would want to be more – to expand as possible, to be mandated to take action.

Just to also say that from our perspective, just to underline how much we value transparency, we are the only company in our sector, on our side of the sector which published reports on a regular basis, twice a year. We have another one coming up very shortly. So, we encourage also others in the industry to be more transparent and to also start, you know DSA of course mentioned this already and this will probably come once the regulation has been – but we start with that because we think that’s just a very good practice. I’ll stop here because I think I’m already over time.

>> SEBASTIAN SCHWEMER: Thank you very much, Petra, for your perspectives on both the current label, but also the DSA and what it means for you and what would be way forward. Last but not least, I would invite Corynne McSherry to give us her perspective. She’s a Legal Director at the Electronic Frontier Foundation, specializing in copyright, liability, open access and free expression issues and has large experience. And I think Corynne, it will be quite interesting for you to hear the kind of top-level issues that we’re dealing with here, and yeah. So, the floor is all yours.

>> CORYNNE MCSHERRY: Okay. Hi! Thank you so much. And I apologize, I don’t know quite what’s going on with my camera, but I’m not actually sitting in the dark. It just seems to look like it. But I’m so glad to be participating in this conversation. And some of what I was going to say is actually a little repetitive, so I’m going to edit it down because I really want to make sure we have time for discussion.

I think one of the things to keep in mind is that the problems that we’re going to confront with content moderation at the infrastructure level start with content moderation itself at. And at the Electronic Frontier Foundation, we’ve been working on these issues for a very long time and have seen over and over how mistakes are made, inevitably. And over and over we see perfectly legal, legitimate content taken down, often mistakenly. But over a decade or so, many companies at the platform layer have put in place at least some processes to try to address that. But nonetheless, there continue to be lots and lots of mistakes, and it’s very frustrating from a user perspective, because those mistakes, what those mean is that legitimate expression is being taken off the Internet or made harder to see and so on.

So, we’re sort of starting with that problem that the content moderation system is really pretty broken, even at the platform level. So, then it’s really worrisome when we extend it to the infrastructure level, as is already happening, as everybody knows, for a few different reasons. One of them that several people have hit on is, of course, a problem of proportionality. Depending where you are on the infrastructure level, it is extremely difficult to target your response to a specific piece of content, even if you want to, right? So, you have to take down entire sites. You take down entire services as a practical matter. This is not the fault of the companies; it’s just a matter of, you know, how the internet works. Again, people have pointed this out. So, I won’t repeat that too much.

But there are some other things that are happening here that I think are important. One is, we’ve talked about transparency. From a user perspective, that’s really crucial. You could actually, a company could put out lots of transparency reports and even make its rules and guidelines transparent, but if I’m a user, I don’t actually realize, usually, all the different services that are involved in getting my speech out to an audience. And so, I may not even know how to comply with all of the guidelines and all of the terms of service of all of the different services that I’m relying on to share my speech. And I think for an ordinary user, it’s really an impossible thing to ask them to be familiar with everybody’s guidelines, so the guideline of the DNS provider and the guideline of the CDN and so on. So, it’s a process that’s really fundamentally very opaque for users and really challenging to figure out how to make transparency meaningful.

And then as a related problem when you hit the infrastructure level – and I just want to – I think it’s just useful for the conversation to concretize this from a user perspective. What are your options for redress? What are your options for appeal? If I’m a user and I see, you know, speech is being taken down, first of all, I have to figure out who at the infrastructure level is, you know, is causing the takedown. And then maybe I need to figure out how to reach them. And very often at the infrastructure level, you just don’t have that relationship between the speaker and the service that they’re relying on.

So, this is why when we move to the infrastructure level, the problems that we already have of content moderation get specific worrisome. And I think that the Commission has recognized this, and the DSA tries to deal with it, but I think we still have a long way to go. And while transparency is crucial, it’s not going to get us all the way there.

Then the other thing I want to say is that, often people don’t necessarily have alternatives. So, you can’t necessarily just switch to another provider if you think, well, this provider has made a mistake. I figured out who’s taking my content down, I figured out what’s going on here, but I don’t have a choice. I don’t have anywhere else to go. So, I think this person’s made a mistake, but I’m still stuck with those decisions because I can’t just switch providers. And that is what makes sort of the content moderation choices even worse at the infrastructure level.

And one thing I want to point out is that, at least at the platform level, content moderation has now extended into conduct moderation, by which I mean that we have seen platforms hiring law firms to investigate conduct of users and then kicking those users off their services based not on what they’ve done on the platforms, not what they’ve done online, but based on their offline conduct. If that starts getting extended to the infrastructure level, that could get even worse. I worry when every time I see what’s happening at the higher level, the platform level getting extended down, it’s just going to have really negative impacts, I think, on users’ expression.

So, the last thing I’ll say, or just a couple things. One thing that I think is very valuable in the DSA is an attempt to limit legal uncertainty, which, because legal uncertainty just makes all of these problems worse, right? When you don’t know what you’re going to be on the hook for, if you’re a company, what that usually leads to is over-removal, because that’s safer. And so, it’s really important to limit legal uncertainty as much as possible.

I think that, as I understand the DSA has it’s currently drafted, though, there’s still going to be a fair amount of uncertainty as courts kind of figure out what different kinds of services fall within which buckets, and it’s going to be expensive for the companies that decide to litigate it, and there’s going to be a lot of questions along the way.

Then finally, so, EFF has been involved in the discussions around the DSA in a bunch of different ways. We appreciate the effort that we see in Article 12, efforts in the DSA to call out intermediaries and ask them and require them to respect fundamental rights. But for the reasons we’ve just been talking about, all of us, I really worry about whether that’s going to be possible as a practical matter.

And so, one of the things that we’ve proposed and I think is worth thinking about is thinking about ways to actually put users in control, rather than just the companies in control of online expression. So, for example, promoting interoperability so that users have alternatives, so people have other services they can turn to. Transparency is very, very important, but as I said, that’s not going to take us all the way, but it’s a good start and it’s more than we get, actually, right now.

And then also actually encouraging companies to invest in tools that actually put users in control of the content that they see online, rather than relying on intermediaries to make all of those decisions, which are often going to be difficult because expression is contextual, and you can’t always figure out from the point of view of an intermediary, especially someone down on the stack, whether the content’s legal, whether it’s offensive, and so on. You don’t have the context to be able to make those decisions. So, we’d love to see more efforts to actually focus on empowering users who do have the context and are better positions to make those kinds of decisions. And I will close there.

>> SEBASTIAN SCHWEMER: Thank you so much for your intervention, Corynne, and some really important, interesting points. That’s also a good discussion in the chat already going on, which I tried to follow up on in a second. So, one thing I was wondering – thanks, everyone, for your contributions. And it seems there’s a common theme around – this at the infrastructure level is problematic. There is something about transparency, about proportionality, et cetera. And I wonder in a way, so, now there’s this framework put forward and there seems to still be some legal uncertainty, Denis, so that question I think I will play to you in a second. The second point I also want to raise is voluntary moderation. Now, the DSA also says, Dear Intermediary Services, your liability exemption won’t be lost if you try to do something good, which is great. But I wonder, I mean, in this sphere of voluntary moderation, shouldn’t the same principles apply that I think both Corynne and Corinne touched upon, and Polina and Petra to some extent – shouldn’t those apply to those voluntary measures as well, irrespective of whether it’s a platform taking those content moderation decisions or an infrastructure service provider, in terms of transparency, redress mechanisms and appeal, et cetera? But Denis, maybe first playing back to you. So, what are your thoughts on the question of that?

>> DENIS SPARAS: Sure. So, first thing, I think you know, I think we are all in agreement, as you, yourself, said, that you know, the question of transparency and legal certainty, something which I think here – so, transparency in terms of reporting obligations of DSA, something which is very important here. And this is what we tried to come up with, you know. So, you know, for the providers of intermediary service to transparently present when they are making certain decisions.

And you know, we tried to break down by different means by which the decisions are making. Because I hear when someone is saying it is a decision based on terms and conditions, and very often indeed what we have observed in our preparatory work is that, you know, those decisions are taken with no clear indication on what basis, you know, with what consequences. And this is what now the DSA’s trying to tackle to say, look, when you are taking these decisions, it needs to be transparently presented, not only at the time of decision-making, but potentially before.

So, if you as a service provider, you want to have a chance to do this, you should tell that to a user before. And you know, the classical example would be AWS, you know, let’s say taking down or however you want to call – it’s essentially you want to be as a recipient of the service, you want to know that. Certain ambiguity will stay. And you know, I think we see from our work, both our work, but you as an academic, and you know, many people around this table will see, there are questions, and replies are very difficult. And as we always say, it will depend on a case-by-case basis, and even there you may have different solutions depending on who is looking at it, which is unfortunate, because it will lead to legal uncertainty and fragmentation. But unfortunately, at this point in time, you know, it’s very difficult to prevent all that, you know.

I think we have the knowledge, we have the experience, but I think we are still lacking some of these things. And again, you know, the services are developing so fast, it is very, very difficult, you know, to pinpoint in a given point in time, you know, it’s for sure this or it’s for sure that. Again, I’m just thinking about studies I have read and academic papers I have read. I mean, there is no – you can’t find an agreement for one single service. So, it’s very difficult, you know. I’m not saying, you know, we shouldn’t be trying, but what I’m saying, we have tried.

And like you said, when it comes to voluntary measures, again, you know, even in case of voluntary measures, we would want to know when these are employed and what are the consequences. What we wanted to provide with Article 6 is essentially legal certainty, that the mere fact that you are employing these voluntary measures does not mean, irrespective of which of the intermediary services you are providing, that you are automatically, you know, outside the liability exemption.

I think, again, let’s say what we tried to address is the calls from the stakeholders, from the industry, to address some of these opened questions. I mean, again, you know, tech is still improving, many peoples are contributing to it, and you know, discussions like today are also probably very helpful in that. Whether we will have a perfect text, that I’m not sure, but you know, we can try our best.

>> SEBASTIAN SCHWEMER: Thanks, Denis. There was a question in the chat regarding kind of the view of the DSA and the e-Commerce Directive in terms of the network layers and where CDNs and the DNS. Polina and Petra, do you think the proposed model that is very functional-oriented is kind of sufficiently clear on that, and is that the right model to work with? The question in the chat was talking about layers. I have mixed feelings, because if you think of network layers, CDNs would be in the content layer or not, sometimes in doubt what distance CDN as an infrastructure intermediary as opposed to DNS operator.

So, Petra, maybe you want to quickly react on your thoughts? Is the world view the right world view in the DSA?

>> PETRA ARTS: Thanks, Sebastian. Thanks for the discussion. I think, you know, we definitely think there is still some merit in the categories as they were originally foresee in the e-Commerce Directive and that, you know, as Polina mentioned, maybe there is obscurity about the DNS and how it fits in the categories, but from CBN perspective, the main ability is to cache. So, for us, that is still a very relevant category to have in the liability framework.

I think, you know, the main important thing for us, from our perspective, because we provide so many types of services and many of them are based on the same coronet work that we have and the functionalities are based on the same technology, if you will, I think the service-oriented approach is very good, rather than, let’s say a provider approach, because a provider can provide many different services, and then you get, like, very modeled kind of situation that you’re not sure whether you’re covered, but the service by service, let’s say, if you will, approach with a very clear legal framework that applies to the core functionality of what the service does I think is very important from the DSA’s perspective, also was really important from the e-Commerce Directive’s point of view.

So, in that sense, I think from our perspective, I think one of the things that we have seen that we think could be further clarified in that respect is that when there is a service, that we provide, I think it was mentioned in the chat, there is often a service that provides a CDN, but also a reverse proxy subsecurity protection service, that when there is a consideration of, you know, liability, for example, from an authority or a court, that there is consideration of what is the core functionality and what liability is attached to the different functionality of the same service and that they’re not, like, put together and then you lose your liability protections because there is, you know, a different functionality that is also offered on the same network, for example. But CDN is the core service. So, that is something that we have seen that could be, you know, considered a kind of bit of the modeling of the different layers that could be clarified maybe further.

And as others had said and Denis just mentioned it, I think also for us, the lack of certainty, the categorization of the services in Title 27 is something that might not be helpful for the legal certainty across Europe as well because there will be, obviously, different national interpretations still, presumably, from different governments on how they would categorize services. So, that is also something, that where you can maybe see a little uncertainty happening.

>> SEBASTIAN SCHWEMER: Thanks. Polina, how would you explain the domain registry’s perspective on this? Is the issue of the legal certainty related to the liability exemptions? Is it related to the potential due diligence obligations? Is it related to both? Where’s the issue for your part of the industry in that respect?

>> POLINA MALAJA: Thank you, Sebastian, for directing this question to me. I would simply have to reiterate myself from my opening remarks. From our side, of course, the main issue is the unclarity on the liability exception. So, there is some type of intention from the legislators to shield the domain name registry specifically from the liability, for, let’s say, so-called “content moderation,” using air quotes for that. But in practice, it’s just not what can work under the current definitions of these three categories that simply just do not reflect the function, how DNS functions.

And I know, for example, Sebastian, and from also your previous work, you also argued that maybe potentially registries can be considered mere quandary, however, under some interpretation. But for us, it’s very clear that, unfortunately, DNS can also now be equated to providing access to communication network. So, for these reasons, of course, while there are really great intentions from the legislature and from the commission’s proposal specifically, but in practice, it will be left for national courts to decide case by case. And of course, this will definitely not improve the aimed harmonization and, yeah, avoiding fragmentation of the legislation within the EU.

And specifically on the due diligence obligations, actually, we fully support, because we also feel that transparency’s very important and the clarity in terms of service need to be ensured irrespective whether a service is intermediary or not. But something maybe I will very briefly touch upon, the voluntary measures. While this is another one for registries, of course, meaning that we’re not even shielded from liability at the moment, so the intention for the self-regulation in order to be able to take voluntary measures for us is, of course, a bit even one step further away, because we don’t even have a proper regulatory exception. So, that’s from my side.

>> SEBASTIAN SCHWEMER: Thanks, Polina. We only have 1 1/2 minutes left before we need to present the messages. And I still would love to hear two very brief thoughts. Fred, you mentioned jurisdictional issues, which I think are quite relevant in this context. Is the framework that the DSA is setting out, is that something that can help resolve those, or is it kind of just shipping around some of the issues that you mentioned in your intervention?

>> FRED LANGFORD: Well, I’ll pick up on something that Denis said earlier and I completely agree, there is not a one size fits all. And I think that each case does have its merits. And to pull it back on the jurisdictional issues, it would depend on if the intention was to remove access to an entire site, then it may well be proportionate to take action.

As far as the jurisdictional issues, I do think it’s a struggle, and I think this is probably better placed for lawyers and myself to answer. A lot of it comes back to the agreements in place between particular countries and how quickly people can actually take action. And there’s historical issues around that, legal, historical issues that I couldn’t possibly comment on.

But I would say the framework, I think, is a step forward. It’s definitely trying to address those things. It’s trying to take action, and I think, again, just to pick up on what Denis was saying, it’s to try and fix and stay futureproof with what was there and what’s coming, and it’s making best endeavors and not everything is going to be perfect.

>> SEBASTIAN SCHWEMER: Thanks so much, Fred. Corinne Cath, I directed this question to you and want to hear it again. You mentioned the lack of strong accountability framework and accountability measures. Do you think if we talk again in the next EuroDIG, is this issue resolved with the DSA?

>> CORINNE CATH-SPETH: That’s a tricky guess to make. The thing is, we need to also keep in mind that there’s two sides to this debate. So, on the one hand, what are the different obligations that regulators like the European Commission are putting on companies, right? And to what extent do questions around proportionality come up in those cases?

But there’s also the other side of the coin which we haven’t really spoken about during this debate, which is, when these companies proactively, themselves, decide to make political decisions about where to intervene. And the thing that I’m concerned about is those decisions. Like, I’m very receptive to the fact that there’s all sorts of concerns industry has when it comes to the DSA, but what about the moments in which they, themselves, decide, we don’t think this is okay, we don’t think this is ethical? Like, where are the policy frameworks for understanding those decisions and what accountability measures should those be followed up with?

>> SEBASTIAN SCHWEMER: Thank you so much, Corinne. Thank all of you for this fantastic discussion. I think we should have scheduled two hours. We don’t have that. Without further ado, I will give the floor to Ilona Stadnik, who is now presenting the three to five key messages from this session. I hope you will have a good follow-up following this discussion. Thank you so much.

>> ILONA STADNIK: Thank you, Sebastian. Can you hear me well? Okay. So, I was following your session and I must confess that it’s a bit sad because there is more questions than answers and solutions. So, I tried to summarize it briefly in several points. Since I’m not sharing the presentation, could the presenter turn the next slide? Okay, thank you.

So, I will be reading it out. And if there is a rough consensus, we can leave it as it is, or we can do some minor revisions. So, DS sank aimed at modernizing the rules and limiting digital uncertainty for addressing risks in digital space, including the key question of liability of service providers, including necessary technical auxiliary functions. Any objections? Okay, I don’t hear any.

Infrastructure intermediaries were in the legal shadow since the 2000 e-commerce directive. Today we face the necessity to define liability exemptions for them and elaborate explicit categorization. Any objections? Corrections?

>> DENIS SPARAS: I would just like to clarify that we don’t define any liability exemption for them. I mean, it’s still the same as it was in the e-commerce directive, you know. We don’t really change much in the Articles 3-5 of the DSA.

>> ILONA STADNIK: Maybe I misheard something, but there was some kind of intention to clarify maybe these exemptions.

>> DENIS SPARAS: Yeah, maybe it’s more about the clarifications, rather than definitions. Not to give the impression we are redefining the liability exemption regime, because we are not.

>> ILONA STADNIK: Okay. Let’s go further. Recent cases show that infrastructure providers unwillingly do content moderation by suspending services for the platforms in ad hoc manner without any transparent policy. Okay, I don’t hear any objections.

>> POLINA MALAJA: Maybe we can say “certain” infrastructure providers, because I don’t think we can generalize them. Thank you.

>> ILONA STADNIK: Okay. The next one – we should keep in mind the proportionality of measures on the infrastructure level to take down illegal content and foresee the collateral damage. Sorry.

>> DENIS SPARAS: Can we use “unintended consequences,” instead of “collateral damage”?

>> ILONA STADNIK: No, it’s okay.

>> CORINNE CATH-SPETH: And maybe add including when these takedowns are the choice of the company, right, not the concern forced regulation, because they choose to.

>> ILONA STADNIK: Takedowns are the choice of the – could you repeat?

>> SEBASTIAN SCHWEMER: Related to the voluntary measures. Ilona let me also point to the chat, where it was pointed out that the infrastructure level, there is no “takedown,” and I think maybe we can in the follow-up sharpen that a little so that the notions are not – the concepts are not confused to – yeah, blocking is suggested as an alternative.

>> ILONA STADNIK: Okay, so, what do you propose?

>> CORYNNE MCSHERRY: Perhaps something maybe to say to target? Because that covers a lot of territory.

>> ELISABETH SCHAUERMANN: And not to cut you short, but we are already over time and my colleagues are urging us to stop so that we can stream the closing. So, please keep it to strong objections now and more elaborate statements can be made in written after we’re done.

>> ILONA STADNIK: Yes, yes. There is a possibility to edit them afterwards. So, I will finish this message and the last one. Current content moderation practices on the content level are not ideal, transparency reports are not very helpful to escape the mistakes, so what to expect from the broader infrastructure level?

>> CORYNNE MCSHERRY: I would say it will only get worse at the broader infrastructure level. But I’m okay with editing later.

>> ILONA STADNIK: Okay. So, with this –

>> PETRA ARTS: Sorry, just one last point, Ilona? I’m sorry. Escape the mistakes. I don’t know exactly what we’re referring to there, but I can follow up later, if we have to.

>> CORYNNE MCSHERRY: I agree with you, Petra. I would probably say something a little different there as well.

>> ILONA STADNIK: All messages will be open to the public commenting afterwards, so I really invite you to do that. Okay, I think we can finish with this. Thank you very much for the session. It was really interesting to follow all you.

>> ELISABETH SCHAUERMANN: Also, thank you from my side. Thank you, Sebastian, for this wonderful exchange. In a few seconds, probably the studio will be here with the closing of the day, so I encourage you to stay in this room, because also afterwards at 6:30, the mentalist show will be happening right here. So, I wish everyone a good evening.