Copyright – Implementation of the EU directive – WS 06 2021

From EuroDIG Wiki
Revision as of 14:16, 19 July 2021 by Eurodigwiki-edit (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

29 June 2021 | 14:45-15:45 CEST | Amphitheatre | Video recording | Transcript
Consolidated programme 2021 overview / Day 1

Proposals: #4 #71 #81

You are invited to become a member of the session Org Team! By joining an Org Team, you agree to your name and affiliation being published on the respective wiki page of the session for transparency. Please subscribe to the mailing list to join the Org Team and answer the email that will be sent to you requesting your subscription confirmation.

Session teaser

EU Members must implement the new Copyright Directive by 7 June, 2021. This workshop looks at the situation at the end of June, especially concerning the results of stakeholder dialogues where the European Commission was to discuss best practices for cooperation between online content-sharing service providers and rightholders regarding Article 17. The workshop may also review possible ripple effects of recent and ongoing cases involving the demands for compensation for the use by platforms of news produced by traditional media.

Session description

In 2019, the EU copyright directive reform has been adopted, despite protests from different stakeholder groups. Currently, we are at the point of time, where the deadline for implementing the directive has passed, and only a handful of Member states have implemented it in national legislation. At the same time, there are looming court decisions about the compatibility of Article 17 with the fundamental right to freedom of expression and information. In this session, we will look at the current state of the national implementations of the EU Copyright Directive and the outlook for the near future in this field.

Format

Kick-off presentations, followed by a panel discussion.

Further reading

Links to relevant websites, declarations, books, documents. Please note we cannot offer web space, so only links to external resources are possible. Example for an external link: Main page of EuroDIG

People

Until .

Please provide name and institution for all people you list here.

Focal Point

Focal Points take over the responsibility and lead of the session organisation. They work in close cooperation with the respective Subject Matter Expert (SME) and the EuroDIG Secretariat and are kindly requested to follow EuroDIG’s session principles

  • Gregory Engels, Pirate Parties International

Organising Team (Org Team) List Org Team members here as they sign up.

Subject Matter Expert (SME)

  • Yrjö Länsipuro

The Org Team is a group of people shaping the session. Org Teams are open and every interested individual can become a member by subscribing to the mailing list.

  • Giacomo Mazzone, Eurovisioni
  • Marco Ciurcina
  • Amali De Silva-Mitchell

Key Participants

  • Patrick Breyer, MEP
  • Paul Keller, Open Future Foundation
  • Justin Jütte, UCD Sutherland School of Law
  • Ania Helseth, Strategic Policy Manager EU Affairs at Facebook

Moderator

  • Gregory Engels, Vice Chair Pirate Parties International

Remote Moderator

Trained remote moderators will be assigned on the spot by the EuroDIG secretariat to each session.

Reporter

Reporters will be assigned by the EuroDIG secretariat in cooperation with the Geneva Internet Platform. The Reporter takes notes during the session and formulates 3 (max. 5) bullet points at the end of each session that:

  • are summarised on a slide and presented to the audience at the end of each session
  • relate to the particular session and to European Internet governance policy
  • are forward looking and propose goals and activities that can be initiated after EuroDIG (recommendations)
  • are in (rough) consensus with the audience

Current discussion, conference calls, schedules and minutes

See the discussion tab on the upper left side of this page. Please use this page to publish:

  • dates for virtual meetings or coordination calls
  • short summary of calls or email exchange

Please be as open and transparent as possible in order to allow others to get involved and contact you. Use the wiki not only as the place to publish results but also to summarize the discussion process.

Messages

  • Copyright can contribute to negative trends such as surveillance and control on the internet. There is therefore the need to combat such a practice and the idea that machines can govern what we can see or read on the internet.
  • On copyright infringement: Whereas filters are designed to identify matches between different sources, they are not designed to recognise context.
  • Content control requires supervision to make sure that the impact on different stakeholders is proportionate. An institutional mediator needs to be established or an existing actor should be given legitimacy to deal with content control.

Video record

https://youtu.be/JUXnE4WLMUI?t=12454

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-482-9835, www.captionfirst.com


This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.


>> ELISABETH SCHAUERMANN: All right. Hello. Welcome, everyone. I would just briefly share my screen. My name is Elisabeth Schauermann. I have the distinct pleasure of hosting this room. And in order not to lose too much time, I will in a few seconds hand over to Gregory Engels, who is the moderator of the Workshop 6, dealing with copyright, as we just heard.

Just two quick words on the session rules in the room, as you might already know. Please be in the room with your real name, with your full name. If you want to ask a question or make a statement during this interactive workshop, please use the hand raise function and I will then assign the speaking rights to everyone who wants to make a point.

When taking the floor, please state your name and your affiliation. The chat will not be published, so you can freely exchange there. And please, do not share the Zoom link out of this room because we need to make sure that we stay protected there. All right, and with this, I hand over to you, Gregory, for this very interesting workshop.

>> GREGORY ENGELS: Well, thank you very much. We are a tad late, but I heard we can go a little bit over time, since after our session there is a break scheduled. I hope this is okay with all your time constraints.

So, the topic of Workshop 6, the session that is now to begin, is copyright, and specifically, the state of the implementation of the EU Copyright Directive and also the outlook of what is next and what will follow on it. We have quite a lineup of speakers here and experts. I’d like to begin with Paul Keller, who is following the implementation of the EU Copyright Directive from the activist perspective since the beginning, and actually, also maintains a website that’s like Eurovision, that gives points for implementation of the Copyright Directive in different countries, pertaining to different aspects of how transparent it is, how well has it been adopted to input from stakeholders, and so on.

Then we have Patrick Breyer, Member of European Parliament. Hello, Patrick, who is with the Pirate Party from Germany, same as myself. I introduced myself – I forgot to introduce myself in the beginning, so I will introduce myself in a minute. So, Patrick is in the European Parliament in part of the Greens Effort Group and is also working on, not exactly on the Copyright Directive -- it’s already set -- but on other topics that are yet to come or that will influence in the same area of governance, and I hope Patrick will tell us more about it.

Then we have Bernd Justin Jutte, who is a researcher on copyright and also explores what the consequences of implementations is to, well, to everybody, I guess, but more that. And I’m also glad that we have Ania Helseth with us, who is with Facebook and as a Platform Operator is directly targeted by this whole legislation and have now to implement it, and we will hear from her about it.

So, my name is Gregory Engels. I’m Co-chair of Pirate Party International. I live in Germany. And it was it was to suggest that EuroDIG is the outlook after the EU Copyright Directive had been passed and is now being ratified by the Member States. So, I ended up being asked to be the focal point for this session, so this is why I’m organizing it. So, in case you’re asking.

So, this session will consist in the beginning we have brief presentations from the speakers, and then we go into discussion. And we also have a very important participant here. It’s Natasa Perucica from the Geneva Internet Platform, who will be taking key messages as a takeaway from that session and will present in the end of the session. We will meet five minutes at the end so we can agree on a – well, rough consensus, and those messages will be submitted to the main conference and then finally to the IGF. So, whatever we discuss here, we have a chance to make an impact in influencing others.

So, without further ado, I would like to hand over to Paul, Paul Keller, to give his initial presentation. Paul, are you ready?

>> PAUL KELLER: I’m ready.

>> GREGORY ENGELS: Great.

>> PAUL KELLER: Hello, everybody. Can you hear me? Okay. People can hear me. Okay. So, my name is Paul Keller. I’m President of the Association for the Public Domain and I’m Director of a think tank called Open Future. We’ve been working on the copyright reform package for its entire duration, so since about 2014. We’ve done preparatory work. We’ve tried to influence what would come into the proposal for the directive and then have worked on the directive throughout the entire legislative process. And after the directive was adopted in 2019, we’ve also started an implementation tracking project, which means that we’ve worked with partners, local partners in almost all of the EU Member States, helped them understand the provisions of the directive, tried to help them with going to their respective government agencies who are tasked with implementing this international law and trying to provide, helping them with providing impact to make sure that the directive gets implemented in what we would call a user rights-preserving way.

And as you, most of the people here in this call, I assume, will know, the directive had an implementation deadline, as all new directives have, which is the deadline by which the EU Member States would have needed to implement the provisions of the directive into their national laws.

Now, why do I say “would”? Because it’s pretty common that Member States, not all Member States manage to do this by the deadline. This is not an unusual thing. I think we also went through a pretty unusual time where lots of government attention was focused to other things, so it’s not a big surprise that by now only a minority of the EU Member States have actually fully implemented the directive.

And as Gregory mentioned in the introduction – and I’ll try, if I can share my screen and show this little bit – we’ve actually made an implementation tracking website, which you should be able to see now, which tracks the implementation progress of the directive in the 27 EU Member States. And as of today, to our knowledge, there’s only four Member States which have fully implemented the provisions of the directive into their law, that is in the order of how they implemented, Hungary, Netherlands, Germany, and we just learned about this, but apparently, this happened on the 18th of June, so almost two weeks ago, Malta. There is two other Member States who have implemented parts of the directive, and these are Denmark and France. Both of them have only implemented those parts of the directive which protect the right or strengthen the right of authors and rights-holders, so the controversial Article 15 and 17. They have not implemented the provisions that contain Exceptions and Limitations, for example, for education, for the use by cultural heritage institutions, et cetera.

And so, what we see is at the moment a very incomplete image, picture of implementation. We have for this website tried to rank these implementations based on the issues in the directive that have been important to us, and I encourage all of you to go to this website and check out some of the countries. Generally, what has been important to us has been, on the one hand, transparency of the process. So have Member States actually given the interested public and other stakeholders the ability to express their concerns and maybe, like, also submit suggestions as part of the national implementation processes. And the other thing that has featured very highly is, of course, the discussion around the controversial Article 17, which implement these upload filters. And there we’ve looked at how far Member States have focused on implementing user right safeguards. And those countries which you see here which score highly, for example, Germany, are countries which have implemented the directive in a user rights-preserving way. Most of the other countries – look at France, Italy, which hasn’t implemented yet, but also the Netherlands have done relatively little to present, to preserve and protect user rights in the implementation of Article 17.

And we believe at the moment this is probably the main point which keeps a lot of Member States from implementing the directive because there’s still a lot of uncertainty about how Article 17 should actually be implemented. The European Commission has at the beginning of June issued implementation guidance which provides some guidance for Member States how to implement Article 17, and those implementation guidance stress the important element that there must be no fully automatic filtering of user uploads unless an upload can be recognized as manifestly infringing, so it can be clearly seen that something is infringing. In other words, the Commission says, when in doubt, do not filter, keep the content up and do a human ex-post assessment.

This guidance, we haven’t seen much impact of this guidance on the Member States, and we believe that much of this is probably due to the fact that there’s also at the moment a challenge in front of the European Court of Justice as a result of a challenge from the Government of Poland, where the Court of Justice will need to decide if Article 17 is in line with the fundamental rights, protections enshrined in the Charter of Fundamental Rights. And that will not really be decided, that court case, until the end of this year at the earliest. So, there’s still a lot to happen around the implementation, both for Member States to implement, but also with a decision from the Court of Justice, if this controversial Article 17 is actually in line with the fundamental rights or not. And with that, I’ll give it back to you, Gregory.

>> GREGORY ENGELS: I need to find my mouse to unmute. Thank you very much. Then, Patrick, if you mind, if you’re ready. Great then.

>> PATRICK BREYER: Thank you, Gregory, and thank you for inviting me. I’m happy to be participating in this EuroDIG event on the reform of copyright. And allow me to say some words about myself. I’m a longtime digital freedom fighter, a civil lights activity, and since 2019, a member of the European Parliament for the Pirate Party. There are in total four Pirate Party MEPs in the Parliament, so that certainly boosts the importance of digital rights in the European Parliament.

Now, in 2019, shortly before the elections, the European Parliament adopted the controversial copyright reform, which includes mandatory upload filters, in spite of massive criticism from NGOs, from experts, more than 150,000 people in the streets protesting on the 23rd of March, and more than 5 million citizens that signed petitions against this.

I would like to compliment anybody who was involved in defending fundamental rights back then and taking this to the streets, even though if in the end the European Parliament, opposition to the directive was only a few votes short, thanks to those massive protests. The greatest protest that ever really took place for Internet were ignored by the European Parliament and most national governments, and by adopting this reform, the substantial part of our digital freedom of speech online has been sold out in exchange for hopes for increased profits for the content industry. This has created a precedent for an easily abusable, error-prone and ineffective, automated censorship infrastructure. And depending on how this directive is implemented, this will cause more or less damage to Internet users and digital rights. Only together and with sufficient public and media attention can we manage to minimize the damage caused by this copyright reform.

Now, when it comes to the details of the implementation, I’m most familiar with the situation in Germany that has implemented Article 17 in specific legislation, and it’s also the only country that has come up with mechanisms that attempt to protect rights while still having upload filters in place, which is really a contradiction itself. By introducing the idea of presumably legitimate uses for which an automated filtering and blocking is not permitted. So, this is a system somehow alludes to the guidance that the European Commission has published, which says that automated blocking can only be applied in cases of manifestly infringing uploads, but the last-minute law being, an exemption was introduced where sort of copyright violations would result in high economic damage. Then there can still be automated blocking, even if it is unclear if the use is even illegal or not. That’s really a dangerous erosion of the Commission’s position and also questions its – calls into question its position in court.

Now, after this vote in the European Parliament, many people were really depressed because they’ll all be one in the end. Still, the protests were so effective that they compelled the parties in the German government to publicly promise that they would mitigate the problems as well as possible, some even promised that they would do without any upload filters at all, which, of course, they couldn’t keep. It had a massive effect on the elections in Germany because the government parties lost a lot of votes. It had an effect on the negotiations on the terrorist content online file, for which I was reporter for my group, even though they will in the future be able to have content deleted in all EU countries, which is terrible, we managed to keep mandatory upload filters out of this. There is an express exclusion of any filtering obligations in the regulation, and that is thanks to the protests and attention that you have generated.

Let’s see. Yes, so, we can tell that those countries where the protests have been most intense are also the ones that are most careful when it comes to the implementation, such as Germany, which I’ve mentioned, but also Austria or Poland, that is even challenging the legality of Article 17 in court. Now we can expect that Advocate General to give their opinions soon and judgment to come out next year, and I expect that there will be a restrictive interpretation in our favor or maybe even an annulment of the mechanism which is insufficient to protect fundamental rights.

But as Gregory announced that I would open the perspective a little, let me say a few more general remarks. Copyright will always be a driver of surveillance and control on the Internet. So, we really need to tackle the problem at the root, and there is a European Citizens Initiative called Freedom to Share, which is collecting signatures to legalize personal, noncommercial file-sharing, and basically compensate authors for that, which is a much more reasonable approach and would basically make it superfluous to do a lot of the surveillance and control mechanisms that are attached to copyright law and copyright enforcement.

Furthermore, we are currently negotiating the Digital Services Act in the European Union. And the proposal from the commission lacks a ban on the error-prone upload filter censorship machines that result in overblocking. The proposal could even be read to allow to make these mandatory for large platforms. There is no exclusion of it. And also, platforms are to be held responsible for illegal content as soon as they are informed about it, which could really only be handled with automated filters. So, it’s a dangerous proposal, even though it doesn’t explicitly mandate filtering.

Nonetheless, in the European Parliament, the position is clear that there should be no obligation to use this error-prone filters. We want to exclude that. And I’m also working and fighting to restrict the voluntary filtering, because for the users that are affected by overblocking, it doesn’t matter if it’s a filter mandated by law or mandated by YouTube. So, therefore, we’ll see how far we get to impose limits and safeguards on the filtering done by the platforms already.

Finally, the EU’s currently preparing legislation that would oblige electronic communication services, such as email messaging services and chat services, to indiscriminately and automatically filter all private correspondence for alleged child pornography and automatically report any suspicious content to the police, which is a system unprecedented in western countries. And I think this is one of the repercussions of this copyright debate, because the EU Commissioner for the Interior Johansen explicitly referred to this Copyright Directive and said look, we can protect by copyright, but we can’t protect our children. So, this is the harm done by that kind of legislation, that now they are using the filters even on private communications and such. General monitoring would really have a chilling effect.

Also, on the counseling of victims, it would criminalize children who are often the object of such investigations, and it would mostly accuse innocent citizens of crimes they have not committed because these filters are so error-prone. It’s as if the Post Office would open all letters, just in case there is illegal content in them. This so-called chat control legislation is something we need to fight, and please take the time to check it out and make sure that the secrecy of correspondence is respected on the Internet.

So, to close, the Internet makes it technically possible to apply blanketed censorship at an unprecedented scale, which is a road that China is going down, for example, and we must not allow, and we really need to fight the idea that machines programmed by international corporations should govern what we can say and what we can read and know on the Internet. That is not something that we should accept. It’s not something we should accept as a new normality, but let’s work and struggle to get back to a human-centered content moderation system that safeguards user rights. Thank you.

>> GREGORY ENGELS: Thank you, Patrick. That was interesting input, and I think we will come back to that in the discussion. I already have some questions. But first, I would hand over to Justin Jutte to hear the academic perspective.

>> JUSTIN JUTTE: Thank you, and thank you for having me. I’ll keep it brief and focus on three points, mainly in relation to the discussion as it appears Article 17, but I’ll also try to go slightly beyond that, always within this microcosm of the CDSN directive, and I think Paul’s already given us the main lines of discussion here.

The first point I want to make, more underlining a point already made – we are currently in an implementation, and I think this is a symptom of a larger illness we’re facing. There are these three weird things we’re currently playing with. We are one directive passed more than two years ago. We had the guidance that came far too late. And we have a pending action for annulment, which is nothing very unusual, but one that has been so intensively discussed the last couple years, is not unprecedented, but it’s certainly unusual.

One brief contradiction, Paul mentioned we might have to wait until the end of the year. There is one precedent where it was quickly decided, not legislation, but an act, one of the cases where the Court of Justice took I think less than a month after the AG published the opinion, to come down with a judgment. But let’s hope the Court of Justice will reflect a bit more on some of the more fundamental and critical questions that have to be asked in this content.

But one of the reasons why we have this implementation is that the directive in itself, specifically Article 17, is what I would argue is fundamentally, systematically unsound. We have provisions that are largely in contradiction. One example is this bold statement that there shall not be any filtering on monitoring of information for the purposes of copyright infringement. It really comes to the question: How do we define filtering? But if we do, indeed, allow private right-holders to earmark content that then has to be prechecked in the form of something under human control, this is effectively prefiltering or premonitoring. And you can call it whatever you want to; I think this is a dangerous development.

On the other side, I do see no other way to effectively enforce blatant violation of copyright, if we do not allow some form of filtering, but this has to be absolutely controlled. And I come to that in not a few minutes, but one or two minutes. But this demonstrated there is an inherent contradiction in Article 17 that somehow has resolved, and I would argue it is probably not easily resolvable.

You also can look slightly outside of Article 17, into Article 4-7, where we had a positive development in the European Copyright Law to the fact that we suddenly have mandatory exceptions beyond that what we find in Article 5, the Information Sight Directive. But again, the European Legislature takes one step forward and one step back and it doesn’t really retract what it takes, but it provides the opportunity to, for example, with Article 5, to take advantage of a carveout to provide an exception for an exception. This, again, systematically to me doesn’t make much sense. We can also look into Article 17, Paragraph 7, where the European Legislature says we have to enable online certain exceptions and limitations, not quotation, parody, and so on and so forth, but the guidance provides that these exceptions, now mandatory online, are necessarily mandatory in the offline work, and this is not something that necessarily contributes to something that I would call as a legal scholar, legal certainty.

And finally, coming to the polished Article 17 or certain parts of, but also in a subsentence or subclause, potentially the entire provision, there is a constitutional question. One question is, is this indeed fundamental human rights? For reasons I have briefly explained, I personally don’t think that Article 17 is in compliance with fundamental rights. You can argue about the question, is filtering suppressing freedom of expression? Certainly, it is, but it might be justified. But there are also larger issues: Is there a right to fair trial engrained in Article 17? Is the, I don’t want to call it policing, but is the enforcement of copyright by private actors a potential under monitoring or filtering or human control, is this in line with the rights to affect trial as later on in Article 17 of the UN Charter for Fundamental Rights? I would argue most probably not. Again, whether something else is possible remains open to discussion.

And the other question, of course, is, is this in line with the rights to freedom to conduct a business under the EU Charter? We are laying incredible burdens on private platform operators that will significantly force them to invest and to undertake efforts, and this is also something we have to take into consideration, in addition to the rights of right-holders that we are trying to protect here.

These are many interests that we have to balance, but the way Article 17 does it, neither provides for legal certainty, and I think the platforms are significantly struggling with trying to get their hands wrapped around this weird provision. But the second question we can ask ourselves, which is also a constitutional question: Does Article 17, looking at the difficulties that Member States have to implement it, actually contribute to efficient harmonization, which is the aim and purpose of every single piece of legislation? I would argue, certainly not. If we just look – and Paul has done that expertly in a number of blocks, including the copyright block. Finland has one model. Germany has one model. Hungary has another model. And there are others to come, and this doesn’t contribute to a level playing field for platforms or user or rights-holders anywhere in the European Union.

I’m not as grim as to say that the Court of Justice will strike down Article 17. I hope it will provide proper guidelines and draw very, very red lines in the sand to say where can right-holders go, where can platforms go, but also where can users go within the scope of Article 17. And that’s it from my side.

>> GREGORY ENGELS: Thank you, Bernd Justin Jutte. So, we’re coming to our last presenter here, and then we come into discussion. And since it started late, we can extend a little bit into break, so we are good in the time, so take whatever you need. So, Ania Helseth from Facebook, the stage is yours, but basically, how you feel. All this criticism about Article 17 and now you have to implement what others decided for you, basically? So, but, I don’t want to put any words in your mouth, so, speak for yourself.

>> ANIA HELSETH: Thank you for having me, Gregory. And thank you, everyone, for having me here today. Just as a brief introduction, I think I met some of you already, but I work in Brussels Office of Facebook, so I used to deal with Copyright Directive. Right now, I moved to deal also with Digital Services Act, so I’m not as close as all of you here to implementation work, but yet, I remain very engaged, of course.

You know, from our perspective, I just wanted first to reflect on the compliance process that we, obviously, set up at Facebook, and then I’m very happy in the discussion to discuss some of the issues raised by my co-panelists, especially related to generally speaking around, you know, what platforms do around monitoring and so-called filters, which I know always attracted lots of attention also during the policy discussion.

But from the perspective of a company, you know, obviously, on June 7th, the clock hit its time, and it was for us to show our compliance. I know some countries – and it was helpfully shown on the graph here – are not yet there, but the company always needs to be there, just in case the regulators would like to check. I think that makes sense.

We always engage with the Copyright Directive discussion from the premises that we take IP protection rights on the platform very, very seriously. I know there were lots of points raised by, as we like to used to call the other side. I don’t think that reflects it well. It’s just part of Facebook community that we take their rights also very seriously as we also prioritize rights of our users. And obviously, there was lots of discussion how that might impact the possibility of engaging in discussions, et cetera.

Right now, the law is here, as I said, so we need to kind of get going and show our compliance. We also wanted to factor in the discussion that we had with right-holders group, with the European Commission, that our platform is genuinely for engagement by people who really want to connect with friends and communities, rather than engage in the, as we call it, professional content. But we also know that this is used by news and creative industries, and that’s how they find new audiences, that’s how they support their own business model. We’re very happy that it’s useful, but it’s just with Copyright Directive, what’s our priority as the company.

In terms of practical arrangements for Copyright Directive compliance, we are making some changes, or we already started making some changes to make sure that content owner has even more control over the content and how it is displayed, in fact. So, for instance, we will no longer display full links previews that we currently were doing, unless the publisher allows for it. And then other people will only see the hyperlink and very short extract. Of course, for every post, the publisher will have an opportunity to indicate the display they prefer. They can do it through the Business Manager Notification that is some kind of platform for them to manage their content on the platform.

In addition to that, and I think it’s less related to compliance work, but just our work that we undertake with the news industry more broadly in the space of, you know, also the ongoing war extremes we have on how to tackle misinformation, how to handle disinformation, how to make sure we are basically contributing to the healthy news ecosystem. We have been making very significant investments towards news industry and, basically, we’ll continue to do since 2018, amounting to $600 million and we plan to invest over $1 billion next three years of the compliance processes.

We also know that lots of times people do use Facebook to connect to the publishers, and we are making sure that, basically, this traffic that we are generating is clearly showcased towards our publishers who can then basically better understand the trends and how users are using and consuming the news on our platform.

In terms of Article 17, which I understand was, because this was mainly related to our work on Article 15, which I think is also quite important – Article 17, obviously, we’re waiting for the guidelines for very long time, because to us and Member States, regulators we were engaging with, these guidelines were a quite helpful tool that wasn’t coming, in terms of explaining how those principles of Article 17 are to be implemented on the practical manner. Obviously, some countries went ahead and didn’t wait for the guidelines. I think that was unfortunate, but I also understand that they were trying to meet the deadline.

From our perspective, the guidelines, or for us, the clarification, we were hoping – of course, there’s always, you know, space that we could receive more. But I think two important factors are there, which we were always mentioning in various stakeholders meetings during the copyright dialogue discussions. First and foremost is that there is understanding that there is freedom to undertake contracting so the authorization is not something that has to happen in a certain particular way, pushed by certain particular group, but it’s actually a broader type of understanding of how we can authorize the content. And licensing is not the only path forward.

And then the second part of this guidelines that we were really carefully looking into is that understanding that there are some technical realities that need to be factored in. And having that understanding was from our side really helpful. The fact that it has to be technologically neutral, it has to be futureproof. So, indeed to your points that were raised, is that, you know, obviously, lots of discussions focus on content filters, or you know, kind of techniques that various platforms undertake to monitor content – and by the way, this is not just referring to copyrights, it’s also, you know, referring to content that is potentially violating on all kinds of types of law and we undertake certain proactive measures. But just to say that there is an understanding that this is not the only solution, that there is some way being undertaken that can be potentially including other tools and the work that kind of focuses more on the collaboration with right-holders, rather just as aggressive approach to take down everything and then, perhaps, check whether it was taken down correctly.

As for the court case, which we obviously monitor and are very interested in result, as you can imagine, I am not in the position as a company representative to, of course, comment on any ongoing cases, but I just want to confirm that we are reading all of the reference blogs and materials, and we, perhaps, you know, would love to say some more, but we need to wait for the court ruling to address and make the official comment. Thank you.

>> GREGORY ENGELS: Well, yeah, thank you very much. Yeah, I got – so, first, Patrick mentioned the European Citizen Initiative, Freedom to Share. And well, I need to take this opportunity, because I am part of the citizen committee that is actually organizing this initiative, which is aiming to collect 1 million signatures to propose to the European Commission to allow for private non-commercial file-sharing of copyrighted materials, and on the same time to come up with regulation that would reimburse the right-holders in a fair way. And we don’t say how it should happen. It could happen in a different way, like something like a tax on Internet advertisement or mandatory licensing through collecting societies. We don’t say what’s the right way, but say that there should be a way that ensures that people get paid for their work, and at the same time, that allows the private individuals to share what they own with their friends, and if that would be implemented, there would be no effectual need for upload filters since it would all be legal and licensed.

But I hear now that since the filters are being implemented, they already have been planned in a way that they can be used for different requirements and also pertaining to the anti-terrorist regulation, and also heart from Patrick that now, since the upload filters are established as a tool that European Parliament can pass those in the law, then they have been trying to extend to other areas, like filtering private communication, email communication, in regard of protecting the children and fighting other aspects. So, this is, basically, what the activists had warned for, that any filters introduced would be the first step towards some kind of censorship in a global way.

And I see now hands raised from, I think Paul was first and then was – well, everybody. Okay. So, Paul, then Ania, then Bernd. Justin. Sorry. Okay, Paul.

>> PAUL KELLER: Thank you. Since you came back to the filters, I wanted to make one point about the filters. So, I did not really talk about this. There was, of course, the stakeholder process that the commission had to organize subsequent to the adoption of Article 17, which was meant to feed into this guidance that Ania and others have mentioned. I think I mentioned them as well.

This was until Corona struck an actual physical process where there were about 100 stakeholders for six meetings in rooms in Brussels, and the commission really diligently organized this and had various rounds that included representatives from platforms explaining the technology they use, a lot of companies who sell content recognition technology, to present the state-of-the-art of the technology. So, there was some discussion. And one of the things that became very, very clear during this stakeholder dialogue is that filters are incapable of doing that, what the proponents of this article expected from them. Filters can, to a large degree, recognize if a song that is uploaded is similar to a song that somebody – or is the same or similar to a song that somebody owns, right? So, it can identify matches between things; it can recognize one thing in another place.

What filters cannot do, and what, like, all of the technology providers have said from YouTube to Facebook to the companies who sell this as a service said, these tools are not designed to understand the context in which a use is being made. And at least for copyright, the specific thing is a work is not legal or illegal as a characteristic of that work. It is always dependent on the context, the way it’s used.

So, what came out of this stakeholder process is that everybody agreed, some people would state that more happily than others, that essentially, filters cannot be used to determine if there’s copyright infringement or not. And that is something that is also one of the very important clarifications; it’s one of the key statements contained in this commission guidance. The Commission says filters cannot deliver this, so we need to find mechanisms to work around this inability of filters to do this.

Filters may work in other context. Like, some certain types of content may be illegal as a result of – it’s a characteristic of the content, itself, of being illegal. But in the copyright context, that’s simply not the case.

>> GREGORY ENGELS: Okay. Thank you. Ania?

>> ANIA HELSETH: So, sorry, I don’t know how to put down my hand, but I will deal with that after. No, I absolutely agree here with Paul. And what I wanted to clarify is, indeed, the discussion which was very at some point during the copyright discussion, very, I would say, heated, I think there was this kind of understanding, or maybe, actually, lack of understanding, what companies such as Facebook are trying to achieve when they try to deal with content, such as terrorist content, such as AI versus copyright or content infringing someone’s rights, but content, per se, being perfectly legal. It’s just a question of who is trying to post that content? Because, obviously, if you take a CI type of – well, the CI type of content, that’s never legal, no matter who is trying to post it. It just doesn’t have space on the platform whatsoever. It actually does also include us trying to reach the law enforcement and trying to inform them about this type of content attempt being done to uphold such a content on the platform.

So, the approach we have there, the photo DNA kind of technology and terrorist content regulation, the sharing database that we have with other companies, it’s all to address this type of content in a very fast way, assuming, basically, that this content should just not come to the platform, but it really doesn’t matter who is trying to post it. It could be anyone who is making this attempt, and that person would be treated in the same way; whereas, in the Copyright Directive, I think the whole discussion was that this same approach would be taken, and I think this is really interesting, you know, the misconception here. Because from our perspective as Facebook, the only person or the only entity who can really tell us who is owning certain content is actually the right-holder. We can’t really do anything very proactively here because we are risking impacting people’s freedom to exchange in discussions and basically freedom of speech.

Obviously, to facilitate on the other side the very valid complaints from the right-holders that it is a little bit burdensome to all the time keep telling us that something is, that they own, is their share legally, we did set up this tool called Rights Manager. But it is not really a filter in a sense that it, you know, kind of hits every type of content, takes it down, and there is no kind of recognition that, you know, there is some false positives and that we don’t need to address that. This is based on a discussion with the right-holder, explanation from their side, what kind of content they own, on which bases this own. By the way, then that leads to interesting discussion the right-holders often have with each other, because oftentimes it turns out that they have actually ownership of the content and then it was turning out that they don’t or they share it with some other groups. So, it is much more, you know, sophisticated, needed kind of tool to kind of address this type of situations where we need to factor into so many different rights. And first and foremost, make sure that our users are not impacted by kind of aggressive takedown of content base, based on that. But I would say Rights Manager is not just some blind filter that just tries to take everything down and just stifle people’s ability to speak. So, these are just a few points from our side.

>> GREGORY ENGELS: Thank you. And now Justin?

>> JUSTIN JUTTE: Yeah. Well, just a few remarks. And I think Ania and Paul have already essentially done the main lines here. I focus on a very simple question. In the end, we won’t get around – again, whatever filtering needs – and I think I need a lesson on what filtering means on the impact of legislation. Because the more I talk about it, the less I tend to understand it. But whatever content control we are applying, and it will have to be automated, requires some sort of supervision to make sure that the impact on all the relevant groups are proportionate.

And as a lawyer, I do get stomach aches. I don’t want to blame anyone for that, if right-holders on the platform then negotiate privately, which probably is necessary in some way, how content will be kept up or taken down, in which time frame. And what we were hoping that the guidelines would provide, I think in that sense, wasn’t really there. I think there’s no way around it, but there must be ample oversight on who decides what content stays up, is being blocked, is being temporarily blocked, and in which instances users can quickly intervene and which instances they can intervene. But this is missing. Some control that is intermediate and doesn’t require a user or right-holder to go through various litigation potentially up to the European Court of Justice or the Court of Human Rights to have their rights properly credited. There must be a shorter way, other than saying we do it economically, and therefore infringing on the rights of large numbers of rights-holders or render copyright enforcement completely unfeasible on the Internet, which is also nothing I would absolutely prefer that we operate on the Internet.

I think some sort of institutional mediator, whether it has to be established or whether existing institutions can take up this, it would be absolutely necessary just to give the system legitimacy. Or we have to say we have to rethink the entire thing. Martin said it pointedly recently, it’s quite a mess. Nobody really wants to be with it, and I don’t see how realistically we can do that to the satisfaction of everyone. We won’t be able to make everyone happy, but I think there’s probably a better way to do that somehow.

>> GREGORY ENGELS: Okay. So, we have two questions in the chat, and then I would like, I think we take them. And after that, we have, I think it would be the time for the summarized statements from Natasa Perucica. So, the first question in the chat is from Nathalia Sautchuk. You talked about the upload filters, but another contentious point directive is the link tax. Can you talk about this point, how German law is dealing with that? So, this is pertaining to Article 15 implementation. Patrick, do you know how it’s implemented in German legislation?

>> PATRICK BREYER: I think Paul can explain that for us.

>> GREGORY ENGELS: I think Paul is quite competent, yeah. You’re muted, Paul.

>> PAUL KELLER: Sorry. Let me speak not particularly to the German situation, but the overall situation. I think it has been the one thing which has been a bit controversial as well, but I think in the end, the compromise that was found is proving to be a little bit less harmful. There is conditions. There is a lot of open-ended definitions, what is very short extracts, which will probably be fought over in court. But the way this ended up in the directive and the way this also ended up in the Member State legislation, while not making a lot of sense, at least does not really create a situation that is directly problematic for users engaging online. So, I think we will see a lot of either deal-making between platforms and press publishers or maybe in some situations also legal conflict between platforms and press publishers who sort of try to figure out the modalities of how to deal with this new right.

But I think it raises much less practical questions for users and maybe also much less fundamental legal questions. I think you referenced the term “link tax,” which was mainly a defensive maneuver to saying, like, this must really not overreach. And I think to some degree, the components of the link tax have also managed to contain this tool. I think it would have better not to have this thing as a whole, but like, to contain this at least to sensible proportions. And I think Bernd wants to add a little bit.

>> GREGORY ENGELS: Yes. Please, go ahead.

>> JUSTIN JUTTE: I think there is one fundamental distinction between Article 15 and Article 17. Article 15, you may like it or you may not like it; it’s good law and there is a normative decision that’s been made and we have to live with that.

Article 17, you can also decide to like or not, but there are fundamental issues with it that have to be resolved before we apply it. And Article 15, there was never a question about its constitutionality. It’s just been a question whether, who actually wants to pay for it, but Article 17 is much more complicated. So, the questions we have to ask here are different ones, and they are more critical in relation to Article 17, I’d say.

>> GREGORY ENGELS: Okay. So, okay, Patrick?

>> PATRICK BREYER: Maybe I can add to this, because this ancillary right or link tax has also been a matter of great controversy in Australia, where this has resulted in a clash, really. The problem with the link tax is not only that the creators won’t see any of the money that this is supposed to generate, but the problem to society is that it disincentivizes links, or at least previews, as Ania has explained for Facebook. So, we are already struggling with the problem of disinformation and fake news and the bubbles that – the algorithms that will advance sensationalists, and oftentimes, problematic content.

And if we now reduce the visibility of quality content and information, that is really counterproductive to the objectives that we try to achieve. So, this is not the way to help publishers. What would help them much more, for example, is to create a level playing field when it comes to advertising, because here we have the practice of surveillance capitalism, where advertising is based on the surveillance of users. And if we made sure that digital advertising is just as contextual as advertising in newspapers, on television, et cetera, is, and does not rely on personal information, far-reaching, intimate information, always, or oftentimes, then that would help the publishers a lot more than a mechanism that will basically not solve their economic problems.

>> GREGORY ENGELS: Yes. Thank you, Patrick. Given the time – so, we are allowed to run a little bit over time, but not endlessly. And I heard that the messages, the takeaway messages are almost ready, and I think it’s – I see Natasa Perucica has switched on her cam, so I would hand over to Natasa. Is that correct? You’re muted. Natasa, you’re muted. Thank you.

>> NATASA PERUCICA: Thank you. I wasn’t able to unmute myself. Thank you for the floor, Gregory. My name is Natasa Perucica, and I’m a Geneva Internet Platform Reporter. The Geneva Internet Platform is basically the official reporting partner of EuroDIG, and we will be providing, as already mentioned, key messages and session reports from all the workshops.

So, now I will present the messages and that will be also included in the report presented on the GIP Digital Watch. And I would just like to remind you that they will also be available for additional comments, and EuroDIG will provide more detail on that. So, my first message from this session is that copyright can contribute to negative trends, such as surveillance and control on the Internet. There is, therefore, the need to combat such a practice and the idea that machines can govern what we can see or read on the Internet. And in case you have any objections to this message, I would kindly ask you to include your comments in the chat. If not, we can take it that there is consensus on the message.

The second message is the following: Whereas filters are designed to identify matches between different sources, they are not designed to recognize context. The third and last message is content control requires supervision to make sure that the impact on different stakeholders is proportionate. An institutional mediator needs to be established or an existing actor should be given legitimacy to deal with content control. That would be all from my end.

>> GREGORY ENGELS: Okay. So, I look to my presenters and give you the opportunity to – well, you heard them just been read. So, do you have any strong objections against what’s been said, or is it okay?

>> ANIA HELSETH: So, I just have one comment. I do think this message probably reflects everything that everyone said in this panel, so I’m not just trying to say what I disagree or agree with, but did we really come to a conclusion on the mediator for the last point? I’m not sure if I’m there, but otherwise, if the rest of the group did, then I’m happy to sign into that as well.

>> GREGORY ENGELS: So, I wasn’t – I looked away, so I don’t know who has his hand raised first. Paul, please.

>> PAUL KELLER: Okay. First, thank you for the summary. I think it’s excellent, but I also have, a little bit like maybe the third point is the one where I’m looking a little bit, if it really reflects – like, I’m not so sure about the consensus. I tend to agree with what Bernd said, but I mainly take a little bit of problem with the content control because I think that can be used in very many things and I think what Bernd said is more about the balance between automated content, like recognition or content control mechanisms and fundamental rights, so that may need to be a little bit reworded. The first two ones are really excellent summaries as far as I’m concerned.

>> JUSTIN JUTTE: I could quickly follow up on that. That wasn’t mean as a proposal for consensus, that’s a provocation. Precisely because I think this is the only way to create a proper balance that would be acceptable. But I also understand this is something that platforms wouldn’t really be happy with because that creates even more work with them to coordinate their work with such a mediator. And it would also cost money. We have to set up an agency and so on and so forth. But it’s a hypothetical construct that would need a lot of work, but I am fully aware that this would never generate consensus anywhere.

On the first point, where my co-panelist tends to agree, I think there is a certain danger in kind of attaining copyright here. Copyright I think is an easy laboratory to experiment with enforcement and there might be spillover effects, but on the other side, it’s not where everything starts. Copyright is economically more important. This is why in many instances this is the area where new technologies are being tested in the field of enforcement. But putting this so bluntly, I think there is also a bit of a danger in tainting copyright. So, I would only sign this half-heartedly, saying that it can contribute, absolutely, but stating – and I speak as a lawyer – saying that it can contribute also means it cannot contribute. So, I think in that formulation, if you want to have some sort of a memorandum, I think it is actually relatively worthless. Either it has to be worded in much, much stronger terms or it has to be more specific.

>> I read that as copyright enforcement, which I think makes it more specific. Not the idea of copyright in itself, right? That is, to be clear, that is something that has its value that is what a lot of people make their living on and that should not be fundamentally questioned.

>> GREGORY ENGELS: Yes. And then Patrick has also his hand risen and he is another lawyer.

>> PATRICK BREYER: I agree that that would be good. On the second point, where it’s true that filters can identify matches, there is still a problem where the databases used for that matching is incorrect. For example, when it comes to the chat control, on looking for child pornography, the Swiss Police says that 86% of the hits that are reported to them are not even criminally relevant. Even if you use those filters and you use flawed databases, then you will still see a result of massive amount of false flags.

And on the third bullet point, I think it might be important to differentiate between human supervision and human review. What we are trying to do, and also in the German implementation, is to make sure that any flags by filters are reviewed by a human to be able to understand the context and the intention, et cetera, and the identity of the person posting it. Mere human supervision of the filtering mechanism is not the same. That wouldn’t be sufficient.

>> GREGORY ENGELS: Okay, good. Thank you. And then we have a hand risen from Amali De Silva-Mitchell. I can’t read the full name. Mitchell, yes. So, can you unmute? Thank you.

>> AMALI DE SILVA-MITCHELL: Yes. Hello. Thank you. My name is Amali De Silva-Mitchell. And what I’m trying to link here is privacy and copyright and Point 3. I know a little bit about privacy. And just wondering whether the traditions of the court come from a similar place. As there are lawyers here, it would be really interested in that, quite a dependent set of points. Thanks.

>> GREGORY ENGELS: I’m sorry, I need to – well, you had a question about how this is comparable to privacy laws? But currently, we don’t have the time to go into that. The question is, like, about the strong objections about diverting, and we collected some of input, and I’m not entirely sure what’s happened next, so perhaps Natasa Perucica can explain if it will be reworded. I know there is a mechanism that we can send in comments before it is finalized, so I will distribute how to do that to the panelists. So, Natasa? Thank you.

>> NATASA PERUCICA: Okay. So, I’m unmute. I will forward the messages to a colleague of mine and they will be available on the EuroDIG platform, and then you can reword them, change them as you wish. I think from what I’ve gathered that there will be a sort of discussion on the wording of the messages.

>> GREGORY ENGELS: Yes, yeah. I also think we cannot find – I don’t hear a very strong objection, but I think it’s, to make it powerful, we would need to spend some time discussing. So, it’s 16:03 I’ve been asked to end by latest 16:05, so I am in the deadline. Thank you very much all for being brief and precise. And, well, I think it was an excellent session regarding discussion and content.

We have the messages will be presented tomorrow also as part of the focus session number 4. So, if you like, you’re welcome to drop there and to join there. And, well, I say good-bye. Thank you very much for coming.

>> ELISABETH SCHAUERMANN: Thank you, everyone, for the great session. Just for those who are unsure about how to get back to the Gather, here are the instructions. And in this room now we will have a 20-minute presentation in a few minutes on the regulatory approach to artificial intelligence and then the next workshop, Workshop No. 9 starts at 4:30 also in this room. So, you can stay or you can go, but I hope everyone has a good time. I’ll see you later.

>> ANIA HELSETH: Thank you so much. Thank you, Gregory, and the rest of the panel.

>> GREGORY ENGELS: Yes, thank you for jumping in at such short notice. It was really –

>> ANIA HELSETH: No problem.

>> GREGORY ENGELS: – win for us.

>> ANIA HELSETH: My pleasure. Thank you.

>> GREGORY ENGELS: Thank you very much. So, okay. So, I say bye. I have to run to a different meeting, too, as Paul did. So, see you later.