Hate speech and its consequences for human rights online – WS 08 2014

From EuroDIG Wiki
Jump to navigation Jump to search

13 June 2014 | 11:30-13:00
Programme overview 2014

Session subject

An examination of how multi stakehoder internet governance does or does not take into account human rights online, specifically with regard to hate speech issues

Session description

The Internet has become a global space for creativity, communication and participation. Online, we can create, share and use media content in a variety of ways and with very little effort. This is even more the case on social networks, where we can upload, forward, comment or promote contents. Internet users, and young people in particular, have a right to perceive their online interactions as benefitting from the freedoms of expression and information. Consequently they should expect what is communicated online to be uncensored. However, reality tells us that the online world is also a space where the values of human rights are often ignored or violated. Among others, hate speech online has become a major form of human rights abuse, with very serious consequences for people, both online and offline. Young people are directly concerned as victims, targets, active and passive agents. But hate speech affects all of society. Hate speech as such is not a new human rights issue. However, its online dimension and the potential negative impact on democratic development give new reasons for concern. One of these reasons is that the online manifestation of hate speech is difficult to monitor, measure and counter.

A debate/discussion on the responsibilities of all stakeholders in the internet - users, governments, civil society, social network corporations, governments etc. in this regard. The European Union Agency for Fundamental Rights (FRA) will be presenting their findings related to cyber hate, including harassment and hate speech, from the latest FRA surveys: EU LGBT survey, survey of Jewish people’s experiences and perceptions of discrimination and hate crime in European Union Member States, and survey on gender-based violence against women. The surveys asked about offensive, threatening comments in person, sent by emails, sms and through social networking

People

  • Focal point: Bridget O’Loughlin, Council of Europe
  • Live moderator: Bridget O'Loughlin
  • Rapporteur: Mario Oetheimer
  • Remote participation moderator: Viktor Szabados, facilitator of the No Hate Speech Movement, IRPC board member, ICANN stakeholder
  • Digital facilitator: [[User:Sebastian|Sebastian Haselbeck], CoLab]
  • Panelists/speakers:
  • Satu Valtere, Finnish National Campaign co-ordinator of the No Hate Speech Movement
  • Maja Rakovic , Counsellor, Ministry of Foreign Affairs, Serbia (tbc)
  • Patricia Cartes, Twitter
  • Vida Beresneviciute, FRA
  • Steven Lockhart- Blogger, Activist
  • Andrej Bencel, National Campaign Co-ordinator Slovakia
  • Youth stakeholder, NHSM campaign online activist - tbd
  • Participants:

Sara Serrano Lattore - Blogger, Activist Igor Beciric, Member National Cmapaing Committee, Serbia

Format of this working group at EuroDIG

Workshop/Debate

Protocol. Discussions

See Discussion page

Addiional notes (open collaborative Gdoc

Further reading

Consult our site: http://www.nohatespeechmovement.org/

Messages

Reporter: Adriana Delgado, Activist, No Hate Speech Movement

  1. As a new form of public space, the internet needs to become more inclusive and matters of internet accessibility and literacy should be addressed, starting with next year’s EuroDIG session.
  2. The inexistence of a universal definition of hate speech is one of the major problems when addressing the subject on a borderless space such as the internet
  3. Given how on the internet deleted content easily resurfaces, and how the online ethos is one of great freedom of speech, education and awareness raising could be better alternatives for dealing with free speech
  4. Although no consensus on this matter was reached, the topic of limits of freedom of expression in what concerns hate speech was discussed. Legally speaking, free speech is not an absolute right and that limitations are defined by national law.

Video Record

https://www.youtube.com/watch?v=3HBg5c9_osw

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com


This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.


>> BRIDGET O’LOUGHLIN: Good morning, everybody. I’m really happy to see so many people in the room for our short – or long debate, perhaps. Let’s start with a short movie on the issue of hate speech on the Internet. Thank you very much, Andre. Andre is handing badges around, you have to wear a badge.

I’d like to introduce our panel. My name is Bridget O’Loughlin, I’m the campaign coordinator for the No Hate Speech movement. I look out the Europe side. I bring our activists, who are all volunteers, in touch with our national campaign coordinators, who are usually people who would work with youth councils who are usually NGOs, or the government.

We have an important lady here today, the Steering Committee. Society at the Council, and it is a committee that has been looking a lot at issues to do obviously with the Internet. Can you stop it for a moment, please? Has it started already? The sound?

(Video:)

>> There are now so many more ways to spread ideas to people worldwide than ever before. So we can see that the Internet –

>> BRIDGET O’LOUGHLIN: We also have with us the representative from the fundamental rights agency, Peter, who is going to provide to us surveys that have been done about hate speech.

We also have Steven and Yana who are two of our active activists who are dealing with hate speech on a daily basis, I think I can say.

And we also – sorry? Can you turn the sound off, maybe? Thank you. That’s better. I’m competing with an excellent cartoon video. So I don’t want you to see it yet.

We also have two members of committees here Andrej, who is from Slovakia, and Igor works with a very effective campaign.

We have as our remote moderator, Viktor Szabados and Sara Serrano Lattore, who is also an active activist. We rely on young people to be active and help us with the campaign, especially because they know more about the computers and Internet than I do.

So we’re going to show you to begin with a short video which was made by some of the activists in Portugal, a group that call themselves the No Hate Ninjas. I’d like to show it to you because I think it sums up in a very short time everything that this campaign is about.

[Music.]

>> The Internet is a global system of interconnected computer networks. It is a network of networks that consists of millions of private, academic, business and government networks that are linked by a broad array of electronic, wireless and optical networking technologies. As more people around the globe become connected, they see, read and hear more. There are now so many more ways to spread ideas to people worldwide than ever before, so we can say that the Internet is a virtual place without borders. It is everybody’s place and nobody’s place. And we are all in it together.

The Internet is the place where you can choose one or more identities; or if you want, you can stay anonymous. You can also send messages with your friends, bar codes from online shops or share pictures of unicorns or cats. Without the Internet, new technologies have been a force for good. But the Internet has a dark side because it has also made it easier to share hate.

Online hate speech is one of the most serious problems that comes with the Internet. Hate speech, as defined by the Council of Europe, covers all forms of expression which spread, incite, promote or justify racial hatred, xenophobia, antisemitism based on intolerance including intolerance by aggressive nationalism and ethnocentrism. Against people with disabilities, migrants or people of origin. How do we respond? What do we do? On one hand we have the freedom of expression. Everybody’s free to share their opinions. On the other hand, freedom can only exist when it doesn’t limit others’ freedom.

Freedom of expression goes hand-in-hand with the demands of a democratic society. It is a necessary condition for the enjoyment of our democratic ideas, providing space for public discussion and debate. Without which there is no democratic society. The question is: When does the freedom of expression become an expression of hate? Where do we draw the line between speak PGP flux our mind and expressing hate? Or is hate speech part of freedom of speech? But whatever we say has consequences. Freedom of expression doesn’t give us the right to offend, discriminate, hurt, hate, intolerate. Because freedom comes with responsibilities. How do we stop hate? How do we regulate the Internet?

The first issue is that the Internet has no borders. The borderless nature of the Internet means that closing down an offending website page or service provider doesn’t solve the problem because there are many more waiting behind the walls or across the board.

The other issue is the anonymity of the Internet. People who are able to post anonymously are far more likely to say awful things, sometimes with awful effects. Speaking from behind a wall disassociates the responsibility from the person. It is far easier to hit the send button without second thought under those circumstances.

So what should be done.

It is very hard to create the prohibition or a prescription against the free flow information. We have to deal with hate speech in other, more creative ways. We need some other solutions. We need to make people aware.

Public awareness of hate on the Internet, whether true reports and studies are need – can go a long way to help sensitize the public.

We must never be indifferent to hate and discrimination. Everybody can do something against hate speech and every action counts. Fighting hate is a shared responsibility. First, we have to remember that actions on the Internet are real actions, so use your real name. If you see a hateful comment, video or photo, report it to the administrator. And, most of all, think critical. Think before you comment and think before you share.

>> BRIDGET O’LOUGHLIN: Thank you. So I think that – I like this short movie because I think it does demonstrate the difficulties surrounding how to deal with hate speech and the fact that you can legislate as much as you like against it. It’s not necessarily going to stop it because people can go elsewhere or they can hide very easily.

So, now what we’d like to do next is to prove to you that there are issues to deal with hate speech. And I’m going to mispronounce your name – okay – is going to tell us about the surveys that the FRA has done on a number of vulnerable groups who have experienced hate speech and I pass the floor to you.

>> Can I have my PowerPoint presentation? Thank you, so my name is Vida Beresneviciute and I represent European Union Agency for Fundamental Rights. And I will talk today from the perspective of agency surveys that we do. I will show the list of the service that have been carried so far since 2008. But I will focus today on my presentation on the three last ones. It’s ULGBT service, discrimination against Jewish population and large survey on violence against women.

So the first two are on minority groups that have been mentioned; the last one it’s on well, which are not minority but still, yeah, it might be vulnerable group related to certain issues.

So I won’t bother you with methodological differences on the surveys, but I will be happy to answer any of your questions you might have. And I’ll give you some figures and facts about the issue. Just to mention that the data collected are more related to harassment. So what we call cyber hate and not so much with hate speech as such.

I think what you’re going to see from the presentation is that, yes, Internet is being used as a form of hate and harassment, that Internet, social network sites offer platforms for expression of racist, xenophobic sentiments. And we do say that online manifestation of hate cream are increasingly serious problem. And some of the groups of population are more vulnerable to become victims of hate crime.

So let me start with the survey of Jewish population in 8 EU member states that has been carried in 2012. And it was online survey of self-identified Jewish persons age 18 and elder. It asks a lot of questions, but let me show few results from the survey. The sample was close to 6,000 respondents from 8 EU member states.

So what do the survey show? That the majority like 3 in 4 respondents said that Anti-Semitism online or Internet is a big problem or fairly big problem and that it’s increasing over the past years as the question was formulated. So we see that there is a message clear that it is problematic issue.

When asked about specific forms of anti-Semitic harassment and listing specific situations where people were faced in, so we see that 1 in 10 of the respondents have received text messages, letters or other communication that is offensive content or threatening content. And also similar portion like 1 in 10 have experienced offensive posting comments on Internet, including social networking.

Let me turn now to the EU LGBT survey, which was one of the biggest also FRA surveys carried. It was done in 2012. It was again online survey in 20 EU member states. At that time it was 27 plus Croatia. And the samples of over 90,000 respondents that were reached. And here are some of the results from the survey. That 1/5 of the respondents like 19 percent have been victims of anti-LGBT harassment in the 12 months prior to the survey. And when asked where it happened, you see that some cases happened at the street, at workplace but also like 10 percent report that it’s online incidents. And, yeah, just to remember that we are talking about to offensive or threatening incidents that offends people.

And also, some of the most important or some of the most serious cases would also happen online that we asked the Respondent, what are the most important cases.

So once again it comes in line with the survey mentioned previously that the share of population face the issues in life.

And, finally, I would like to present you some findings of the EU-wide survey on violence against women. This is the largest survey the agency has carried so far. It was also in 28 member states. And carried out through random sampling technique with a sample of 42,000 women. Although the previous ones surveyed, these are the ones of people who self-identify with certain groups, and this is the random sampling where each woman had the same probability to take part.

So the survey asked a lot of questions about the experience of physical, sexual, psychological violence. And I won’t elaborate on the richness of the data. I will just would like to share some facts. So the survey asked about stalking. Of course it did not use the word stalking. But what we mean, it’s repeated offensive or threatening acts perpetrated several times by the same person with regard to Respondent. So from the data, we see that 15 percent of women across the EU, across the 28 countries, have faced offensive or threatening communication since age of 15. So then 8 percent have been victims of following or loitering and 3 percent damage of property.

By cyber stalking and when focusing on cyber stalking, we asked whether a Respondent received text messages, Emails that were offensive or threatening. Were there any comments posted on the Internet or any videos or photos were shared. So just to have the idea of what specific situations have been faced.

So the next figure shows distinction between stalking, all forms or any form of stalking experienced, and cyber stalking. Based on FRA survey, we see that 5 percent of women in European Union have experienced one or more forms of cyber stalking since the age 15 and 2 percent in the last 12 months before the survey.

But you should know that the frequency was twice higher in the group of age of 18 to 29 than in the other groups. That’s the group that is mostly involved in the Internet, but also we see that the probability of young women is several times higher to become a victim of cyber hate.

And it has consequences. When asked what was the impact or what have you done after the most serious case of stalking? So like 23 percent of women said that they have changed their phone number, email address or less than 7 percent closed their social network account, which, as you know, involves a lot of personal time and efforts.

And just to finish the presentation of the data, some data on sexual harassment specifically or cyber harassment, when we asked the respondents: Did they receive any unwanted, offensive, sexually explicit emails or SMS messages? And then whether there were any postings online or Internet chatrooms or websites. Here again I would like to pay attention to the age differences. And the blue graph shows the experiences since the age of 15. The red or reddish column shows the experiences in the last 12 months. And here again we see that the younger the respondents, the more likely they are to become targets of cyber harassment. And it’s twice higher in comparison, in group of 18 to 29, it’s higher to 30/39 and 40/49 and several times less in the elder groups.

So once again what we see that it’s happening. And that some of the groups are more targeted than the others.

So you can find more data on our website. And it has data visualization tool where you can actually go question-by-question and select wanted to have a deeper look at the details of the surveys. But as I said at the beginning, yeah it happens on the Internet that the platforms are used for expressing hate has motivated sentiment and that certain groups are more likely to become the targets. So here is the email where you can ask any questions you would like to and have a look at our reports and data visualization tools. Thank you.

>> BRIDGET O’LOUGHLIN: Thank you very much. To continue the theme and perhaps wake everybody up because it’s middle of the morning, I would like everyone who has encountered hate speech – has seen it, not necessarily addressed them, but has seen hate speech online to stand up.

And I would like anyone – no, stay standing. And now I would like anyone who had hate speech directed at them to stay standing. Everyone else can sit down.

And now I would like those people who have reacted to hate speech when they’ve seen it to stand up. Not so many. I think we all often witness it and we ignore it either because it’s not directed at us or because we’re afraid to ignore it. Thank you. You can sit now. You’ve had your exercise for the day.

So I just want to bring it out because it is always shocking to me how many people stand up when they say that they have witnessed hate speech online. I think it is a serious – I think we all know it’s a serious issue or we wouldn’t be in this room.

Okay. So I would like to ask my key panelists a few questions. But this is certainly not going to be a series of presentations. We would very much like to get input from everybody in the room who would like to speak.

So I think my first question would be based on what I just said. Yana, how do you think we should react to hate speech when we see it?

>> Jana: Well, the things that we used to tell to our activists with the beginning when I started to get involved with the campaign and organized trainings afterwards myself it was about first if you see the troll, don’t feed the troll, basically. But afterwards when you counter argument you need to be really constructive in the things that you’re saying. You need to find the core of the idea and to elaborate on this issue. So the person will see that you’re just not saying that you’re an idiot for saying so but you argument yourself. And if your arguments are interesting enough, more and more people will join the discussion. So more and more arguments you can get from the interest people.

But personally the thing that happened to me and I was probably going to tell this afterwards, there was a case of Forum, racist Forum from the states. The way it came to my life, it was quite – not connected with this thing. I was going to write the blog for the No Hate Speech Movement about some cases when we hate speech online and what can we do? And I wanted some challenge. Because we all know that there are Facebook rules that are inbuilt. There are Twitter rules that are inbuilt. So basically when you collect enough reports, you can bend if content or report the person. This is not the case for the forums or privately-owned web pages where basically you cannot do anything.

And this Forum, can you please put the video presentation? It’s on the member Sequita heart. There is the only one? Yeah, this. Ah-huh. Thanks a lot.

And basically you have this Forum that whoever has darkest skin color, they are called monkeys. And the thing that they put there is the hatred. So crimes committed by monkeys, whether they are typical attitudes or so on. And the basic message of this Forum is that they call for violence against those people. So I started to as a case study dangers of this Forum, who are targeted? What is bad about them? So the Forum is quite messy. It is the third – because first three times it was blocked after several attempts. But it’s like zombies arriving again and again. The motive is a zoo out of there and they have pictures of mon keys all around. They hide behind the hidden identities. So the first thing that you see when you go to the rules of the Forum is that no person should identify themselves. No person can post photos about them so nobody can track them. But it’s still proven that most of the hate crimes committed in the United States of America, they are committed from people by this Forum. They were tracking them on IDs. Still a Forum exists. There is a whole filtering procedure when somebody wants to join the Forum and see the discussions, whatever, or participate in them because you can see them but you cannot say a word, you need to go through this kind of trial procedure when you need to make at least three racist jokes. And if you are approved by the community and they say you’re cool enough, you can join. But afterwards there is a rule that no non-racist comment can be posted there. So whatever is there, it should be 100 percent racist. And if you are not active enough or you are not racist enough, you are banned from the Forum.

So when I wrote the Forum, the first thing when I wrote the block, the first thing they did is they put the logo of the Council of Europe on the front page. And they said that No Hate Speech Movement and Council of Europe proudly advertising to March 2014 is when it happened. Trolling us and playing with this thing. So we got all the people there.

Well, next day they put this whoof, saying oop. So we are getting feedback from them. There were lots of people commenting on the blog. I started to get lots of hateful comments on my Facebook page because I was the person who proposed to link the author of the blog with some personal information because for me it was important that I stay for my words. So if person has to say something to me, they can contact me and tell me where they have this. So I started to get hateful comments and honestly didn’t expect that many. But also like the day fourth we found that actually existed for one year. And we started by darker skinned woman whose daughter, she was 22, she was personally attacked. The daughter didn’t have one hand and what people from this Forum did without any previous conflict, they just put photos of this girl all around the city. And they said that look what they produced. They shouldn’t multiply. So it was direct call to target the girl. And she had to change first the school. Then she had to change the city.

So what they basically do on the website. They find targets who can’t react on them. And they put their personal data. You can find phones of people who started to disagree with you. You can find Emails, addresses, whatever. My email was also honored to be there for quite a while. Yeah. They put my photo from Facebook. And they said that I’m off of free speech unless person targets me. I don’t know how this Forum could personally target me because I’m not there initially supposed to target audience. But, still, this one we were trying to spread all around and also the petition that by the date 110 censures. So the idea that we were telling to our activities that you should counter argument, that you should get engaged in the dialogue and spread this one.

In this case, they didn’t work out because on the blog, we were getting lots of hateful comments. And some of them, they were really epic ones like in your stupid country Europe, and just one of the activists said sorry Europe is not a country. There were lots of stuff. And the data that we were trying to deliberate on, lots of people who were supportive to us. But it didn’t make any sense because the more you write there, the more haters you get, the more trolls you get. So we finished commenting on this.

This side of protection, there were advices from the No Hate Speech Movement that probably I should use thor browser to hide my identity or if I get engaged in this case, I should get engaged also from the hidden pages, profiles, whatever.

Still, for me it was probably easy because first it’s just physically being far from the thing. And I know that those people who are targeted by the Forum, those are the people who are really struggling in their lives because they have to change their jobs, they have to change their places. They have to change their lifestyle. Just because there are some people who hate them, skin color or something else.

And I didn’t know by this, I think there are called the cloud fair who is the hosting service. And just having the examples of this website breaking so many international rules and regulations, and also the rules and regulations that exist in the states, in United States of America, but still nothing is done. So the first three items by removing if web site different providing services, this one is still there. So I don’t know. I’m asking you probably if you feel supportive for this case. There is a QR code you can use just to sign the petition. But that’s the thing that we can do by the moment because the cloud fare, they said that in order to remove the website, it’s not just a number of letters from those people who were targeted or who were somehow felt bad about this one, they need 25,000 signatures. And today after one year, this lady did a lot and people in the states did a lot, but they only had 110 censures. So I hope because there is legal paper that says that I as a cloud fare take the responsibility to remove the website when 25,000. So if they don’t do this, it can be criminal case. It will be further proceeded in the states so it will be a litigation stage. By now we can contribute by this because there are other things about reporting hate speech and whatever, they don’t work in this case. But this one is extreme one, I think. I know others would like to tell about other ways to report direct on the hate speech in less crazy cases.

>> BRIDGET O’LOUGHLIN: Thank you. That was an extreme case. But you can see that the activists or people who do try and counter hate speech can become targets themselves.

An alternative, of course, is to have the hate speech taken down. But that amounts to censorship. And then we get into a serious issue. And perhaps, Maja, you’d like to address the issue of the questions of free speech versus – yes. If different rights involved. Thank you.

>> MAJA RAKOVIC: Thank you. Well, I’ll speak from the point of view of our work in this intergovernmental committee of the Council of Europe, which is a committee where all member states, 47 member states of the Council of Europe are represented. And in our work, other stakeholders are also involved and included. Representatives of civil society and just the organizations.

The Steering Committee media society and Information Society works under the authority of committee of Ministers. So basically all the government ministers document in this field are prepared, drafts are prepared by this committee.

Hate speech has been one of the themes, which it’s an old problem. It’s not something new. It’s been there for a long time. First document directly involving hate speech theme was adopted fully in 1997. And it is a solid and very good basis for tackling this issue. But one of the things is that since 1997, in the media field, many, many things happened. And there are also other documents that were also adopted at the time. I’m just signaling and underlining this one as the flagship document.

In the Council of Europe field, what is important from the perspective of European Human Rights is the balance under freedom of expression and other rights. And if there are limitations, freedom of expression is not an absolute right. There can be limitations. Knees limitations need to be – these limitations need to be prescribed by law. They need to be necessary and in democratic society and they need to fulfill certain number of conditions why certain facts could be or certain things could be limited, which is national security, protection of health and morals, and there are a number of other things.

So in these documents we try to balance these two rates and problem with hate speech is that if you put very strong limitations, your risks of actually misusing this right because there might be stakeholders that would wish to use this limitations in order to promote censorship and other interests.

On the other hand, if hate speech is completely uncontrolled, it can provoke many other damages and cases like this which have just hurt. So what is important in the approach to this theme is to first have a holistic approach and then to have differentiated approach depending on which types of hate speech. It depends who is it intended to? There are differences between, for instance, levels of protection of public figures, of public officials and ordinary citizens users because public officials, governmental representatives should be more tolerant into critical views and critical comments. You know, there is also this question of where to draw lines between criticism and hate speech. On the other hand, it also depends on the impact, which means it is not the same if the hate speech is performed in the media or in a private sphere. Now with new media, we have this blur line between private sphere and public sphere. So the problem with the hate speech in new media is also that it’s becoming more speedy; that is, becoming interactive that you have user-generated content which means for instance before in order to report in the media, you had to fulfill certain standards, ethical codes, journalism standards. Now to generate comment, anybody can become a media it in self.

Then also age limits have changed, for instance. And now smaller and smaller children can become targets of cyber bullying and people are very critical, age groups can become targets of hate speech without really understanding what is happening and how to react to all those things.

So Council of Europe is within this work of intergovernmental committees trying to give responses to all these developments.

The way we work is also – the way we work and the good approach to this is to have complimentary measures. And one way to have regulatory measures, things that are prescribed by law as some forms of hate speech that need to be tackled by regulatory measures. For instance, incitement to violence.

And on the other hand, there are many self-regulatory measures that are proposed. And in this respect, we, in this committee, work also with other stakeholders, for instance with Internet service providers, with online game service providers, in developing standards that – Human Rights standards for this particular groups of stakeholders, which also helps.

There is also what is necessary is media. It is important that users are educated to understand. Because sometimes, I mean, people don’t understand first of all how to defend or find out how to try to defend themselves and others and also how not to offend others. Because sometimes people can commit some things without really understanding that they’re hurting others. And, for instance, one of the most recent documents that has been adopted in this field, which is also certain media literacy tool is Human Rights guide for Internet users, which has been adopted just two months ago and basically it is a sort of compendium of Human Rights that users should know, that when they use Internet.

So these are the measures. One of the framework documents, also very important for this particular theme, is the new notion of media, which has been adopted in 2011. And it recognizes that although media environment has changed, media functions did not change. So media are still there very important for democracy. They’re there to provide public debate. They are there to provide scrutiny. And also to inform and educate. So this document tries to see how new actors in the new media system can be identified as media, not media. And that depends on many – it’s not that black and white like it used to be now. There are many criteria. For instance, to what extent somebody can perform editorial control. What is the impact of the information? What is the – how massive dissemination might be. Where is the difference between public and private sphere? And this document actually also talks about hate speech and how to try to respond to hate speech risks in this new environment.

One of the themes that Council of Europe has also from time to time every three to four to five years ministerial conferences where future work is – a road map is provided. And the most recent conference which was held in 2013 in Serbia, hate speech was also underlined. It was one of very important themes. And it was said there that is one of the things that is also important in the context of new media is not to put too much responsibility on intermediaries because – but actually on the authors of hate speech because that might lead to prior censorship, that might lead to intermediaries who might try to censor some things just to avoid any near risks for their performance.

It has been also said in these discussions that it’s important to have flexible – I mean, it’s not good to have very determined definition of what the hate speech is because that also might lead to certain restrictions that can be misused.

One of the documents that we will be looking into now is the concept of Internet freedom. And within this document about Internet freedom and provisions how to – what is Internet freedom? What rights do we have? What responsibilities are there of different stakeholders? Hate speech is also one of important themes. So this is in brief. I mean, there are many more things to speak of. Council of Europe has been working – there are many other branches of the Council of Europe that has been speaking from this point of view. Thank you.

>> BRIDGET O’LOUGHLIN: Thank you very much. I think it is clearly an issue which has to have multilateral approach because obviously one issue is education and awareness-raising, which as we saw from the video might be more important than bringing in lots of laws that people can avoid by just going somewhere else or hiding.

I think maybe I would like to – I’ve seen some tweets coming up that people are actually indicating disagreement. And I’d love to hear from someone who wants to challenge a little bit what we’re doing on the basis that it’s restriction on freedom of speech. Anybody want to comment on that? Is anyone there? Yeah, please. I don’t know whether everyone can hear you. You will need the mic.

>> All right. Hi. Jacob from Swedish think tank.

I think that one of the problems with this discussion is that we’re using a lot of different concepts at the same time. There has been talk of cyber bullying, which I think is perfectly fine in relation to free speech. It should not be covered by hate speech. Depending on what the bullying actually consists of.

And the definitions of what hate speech should be, the Council of Europe definition, which I was trying to look up just now so I can’t actually say that I’ve looked it up. But I do think it is problematic that there are concepts used and mixed in this discussion that are not – that create different problems with free speech. So I’m not sure that all of these – cyber bullying, hate speech. There was talk about stalking online, which was actually less prevalent online than offline which I was surprised to see.

Yeah, I’m sorry I didn’t have time to actually figure out what I was going to say. I’ll come back if I have another question. But I do think it’s problematic that we talk about “we have to limit speech in this way, especially the video in the beginning.

>> BRIDGET O’LOUGHLIN: Yeah, go ahead.

>> Yes, I think also there’s another – this is not – I understand the issues here very clearly. Abuse and revolting things that exist in the world. The thing is hate speech is a form of behavior. It produced unpleasant content but it’s also a behavior, how people behave, just as bullying is a behavior. And because we are seeing this on a screen, we also see it as content. And this is why the media raises issues about censorship.

But if I could underscore the need for the education and awareness. It isn’t just about learning how to type properly or say the right things. It’s about how to behave with respect. And that is also country-specific. Directness in one case may be taken as offensive in another. So I’m interested in hearing more about the education awareness raising that’s done in an enabling rather than scaremongering way. Because this, as you all know, this is the shadow side of the platform, it has its own shadow side. How do you deal with that kind of sense of is this just making things so scary that in a sense people – yeah. How do you deal with people misbehaving and not do it in a parental way?

>> BRIDGET O’LOUGHLIN: I agree with you. A lot of it is behavior. None of this is new. Bullying has been going on, I’m sure, for time eternal. I certainly remember seeing it in school and even being a victim of it.

Hate speech is not new. The Council of Europe’s definition is in a recommendation which was adopted by the committee of ministers in 1997 well before the Internet is what it is today.

So the issue, I think the reason we’re tackling this issue online or there’s two reasons. One is that the whole campaign was started as a result of the killings in Norway where Andrei Breivik went and killed all the people in a youth Forum. His actions were preceded by a huge amount of hate speech that he put out online. So the feeling was that clearly hate speech can lead to hate crime. In this instance, it was only one person. But somebody could have tried to motivate other people to be involved. And that therefore we should be doing something about hate speech.

And I guess what bothers me with the Internet and I see it with my kids and I see it with the younger people that I associate with, especially in this campaign, is everybody’s online all the time. And it’s very easy. It’s very easy to forward on something that you think is funny. But it isn’t funny to the person who is the target. And I know that Adrian an wants to speak about the issue of being a target.

>> Is this working? Yes, it is. I will give my answer also in response to Jacob? Is that your name? Sorry. Which I think, yeah, I was talking yesterday with Bridget about also thinking of Jana’s case being at the receiving end of hate speech of the it’s not necessarily that we defend that everything should be deleted because it’s really important that just by discussing this thing, we are just trying to make a discussion, to make people think about the phenomena, not saying let’s erase it. But we should think about behaviors and try to understand why they happen and how you can respond to them. And the thing is when you are at the receiving end and you spoke of bullying and how to you it’s covered under free speech, the thing is we have seen over the last, I don’t know, two years so many cases of people who actually committed suicides as a result of bullying, for example, and you say maybe they had problems and they already had depressions? But precisely. They already had problems. And they were not in any way – it’s not like survival of the fittest and then only those who are completely strong survive and if you are weak, you will go with the tide, you know. There are consequences when you are at the receiving end of this speech, precisely online, you do not know the person, the special conditions of that person, you cannot predict that that person is already dealing with suicidal thoughts or whatever. You cannot predict how your actions are going to affect this person.

I understand that there is such thing as freedom of speech. But I also think that while Internet user and if everyone is using the Internet right now, you have the right to also, in your own mental health if you have problems, or if you don’t, but to not be permanently ostracized as you can be by a very organized campaign of bullying. And those campaigns exist.

So I think as Maja said, these are complete rights. There are not absolute rights. So I think we need to rethink how valuable is freedom of speech if – I mean, sometimes someone told me my right to punch the air ends where your nose begins. And I understand that we should have the right to speak up our minds and share our thoughts, but there is someone that is very real on the other side. And this other person also has their own needs. And we need to balance them.

>> Thank you. Now I’m a bit less taken off guard. I do understand the argument that there are people that are very, very affected by bullying online and hate speech online. I myself have been bullied. I’m guessing that more people in here have been bullied on occasions and have been subject to hate speech, as we saw before. That in itself is not a requirement for – like the video before we saw that said we should regulate the Internet. It explicitly said that. And said that we have no, hold on. We have no right to discriminate, hate, offend, intolerate or bully within freedom of speech. Yes, we do. We do have the ability to do that within hate speech – or free speech. That is covered by free speech. You are allowed to insult people. And being insulted is not a reason to limit others’ speech. But, yes, there are people that do seriously take hurt from this. And there are suicides related to this. But there have been suicides before related to bullying.

What it’s about is rather creating a better environment. Some of these kids, we’ve had a discussion about this in Sweden over a long time with how parents should help their kids with their online presence. A lot of parents are hostile to the online environment and do not give support to kids that are being bullied online. Being bullied online is the same as being bullied offline. It’s more present, maybe, since you can’t really turn it off. But it’s also about the parents giving a – having a better understanding of what’s going on, not just limiting the speech, as is the case in Sweden, that some people want to limit the speech so that there is no bullying.

>> BRIDGET O’LOUGHLIN: I’d like to respond to that. I agree with you. Of course I can insult somebody. Of course I can offend somebody. In fact, if anyone is interested, the European law students’ association have done a very, very good booklet which actually analyzes the caselaw of the court of Human Rights down to the fact that an ordinary person should be more protected than for instance a politician who has put himself out there and therefore his or her ideas can obviously be criticized.

However, while I agree with you that I can, I don’t think that I should be allowed to do anything online that I am not allowed to do in a public space. On the street. I can’t go onto the street corner down here and start saying all the Jews in Germany should die, can I? So I should not be able to do that online because online is as public a place as any other public place. This is kind of the fundamental thing about this campaign, is that.

>> That’s not bullying.

>> BRIDGET O’LOUGHLIN: I agree with you. But I think bullying is a serious issue. We’re not specifically necessarily dealing with bullying. I have colleagues who do deal with bullying and I know it is a problem. And as I said I have experienced it myself. And most children, let’s be clear. Most children do not go home and say “mom, I was bullied at school today” nor do they tell their parents they were bullied online because their parents will just take their computers away. So I don’t think – I agree parents have a are responsibility but I’m not sure they’re always aware.

Steven?

>> STEVEN LOCKHART: To anyone, really. So if hate speech is something that we define as targeted at groups of people, why are we doing that? Why can it not be individuals and something that’s very specific to individuals? I’m not sure why. I don’t understand that logic. Perhaps, could you respond to that? Is that kind of something you were saying?

>> No, not at all. Speech can be individual, yes, of course.

>> STEVEN LOCKHART: Okay.

>> Yeah, I just want to respond to that. I don’t agree with you that it’s the same online and offline. Maybe the moral point of view from that point of view it’s the same, but not the reactions. And there we are. I think that’s the point, important one, as you said before. So what are the consequences? And, I mean, freedom of expression don’t mean freedom of the consequences these expressions may have. And that’s exactly the point. So when there is not somebody who comes over and says “oh, stop doing that, saying that” whatever, there is not in real existing room, it’s digital one. So that’s the problem. I think that’s the question. And that’s also the question, maybe you addressed. How could these consequences look like?

>> BRIDGET O’LOUGHLIN: I would like us to maybe – sorry. Just before we go on to reactions on this point, still? Yeah, okay, go ahead, then, Mariann, with your computer.

>> On the difference between offline and online, I think it’s important to remember that every comment that is made online will stay differently from whatever is in the room. So that’s why I think I disagree with the idea that we shouldn’t put the burden on intermediaries because if you don’t allow an effective and prompt remedy against speech, this will stay there forever and that’s why we have some suicides because people did not want that – okay. Maybe they were already troubled before that. But did not want that information to stay online. They were very embarrassed by the fact that it was available to everyone. And then they thought this was the consequence.

So I think intermediaries should play a very important role in addressing that quickly. And the justice system is not nearly as efficient as an intermediary could be. This is why I think the recent Google Spain judgment isn’t all that bad because recognizes that there is the information out there. And if you allow the search engine to spread this information, which can be harmful in the first place, this is a serious problem.

So I think in this respect, Europe, perhaps, is a step ahead of the United States, which allows intermediaries to do whatever they want with the content. And I think this should be also be considered as a way forward for other countries.

>> BRIDGET O’LOUGHLIN: Just in response to that, perhaps, there is a case coming up in the grand chamber of the court of Human Rights which is exactly on that question of the responsibility of the intermediary and it’s called the Delphi case. It’s going to be judged on the ninth of July. So it will be interesting to see what the court says. There’s been a lot of criticism of the initial decision because it puts a great burden on the intermediaries. And the risk is, I think it was Maja’s mentioned it. The risk is that you get the intermediaries will just censor. And I’m not sure we want to put censorship in the hands of Google. In fact, they are supporting us. They’re giving us some advertising support and some training support. And I think that’s probably the way forward is to help them have some sort of code that we can all kind of agree on.

They know that if they don’t do something, they’ll probably end up being regulated by government. So I think they’re aware of that. But I really personally have a great reticence in putting Facebook in charge of what I put on Facebook.

>> Can I quickly respond to that? I agree. It should be a balanced framework. It should not be something that entirely regulates what they should do with the content. But when a website deliberately encourages hate speech, like in the case that was put forward to us and that allows only people who can make racist jokes to join, I mean, that is a no brainer.

>> BRIDGET O’LOUGHLIN: Well I agree with you but maybe the people who go on that site wouldn’t. And the people who go on that kind of site, we’re never going to persuade them that they’re wrong. So I’m more interested in the rest of the world who might be looking at it is and see our no hate speech. I know there’s a remote comment. But Andre wants to say something. And I just wondering if you could say something about the – the way the campaign is run at the national level for us.

>> ANDRE BENCEL: Thank you, Bridget. Also, I would like to mention that I absolutely agree with your statement that there is no difference between acting in real and online life. That’s really, really important thing. And also in my point of view, it’s really important to think and to know where our rights ends and when starts the rights of others. So it’s up to us how we act and what we do in our lives and also what kind of comments and also ideas we share on Facebook, Twitter and so on.

But back to your question. As the campaign launched last year, in 2013, we said also national committee. We’ve heard representatives of ministries and also NGOs and also bloggers in Slovakia. And also we tried to promote this event and use festivals and also we tried to get to schools as much as possible.

The head of our committee, he’s the Minister of education in Slovakia. So it’s not only NGO thing or something like that, but it’s also governmental thing. And also we try to cooperate between two sectors, governmental and nongovernmental.

As I mentioned, the education in schools is very important. And also we are going to launch a new competition for the best school who has no hate speech. And also we try to motivate young people to aware their comments and also their thinking about Human Rights.

What to add. We also cooperate with other countries on regional level. I would like to mention that we got really good bilateral corporation with Serbia. And in this position also I could introduce new project which is going to be in August called bus trip. And the main idea is to promote no hate speech as much as possible. There will be eight countries participating. And also the bus will start from Belgrade, Budapest, Vienna, (other towns) Bosnia. And back to Belgrade. And in each city, the participants are supposed to make flash mob. And also there is official from the ministries. And also other really nonformal thing in the city.

So also I would like to add that nonformal education is really important. As a nonformal facilitator, I would like to really push nonformal education not only in schools but also in real life. And that’s a really good thing.

>> BRIDGET O’LOUGHLIN: Thank you. Sorry. Our is there a manual for teaching in formal and nonformal education. It exists in French now as well as in English and it’s being translated into a number of languages thanks to the grants that are helping us fund the translation. It’s an excellent tool for anybody who is a teacher or a nonformal educator.

There’s a question at the back of the room. And then after that I’d like to go to see – there’s some online questions? Okay.

>> Thank you. Actually not a question. My name is Imid Povich from from Bosnia Herzegovina. My question is also to you and mention the documents and the agenda we have in the CDCMI. My question is do you think we need to do more from that? Because we have enough documents. But unfortunately the implementation of those recommendations when we come on national level could be the problem. Is that the same thing with the hate speech? From also point of regulatory who dealing with the communication, including broadcasting and telecommunication, we had cases on the hate speech. And it’s very easy for us to implement and to deal that because we have directive. And of course it’s always problem to do that on Internet.

But my question to you is: Do you think that we need to do something more from the point of the Council of Europe? Do we need more standards setting? Do we need more recommendations or from the perspective of the regulatory? Is itself regulation enough for the Internet or is there something else that we could do? That’s my question to you. Thanks.

>> BRIDGET O’LOUGHLIN: I don’t know if anybody wants to respond on that. I can give you my very personal viewpoint, which I’ve been working on this campaign since January. So it’s all very new to me. And the whole of the – I’ve had to start a Facebook account and a Twitter account. So I’m learning a great deal. But my dream would be to try to develop some form of standard-setting, having worked a lot at the Council of Europe on a multilateral basis, on a transversal basis, which wouldn’t be necessarily purely in the criminal law field or in the regulation of the media field but would also cover questions of education and awareness raising, you know, teaching people how to behave properly, perhaps, or encouraging it. It’s a bit of a vague idea that I have at the moment, but I’m really hoping that maybe we can get – this campaign within the Council of Europe seeks to be transversal. We have colleagues in the parliamentary assembly who are picking it up. We have colleagues in the regional authority who are picking up. We have colleagues in different areas whether it’s Internet Governance or education or the media working on the question. And I would like to see that translated into maybe a guidelines or best practices because I don’t think you can solve this problem with laws. I just don’t. So I think we need to think a bit more holistically, perhaps, about the question. But that’s my very personal opinion.

>> About this common grounds for the hate speech. I’m coming from Russia. In my country we have real interesting approach to define hate speech. So even though bylaws it’s hate speech that is incendiary to violence, that’s the general approach to many countries. So when there is direct incitement to violence, it is considered to be hate speech.

If there is no thing like saying okay, let’s gather at this bus stop at that time and kick those people, this will be not considered to be hate speech.

But recent cases in Russia show that things like obscene language against politicians or swearing at people in power, this is already being considered to be hate speech though it has nothing in common with this one, basically. So there should be some common understanding and it should be field for manipulations.

Also, to think about hate speech or incitement to violence, these arguments were used a lot against – the cases against activists. They were considered to be extremists or whatever it was during the things, during the process against the government in many cases. So it was manipulation to call the people for criminal responsibility but not necessarily for the case of hate speech but for something that can be manipulated as hate speech.

And the second thing about the child education in schools and talking about these issues. In my region, we have two successful examples. Only two, unfortunately. Yet there are schools when we try to involve different stakeholders in the cases of hate speech at school – there. So basically there were oppositions of school psychologists who were supposed to not like watch the profiles of kids because special for younger kids, it can make sense, but kids could go to those people and tell about them, about the things that are happening online with them. In the beginning there were really few cases. Then the first girl she started to talk with the psychologist. And some kids were also showing their profiles, so things that they are receiving online. Not to their parents but to the persons in their school. So we were trying to involve different stake holders in this. But it’s also field to be developed a lot. So where is the borderline between the person who is watching the kids, how they behave online and where is the borderline between the person who is showing you how to behave responsibly or that you shouldn’t just say “you idiot, I’ll go to hit you” on the Internet but trying to get involved in the communication somehow.

So it might be the case when there is a need for other stakeholders to get involved, such as parents or psychologists in the school. Thank you.

>> BRIDGET O’LOUGHLIN: We’ve got 15 minutes left. I would like very much to ask Viktor so that we can respond to some of the online comments because otherwise they’ll think we have lost all our Internet connections. Thank you.

>> VIKTOR SZABADOS: Thank you, Bridget. Yes, we have two hubs here and some individuals following the meeting. And the question is from the Armenian hub is I will read out. “Are there any steps undertaken to come up with the universal definition of hate speech”? And then “how do you ensure protection of Human Rights online in some developing countries where it’s not protected offline”? How do you ensure protection of Human Rights online in some developing countries where it’s not protected offline?

So we have the open Internet and the general definition and the scope. We have very good discussion and exchange on Twitter. Actually re-tweeting and responding to some of the comments and presentations. That’s so far.

>> BRIDGET O’LOUGHLIN: Do we have any hate speech comments?

[Laughter]

Yes?

>> Mayor yet from the fundamental rights agency. I think that question from Armenia, very important question that was raised and that I haven’t heard so far. And I’d like to ask you, the campaigner, have you tackled the issue of the remedies? Because the Armenian follower was saying how do you implement basically freedom of expression and fight against hate speech? But this is it. What are you doing? What are you suggesting to Twitter, Facebook and your counterpart in this campaign? What are you promoting as good practices? Because of course the judicial mechanisms are very slow. The court makes great standards but it takes a lot of time. And therefore I think it would be very useful to spend a bit of time on this issue.

>> BRIDGET O’LOUGHLIN: And that question – [Inaudible]

>> Yes, thank you. My name is Christopher Newman, I work in public affairs here in Berlin. And my question is related to also the remedies that we just heard. Twitter, we’ve seen some very good comments come in here but there are also cases maybe some of you are aware of it, for example, the Miss America competition was won by an Indian American lady and there were a lot of very, very hateful comments about her looks, very, very ignorant comments, things like this. These tweets that are not anonymous, so people are not even hiding behind the cover of anonymity, they were collected and a wall of shame, so-called, public shaming, a tumbler blog. I was wondering what the campaigners think, whether this is an effective mechanism to engage or maybe counter this through then self-censorship.

>> BRIDGET O’LOUGHLIN: Thank you very much. We obviously have considered these issues.

The question about a universal definition of hate speech, are there any steps being made towards developing a universal definition? No. I don’t think so. There is a European one. It’s in the 1997 committee of ministers’ recommendation and I think it’s very clear. There has been discussion about whether we should go back to the committee of ministers and have it revised because maybe it should be widened to include some other minorities who weren’t included in the definition the first time.

But at the Council of Europe, going back to the committee of ministers and asking them to reopen a discussion sometimes leads to less rather than more. So we’re not really sure whether that would be a good idea.

But I believe, as well, that if you look at the UN declaration of Human Rights, there is a definition of hate speech in there. There is. It’s after freedom of speech, there is what you should not be allowed to say. I can’t remember what the number of the article is. There is also on the convention of Human Rights. There is a limit on freedom of speech very clearly expressed.

And I’m not sure whether a universal definition is even possible because in many countries, even within Europe, what is considered hate speech in one country might not be considered hate speech in another country simply because of cultural differences. So it’s a little difficult, I think, to come up with ’A “universal definition.

I cannot answer the question about how to ensure protection of Human Rights online when they’re not protected offline. Maybe that will be our next campaign. Because, I mean, obviously if they’re not protected offline in developing countries, that is an issue. However, I do think the Internet, where people have access to it, is a great tool for people in less developed countries. I think people have seen some of what’s happened in China as being very interesting, I think, using the new technological tools, which means that people can communicate with each other more easily.

The question about what are we doing about remedies is an interesting one. And, yes, I agree. I think the courts are too slow. I don’t think that necessarily the penal – approaching it as a crime is obviously important but it’s not necessarily a solution.

We are working with Twitter and Facebook. And we are hoping to get Twitter, Facebook and Google around the table to work on some basic policies and to offer remedies and to let people say, “this is offending me. Please take it down.” They already do take stuff. You can report stuff. You can also take it down.

There have been a couple of cases again in the UK, very famous cases, where people were attacked on Twitter. And one was a football commentator who was black who received so many hateful tweets and Twitter didn’t respond to him that eventually he took his account down. This caused so much scandal in the press that Twitter actually called us and said “please can we say we’re working with you on your campaign”? So we can show that we’re not completely bad.

So we have a certain amount of – we do have support from them. We have support in advertising from them. We got 1,000 new followers on Twitter yesterday, which was quit a lot for us. So what’s important and here I’m going to do a bit of my spiel, is to make sure that we get lots and lots more followers on Facebook and lots and lots more followers on Twitter because it gives us much more leverage with those companies to talk to them to say, okay, we are a serious campaign. We got all these people following us. And we would very much like now to sit down and let’s talk about policy and let’s talk about how you can perhaps react more rapidly to people or correctly to people who are complaining. So that’s where we are with them for the moment.

>> The remedy question which is extremely important question because if there are no effective means for remedies, Human Rights cannot be assured. The document that I’ve mentioned, which is this guide to Human Rights for Internet users speaks about the remedies. But what are we doing now within this committee is also together with the secretariat of the Council of Europe is the site is being developed on Human Rights guides for Internet users. Now, it does not speak – it does not treat necessarily only hate speech. It treats many, many other themes and questions. But now what has been agreed within member states within this committee is also that all member states provide information about effective remedies for Human Rights violation on Internet. This information will be available on this website. It will be available in a manner that ordinary users, not necessarily experts, but ordinary people can see in which way they can seek effective remedies within their jurisdiction.

It is an ongoing process. The guide is the first step. But it will continue to live. So maybe this is also one of the ways to look forward.

And just going back to what you have mentioned as your private idea and impression, yes, and it is actually something that we are doing within this field because we do constantly speak about the fact that it cannot – nobody talks about regulation, need to regulate Internet. On the contrary, it’s very complex issue. So we do insist on measures of self-regulation and underlying the importance of media literacy together with legal and policy measures depending on the subject. And on if level of an impact of some speech. It depends also. Thank you.

>> BRIDGET O’LOUGHLIN: There was a question? Yes.

>> Yeah, I was wondering if you did anything around – you were talking about culture. And I was thinking about social norms and how Facebook and Twitter and stuff can kind of create that kind of rather than being censorship or consequences, like the way we just kind of know what is and isn’t acceptable. It’s been funny being here in Berlin and seeing people wait at the traffic lights until it goes green which is not something that we do in the UK so much anyway. No policeman stood there like stopping people. It’s just become something that people do.

>> BRIDGET O’LOUGHLIN: Special little men.

>> Yeah. And so I kind of think it’s somewhat of a catch-22 situation. So in order to create those inclusive environments, people from lots of different groups need to be within the environment and represented in the environment. And if they’re being harassed and leaving, then that’s not helping change the situation. But I also red that apparently Facebook customizes adverts to match like your group and the people you know. So it means that you’re seeing more of what you already see. Like if you don’t have many friends from different backgrounds or if you – like if you go to a lot of sites that are objectifying to well, then the adverts you’re going to see are going to respond to that. It will be like a feedback loop that says this is fine in this environment. So I was wondering if you have any thoughts about like kind of how we can make the culture, like the space, like a space in a room, we try to make it welcoming to everybody. If there weren’t people from a certain – like if there’s a woman on a panel, then we make a fuss about it, as well. So how do we kind of do that in an online space?

>> BRIDGET O’LOUGHLIN: Very interesting question. And I’m not sure how you would do it because Facebook and Twitter make their money by getting – targeting their adverts to the right people. But I agree with everything you’re saying. And I do think it’s a culture issue.

One of the interesting things we have discussed with Twitter and Facebook and we discuss with our activists who are working on this all the time is the importance of the counter narrative. What’s called the counter narrative. And Twitter in particular, the gentleman was talking about the case of Miss America, they decided not to take down all these hateful tweets because they wanted to show how horrible people were being. And they wanted also to show the arguments that were then – the counter arguments that came in, which were very good. And so sometimes you actually want to leave hate speech out there because you want to be able to show how cleverly it has been countered by somebody.

This is one of the ideas of how to deal with it is rather than taking it down or censoring it, it depends on the counter-narrative. If it’s a serious incitement to go out and kill somebody, you’d want to take it down.

But if it’s just silly hate speech that somebody hasn’t thought about, sometimes the counter can be very helpful. But Twitter and Facebook, they have a lot of employees that spend a lot of time thinking about policies and these issues, as well. They are not blind to it. They have a certain sense of corporate responsibility. They know it could affect their bottom line. So they will, I think, be very cooperative. There are some other sites out there, some other social networks, as well, which are a lot less socially responsible. But it’s more difficult to get to them.

We had a question come in about – oh I’m going to say Jan is going to Hans it so she could tell us what the question is.

>> I picked up the one from the Twitter feed basically. Mostly a comment. Would like to hear about strategies about hate speech while accepting the need for and the right to anonymity online.

I was curious about this one because it was pretty much my case. So if I wouldn’t reveal my identity on the blog and I wouldn’t put the link there, I wouldn’t get all this hateful messages, even. In Russia from the U.S. numbers.

So for me it is mostly my attitude. If I’m saying something online, I want to carry the responsibility for saying this. I wouldn’t hide saying no, I was not in a good mood by saying so. Or kind was I was out of my mind at that moment. I really want to take responsibility for the words that I say. Yes, it would happen in the future.

So, but the actions that actually make sense with anonymous profiles and it was the one for me fascinating when I found this the no hate speech campaign. It was thunder clapping. So when there is one message and you subscribe to this message, put the timeline alarm. And it comes to the directed profile of the person. So, for example, if you have hateful profile on Twitter, many people can do at the same time like clap. So basically you just put the time. And this tweet like, I don’t know. Spread love not hate, something like this. And post the pictures or something really nice. Even smiling kittens. I don’t know. Whatever. But the same which comes to the hateful profile from, I know. 100 people. 3,000 people. So first you can almost block the person. Second you really make him annoyed. And if it happens consistently from time to time, once a week, you can of course ask this person to stop doing what he’s doing. So it’s lining some piracy. I think but anonymous profiles can make sense because it’s just joint action.

>> BRIDGET O’LOUGHLIN: It’s almost like an online flash mob.

There was a question over there? Yes. Do you have a question? Yeah or comment?

>> I just wanted to contest your statement that social networks such as Facebook and Google are always on our side.

>> BRIDGET O’LOUGHLIN: I didn’t say always.

>> I mean lately. I think you said lately or recently. I don’t think that’s true at all. I’m coming from the new media summer school and we had a person from Google there responsible for Human Rights, some point. And he was saying that it’s not – he was not worried about hate speech at all and it’s good that it’s online so we can see it. And so who we know who the idiots are so we can ignore them. So this is a person from Google from Human Rights saying this.

And if you will look at Facebook and their policies regarding a lot of women’s rights issues, like two weeks ago they removed the picture of two women kissing. Someone in Italy was celebrating LGBT pride. Of course they put the picture back. But – and if we remember the last time that Facebook really questioned their policy towards removing content related to violence towards women and it was because there was an initiative targeting people who are making ads on Facebook. So they only changed their policy and invited feminists to rethink their strategies because they were losing money and they were losing people putting ads on Facebook. So we need to fight this because these corporations, which is what they are, they don’t have Human Rights approach at all if we don’t demand it.

>> BRIDGET O’LOUGHLIN: I agree with you. It’s very unfortunate. A representative of Twitter was supposed to be here today and she unfortunately could not come at the last moment because she had to go deal with an emergency in Spain. But I think – I agree with you. Human Rights are not their problem. Their problem is their bottom line. But when they realise that their bottom line may be affected, they are a lot more cooperative.

>> I just want to follow on many of the points that have been made. And make a plug to some work that well do at the IGF in Istanbul in September. We established a Dynamic Coalition which is called Dynamic Coalition on platform responsibility, which will aim to develop model contract clauses to be used by platforms as terms of service in a way that allows them to respect Human Rights. So freedom of expression, privacy and due process. So if anyone is interested in taking this further, please join us in Istanbul on the fourth of September, which is when the dynamic coalition meets. And we also have a workshop the second day of the IGF, No. 158 on promoting platform responsibility for content management, which is also very much about these issues. Thank you.

>> BRIDGET O’LOUGHLIN: Anybody else have a comment or a question? Everybody’s hungry. Is there anything else? Yes, please.

>> Just a bit bringing down the enthusiasm about the Internet, that the data also confirmed that there’s no grade different of the offenses experienced online or offline on reporting on them. It means that it’s not that if you get it online and you are active and more involved that it’s somehow induces you to report on this. The nonreporting levels are very high on both foras just to piggyback.

>> BRIDGET O’LOUGHLIN: Adriana?

>> Adriana: Just coming back to the strategies on how to deal with it, I have a little bit of a negative strategy, so something that maybe isn’t a very good idea that I’ve learned with the campaign, which was – and concerning Facebook, how they are not necessarily with us, but there were a couple of pages saying, for example, say no to Roma say no to Roma gypsies, these kind of things. We managed to have some organized reporting much these pages. And after a while, the first time Facebook said oh, it was just humor, controversial humor. But eventually they deleted it. But what we learned is that it’s useless if it’s deleted because the page quickly came up again with a slightly different name. So it’s not say no to Roma. It’s now to say no to Roma gypsies. So Roma gypsies say no. So it very quickly comes up again.

Concerning strategies to be honest, what I have learned so far is I still don’t know how much how to say how to del with hate speech when it happens. What I do think is it’s important to use – one thing I was told in the beginning of the campaign if there are 10 percent people of who are for and 10 percent who are against, there are 80 percent of people in the middle. It’s middle people that usually want to target. If you go after chimp mania how likely is it they will change their mind? Not much. So I guess the most interesting strategy for dealing with hate speech is just about presenting a counter narrative. And usually using humor for deconstructing the arguments the hateful, those who have hateful comments create. There is this anti-homophobia thing called fuckhate and they have very funny videos. So I guess this is the one strategy that I prefer.

>> BRIDGET O’LOUGHLIN: Thank you very much, everyone, for coming. Now Igor was going to make us all dance but he’s suffering with a nosebleed so I can’t even make him do that. So you have had a lucky escape. I’m sure he’ll do a video for us so that we can all learn how to do the no hate speech dance.

Please all go out and wear your badges with pride. Take leaflets. Take the information that’s out there. And please, please, follow us. Like us on Facebook and follow us on Twitter. It does make a difference in our negotiations with them. Thank you very much for coming, everybody. Thank you.

[Applause.]

Pictures from working group

Session twitter hashtag

Hashtag: #eurodig_ws8