Privacy and E-Commerce – implications for children and young people – PL 03 2013

From EuroDIG Wiki
Jump to navigation Jump to search

20 June 2013 | 16:30-18:00
Programme overview 2013


Key Participants

  • John Carr, eNASCO
  • Conelia Kutterer, Microsoft
  • Clara Guerra, Portuguese Data Protection Authority
  • Peter Matjasic, European Youth Forum


  • Sophie Kwasny, Council of Europe


  • Martin Fischer, Young European Federalists


  • Understanding that children and young people have very different needs and interests, they must be addressed differently.
  • Accepting that there is no common solution for conflicts amongst particular user groups.

Session report

Plenary 3 outlined the special status of children and young people on the internet. Young people should be considered stronger as users, digital natives and more media literate people. Young people were in particular referred to as users between 13 and 18. Children on the other hand are still very impressionable and „digitally naive“. They learn fast but the learning process runs many risks if unsupervised. Children were discussed for youth below the age of 13.

There are disparities also amongst young people themselves and children alike. They should not be considered as a homogeneous group but they are easy to reach for educational measures to set a basic level of media competency. An arising problem is that young people and their parents do not necessarily „speak the same language“ in terms of media use, a lot of media learning happens through peer learning.

Spark for the debate was the EU draft regulation on privacy and e-commerce, which in its current form would loosen the regulatory framework for ISPs, eg. reduce the role of parental consent in online forms. It would also reduce the minimum age for people to register online and share their data to the age of 13. The age is picked arbitrarily; the plenary was missing reasoning for that exact age limit. Furthermore the regulation lacks enforceability. Similar to the current status ISPs can shift the responsibility towards the parents and are not requested to check the information provided.

The debate quickly moved towards the problem of child abuse. Numbers for child abuse on the internet are apparently increasing. There seems to be a lack of police and international cooperation to deal with the topic. At the same time the organisations pledging support to the fight against child abuse are grow in numbers. Many questions arose on the implementability and effectiveness of current technology to track child abuse but remained unanswered. Young people from the audience largely stressed the relevance of privacy of every user and the potential loss of freedoms on the internet due to implemented measures.

This showcased two different approaches towards the topic: blocking and taking offline. The European states have different ways to deal with it. With blocking the content remains online but is harder to access, yet with high media competency still easy to obtain. With taking offline content remains available until the source is found and then prosecuted. Questions evolved around the inflicted harm of pictures remaining online, the effectiveness of blocking, the communication ways of child abusers and the traceability of those.

The plenary did not come to a common solution but outlined ways to address the issues and conflicts amongst particular user groups. Young people and children have very different needs and interests and need to be addressed differently.


Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835,

This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.

>> SOPHIE KWASNY: Good afternoon. Welcome to the plenary on privacy and e-commerce, implications for children and young people. I’m Sophie Kwasny, I work for the Council of Europe. We have a stand outside by the coffee if you want to learn more about what the Council of Europe is doing. We have plenty of materials.

This session is started with a bit of delay and you have to note that it’s a 60-minute session, one hour. And like other plenaries, we have a very important topic to discuss. So please bear with us for an hour.

Privacy and e-Commerce. So in the title we restrict to a specific type of activity, but privacy, the right to private life, which is protected by the Article 8 of the European Convention on Human Rights, applies to any individual, so that is whatever the age of individuals, children and adults. And applies in all spheres, so not only about e-Commerce. Here it’s about children and young people. Children, if you take the legal definition from the Convention on the Rights of the Child, it’s any individual below the age of 18. And here we also want to speak about youth, and there the age goes further, it’s 30, 35 years old. So we see that we are addressing categories of individuals with very different needs, very different use of Internet, very different situations.

In the field of privacy, and e-Commerce, a very important reference is the reference of the age in particular, and there is the age of 13, which is often taken as a reference, to start with, because in the USA, the act which is applicable to that field, the COPA Act, refers to the age of 13 for children to be able to consent to the service. And this, of course, because children in Europe are using those online services, this age of 13 also became relevant in Europe and now even is mentioned in the draft Regulation, which is proposed by the European Commission.

So is this age of 13 relevant or not? Is it the right approach? How to really enforce or show that this age is a reality for the person who is behind the computer? That is something we will be discussing.

The Council of Europe has done some work on that topic. I will just mention those instruments, and invite you to go on our website if you want to learn more. We have adopted a Declaration on the Protection of the Dignity, Security and Privacy of Children on the Internet, and we also have a recommendation which tackles broader topics on the measures to protect children against harmful concerns, behavior, and to promote their active participation in the new information and communication environment.

I would like to welcome our panelists today that will be trying to discuss this issue. Let me start on my left. Carolina Goncalvez. She is from Lisbon. She is 11 years old and she is representing the World Association of Girl Guides and Girl Scouts.

On the left will be her interpreter that will translate for you what Carolina will be telling us.

Further on the left, we have Clara Guerra, director of Communications for Microsoft.

On the left, President of the European Youth Forum, Peter Matjasic.

To my right, Clara Guerra, she is working at the Portuguese Data Protection Authority.

And finally, John Carr, further on the right, Advocacy and Policy Advisor to the European NGO Alliance for Child Safety Online.

So having introduced the panel, we will start by showing you a video that has been put forward by the Youth Forum. And after that, immediately, we will start discussing the issues.


>> Internet is all about young people. They are not users, they are creators.

>> Young people are special not because they are young. They are special because they are using the Internet in a very much different way than the older ones.

>> Every new generation has a responsibility to step up and to speak out what they stand for and how they want to see this world.

>> We are trying here through these programmes to develop their capacities to express themselves on the topic, otherwise we have a dialogue on young people without young people, and that’s not what we should have.

>> I think that we put our privacy in risk every day, whether it’s offline or online.

>> Without access to the Internet nowadays, you don’t really take part in life anymore

>> I felt like a world inhabitant all of my life. And Internet is making it a lot more easy to live that life.

>> SOPHIE KWASNY: So this was basically to show you how, depending on the age, depending on the persons, Internet can be an essential part of our lives or not. Myself, when I was a child, I didn’t have at that time Internet. I started using it gradually. Today children like Carolina were born with it.

Let me first start asking our panelists precisely in this use of Internet how they are using it. What are the challenges they identify, if any? What are the risks? And we will try at a later stage to see if some solutions can be proposed, some actions taken to remedy those challenges.

So, Carolina, please, for you, are there any risks on the Internet?

>> INTERPRETER FOR CAROLINA GONCALVEZ: She was saying she uses the Internet to go to social networks, such as Facebook. She also uses to listen to music on YouTube. And she also uses it to search things for Girl Guides and also but mainly for school.

The risks she has identified is the fact of identity and Facebook, like if someone is using the Facebook to become someone they are not, and she was also talking about the hackers. So those are the risks.

>> SOPHIE KWASNY: Thank you, Carolina.

So indeed, identity theft, for instance, there is a peer link with privacy. I would just underline that Carolina is 11, and she is using Facebook, which is normally for 13 years old and above.

Cornelia, for you, what are the privacy risks for children and even younger adults on Internet?

>> CORNELIA KUTTERER: I was hoping that you would ask me how I use it, because for my children it’s a great tool to answer all the questions that they have, which otherwise I wouldn’t be able to answer.

When it comes to how to protect children, there is, I think as we are here sitting at EuroDIG and talking about a number of human rights, the – I think the challenge with children’s privacy is that it is in between the tensions of a number of human rights and the how you look at it. So you want to protect the children by having their parents help them. But of course that, and of course my colleague might get into it, you might have privacy risks as well. You can use technology to help privacy of children, but the technology has challenges itself. So you’re constantly trying to weigh in.

But each step has a kind of a challenge.

For example, you mentioned that Carolina is only 11, while the terms of use of Facebook are allowing the service only for children above 13. One of the problems in the technology field here is, of course, age verification and how you do that, and without risking some privacy of Internet users as a whole. So these are kind of the challenges that we have to weigh one against the other constantly in that field. And in particular, when it comes to privacy, that is very relevant. So if you think about technical tools to protect children in the context of sexual child abuse, that is something where you can delineate better than you can when you use it for privacy settings, or in the case of the underage use of services.

>> SOPHIE KWASNY: Thank you, Cornelia.

Peter, to broaden the focus, we have heard about children. What about youth in general, what are the risks, if any?

>> PETER MATJASIC: There are many risks, and I’m happy that we have a child with us today. Because in the Youth Forum we believe in the empowerment of young people, and young people for us are children. Young adults, between 16 to 35, I refer to as young adults. And everything below for us are children, but it doesn’t mean that they don’t have an opinion. It doesn’t always fit with the age definitions that exist when it comes to law and law enforcement that we have. What is important is that young people need to be empowered through education and awareness raising, so that they can avoid the risks, and we know that there are many risks out there, cyberbullying, a lot of things breeches privacy.

The best way to combat that is not by reactive strategy, but by preventive strategies, by informing, empowering, and thus enabling young people to participate. Because for us, we are in the EuroDIG for the sixth time running, because we believe it’s important to have the voice of young people directly present. Then young people have very different opinions. We as the Youth Forum are one representative body that works at the European level and we can claim that representation.

So we have one view, that is, for example, preventative strategies are better than reactive ones. But a lot of people will use Twitter and they will say that no, we believe that the state should get involved to a certain extent. And that’s what we want. We want youth participation to be involved in this, so that young girls and young boys and young adults can sit here and discuss. And then we might see that we have more things in common than we believe.

But it’s important to have this involvement into the discussion. Because only with that, we believe, we can then have solutions that will be made for the users and with the users.


>> SOPHIE KWASNY: Thank you, Peter.

Clara, the Portuguese Data Protection Authority. So you are there to enforce the law, to make sure that the individuals have their right to data protection, effectively protected. So, how do you see the risks for the privacy of children and youth on the Internet?

>> CLARA GUERRA: I would like to point out that one of the biggest problems we face is eventually the lack of enforcement power and problems. The Internet is global, so we have always problems of applicable law, jurisdiction. And the result is that a huge immunity from deceptive commercial practices, at times.

Lack of transparency towards people, and of course here we recognize children despite their age being a more vulnerable group.

But the risks are for everybody, because the problems that we face today are for the general public. And apart from specific services that are tied directly to children, most of the services are for the public, including children. So it’s very difficult at times and it will be even in any regulation to divide these two spaces. Because children, they all use Internet services and these new media services even when the services are not directly to them, addressed to them.

So one of the biggest problems we have is that lack of transparency, the problem with this deceptive and disloyal commercial practices, and of course the lack of regulation, regulation that has to bridge across and outside of Europe. Internet is global. And even if we had the most perfect legislation in Europe, it still won’t be effective because most of the companies are based outside Europe. So this lack of appropriate regulation also, because there is nothing concerning privacy and children in Europe. There is in the United States, a law.

And like Peter said, now the third lack is lack of awareness. To be able – and I share his views that we should involve children. We have to educate children. In the former plenary, someone said education, education, education. That’s right. And we can repeat education. And to be able to empower children and young people, we need them to be informed. Because if they are not informed, they cannot make good decisions to decide and to discuss.

So awareness is very important or everybody, for adults I say again, but also for children. So I would say that the biggest challenge is to face these three lacks: Lack of transparency and lack of appropriate regulation and lack of awareness. And to face this, we will discuss it later how we can do it. It would be the – it’s the most challenging problem we have now.

>> SOPHIE KWASNY: Thank you very much. Now, John?

>> JOHN CARR: Okay. I’ll give you my take on it. Just a couple quick points. In these conversations inevitably we tend to speak with very broad brush strokes. But the truth is, this is quite an important point, not all young children, not all young people are the same. It is not true that all children and all young people are super cool, Internet literate, Internet savvy dudes who go everywhere all the time and feel good about everything that is going on in the online space. There is a huge variety out there amongst children and young people just as there is in the adult world.

So when we make assertions about the importance of education and awareness, we should be aware, also, that that will not work equally well for all children, just as it doesn’t work equally well for all adults.

So that kind of takes me to the second point. Of course, absolutely, without question, the best defense, the best weapons that we can give our children to deal with any parts of life, including the Internet, is knowledge. Their own knowledge and their own resilience. 100 percent, no question.

And in that context, education and awareness is singularly important, because that is about equipping our children to deal with life, all aspects of life, including the Internet. But let’s not forget, as I said in my first point, that not all young children have the same capability to respond to the Internet messages. They are never going to be enough, and that’s why the question of regulations and rules also kick in. We will come to that later.

But just quickly on what are the challenges? Well, a couple years ago, in the United Kingdom, the admissions tutor of a very famous University, the kind of University that many children and young people would have dreamed of getting into, worked hard to get into all of their life, right, admitted that before he made a decision about who to offer a place to and who not to offer a place to, he went and looked at their profiles – excuse me. It was on Facebook, he said, on social networking sites.

We, until a couple years ago, there was a survey done of employment agencies. Two-thirds of the employment agencies said that they expressly searched the publicly available information on the social networking sites about new individuals that they are thinking about inviting for an interview for a job.

Now, there is a fundamental principle, isn’t there, in data protection law, that you should only ever use data for the purpose for which it was intended. Now, when kids, when young people put stuff up on Facebook or on some of the social networking sites, maybe when they have had too much alcohol, and I use the very Angelo Saxon expression there, maybe when they have had too much alcohol, party or whatever and they do stupid things and pictures of them get put up there, they’re not making an application for a university. They’re not making an application for a job. But these things can be seen and taken completely out of context and have a completely disastrous effect for that child.

Imagine that’s the job you wanted all of your life. Imagine that’s the University you really dreamed of getting into, and you would never know, you would never know that the reason you didn’t get the interview for the job or the place in the University is because somebody looked at your Facebook profile where you had said and done something stupid.

So that seems to be completely wrong. But it reminds us of the singular importance therefore of the regulatory framework within which these things happen. And my fundamental view is that the rule of 13 is a badly thought out law, a badly thought out proposition that I can see no reason why we should import it into the EU, which is currently what is on the cards.

The rule of 13 was adopted into US law in 1998. Right? I think Google came into existence in 1998. Facebook was several years into the future at that point. The rule of 13 came – it’s a 20th century law developed before social networking sites were created in any substantial way, and it has been transformed into a privacy law. It was never meant to be a privacy law. It was meant to be a law about protecting children from commercial advertising. Yet, it has become an all purpose privacy law.

My – I’m going on for too long, so I’ll close now. My basic point is between the age of 13 and the age of 18, when they become adults, so I step out of the picture at that point, between the age of 13 and 17, children do an awful lot of growing up. The idea that one rule governs all of that span of maturing and growing up seems to me a stupid idea. And it’s never been properly researched or properly documented.

>> SOPHIE KWASNY: Thank you very much, John. And I think this point that you’re making about the traces of the Internet and what you leave there and how it can be used in the future is an important one. And, indeed, we have data protection principles that should apply and that should enable when the European Commission qualified as a right to be forgotten. If we don’t use this language in the Council of Europe, we think that indeed specification of the purpose, the duration of the conservation of the data, and the right to have your data deleted should enable children precisely not to leave those traces on the future.

So those are the traces of the problems that we face as we try to protect children online.

We have another film from the Youth Forum. So let’s see their view on that.


>> I think that we put our privacy in risk every day, whether it’s offline or online.

>> I don’t like people looking through my stuff anyway, not just online but in general.

>> It’s difficult to track all the new developments and to predict how the new tools can be used.

>> Facebook, our Twitter account or gmail, there are rules. The rules are not well-defined. But still, everything is changing.

>> If you want to make a gmail account, they want to know your phone number, and they say because you can forget your password, but I don’t think so.

>> You don’t really know what is going on with the company sharing your data with another. There are others, like 500 pages long terms and agreements that you have to sign. If you read the terms and condition of Facebook, nobody would really be smart about it, because nobody really understands it.

>> Whatever you post online, whatever you have online, whatever is your data should always be your data and you should be in full control of it.

>> We value a few things about our privacy and how our data, for example, would be used.

>> There is not enough empowerment in terms of how far you want your privacy to be and how are you going to arrange that?

>> I cannot control what information I can take back from the Internet.

>> I think you should be able to say I shared this with this person or I didn’t share with that person or I want it deleted now, right now.

>> Only when you don’t use freedoms, then there is a risk that it slowly disappears.

>> If I was able to identify that gap, practice privacy threat, I think that would already make me feel safer.

>> Right now there is not a lot of privacy left. I just try to moderate what I can still put out there.

>> You have to test the borders of the law, otherwise they will slowly encircle you a bit tighter. You know, there is a box that you tick, “I accept all the policies.” Read that, read that and be conscious about it.

>> This feeling that somebody is standing behind you and looking over your shoulder and maybe doing something unexpected, I don’t really enjoy thinking or having this idea.

>> SOPHIE KWASNY: So you see some of the comments there raised the fact that there is a lack of understanding of how the data can be used. So lack of transparency. Even when we are supposed to click to accept some terms and conditions, do we read them? Do we understand them? So those are amongst the issues.

So how can we tackle that, how can we make sure that the privacy of children and youth be better protected?

All of our panelists have their take on this. I will start again with Carolina to see what can be done, for instance, to make sure that the privacy of the children in the scout group, girls guides there can be better informed. Carolina?

>> INTERPRETER: She was saying about a thing we can do to protect ourselves is to be – don’t share much information about our lives and to be aware of the photographs we put online.

And then for Girl Guides? This is an activity pack for the world to see of Girl Guides and Girl Scouts, which is called World Smart. And Girl Guides have to do six of these activities to earn a badge of surfing security online. And it’s a project that WAGG is working on. So it’s a theme that WAGG is working on right now about this theme of privacy and the safe surf on the Internet.

>> SOPHIE KWASNY: Thank you. So it’s an educational tool for the children to better know how to handle it.

>> INTERPRETER: This is only the table you talked about, and it can be downloaded from the WAGG site. So it’s online also.

>> SOPHIE KWASNY: Thank you very much.

Cornelia, what is Microsoft doing to address those challenges?

>> CORNELIA KUTTERER: So we take a three-pronged approach to children’s safety online and privacy. Obviously as a technology provider, we invest in technology to help save that. We also work with NGOs on education and we try to be participating in those communities to understand better what is necessary. And that goes right down to the product development when we look at what needs to be done.

To give you one example, in Windows 8 we have shifted entirely our approach to parental controls. And that is based on surveys we undertook in multiple regions around the globe to see how parents and children react to parental controls. And there was interesting outcomes to see that – and I swear, I made the experience myself – blocking content or time blocking is often not really helpful in conversations between your child and you as a parent. But understanding what your children do is much better.

And so in the approach we took in Windows 8, the parental control default, when you turn it on, is being informed of what your kid is doing. And again here I’d say this is really useful when the kids are smaller. You don’t want to necessarily do that when you are dealing with your 17 year old teenager. But you can then adopt according to the maturity of the kid whether you want to block, for example, use a white list of adequate Web sites on the Internet for your children. Or when he grows older, to use a block list and block only those that are really harmful to your children. So you can use different approaches according to the maturity of your child. And it enables you to have that discussion.

That is just one of the examples. We do have, also, examples in our products where we have privacy by default setting. For example, in Windows Life, we have I think a year ago, we changed the settings by default private, so there is no profile available on the Internet. Or another default setting that we use is the do not track setting, and the do not track signal is on.

So it’s just a variety of usages and technology that can be used. There is not a one-size-fits-all solution.

>> SOPHIE KWASNY: I’m sure that Peter will back you up. Because in the policy paper on the Youth Forum on Internet Governance, privacy by design, it’s one of the elements that is put forward in protecting the privacy. So Peter if you want to...

>> PETER MATJASIC: Exactly. You’ve been talking about data protection and privacy Regulation. These as you know are increasingly central in defining individual rights online. And what we do in the European Youth Forum is we look at the rights of the youth and child. And we have that in mind. And you heard from the video of what young people want. They want to be aware of what is happening with their rights.

We said and we are discussing in the EuroDIG and IGF, there is no distinction between the offline and online rule. We have human rights, and part of that needs to be linked to the debate on your data protection and your privacy. But from the perspective of empowerment and awareness raising, it’s bringing it back to the notion that we had when we launched it already in the past, that the Internet is a sandbox for children and youth, where they should play, discover, they should try out things, within the limits of course.

We have limits as we have in the offline world, limits. But it should also always be built with them and discussed. Like when you have a parent-child relation, that should be tested between them. And the notion of our colleague here being able to use Facebook at 11, even though that is technically illegal, it’s something that should be left open to define by design in that sense. So that’s for us a very important thing.

And a lot of things were said about the terms and conditions. And we can all be very honest with each other and say that the biggest lie of the century is that we all agree to the terms and conditions. Those of us who are aware of the things, we still find it difficult to read all the details. So what we are claiming and saying that we need to change is that those terms would be simplified. So you have the gist of the basic things, and that should be put simply so an 11-, 12-, 50- or 60-year-old can equally understand what is happening, and then there are other detail specifics, technical things that maybe are more for e-Commerce business approaches that are not so relevant for the individual or user. So that is a very important demand from our site.

>> SOPHIE KWASNY: Thank you very much. About the terms and condition, you mentioned it yourself. Even us, we know what the risks are and so we will click the box and we will agree. So maybe making them shorter or more understandable, it will change in the end. It’s a matter of being aware of the risks.

So, Clara, what are you doing in that respect?

>> CLARA GUERRA: I’d like to go back quickly to the current data protection principles. In fact, the principles we have now in the current legislation, they are correct. The question is that they are not enforceable. That’s the problem that we have. And the new regulation proposed by the Commission is very shy in that respect, not only concerning the Internet environment, but in particular in what concerns the children. There is only – there are four recitals mentioning children. There is one article mentioning the consent of parents required, if the children are above – below 13 years old. But only when the data processing involved has primarily a legal ground to consent in itself.

So contracts are out of this, or even legitimate interests of the data controller are out of this. And only if we are talking about addressing directly to children Information Society services or other activities in general.

So this is a very, very, very minor mention to children just to say that children are here. Because today in the directive, children are not mentioned. In the new proposal, it’s shy in practical terms if not in fact most of the cases. So it’s to say that there is something here, but clearly not enough.

Of course, there are other dispositions, general dispositions, like privacy by design and privacy by default. It is one of the best ways to proceed, I believe. We can use technology in our favor, not only against us. Technology poses problems but also the answers, if it’s developed in a privacy enhancing way. So that’s a way to go.

In terms of awareness, I would like to recall that we think, like John has said, the solution is a twofold solution. Regulation or, one hand, with appropriate regulation with enforceable and feasible regulation. Because if the regulation is not feasible, it can’t be enforceable.

And a lot of international cooperation, the ultimate goal would be to have an International agreement, binding agreement. But, in the meantime, maybe a memorandum of understanding between the US and European Union would be a huge step and a good way to proceed.

In terms of awareness, it’s the other, the other way. It’s the other fold. Be preventive. Go to have a preventive action. In Portugal, the Portuguese Data Protection Authority started in 2008 a project addressed to children. But it was not a campaign of one or two months, not a video or some brochures, it was a structural project to be dealt in schools, which we think it’s the natural and more privileged environment to learn these things.

Being today children are digital natives, it’s absolutely normal that they learn how to use technology in a correct way, how to prevent, how to have preventive behaviors at the same time, and as naturally as they learn mathematics or geography. So we developed this project for the classroom and we convinced the Ministry of Education that this could be a normal step. You have privacy and data protection issues included formally in the programmes at schools.

We have succeeded. This year for the first time – and there is a brochure in your – in the – there is a brochure and there is a mention of that to that. For the first time in ICT discipline it’s formally and it’s for all students some contents concerning privacy, data protection, and Internet safety. So we are very happy that we pulled this trigger.

>> SOPHIE KWASNY: Yes, from a European perspective, we are very lucky to have EuroDIG in Lisbon, because indeed the Portuguese DPA has been having outstanding results in the field. So it’s a very good example of good practice.

John, what are the actions that should be taken and what is needed?

>> JOHN CARR: Well, across the piece, quite a lot of stuff needs to go out. I guess my starting point would be something like this. If we knew then what we know now with the way the Internet panned out, I very seriously doubt that we would reinvent what we have got. Because all of these problems that we’re talking about here in relation to kids and illegal content, nobody anticipated them. It would never – there was never a great plan: Let’s create this new thing to put kids in danger or let’s, you know, let’s create this new technical environment so we can have more child pornography kicking around. That was never the intention.

And at the root of the loss of it, and this gets you right back to the kids, is the authentication of who the users are. And when you authenticate people, one of the key attributes about them would be their age.

And I think in the end that is where we’re going to end up. I think as you go around the world or as I go around the world and you see and listen to the debates taking place, there is less and less patience amongst national Governments and with national institutions with the manifest failings of the Internet.

Sometimes when I come to meetings like this, I feel like I walked into an alternate universe. We are talking about not interfering with the Internet and of self-regulation. And then you go out into the real world and you encounter a totally different reality.

I’ll give you two quick examples of what I’m talking about. The FCC in Washington, D.C, one of the most expert knowledgeable bodies in this field on the planet earth, last year looked twice in February and then in December at apps, apps specifically aimed at children. So we are talking about five or six year olds. So there was no ambiguity about who the apps were targeted at. They were all for very young kids. They looked at 200 apps that they got from the Apple store and they looked at 200 apps that they got from the Android store. And the conclusion that they came to in February was simple. They said it was the exception rather than the rule that any of these apps explain to anybody how a child’s data was going to be handled and how the whole privacy thing kicked in there. In December, when they came back and looked at it again, they said in the intervening ten months nothing has changed. So the FCC – so that is not really a triumph of self regulation. That’s not a triumph of hands off. So the FCC is going to regulate. And whenever you look you get the same sort of thing.

Another interesting statistic that came out around that, and we are all into apps right now, 76 percent of all of the revenue, all of the revenues that went to apps developers came from free apps.

76 percent of all of the revenue that app developers got came from free apps. In other words, they gave away the app as a hook to get you to buy something. An extra pink pony or a laser gun or whatever it might be.

We have a case in Britain where a child at 8 or 9 on a free app linked to their parent’s credit card, spent 4000 Euros in about half an hour on apps.

But these things shouldn’t have happened, but they do. And partly because we have got this romantic attachment to the founding principles of the Internet and we are so hooked into the idea of self regulation that these things are constantly happening. And in the end it will be the end of the Internet as we know it.

Last point, today on the Internet, there are more child pornographic images, more child abuse images than there have ever been. In 1995, which I regard as year zero, you began to see the Internet taking off. There were only 4000 images of child pornography. Last year in Britain, five local police forces – and we have 43 altogether – five local police forces reported that in the previous two years they had seized 26 million images in that previous two years, from five localities. And if you extrapolate that to all 43 police forces in the UK, you get a number in excess of 300 million. The police did a study on peer-to-peer networks and the exchange of child abuse images on child abuse, only in the UK, and they discovered 60,000 people downloading or exchanging child images over peer-to-peer networks.

The largest number of people ever arrested in a single year for child abuse was 2500. So if there were no more new cases of people accessing child abuse images on the Internet, the last person would be arrested in 2034. The 12th of June by Interpol said no police force is more on top of this. What a triumph, when no police force can claim that they are coping with the child images on the Internet.

If we do not get an answer to this problem – well, to get an answer to this problem, it has to involve the industry in a much more energetic way.

I’m sorry I missed the tweets, but somebody will catch me up later.

>> SOPHIE KWASNY: Thank you very much. I think we heard about this. So now it’s time to open the floor to the audience to all of you to react to what has been said. I also want for the remote participants to understand that we have Silvio here with us, our remote moderator, who will be the voice of the remote participant. So please send your questions to Silvio.

>> AUDIENCE: I want to fight against the whole thought that we really need to protect kids from the Internet. Because it always goes to the extremes of blocking them or saying you can go on these Web sites or your parents always have to watch. Blah, blah, blah, blah, blah, blah. And I understand this for small kids. But we’re talking about an extremely large group of people over here, and they don’t just end at the age of 13. But until 25, youngsters online, and 35 maybe, make a lot of mistakes about privacy and about what they can actually do on the Internet.

So what I really want to focus on is let’s stop protecting and fighting. Let’s start raising awareness. Let’s start educating all the kids online. I mean, I can tell my friends what they should and shouldn’t do. And there are many, many examples out there. And I really believe that that is – is that if we really trust in each other and that we believe in the fact that kids can teach themselves – I mean. We’re natives. We’re born practically on the Internet. Our date of birth is on Facebook, if you scroll back in our timeline.

So what I’m talking about is let’s really start educating and let’s start showing examples of bad things on the Internet. And really trust in education and raising awareness instead of blocking certain sites or doing very aggressive campaigns, but really trust in the kids themselves. Because they will find a way.

Because in the end, the Internet isn’t just a bad place. There are many, many cool things and opportunities out there. And there are kids that earn money on the Internet and they learn how to programme and they learn how to develop stuff. And if you start showing the Internet as a dangerous place and we start telling kids they shouldn’t be out there, I mean, put them on the Internet and they can explain, because there are many things that they can learn and they can teach themselves and there are many opportunities.

And let’s just start giving everyone the room that they need on the Internet. Thanks.


>> SOPHIE KWASNY: Thank you. We did start by saying that there were different age groups, and you mentioned up to 25. And no provision on children protection would apply to anyone after 18. That is clear.

At the end of the room, there, Bertrand de La Chapelle. And do ask questions to our panelists instead of long declarations.

>> BERTRAND de La CHAPELLE: Two questions. One to John. I appreciate the numbers that you provided. I am still struggling to find the numbers regarding the number of people who do produce those images, as I mentioned in the session this morning. I think it would be very important in this debate to know exactly how many people we’re trying to track.

The second question is talking about age ranges, child abuse images is a very important thing. Protecting the kids under the age of 13 is an important thing. Something surprises me, and I want to know if anybody is trying to address this, For the age range that is above, there is a tendency that I find extremely worrisome, it’s called revenge porn sites. I do not know if you’re aware of revenge porn sites. But I consider this as an extremely important issue of invasion of privacy. That is something that is abhorrent in terms of human rights. And I never heard in any Internet Governance discussion this issue being raised.

Is there anybody dealing with this? And if not, why?

>> JOHN CARR: Well, nobody knows, Bertrand, how many producers are out there.

>> BERTRAND de La CHAPELLE: Average numbers?

>> JOHN CARR: If you look in the – in Britain we have what is called a prevalence study, done by the NSPCC. And according to the prevalence study, something like 10 percent of all children have been or claimed to have been sexually abused in some degree or another at some point in their lifetime. It may be higher than that. Does that mean that 10 percent of the adult population is doing it? It’s very unlikely. But it’s not an insignificant number.

My problem with what you said earlier, Bernard, was that the implication is that somehow it’s the police’s responsibility or it’s the government’s responsibility or it’s a social service provider’s responsibility to find them. And I agree. But it’s also the responsibility of the Internet companies and industry, also, to be part of that effort. Because we know that these authorities cannot cope with this on their own. How many police officers, judges, courtrooms do we think we will have to build to deal with this problem?

>> BERTRAND de La CHAPELLE: Sorry. I have to interrupt you here. That’s what I said this morning. We need a multi-stakeholder International task force that brings the Governments, legal authorities, and civil society groups to track those people. I agree with you. It’s resource, but not surveillance.

>> JOHN CARR: One quick final point. I agree with you on that point. But the people downloading them and consuming are another matter. On a conservative, between one in six or seven of the people downloading child pornography will go on to commit offenses in the future. The British police said it could be half. So 60,000 people we know about in Britain have been downloading these images. They have not been arrested, because the police can’t cope with that number of people. And we do know that a portion of them will rape or molest children in the future. Now that is part of our responsibilities to do something about that.

>> SOPHIE KWASNY: Silvio, please, could you... we have a question from the remote participation.

>> SILVIO HEINZE: Okay. We have two question. One is from Renato Ivis and he is saying that he knows of cases of children at five years who start using the Internet to play games, and that they are of course frequently exposed to information that cannot be fully controlled by parents even under any surveillance. But he also sees the only way of addressing this is by education.

And this is also something the youth IGF was underlining on Twitter at the moment. Because the German youth IGF at the meeting last month, they wrote and they said they want media literacy and media education instead of any surveillance.

>> SOPHIE KWASNY: It’s as much parent education as the children.

Any other questions from the room?

>> AUDIENCE: We have been talking a bit about what is a social problem and what is an Internet problem during today, and I want to know if we shut down the Internet tomorrow, would it stop child pornography? Would it stop child abuse? And I don’t think that it would.

And that’s what I’m struggling with here is that I think that we need to stop people doing this behavior. And if this many people are accessing it, there is something wrong. It’s something so much deeper than how they are sharing it. They shouldn’t be doing it in the first place. And I’m really concerned that this is being used to block the positive side of the Internet and the children. Maybe it’s something that can be solved by blocking it. But it has to be solved.

>> JOHN CARR: Some Governments in some parts of the world use child pornography as an excuse for blocking politically. I’m opposed to that precisely because it undermines people’s belief in the sincerity of the position that I’m putting forward. It’s focusing on child pornography from a child protection point of view.

Secondly, you are right. Child abuse – child sexual abuse has always been around. The Internet didn’t create it. It’s neutral on that. But what is absolutely and completely unambiguously clear is that the Internet opened up new pathways that didn’t previously exist that we ought to deal with.

Let’s suppose for a minute, let’s assume that it’s right, that there is less child abuse going on now than before the Internet existed. Let’s assume that. Does that mean that we should therefore ignore the child abuse that is taking place now on the Internet? No. Because wherever the child abuse is taking place, wherever the images are being published, we have a responsibility to deal with it. So whether the Internet created more or less, it’s an interesting academic debate and point.

But it really misses the point by a mile.

Because whenever it’s happening, it’s wrong. And we are here talking about the Internet and what the Internet’s position is in all of that. That’s why we are speaking in this way.

But there is absolutely no question about the effect that the Internet has had on multiplying by a humongous factor the viability of this material.

And very quickly, if somebody is an alcoholic, one of the most important things you should do in relation to that person is keep them away from alcohol. It’s the same with these types of images. There are people that have an interest, but that doesn’t mean that we should take it out.

>> CORNELIA KUTTERER: We do not intend those uses of the technology that then happen.

So a couple of years ago, the Canadian police came to our general counsel and has asked for help and has shown to Microsoft the massive amount of sexual child abuse images that are circulating. So we have partnered with the University of Dartmouth to be able to detect those images on our services. And I must say that in those cases it doesn’t help. So the foreseen steps in European e-commerce, because there is no one to notice as you could on the free wrap, when files are shared between communities, this technology has been donated to the Internet Centre for missing children, which has known sexual child abuse images. And the way that that is restricted is that they then give the hashes to Internet Service Provider, the worst of the worst images for Internet Service Providers to use it.

So Facebook and Google are certainly the most well-known users of those technology. And Facebook has publicly said that this is a game changer for them to get rid of the problem. And to my knowledge, Google has just recently also announced the use of that technology that they have been developing to make that happen.

Now, this said, I want to go back to the youth, youth, youth comment. We do think there is interesting discussions to have. And this is totally separate. I just want to give also some acknowledgement that we are also talking about youth, because I have so far only been talking about child protection.

We do – and I know we did research with the Cambridge Microsoft Research Centre with social scientists, and one is very famous for her studies on how youth is using public spaces, Dana Boyle. And I recommend her studies around the usage. And the sad thing is that it applies mostly to the fact that my age group is unable to understand well enough how teenagers are using that.

So I think there is a lot of studies still to be taken to really understand what the privacy is. But she comes to the conclusion that the privacy norms of the teenager have actually not changed dramatically. It’s only the technology that has. And so that kids use those public spaces, as they have before used other public spaces. And they feel that one comment she has received, for example, is that they feel more – they seem to feel that it’s more of an issue of their parents. It is more that the parents use those sites than anybody else. So it’s maybe not the social networks that are accessing those.

>> SOPHIE KWASNY: Last comment on the issue of child abuse images?

>> PETER MATJASIC: I think it’s very important in terms of what was mentioned in terms of the multi-stakeholder approach. What we are missing so far in the multi-stakeholder approach is the understanding of what youth and children want, to directly listen to them. And therefore I’m very happy to have these discussions. It’s great to talk to John because we have a different point of view. And we are not necessarily opposing each other, but we have different approaches. And to me it’s important that we take this into account. Even though I’m enjoying the tweets, because it comes across that we are like from two different worlds. And I want to mention that the young people of today are not anymore as digitally native as some of my colleagues are. They are the ones who know best what to do and how to do it with the Internet.

What one generation has that can be of value is the experience of the offline world to the dangers. What is needed is a proper dialogue. But the proper dialogue is based on mutual respect, so we have to listen to somebody more to be able to get to all of this.

I’m happy that we are in this session. Media literacy is very important and we have our colleague from Portugal who was mentioning what they are doing in the formal education. But what we want to talk about is the nonformal education. Out of schoolwork, preeducation stuff that WAGGS is doing. There people hang out more. They will trust more a younger person than the teachers telling them about the Internet. So it’s using that type of technology that we have out there, so it’s important to bring that into the discussion as well when we talk about media literacy. And it’s a whole new world for many of you out there in the Internet Governance field. So I just wanted to make you aware of that and discuss during the breaks with the youth representatives on what they are doing, what sort of problems that you see out there. So it’s about protecting and empowering.


>> SOPHIE KWASNY: Thank you very much. Peter has his fan club in the world.

Martin Fischer, I want to thank you also, because you’ll be our reporter for this plenary. So Young European Federalist, Martin, please.

>> MARTIN FISCHER: Yes. The Young European Federalist.

I have two different questions that I wanted to address. One is something that came up on Twitter. The right to learn. Don’t young people have the right to fail and the right to learn? Like every older person? And in particular, in the age group of let’s say 15 to 18, which are still considered by the Rights of the Child, parents still have the right to protect children from certain levels of the Internet. Cornelia spoke about white and black listing, more likely in that age. I earlier shared a statistic that came up about what young people search for. 25 percent of it is porn. And I think a lot of parents have a lot of moral, let’s say, at least ambiguities about that topic. Ambiguities. And one of the questions that I would address is do parents have the right to prohibit young people from learning, from exploring the different contents online?

The second one is bringing us back to the child pornography discussion. I think that MS photo IDs were to be taken into account of what John said, that the numbers are increasing. And Microsoft photo ID was not invented yesterday. And even if Google applies it now, the numbers still keep growing. The photos are indexed. We can relate different photos to each other.

Shouldn’t we find new ways to address it? You talked about the different interventions of the police force. Why are the police always acting only on the national level and not on the International level? Why is there not cooperation?

Mr. Bertrand de La Chapelle also spoke about an International multi-stakeholder task force. But right now apparently there is no interest by the state to follow up on these issues on an International level. If you address the Internet, if you think it’s an International construct, so why do we address it only on the national level in these communities that we apparently – if they are so problematic, why don’t the States make a change of it?

>> SOPHIE KWASNY: Thank you. At the International level, there are also some efforts which are being made. For instance, the Council of Europe adopted a Convention which precisely tries to tackle the issue and bring together our Member States, the one which will ratify at least to cooperate on that.

The Lanzarotti (ph) Conventions are fantastic. There are fantastic International agreements. But the young friend is right. How little of that is translated into reality? I’m in favor of peace. I’m in favor of finding a cure for cancer. I’m in favor of ending poverty and I want to see more International cooperation by police services.

I regard each of those as being exceptionally difficult to achieve. I wish it were different. I wish all of the police forces could settle their differences and we could have one set of laws and they can get together in a room and say hey, dude, let’s do it. That has not happened yet. That is not a reason for walking in the other direction and saying well, there is nothing we can do until the police get their act together.

And by the way, let me just say, in Britain it won’t take many more cases like the ones we have reported in the press a couple of weeks ago for the British Parliament to jump right in with very severe laws about some of these things. So it’s in the interest of the Internet industry to sort these out, because in each individual country time after time it’s coming back, and one day it’s going to result in something terrible.

And Google responded well. They deserve to be congratulated. The amounts of money that they are putting in are not large by Google, it’s not enormous, but in terms of other things it’s enormous. Google has a great track record, but other companies need to step up. Sooner or later it will come crashing down, because national Governments won’t put up with it for any longer.

>> SOPHIE KWASNY: I’m looking.

>> CORNELIA KUTTERER: Maybe just on two points, and this is something which is a governmental initiative. So companies are not part of that. There has been last year announced a Global Alliance against actual child abuse. And the list of action points named the take down of sexual child images online as well. But it’s not the first card which this group of I think 46 or more countries has committed to tackle. That is only number four or five of that list, and talking about prevention in and the fight offline as a prime activity.

Now, on the first request that was asked, I’m almost inclined to answer as a tiger mom, as this man next to me calls me. My children are, you know, are of a age where they start to learn making their own decisions. But as a mother I do have a legal responsibility to take care of them. And we need, I think, in an open society, also, the space for parents to do that according to their beliefs. We have many different people and I guess everybody here educates their children slightly differently. And as a technology, going back to a representative of a company, we do actually believe that the parents have a great role to play in that space. And hence we do have technologists that facilitate parents to actually do that.

>> SOPHIE KWASNY: I’m looking at my watch. And I’m the sorry for the persons in the audience who will not have their opportunity to put their question. But we will be there, each of us, so you can come to us.

I would like to invite you to join me in congratulating our panel for this work. So thank you very much. And this is the close of the panel.