Cloud and Big Data: Delivering on the promise while safeguarding privacy – WS 04 2014
12 June 2014 | 14:30-16:00
Programme overview 2014
Session subject
- Big data, new behaviour patterns
- Promoters and opportunities of new technologies
- Data protection
Session description
The advent of cloud computing and big data, spurred by the growth and increasing pervasiveness of the Internet in our societies and economies, holds many promises. We are at the start of a ‘smart world’, where our homes, our cities, our energy, can become smarter thanks to better data monitoring and collection; with the potential to drastically improve our consumption and management of energy and the environment. The same online developments have also seen the rise of crowdfunding, and generally of more citizen and public empowerment.
But with these developments – especially the increased use of (personal) data - come challenges, notably for privacy and security. As the technology develops, it is still poorly understood and often not (yet) trusted by many, which could deter or delay its potential.
As policymakers look to the new questions raised by these developments, it is important that we have a debate including all stakeholders, to ensure that the technology delivers all its promises – from empowered citizenry to better energy utilization and economic growth – while making sure all the necessary safeguards are in place, notably when it comes to privacy and security.
People
- Focal point: Jean-Jaques Sahel, ICANN
- Live moderator: Jean-Jacques Sahel
- Rapporteur: Olivier Crepin-Leblond
- Remote participation moderator: Farzaneh Badiei
- Digital facilitator: Tobias
- Panelists/speakers:
- Elena Bratanova and/or Dr Stenzel, Data Protection Reform, German Ministry of Interior
- Cornelia Kutterer, Director of EU Institutional Affairs, Microsoft
- Arman Atoyan, Founder, Director, X-TECH Creative Studio, Armenia
- Karsten Wenzlaf, ikosom / German Crowdfunding Network
Format of this working group at EuroDIG
Workshop
Protocol. Discussions
See the discussion tab on the upper left side of this page
Further reading
Messages
Reporter: Olivier Crepin-Leblond, ICANN’s At-large Advisory Committee (ALAC)
- Data protection laws like the ’95 directive are useful but ill-suited to big data because it requires individual identification of each piece of data in order to protect it;
- There should be ongoing work for the strengthening and improvement of this ’95 directive;
- Europe has stronger data protection laws but there is less ability to impose high fines than in the United States in case of breach. Thus laws have less of a deterrent effect;
- There is a huge potential for cloud and Big Data including gains for consumers and for the economy as a whole. Big data is accepted by consumers when it makes products less expensive or more suited to their use of the product, yet it needs to be kept in check;
- The problem is not Big Data itself, but the ethical use of Big Data.
Live stream / remote participation
Click here to participate remotely
Transcript
Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-481-9835, www.captionfirst.com
This text is being provided in a rough draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.
>> Does this work? I don’t know if you can hear me? No? Oh, you can hear me here. Okay, yes, I think that works; no? We’ll just have to speak loud. Okay.
Good afternoon, everybody. Thank you so much. I’m very glad to see many of you around the table. So we’ve got a huge topic to talk about. I have been to conferences that lasted several days on actually just a small portion of this workshop. And this workshop is supposed to cover cloud, Internet of Things and “Big Data.”
And you will hear from very different perspectives, I’m very glad to say. I am actually very proud, also, to have a series of key participants who are, if I’m not mistaken, in majority women. This was, of course, pure coincidence.
What I would just like to do is ask the key participants to quickly just say their name and where they come from and then we’ve got at least one presentation, I think, to just kick things off.
>> ARMAN ATOYAN: Hello, everyone, my name is.
>> ARMAN ATOYAN: I’m founder of X-TECH and Citybugs. I came from Armenia. You will go.
>> CORNELIA KUTTERER: I’m digital policy for middle eastern Africa and I work for Microsoft.
>> I’m not a panelist. Okay. I introduce myself. My name is Gema Campillos, Deputy Director General of Information Society Services at the Ministry of Industry in Spain.
>> CLAUDIA SELLI: And I’m Director of European Affairs at AT&T.
>> Elena?
>> RAINER STENZEL: I’m head of the section for privacy in the German Ministry of Interior.
>> This is, of course, as many EuroDIGs, an informal session in the sense of what we will do is have a short presentation from Arman about Citybugs, which will give us a flavor for what actually is cloud in real life, what can it serve for? What are the main benefits can derive in a very concrete way? And then I will ask the other key participants to just say a few words about their perspective on cloud and “Big Data.”
And then I would like to move on to basically just open session Q & A, and I would encourage every single one of you to feel free to ask questions, make comments and generally advance discussion.
This is in the context of EuroDIGs, so you’re welcome to it on Europe. At the end of the session, I will try to encapsulate the session into free bullet points which will then be fed into the plenary sessions.
Maybe to kick things off before Arman takes the floor, I’m always amazed when I read some of the figures that I think it’s actually European Commission that had the most optimistic view of what a cloud could bring to Europe. I think they said that if we realized the benefits of the cloud, by 2020 the cloud economy in Europe could bring I think it’s 957 billion Euros to our economy. So close to a trillion Euros to European economy. So it’s clearly an important aspect for the European economy and more than that, it’s important for the European society and for us as European citizens.
And with that, and with that sort of citizen angle, I will pass on to Arman to tell us more about Citybugs and what cloud means in reality.
>> ARMAN ATOYAN: Thanks, Jacques. We have developed the Citybugs platform. And this is actually the – it’s one way of using the cloud data.
For save the time, let’s see some small and short video about the project; and after, week have discussion about the project and I will share other slides. Can we have the video now?
>> Citybugs.am is a social platform developed to raise socially critical issues. Registered users can easily report different social issues persistent in their communities. By simply using a mobile device or PC, users have the ability not only to report bug but also give suggestions and participate in online discussions. How does it work? In order to report a bug or make a suggestion, you should register with the system. If easiest way to register is to use your Facebook profile, or you can register by giving your name, last name and email. You can also include a photo and phone number. Once you are registered and in the database, you can report a bug, adding details like the bug title, the sphere it belongs to, the European district, mentioning if exact address of the bug as well as choosing to make your report open or closed to the public. Afterwards, public bugs are pinned on the dynamic map, making it visible to the public. It is widely shared through social networks like Facebook and Twitter. All public bugs are visible on the dynamic map of the portal. Specially developed mobile apps for android and IOS systems allow you to report a bug via your mobile immediately from the spot of the bug. Citybugs.am also aims to reveal positive developments in the fields of construction and bring in innovations. This public usage tool improves communication between ordinary citizens and authorities, making the process more transparent and reliable. Citybugs.am. Let’s make our city a better place to live.
[Applause.]
>> ARMAN ATOYAN: Now let’s pass to the more detailed discussion. So just imagine how it is easy to report about some issue where your mobile phone if you are meeting in a different place in your city, and how hard for the government to look up to all the places and how many workers they need to have to go to all the places and find out the issues.
So this is the way how the citizens can help the government to work with them and how they can help the government to work for them.
Actually, the platform is based on the Google map. And it’s very easy. It’s just a few steps. You can download the app from the app store or Google store. And this is the Unix way of showing how the cloud data can work and can support the developing of the cities. And we have mentioned the cases related to the city, but it’s not just for that city. Somehow the people are telling it’s like another platforms about reporting about the potholes and other type of issues. But this platform is giving the ability to report and discuss other cases, also, like related to the healthcare or related to the education.
So we have been, in our mind, we have to bring the platform where discussion and where some kind of dialogue can go on and this can be visible and can be reported to government.
And another good thing with this is every bug has his time come down. So everyone can see how long this issue is reported and not taken care. This is something that government do not like. And that’s why they are trying to have the minimum number of dates that are passed on that issue.
So we have developed the platform. With the help of the USA, maybe you saw the logo there. They help us to bring the platform to trillion other cities. Soap this is working in our capital. And they scale the platform to another three cities. Then we have moved forward. We have integrated one more city. Now we have five cities. And currently we are – we have the good friends who are living in Toronto and they requested us to start thinking and developing the platform for the Toronto. You know, this is another huge city that, as others cities have some problems that can be solved by help of the citizens.
So I want to introduce the video and have more discussion instead of the slides. Thank you.
>> Thank you. I wonder if anyone’s got any question for Arman, any immediate reaction? You might want to have it for your city, for instance.
>> Can you tell us a little about your privacy policy? And could you sign up as a citizen in another name than your real identity?
>> ARMAN ATOYAN: Actually, all the issues that you are reporting is related to the city issues. And they are open. But you can – if you are thinking that this somehow can hurt someone, then you can report as a private. So this will be visible just for the governments but not for all if it’s somehow related to another person.
>> If government will know exactly where you are going around. They will know all the data about you.
>> ARMAN ATOYAN: Yeah, they will know about the data.
>> Do you trust your government was the question? I think we should ask it from –
>> If you are asking by person if you’re related to my government, then, yes.
>> Do we want to do a round table of all the Europes – governments in Europe and ask if you trust them? I’ve read some interesting statistics about it but maybe in the break I’ll shout out with you.
>> Yeah, just a quick question. Which country has this application being developed and is implemented?
>> ARMAN ATOYAN: It’s working right now in Armenia. But why I am here, because I want to find some partnership and try to scale it to the Europe and other part.
>> There are similar systems elsewhere in other cities, but not as comprehensive as this one, I think.
>> ARMAN ATOYAN: And one more idea that, what’s the other type? There are some governments that have their own tools. And the citizens in particular do not love that kind of systems because they are considering it as governmental tools. So different with this platform is this is social platform. It’s open. And it’s not belongs to the government. It’s owned by the citizens.
>> MODERATOR: : Do you have a question? Okay. Remote participation, I think?
>> OLIVIER CREPIN-LEBLOND: Thank you very much Jean-Jacques, I was the remote participation. I was going to put the point could you introduce yourself when you speak because for the remote participants, it’s very different to guess.
>> MODERATOR: I am Olivier Crepin-Leblond. I’m sorry: I should have introduced myself.
[Laughter]
Jean-Jacques Sahel. Thank you. We’re all friends. There’s a question just behind Olivier.
>> Karsten Schewe with DENIC. Maybe I haven’t understood it yet. The checks like if a city bug is being reported, how about the severity of the report? So maybe if a bin is full a certain day but a paper bin is full, it might actually be the case that’s going to be emptied, say, two hours later. So who would check the reports for sanity? Let me put it this way. Because, otherwise, the back Office of the system could be dying of DDOS attack by plethora of reports. And then like the real thing can’t be detected any longer.
>> ARMAN ATOYAN: We have special statuses for the issue. It’s trusted or not. If some person is reporting commonly, the moderator is creating the new status for that user. And this means all the issues that are provided with that user is trusted and they are approved.
But for other – the system has also volunteers that are – some of them, for example, mark they live in some part of the city, and they are confirming the bugs. But it’s not for all the ones. So someones are getting confirmed by the volunteers who are living in that part of the city. The idea is to have something social that belongs to everyone. That’s why the people are – let’s say they are happy to assist and happy to be part of them and we are mentioning this person’s helping for this district of our city.
And something I want to ask, because there is a lot of policymakers and the people are related with the privacy and policy. If we have time, I will tell you the story how we get the government working with us, but the question is like, for example, if we are set up in Berlin. Let’s imagine after two weeks we have the system in Berlin. And in general, let’s suppose we are not agreed with the government or the city mayor. After – if some citizen’s reporting about some issue and the issue is taking place on our map and our moderator is sending the issue to the government, it will be considered as a real request for the citizen. And is there any law that is, let’s say, pushing them to solve this issue because the citizen the reported? Does it matter where the report happened, in their email system or in another platform like on Facebook or Citybugs?
When we have discussion in our country with our mayor office, we told them, guys, you are solving the issues that is written by people on the Facebook, right? Yes. If someone is writing on our mayor’s page, we immediately solving it and we are marking that it’s solved because a lot of people are watching the Facebook and they can see that there’s some issue.
So we explain the error is the same. So this platform is the registered company in Ireland or in U.S. and this one is registered in Armenia, what’s different?
So we have explanation for them. And after the one month that we have started, we got the call and we got the official approval that the government is looking for the system and they will answer to all the issues which is written in the system.
So now the question is, for example, if we will do the same in Berlin. It will be considered as a real request for the citizen. And the mayor office must, let’s say, answer to that request or there is not that kind of laws?
>> Well, I’m not part of the city of Berlin and don’t know what the officials of the city of Berlin would say. But I could imagine that maybe some authorities in Berlin might be interested in this project.
First, it might be the data protection authority. And they would raise some questions which were raised here again: Well, what about the privacy policy? Is the data really safe there? What about identification? What about reporting not only bugs in the street but also, well, personal data of others? Maybe also names of civil servants who didn’t react or something like that?
And then I’m pretty sure that the Berlin data protection authority, they would raise some concerns. I mean, maybe you can handle it and you can get an agreement with him, but maybe he would raise some concerns. And maybe, imagine, these are concerns that are so strong that say well, you cannot start your service here in Berlin.
But then an interesting thing might happen because then you might ask from another side and ask the question: Well, what about freedom of expression? Isn’t it the right of the citizens to report what’s going on in their city? Maybe not as an official request but maybe also as, well, we stand up for something which doesn’t work well in the city, and it’s our right of expression and our fundamental right to make these things public. And maybe we can use also electronic means, or new means of “Big Data” for doing so. And this brings us to the more general question: If you look at every service, at least, what is the value of the information? And also the value for using this information for exercising your fundamental rights on freedom of expression, right of information and so on the one side and privacy and private sphere and data protection concerns on the other side?
And I think the “Big Data” is such a big problem that as legislators, we should think about. Yeah, it’s an opportunity. It’s a chance and a risk. And I’m not sure whether the existing tools we have in our law are really sufficient to handle this problem in a proper way.
>> MODERATOR: I think we should have a proper discussion a bit later. And I think this question of balance is key. Policymaking is about balancing versus certain interests, laws, et cetera, for the better of society. And that’s the challenge. And to be a good policy maker is to be good at balancing. And that’s not easy.
So let’s maybe come back to that. And to sort of add to the perspective, I’d like to pass on to Claudia to give us a business perspective on all this.
>> CLAUDIA SELLI: Yeah, well, the mobile Internet, as you know, has brought really a global wave of innovation that’s transforming our life, the way we manage our life, the way we connect with people, and the way also we operate business. At AT&T, we have seen an increase of 50,000 mobile data traffic since 2007. And certainly, you know, the year – the past years have been incredible, but the opportunities ahead are even greater. And next wave of innovation will, basically everything will be connected, from the cars to the home in a smart way. So the content will follow us no matter what the device we’ll be using. The education will be much easier to access no matter where people will be based. And also our homes and our cars will become the natural extension much our smart tablet and smart phones. And, you know, this will be allowed, as well, by ultra fast mobile communication with access to content located in the cloud. And the impact, of course, on our life will be certainly great. And studies have shown, as well, that by 2020, the number of wireless devices in Europe will be reaching 6 billion, more or less. And on the other side, the market of connected leading will be – in Europe will be equal to 234 billion. So always by 2020, which is about 6 percent of the European GDP.
And then also the Internet of Things and machine to machine will allow consumer to remotely control the different machine and their functions. So just to give some examples. For example, concerning the cars, the automobile manufacturers are basically connecting the car to allow users to control their vehicle to, for example, allow the users to plan journeys or locate a recharging station which is close to the car or also to cool off the car or to turn on the heating of the car. Or for example, in the U.S. we are offering digital life, digital home, so people will be able to basically control their house, check on the security of the kids or of the pets or to turn on the heating or switch off the heating or, for example, the alarm, as well. So these are just some improvements.
And also in the healthcare system, I mean, a doctor can certainly monitor, you know, chronic diseases from far away. They don’t need to be present anymore. And these technologies are bringing certainly a new market.
And at AT&T, we also developed, for example, a sim card which allow enterprises to connect these machines, wherever they are globally, and using a single provider. And so this is opening up a business market, certainly. And the next years will continue to be even greater. I mean, we will see certainly a growth in the network usage. We will see a growth in investment. And we’ll see certainly customers continuing to ask for more and more mobile functions and operations.
And certainly this will bring, of course, policy questions and policy issues because it’s raising questions of privacy, of security, of transparency. And it’s very difficult, as you were saying, you know, it’s important to strike the right balance. If’s very difficult at the same time to agree on all the nitty-gritty details. But certainly we can set the objective to create policy frameworks which are flexible and that don’t lock in innovation but allow innovation to continue and to grow because we cannot really predict the future.
And, you know, sometimes policy is just a little lagging behind. So we need to have high-level principle established that would allow the business to continue to grow. So I will close my comments here.
>> JEAN-JACQUES SAHAL: Thank you very much. I wonder if there’s any immediate questions or comments? I feel like this is actually a good time to pass on to Mr. Stenzel to give us the policymakers’ view. Tell us: What’s the solution?
>> RAINER STENZEL: Well, first of all we have to bear in mind that we already have existing law dealing with all these issues. We have data protection law in Europe, in other areas in the world. And in Europe, for example, you have the directive of ’95. And if you apply the Directive and the existing data protection law on all these issues, then you come to very, very complicated questions, sometimes. Because the whole data protection law is based on the idea that you can identify a single data and that you could say what is the way of this data? From one entity or one controller to another or to a processor and the way back and so on. But this was all created in an environment where we had all big databases. They filled a room like this. And we had data sets of name and birthday and where I live and so on.
And in this world, you could separate one data from the other data and give the responsibility of the one or the other to a controller and so on.
With “Big Data,” we have to face the phenomena that data is around us. Well, if’s mostly everywhere. In the future, in cars, as you described, in your electricity, in smart meters and so on. Everything is connected to each other. And you can buy “Big Data,” you can identify correlations, aspects, new information on groups of persons, of single persons and so on.
So if you would apply a system of the existing law with the tools we have there and you would say, for example, one or two means in this toolbox are notice and consent, you might ask, okay, how does it work, then, in a “Big Data” environment. If you have a data analysis with, let’s say 100,000 people data sets, you are not really interested in those persons, but you are interested in what they are doing, where they live and so on because the new information you would like to get is something of, well, health diseases, something like that, risks and so on.
If you take the principle of notice and consent, as this element in this toolbox, you have to identify every one of these 100,000 persons – although you are not interested in them, but for data protection purposes, you have to give them the information, “well you are part of this “Big Data” analysis, I am not really interested in you but where you live and what you’re doing and so on.” And then you have asked for consent. I mean, it’s pretty obvious that this might be a good tool for some data processing which are going on, but it cannot be the solution for everything. Because if in the end, everyone has to decide on its own whether it is fair or unfair to have this analysis, it’s – we are reaching – and it’s getting more and more impossible, more and more data processing that we have in our environment.
So as a policy maker, we should think about new elements in our toolbox. We have to ask maybe about regulating the process of identification, for example. It’s not impossible to identify a person in the “Big Data” analysis, but if you do so, maybe you can regulate this process as a kind of a border. And you can say that this has an effect on a private sphere or personal aspects of a person.
You could also regulate, for example, the degree of discrimination. If a “Big Data” analysis comes to the result that you are in living in an area, an environment, which is pretty risky and your insurance rate is raising because you fall in this category on this, you are discriminated, in a way. And then this might be something you could regulate in the one way or the other way. You can make – there. But this is a new category.
So neither we have in our data protection law already these elements like discrimination and so on; we only have protecting the data. And I think we should think about these things, the elements behind the data, what is really the subject of protection, and what would we like to achieve.
And in the end, I think we can only handle these problems if we also see the chances of “Big Data” analysis of Internet of the things, of communication, of more information on reporting things which elements are not that good in your environment, with the city, what the government does and so on, and we cannot say we go back. The future cannot be coming back and saying “we don’t want to have any data processing at all.” We have to handle this. We have to face the problems. And I think we need some new elements in the toolbox.
What is interesting around the world there are some discussions on these things. They are ideas how we could regulate this in a proper way. And for us in Europe, it’s the challenge to find these solutions as soon as possible and to introduce them in our new law in the reform of the data protection law in Europe to have, in the end, a more modernized and better data protection law than the existing one.
>> JEAN-JACQUES SAHAL: Thank you.
>> Hello? My name is max Cavenstein. I work as a researcher at a research institute in Berlin especially in data protection and the principle of purpose limitation. This is a nature contradiction to “Big Data” where you never know the purpose in advance. And I’m very interested in the last aspect you mentioned, the time pressure. So that you have to find now the solution to find new regulation instruments, to put them early enough in this new legislative process. And what do you mean specifically?
So until when do you need these solutions?
>> RAINER STENZEL: I would say yesterday.
No, the problem is we have to face many, many issues at the same time on the European level when it comes to data protection. We have to deal with our public authorities because the public sector is covered by the new regulation. We have to deal with Google and Facebook and Twitter. We have to deal with balancing fundamental rights like freedom of speech, right information. But also right on private sphere or data protection. Then we have to face new technical developments all at the same time. And we have a lot of pressure from the outside, from the public that they say “we need the new law.” Sometimes with the next sentence doesn’t matter how it looks like, but we need it now.
But we think it is necessary to take the time to improve this regulation. But we don’t ask for a never ending story. We work pretty hard with our stuff but also in Berlin and other capitals on proposals which bring this regulation forward to a better one.
And the discussion also, not only under policymakers but also in scientific area, it just started. And we get some input from this side, as well.
We would wish for some input sometimes, but we see more input. And the only thing I can say is we work as hard as we can and as soon as we can.
>> JEAN-JACQUES SAHAL: I don’t want to go into the debate too soon because we have at least one other key participant intervention. I would just ask a quick question you don’t have to answer. Considering how fast technology evolves and how fast uses of data evolve, should we have a new law every, what? Two years?
>> RAINER STENZEL: No, definitely not. Four years ago we had this debate on Google street in Germany. And they proposed a new regulation on – regulating the camera on top of the car. Because it was the reaction on this debate. And we thought this is not a good idea because we cannot have new regulation for every service and every specific situation.
The data protection law or privacy law has to be technical-neutral. That’s true. So if we talk about new elements in our toolbox, we can talk about principles, about what is this about, for example, to introduce only the element, for example, of discrimination. This would help not only data subjects or citizens to say, okay, what is my right? Where is my harm in the end? And what is my interest, my fair interest in this process of data processing?
It’s also that companies and others – every one of us who’s processing data, that you have more legal certainty. That you know, okay, these are the general principles I have to bear in mind when I design my services. And if you say, for example, purpose limitation has its limitations, maybe it’s good to have another additional principle of, for example, context or appropriate or reasonable expectations in an environment, for example. And then you can decide that I might have the reasonable expectation that my television is not surveying me, that I have to switch it on because I’m in a private sphere, I’m in a private context and so on. And I think then in the end if you have these principles, this would help a lot. And then we need jurisdiction on these things. So that we can develop the measures of really balancing all these things because as a policy maker, it’s absolutely impossible to have all the cases in mind that we are regulating. Especially when it comes to data processing because everyone is processing data every day.
>> JEAN-JACQUES SAHAL: Thank you. That reminds me of a very bad joke wed in France a few years ago before the fall of the war. We used to say in the Soviet block, you’re not watching the television, the television watches you. But with that bad joke over, I would like to hand over to Cornelia, she might have some direction to give us how we might address these concerns. But also tell us more about the positive perspectives of cloud and “Big Data.”
>> CORNELIA KUTTERER: Okay, thanks. Yeah, we are already right in the middle of the issues and problems, and so let me eventually start. I want to talk a little bit about opportunities, business realities and then going into the data protection area, which I think is much wider than actually only data protection to start with.
On the opportunity side, I make it very personal. I live in Brussels. I’m German. My parents live in Germany. They speak mainly German. My kids are raised in French, mainly French because I’m a full-time working mama. I can’t really implant my German very well. So we have a real problem. My kids cannot communicate with my parents. They use Skype. And that doesn’t really bring them much than waving hands. Skype has just recently in the last couple of weeks announced Skype translator. And Skype translator allows an in-time translation, so you can speak – my parents can, as of the end of this year speak in German to my kids and they will get a translated voice in French. And they will be able to communicate. And that’s the opportunity of “Big Data,” the whole data translation is only possible through data analytics. So this is my personal opportunity. I’m really thrilled about this. And that is just one example.
Now let me go – and there are plenty of others before, Citybugs, I want to have that in Brussels, definitely. It will be full.
The business reality is the following. ICD released a study in March 2014 to look at over 2,000 companies, middle-sized companies across multiple sectors: Retail, finance, health, you name it. And they looked at different processes in those companies to see what the data dividend is and how, through the right infrastructure, the right data management, and the right business intelligence analytics they can materialize this growth opportunity. And that growth opportunity is around 60 percent. 60 percent in cost efficiency in better services for customers. At the end of the day, being more competitive. And when it comes to competitiveness, it will be those companies will survive that will use those tools for them. And so that is where I’m saying this is a business reality. This is coming. And we have opportunities and realities. Just from eventually for clarification, Microsoft, in that space, provides infrastructure, data management in our B2 B and particular data management is something that we are certainly one of the leading companies. And then there is analytics and business intelligence. And as we move into machine learning area, this will increasingly – I think we are just really at the starting point of that revolution to happen.
And now to the issues we need to think about as we move forward in this data-driven world, and I would say from the context of this workshop talking about privacy and security only doesn’t really get us there. It’s actually – and I think the risks – and one of our Microsoft researchers has stated that – she was saying that what really is at risk is the social fabric of societies. So we need to be very careful and think about this in the future.
I think one part of that picture has already been mentioned: Discrimination. “Big Data” will allow more predictions. It will allow predictions by using different data collections that are independently not saying as much but you pull them together. Suddenly you can probably predict that you are on the other side will be at a higher probability to pay your insurance fees than your neighbor. And so you will pay a lesser price for the insurance, just to give one example.
So the social Welfare, the social part that we take the costs as a society for the individual will be much more under pressure. I think that is where really, really the biggest issues will have to be talked about.
So that brings me basically a little bit to data protection, but it’s broader than only data protection; it’s data usage. So we will have to have an ethical conversation around data usage. There’s issues about accountability and liability, algorithm liability. There is questions, of course, about data protection. So it’s a little bit broader than only data protection; it’s really about – and eventually, also, the social fabric, obviously with the European Union alone, we have different social agreements in society between Paris and Berlin or Berlin and London. So I think that is where we are actually really at the beginning. And I have heard two less, the issues of having a serious discussion about ethical data usage.
A couple of comments eventually last on the data protection directive and the data protection regulation. So the principles of the data protection directive I think are probably just replicated in the data protection regulation. There is the consent under the vote in plenary is strengthened. We could discuss whether that is actually a useful tool to strengthen data protection. Data limitations, when you have consent, doesn’t really matter that much, it depends on how it is discussed. And again this brings me back: Where are the other boundaries that you need to provide to make it actually socially acceptable? And legitimate processing, where you actually have a value for public society which is higher. We need to take that into account, for example, research. So plenty to discuss.
>> JEAN-JACQUES SAHAL: I told you this could be a conference over several days.
I’ve got a question here.
>> Hello, my name is Peter Herm answer, I’m associated with business. Anyway, I’m from Germany. And I have a question. I’m going back a little bit in history, maybe to some history that some younger ones don’t remember. I think it was in the either late 70s or early 80s when in Germany we had a census of basically counting how many people, you know, the basic statistical counting of the citizens. And there was a huge protest at that time in Germany because people still remember too much about history in Nazi Germany and at the time still going on Stasis in eastern Germany. And they were very wary of a state that new too much about its citizens. And the similar thing happened when we started getting our personal identity card machine readable, there was huge protest out of concern that the state would collect too much data, would have it too mated. And there was also, I think, the principle that whenever state agencies would collect data about an individual, that they should not, on purpose, not cooperate and share data with each other so as to prevent the state from gaining a full profile of each individual.
Now, today, 30 years after that, 35 years, times have changed a lot. We voluntarily give up a lot more very personal and sensitive, private data to public, “Big Data” collectors like Google, Facebook, Microsoft, Apple, whatever. And we voluntarily synchronize our cell phones which know everything about us with a cloud service where we don’t know who has access to that data. We also hear that retailers have gone even a step further by making use of the technology in an unwanted way. Every Internet-capable device has a Mack identifier for that particular hardware card. And retailers have gone to the fact that most modern smart phones, they also Internet capable and they will constantly search for an available WiFi hot spot and they will also broadcast the Mack address. And then they have gone to the point that they now combine small micro WiFi hot spots within their shops, plus cameras. And they can now recognize you not only by your Mack address, the smartphone you’re wearing, but because they’re taking a picture at the same time, they know who you are, what gender you are, how old you are, how you dress, and what kind of shelves you have looked at.
Now, I know Apple has just announced that they’re going to randomize Mack addresses within the next generation of Iphones and Ipads. But again it’s not of much use because modern smart phones is not only have an Mack address they also have an email address, so the problem is not solved. So that’s the private part. Where we realise that data is collected about us without anyone asking us for our permission. I’m not sure, actually, I don’t know whether that happens in Germany with the retailers. I know for a fact it happens in the United States. But the question is we know there’s a lot more data in private sense than we would willingly want to give up. We do it unknowingly and unwittingly.
And I think on the public side, what we have continues, of course, the statistical office is collecting data on an aggregate basis and anonymously. And I think it’s extremely valuable. Also, a lot of sort of other data that is collected on a “Big Data” and anonymous level. So I think when we want to look at the future, not only do we need to make sure that we balance and protect our privacy rights against that invented super right of security that our formal interior Minister had proclaimed which does not exist in our constitution, but I think we also need to make sure that there’s not a possibility for governments to have access to the private data because we assume that when we share private data with “Big Data” collectors, that it is anonymously and it is, at least in my own, if it’s my cloud, I can’t, but we know that governments make use of the private data, as well, with warrants and subpoenas.
So to make things short, which is difficult, I think Mr. Stenzel mentioned that he is seeking input. And I would welcome, if we have a fairly long, drawn out public consultation process whereby experts and the concerned public can provide input into the process about finding the right balance and finding the right sort of principles that would then inform laws and then regulations. And I think some of the principles were discussed at the NET Mundial meeting. But I think we need to have a full-fledged ongoing public input about the data collection making use also of the benefits that we are all deriving from it. Thank you.
>> JEAN-JACQUES SAHAL: Balancing privacy and my own security.
Thank you. Actually, maybe, just a show of hands. Who has got a loyalty card from a local store? All right. They know everything about you, you know that. So you don’t need to look far to online platforms, et cetera. People have been collecting your data and know everything that you’re buying and pretty much can deduct what you do every day from that. It’s not new. We have been giving that data away for decades. So let’s just keep that in mind.
Actually, maybe, it’s not that I want to play devil’s advocate, I’m just quoting something I heard yesterday in the IGA Deutsche land several members of the parliament on a report they have just done in Germany they are saying actually when they talk to people in the street, when they get out of conferences like this one and they talk to people on the street, people don’t seem to worry so much about privacy. I wonder if we did surveys, actual surveys what we would find. Do people actually really worry so much that they will take steps, that they will clearly some people protest sometimes. But I wonder how widespread.
>> I would expect if you ask people on the street do you care about privacy and data protection and do you want higher or lower standards, a huge majority would say higher standards and I care about it.
Then if you ask them how do you act? Is there something you have in mind if you choose your social network, for example? The reality is different.
Social networks in Germany, which were pretty much based on the idea of good data protection of privacy rules. But in the end, the servers get – well, a little bit I would say very general less sexy than others. For the users. And the choice was pretty clear for them. They all closed more or less down and they didn’t exist anymore. But the big one, the network we all know and use, probably, with a bad reputation on data protection and privacy in Germany, whether it’s fair or unfair, they had a huge success. So this is totally different.
In the end, I would say if I get something for a better price in my supermarket because I had this paper card or something like that, they say yeah, where can I sign? I will get it. And then if they know this is bad for privacy reasons and those are all sold and so on, so this is a very interesting phenomena.
On a general theoretical level, everybody would agree that we need stricter rules and data privacy and so on. But not that many people really act like this.
>> JEAN-JACQUES SAHAL: I studied marketing so I don’t have a loyalty card for my store.
You had a question.
>> The third question could then be: Do you mind paying twice the price of your neighbor because you’re much richer and you have a craving for that product? Did you get that?
>> This is the question of discrimination or not because of your profile and your data. I think it’s a very fair point. Is it fair or unfair? As a user of a specific computer or something like that, I have a better income, for example, probably and I get a higher price online. Is this fair or unfair? I mean, this could be regulated. You could say, well, it’s fair or not fair.
>> You could regulate it but could you control it and stop it?
>> RAINER: Well, if you have the principle of discrimination, for example, and you say well this is an unfair or unjustified discrimination, then you had a reaction on this and this would be competition advantage and so on. So you could react with sanctions, whether they are in the privacy law or data protection law or competition law.
But the first question is: What is the society – what is the answer of the society? Is it fair or unfair? And this could be decided in this way or that way, maybe. Because you have the same on insurance rates. My car insurance here in Berlin is higher than in bran den berg. I don’t have a garage here. So is it fair or unfair? I mean, society, they find some rules on it. But the principle of discrimination, I think, I agree I think it’s not the only one. There are others. And as soon as possible, I think we need some thinking, some debate on it. I think it’s good to initiate also the public debate on these things as soon as possible so that we have something more in our toolbox. And the ideal solution would be that those tools in the toolbox are similar all over the world.
>> I would actually, just for the last piece, I would probably say we need to go the other way, in the other direction. We need to not even think about doing this on a global level or even regional level but, rather, in specific sectors because the situation – first of all on this discrimination is perceived from an economic perspective neither bad nor good. And you have price discriminations, which are really good. If you think about the European market in itself, that’s really good. Some goods are offered to lower price in some of the lower income countries within the European countries than in the higher ones. So that is, I think, not the real risk. The question that for governments will be really hard to decide is: How much will be their role in defining where the society will cover costs for the individuals which have higher risks associated to them, which we only know and can now with all this data analytics define where we precisely and how much we will lose out a certain acceptance from the society to cover those risks that are associated to specific behaviors, specific DNA, whatever it will be in the future. And that is where we will see differences in regions, big differences in regions and eventual big differences in the European Union how these things develop. And then the second piece here might be that could develop also differently in different sectors if you think about the financial sectors specifically, the debate in Germany about Schufa and Schufa data has been ongoing for many, many years. And that multiplied in many, many different sectors. I think these are the areas where you will not be able to even think about implementing this in the data protection regulation. Currently even so, I agree a discrimination clause might be interesting to think about.
>> JEAN-JACQUES SAHAL: Thank you. I open for comments and questions from the floor, please.
>> I think there is no question thinking sharing the data or not. Because we can understand that amazing things can happen if there is a lot of connected data and the data can make a right suggestion. Where it can save the people’s life. For example, if you are sharing the data with your car about the roads. Just imagine there is another connected data with the weather and about the statistics, how far, for example, you are from place where the accident’s happening, usually. You can get some notification where your mobile message, be careful, this is some place where in this weather the more accidents happening. And you can drive in less speed. I mean, if we are getting the right data connected in the right way and we have the right laws that is not allowing the companies who are getting the data to not share them with third parties, everything will be nice. So we do not need to be afraid about sharing the data. We should just – need to be sure they are right laws that are not allowing them to sell or to share it with the third parties.
>> JEAN-JACQUES SAHAL: Thank you. I’m going to throw this.
>> Okay. Thank you very much. Jan Krancke, Deutsche Telekom. I just want to add a different perspective to the debate, and that’s a question of international comparison and different countries on how we treat it. So I think in general data analytics, we are talking a lot about, yeah, taking care of data security in principle, but for some services, for example, if you use Google services, you have to agree that data analytics takes place, in fact.
So, in fact, if you use android smartphone, you cannot use it full-fledged if you don’t agree to Google terms and conditions. So this is implicitly connected to the service itself. So the service is unusable without agreeing to these data measures.
So the first question, just for debate, there has to be some kind of a separation, using a service and explicitly agreeing for data usage or not. But nevertheless, being able to use the service. So this is more the European and German perspective with an implicit opt-in.
And on the international side especially in the United States, you have direct implicitly agreed to using all the data. So this is a different model which really calls for a discussion on the global level, what is the right level?
And then to add a further perspective of the question of the level playing field between the companies. Of course it’s extremely attractive for Google, Apple and all the other players to analyze data and make money out of it.
For a European company or for us as a telecom operator, we are simply not allowed to do it based on German and European law, very restrictively allowed to do it.
But on the international dimension, we are competing to each other. So there is an imbalance. I don’t say what is the right balance. But I just want to highlight that this is certainly a big issue which needs to be tackled.
>> Can I? Do you want? Okay. A couple of things. First of all –
>> CORNELIA KUTTERER: Microsoft’s advertising revenue is around 4 percent and not at 95 percent comparable to Google when it comes to analytics. I would be, as a user, more from a personal perspective, more worried about the use of Google analytics on all the Web sites that I’m visiting where I’m not even aware they are used. So on that specific one, Microsoft complies with European data protection laws. So the eventual area where data protection laws differ might be specific to your specific sector, which is a different issue. We can discuss this.
The talic communication – directive has some other rules than the ’95 directive. So there’s a network regulation and a services regulation if you try to translate those different rules in our nowadays terms.
Also there is a missing comprehensive data protection law in the United States, which at least Microsoft and I guess a lot of Internet companies have actually called for. In specific sectors the data protection is pretty advanced. And now I am coming back to something that I was actually advocating for in my former life a long time ago as a representative of a consumer organisation that what I was at that time always saying in this comparison is: At least in the United States there is enforcement. And so once the FTC does enforce, then the companies have obligations that very often are there for many, many years compared to our ’95 directive and the fines that you get here, which are not really impactful, to say the least. So level playing field? I think when we go into the details, the picture is not – is more blurred, more complex. But I would like to definitely point out that at least – and I cannot speak for everybody, for every company and for every Internet company. Microsoft applies in Europe, European data protection laws.
>> CLAUDIA: Yeah, I want to come in. I might echo some of the points that Cornelia has made. But certainly AT&T complies with European data protection law. The type of business that we have in Europe is certainly different than what we have in the U.S. because we are providing – we are connecting big multinational so it’s business to business. But of course we do respect the law that is in place, first of all.
Secondly, I wanted to stress that we care about our customers. Because ultimately they allow our business to grow. So for us, the trust of our customers is key.
And in the U.S., what we try to do, of course, is to be transparent, very transparent with our customers about what we do with their data. And certainly we won’t use the information of the customer, we won’t sell the information of the customers unless they ask us to do so. For example, we had a service which is called AT&T alerts, which can alert customers about some discounts or, you know, about commercial advertisement. But we won’t, of course, target customer unless they ask or they really want us to do so. And it’s explicitly highlighted in our policy, in our privacy policy. So, yeah, I just wanted to comment – to make these two comments.
>> JEAN-JACQUES SAHAL: You will soon have the app for windows phone, no doubt, Arman.
Other questions, maybe? Jan, do you want to come back?
>> JAN: I think the difference is possibly depending on the persons sitting around the table. So I totally agree that AT&T as Deutsche telecom will have a similar behavior. But what I mean is more those companies whose core benefit is data analytics, the whole business model is data analytics and selling data and making money. That’s all. And we all know that most of the pure OTT companies do nothing else but selling data, analyzing data and making money out of it. So this is a difference on this level. And they are, in fact, competing to classical telecom operators in some instances. So this is kind of a level playing field. You can question whether this is adequate or not.
>> I’d like to hear: We’re hearing some companies, few companies competing on privacy today. Can you give a perspective on that? Are we going to see much more of that? And are we – in Europe, don’t we have more advantages on that issue than, for example, American companies?
And the last thing, the foreign Minister of Germany said it was horrible about Snowden, but hasn’t his revelations caused a lot of money for U.S. companies?
>> CORNELIA KUTTERER: The first thing on competing on privacy, it’s a constant debate in our business. And if only the users would choose the browser based on privacy, that would help. So, yes, we do but it cannot be [Inaudible] last years., which sometimes had very odd reactions. Do not track is a good example. I’ve been extremely involved in trying to convince European consumer groups or independent groups to develop TPLs, different – actually more effective way of blocking third-party tracking. And it is so hard. And then we decided at one point to install do not track on by default and I think it was IE9 when IE9 came out. And that triggered a whole revolution in the standard where we actually tried to agree across-the-board on standards, which included the advertising industry as well as the consumer groups and to certain extent we got really barbed by both for doing, at the end, the right thing. But it was just not – and the standard’s still not finished. And the FTC, who was a big supporter at the time, has sort of changed. I mean, at the end all is political players. So they’ve changed focus. They focus on other things. So I don’t know exactly where this is going to. It’s a little bit related to having privacy by default, privacy by design discussions.
I personally believe – and that’s very European law-related – that the opinion we recently – which was recently published by the article 29 working party by Peter Hosting’s team on privacy in the context of consumer protection and competition law is the right way to think about these things. We have actually principles under European law that are helpful. And there are a couple of court cases currently whether those laws are applicable on data privacy policies. If so, they could be very helpful to data protection authorities. The fairness and the expectations limitation in that area is actually more consumer-friendly and is not so consent-based relevant.
So there are a couple of things you can think about on legal frameworks. And then there’s commercial practices that eventually at one point can be used. So Microsoft has made a commitment that we do not read Hotmail accounts to serve advertising. Other companies do that. And for if user to choose whether they want or not. So, you can, but at the end of the day, it is the user who makes that choices.
>> JEAN-JACQUES SAHAL: Thanks. Can I just complain here for defamation. There is a fake quote of me on Twitter. So just by and by.
So my rights have been breached somehow. Olivier, yeah?
>> OLIVIER CREPIN-LEBLOND: Thank you, it’s Olivier, the remote participation moderator. We have a question from Casper Bowden on Twitter. You may have seen it pop up on the screen. What is the EU legal protection for Microsoft or other U.S. corporations to hand over cloud data to the United States under the prism programme?
>> CORNELIA KUTTERER: Tell Casper that he can wait for the next panel. I’ll speak there, too.
>> JEAN-JACQUES SAHAL: Great. We’ll have at least one remote follower for the next event. Casper, please keep the questions coming. We love your questions. Yes, please.
>> Hello, my name is George and I’m from Cyprus and representing new media summer school.
My question concerns about privacy. And I remember a U.S. court of law in December 2013 that was asking the content of Emails from the Microsoft platform of au jour, to be precise, to get the data, to get the content of Emails affect the U.S. customer but having business in Dublin. But the data was, of course, given to the U.S. court of law. How does the privacy legislation in EU protect the EU data? It was established, was installed in Dublin.
>> CORNELIA KUTTERER: I missed a little bit at the very beginning because I was distracted with Casper’s Twitter.
[Laughter]
>> Do you want me to rephrase that?
>> JEAN: I think I understand where you’re aiming. So Microsoft moved European customer – Hotmail customer data to our data center in Dublin around 2010, more or less, don’t quote me on that. We have, obviously, business data B2B also there and elsewhere. You wanted to know how extra-territorial data access requests by governments can be reconciled with European data protection laws.
So let me start with what we are trying to do currently. And it’s really – it goes a little bit into the panel discussion we are having in the next plenary. But at least let me say so much. There’s currently a court case of the New York District Court going on. We have filed a submission last Friday. And I’m not a U.S. lawyer, so forgive me not being able to give you all the details. But the way we believe we can restore trust with customers has different pillars. One of them is forcing governments to act according to rule of law. And when we believe we have reason that a request does not fulfill that requirement, we will fight it in court. And that is what we are currently doing.
The background to that case is precisely that the court issued a warrant on data which is hosted on our data center in Dublin. And we have amicus brief from one of the core negotiators of the am let between Ireland and the U.S. We have am cuss brief from Verizon. We are hoping to get other amicus briefs from other, particularly in the EU, to force governments to use the legal procedures according to those, get to the data which actually might have legitimate reasons to get it.
>> RAINER STENZEL: On your question, Germany has a reaction to the debate in summer made a proposal for the EU data protection regulation for a new rule that if there was a request from a public authority in a third country to a company in Europe, that these companies need the authorization of the data protection authority to transmit this data. So Germany made this proposal before the parliament. The parliament made the same proposal. And this was one of the reactions.
So, the scope when it comes to, well, those questions of the data protection regulation is limited. The regulation is not about intelligent services and so on because the EU has no competence on these fields. But what you can regulate is the restriction for companies in Europe to transmit data on a request to third country public authorities.
If I have the microphone in my hand, I just want to add one other point which was here in the discussion a few minutes ago. The question was: How can we find more specific rules for specific sectors and specific services? And I think we agree in this room that it is more or less impossible for a legislator to find now at this moment solutions for Google street for the cameras, for “Big Data” for insurance companies and their border for discrimination and so on at the same time. So the only thing we can do is having more or less general rules. And this is what we already have in the ’95 Directive and which we will have in the new regulation, maybe better or more principles than we have at the moment, something like discrimination expectations and so on.
And then the crucial question is: How can we break this down to several services or specific areas of data processing? How can we make this tailor-made rules for specific services? And who should be the one to decide on it?
So the proposal of the Commission was that it should be decided by the Commission, with delegated AGs and so on. We thought this was not sufficient because it takes a while before you have this delegated AG. You cannot have it for every sector and every service. So our idea was some kind of a bottom-up regulation on a second level to specify and to have more detailed rules for specific situations and problems. You can call it some kind of self-regulation but it’s more in our proposal not the old set regulation we had in the former or the existing directive 95 or which is now in the proposal of the parliament, but a more sophisticated, more developed kind of self-regulation with the procedure where other groups of interest are involved in the process of finding these rules for specific areas. And I think this could be a way also to react flexible on different situation. And maybe it’s in the end a win/win situation. If it’s the perfect world. Because there is an incentive for companies to have more legal certainties. They have a interest in the self regulation, codes of conduct. And the citizens, the users, the consumers, they have an interest in bringing in their interests by, well, for example, a consumer associations and technical expertise and so on.
And this was our idea to have strong principles, strong ideas, strong, well, basic elements in the regulation itself. But if we cannot go further because it’s impossible for a legislator on the European level to make everything in detail for every service, then we need a second layer and then we need to involve also the data protection authorities in those processes, make it transparent. That’s another point. So this was our answer on the challenge of getting more tailor-made, rule-specific rules for a specific situation and therefore, for example, you could answer the question whether they can get higher costs or higher prices if using a different device for your online shopping.
>> JEAN-JACQUES SAHAL: Thank you very much. We are getting close to the end. And I actually really enjoyed this intervention just now because I’d like to come to some sort of recommendation if we want to or key take-aways from the session. So I would like everyone in the room to think about one or two. If there’s any final questions, please raise your hand now. Olivier, you’ve got one from the remote. You’ve got two. Is it from the same person?
>> OLIVIER CREPIN-LEBLOND: Two sizzling questions from Casper Bowden to Cornelia. So you might wish to punt some of them to the next session. [Inaudible] okay. So, Casper, if you’re hearing this, you’re loved.
[Laughter]
By Cornelia, at least.
So the first question is does Microsoft, by design, have a key to any Azur or Office 365 stored data which can be used to hand over plain text to LEAs?
>> CORNELIA KUTTERER: No.
>> OLIVIER CREPIN-LEBLOND: No, okay.
And then the second one is has Microsoft already handed over to the U.S. any data extracted remotely from EU data centers and/or FESA 702 or ECPA SCA? Microphone, please, yeah.
>> CORNELIA KUTTERER: We can say as much as we have disclosed in our transparency reports. So he can have a look at these.
And what we can say is there has no data been disclosed under those rules on our business partners. Have a look at our transparency report where we are as clear as we can under the current rules.
>> OLIVIER CREPIN-LEBLOND: There’s more coming from Casper. He could probably keep us the rest of the afternoon. So here’s the next one. Here is WP29 opinion which says DPA do have competence to stop third country spying. And he’s put a link over to it. I guess it’s just to feed into the –
>> JEAN-JACQUES SAHAL: It’s a comment.
>> OLIVIER CREPIN-LEBLOND: No, it’s not to you anymore, Cornelia. You’re safe.
>> CORNELIA KUTTERER: I might since it’s the – we are open to discuss these things. For us, this is important to – as a part of restoring trust. But I would at least note that there – in the revelation documents, there was one singular piece that has actually proven that Microsoft has not disclosed data voluntarily at any point. And the reason we can prove that is because at that very time in from where this document we started from, there was only one company, because the company names was stroken out which had the LCA legal and corporate affairs structure. And that was Microsoft. So we have not – we are a service company. We don’t run networks. We have never, in any event, voluntarily handed over data, nor do we have back doors in our software to allow any of that access. You can hear more of that in the next panel.
>> JEAN-JACQUES SAHAL: Thank you. I didn’t see anymore hands raised. So at a next EuroDIG for the followup session, I will invite Google and Cornelia, you can stand in the audience.
What I’d like to do, just to close off –
>> Yeah, just one thing to close off. We had two hubs that connected to us, the Albanian hub into and the Romanian hub and there were three other individual participants remotely. We worked out the technical problems we had at the beginning.
>> JEAN-JACQUES SAHAL: Thank you very much for joining us. Thank you, Albania, Romania. What I’d like to do just try to close off this session, just I’d like to ask each of the key participants for sort of the key take-aways or key recommendations. We’ve sort of had some already. How can we – and there is a huge potential for cloud and “Big Data.” And this is not as we’ve tried to explain it, it is not about economic gains for certain businesses; if’s about gains for, yes, our economies but also very much our societies and our daily lives. And we shouldn’t lose sight of that. So how do we reconcile the fact that we’ve got a huge benefits to gain as individuals and as societies whilst we want to make sure that our rights are protected, we have the sort of right ethical constructs, if I can call it that way, or ethical data usage. So maybe just a final word from each of the key participants. And we will start with Arman. And then we will go.
>> ARMAN ATOYAN: Well, thank you. Some closing words from my side are just elements I took from this discussion. I think we feel more engaged in still keeping on the search on solutions on these problems when I hear discussions like this. Because the point is that we need a new and modernized data protection or privacy law for Europe and also globally, I would say. You need some principles, some key principles. Because data processing is a global phenomena at the moment. And I take some specific points within from this discussion, some kind of a more public debate or general debate on opportunities and chances of “Big Data” on the one side and risks on the other sides and the reaction on these risks and how we could develop these elements for the regulation and some other elements I haven’t react on this but what he said on the consent and the different approaches in the U.S. and in Europe. This is a remarkable point, as well.
So we feel engaged and working even harder than we already do, my colleagues that are sitting here. So they are looking forward to doing so, I guess. And, yeah, I think we have to see not only the risk but also the chances. But we have to focus on the risk when we come to data protection and we have to solve them in a way.
>> JEAN-JACQUES SAHAL: Thank you very much.
>> CLAUDIA: Yeah, I will just reiterate it. One of the comments is to create flexible policy framework that allow innovation to continue so that we can continue benefiting from, you know, the evolution and the next wave of innovation.
>> Thanks for attending. And it was really useful because before. It was really we have not focused on the data privacy and we will think about it more. And I think, again, I should repeat, this has a lot of advantages. We just need to focus on the rights. And we just need to have the clear definition of everything. But we do not need to be afraid about the sharing and collecting data. Thanks.
>> CORNELIA KUTTERER: We should build on the ’95 privacy principles that have been proven quite useful and develop accordingly. I agree with Claudia, it needs to be flexible for not hindering innovation, but we also need to have strong protections for society. And let’s discuss this on more innovative ways, think about ethical issues, as I mentioned before.
>> JEAN-JACQUES SAHAL: Thank you. I don’t have a microphone anymore. Well, okay. I think these were excellent concluding remarks so I’m not going to add anything to that except to thank very much the key participants for this. Clearly we could be discussing this issue for far longer. There’s much more information I could share, for example, about EU’s cloud strategy which is looking at codes of conduct, about principles-based regulation. And that’s where we need to continue our the dialogue that EuroDIG or we need to continue those dialogues at national IGS and just generally any time we can. And thank you very much for all the participants and I’ll see you shortly at our next workshop. Thank you.
[Applause.]
Pictures from working group
Link
Session twitter hashtag
Hashtag: #eurodig_ws4]