Freemium Dating Apps: Risks and Opportunities for Dating in the Digital Age – Bigstage 04 2022

From EuroDIG Wiki
Revision as of 14:53, 12 July 2022 by Eurodigwiki-edit (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

22 June 2022 | Start 13:15 CEST | SISSA Main Auditorium | Video recording | Transcript
BigStage 2022 overview

Session teaser

Everyone has heard about GAFAM, but online dating apps like Tinder, Badoo, Lovoo, etc. play often a much more important role in the everyday lives of individuals. Yet, this seemingly private topic is rarely discussed in political fora. We are here to change that by inviting youth representatives, free dating platform spokespeople and academics to our Big Stage session at this year’s EuroDIG*. Topics include security, privacy, algorithms, mental health.

Session description

The session has as of now centres around four key areas:

Security: Many users are concerned about their security when meeting people in real life and they would like platforms to provide them with more tools to ensure their safety. For example, passport verification and recommendations on how to stay safe would be welcome.

Privacy: Often, privacy policies on dating platforms are not very transparent. Providing more options on how to handle your own data would be great as well as knowing with whom it is shared. For instance, this could be part of a paid subscription model.

Algorithms: Unfortunately, the way the dating platform algorithms work is not very transparent either. Tinder used the Elo rating system in the past but now uses a different one. Since online dating has real-life consequences, many users would like to know more about why they have suggested a specific pool of people.

Mental health: Some users are not very popular on these platforms. This can be a problem for the self-perception and the self-confidence of young people. It can also create unhealthy beauty standards. Users are nudged into paid schemes to increase their popularity.

Overall, the challenge is to find a good business model which minimizes bad impacts on individuals. The hope is to introduce consumer-friendly competition in the free online dating market. This would also pre-empt regulation from governments and give such companies a competitive advantage. Mentimeter is going to be used during the session to encourage audience input.

Format

Big Stage: A fireside chat with audience involvement via Mentimeter.

Further reading


People

Key participants:

Moderator/Coordinator:

Video record

https://youtu.be/qFEpZUpEML8?t=12125

Transcript

Provided by: Caption First, Inc., P.O. Box 3066, Monument, CO 80132, Phone: +001-719-482-9835, www.captionfirst.com


This text, document, or file is based on live transcription. Communication Access Realtime Translation (CART), captioning, and/or live transcription are provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. This text, document, or file is not to be distributed or used in any way that may violate copyright law.


>> NADIA TJAHJA: Next we’re going to the BigStage discussion, Fremium Dating Apps, risks and opportunities for dating in the digital age. I ask Fabio Monnet to come to the stage. Hello. Great. This works.

>> FABIO MONNET: Good afternoon, everyone.

I would like to ask the people that are in the room here, since we’re quite a small group to maybe get a bit closer to the middle.

I know like some people from the EuroDIG, they’re going – the YOUthDIG, they’ll join us later, they’re taking a photo right now outside.

Are we ready to go? Great, okay.

I see Nicholas. Hey.

Maybe we’ll wait for Jessica who will join us soon as well.

>> I’m already here.

>> FABIO MONNET: Okay..

Can we get Jessica back on screen? Perfect, now we have everyone.

Great, nice to see you.

So I’m actually excited about this session. I think the topic of Freemium Dating Apps, the application, it is something that is not very much discussed at Internet Governance Forums unfortunately. So the focus is usually more on big tech corporations like Google, Facebook, Amazon, so on. I think it is important that we raise this issue, especially since online dating has become very popular. It is growing in popularity, even here in Europe. Also what is happening, during the coronavirus period, of course, since dating, meeting people in person has been a bit more difficult and this is growing even more in importance.

I’m excited to welcome our speakers here. Maybe going from top to bottom here on the screen. We have first of all Nicholas, he has developed an app which is a dating application without profile pictures.

We have Jessica, a well-known researcher in the field of dating applications and dating algorithms.

Yeah. Very cool to have you here.

Finally we have Amanda. You’re joining us here from South Africa, if I’m not mistaken, and so she is part of the equality now NGO which is there to promote women’s rights, also in their online space.

Maybe you can say a few things about you in a few words. Start with Nicolas.

>> NICOLAS: I’m from Switzerland. I founded my dating app in the year 2019. It is a dating app without images but I think it was – the reason why I created this app, it is because I had other social media platform before that, you could chat anonymously and a lot of people were chatting about dating, so it was like a public forum where you ask questions and they ask about a date, something like this, and if people are interested in dating without image, even a platform that is made for that. Why not creating a platform that’s just meant for dating without image? Yeah. That’s why I created this and that’s the reason why I’m working in this sector.

>> FABIO MONNET: Thank you very much. I’m excited to hear more about it. I think it is really a great idea since at most, dating applications are based a bit on some superficiality I would say, maybe this goes in a different direction, not so based on just physical things.

Jessica, maybe you want to say a few words about you.

>> JESSICA: Thank you for organizing this, Fabio. It is a really – I followed the other conference and it was really interesting.

I’m Jessica Pidoux, a post-doctoral researcher in Paris where I am working in a project of citizen science where we study cooperation practices between citizens and researchers. I’m also Director of personal Data IO, an NGO, it is fighting for personal data rights and we build data governance models with engaged, concerned communities in a specific theme like dating, mobility, gig work. And my research, my doctoral thesis was on dating apps and dating app algorithms where I mainly focused on not just the matching algorithmic system and how the matching process is produced between multiple stakeholders, which are the developer, the user, the algorithm systems, and also the geographical user interfaces.

>> FABIO MONNET: It is nice that you also mention I think the graphical interfaces because I think the interface design, it is not just users into some kind of a behavior. We will see that. I would like to keep people swiping, continuing to swipe.

Then we have finally Amanda.

Welcome.

>> AMANDA: Thank you so much for having me here today.

I am Amanda and I’m an Equalities Now digital law and rights advisor based in South Africa. Equality Now, it is an international Human Rights organization with the mission to achieve legal and systematic change that addresses violence and discrimination against all Women and Girls around the world. I work at the intersection of tech and law, and I provide digital rights and legal expertise around campaigns to end online sexual exploitation and abuse and to make the Digital World safe and secure and an equal place for Women and Girls.

I look forward to today’s discussion.

>> FABIO MONNET: For the audience, a basic plan, you know, how we’ll proceed here. The idea will be to focus on four big topics and we’ll start with cybersecurity. I just will ask a question to one of the input speakers and then we have a bit more of the open discussion where people here in the room and online also explicitly would like to raise their voice.

The first thing, cybersecurity: So cybersecurity of users of online dating apps and then it is followed by questions of privacy and then a question about how the algorithms work and finally about equality on this platform.

Yeah, I would like to start, we know that, you know, sexual violence is a big problem on these dating platforms. What do you think platforms like tinder, Bumble, other stakeholders could do to improve the cybersecurity of the app, especially for women but also for any user?

>> AMANDA: So, some of the things that can be done, some are being done, I will highlight what they’re doing and what are some of the challenges that we’re seeing.

The first thing that the dating apps could do, it is the background checks. So just matching a new profile with any list that you have in the country, this is usually done on a lot of the paid sites, not really offered on the free sites, on the free apps. The challenge also with this, it is that there is quite a lot of countries that don’t necessarily keep the sex off the register. The other thing that could be done by the platforms, it is to work together with data-based violence protection groups or any of them working with survivors of any form of violence that’s occurred on platforms or similar platforms.

It was to help them see what are some of the pathways, what are some of the challenges that the users are facing and how base ready they are with the features. The features such as are you sure, that’s just asking are you sure you want to see this message, having that feature, it is great. Tinder has it, but even Tinder has found that it is only 10% effective. 90% of the time, someone will still send the content that they want to send anyway. There is the bother you feature, if you want to unmatch with someone, you’re asked a series of questions, you know, such as did they do anything to offend you, send any offending, inappropriate content or message to you, all of these questions to make sure that there wasn’t any harm to you in anyway before you actually say no, I’m not interested in this particular match.

Having an ability to block a user, there is also a way that platform cans assist. The most important, it is these features and these measures have to be easy and accessible. As a user, you need to be able to find the features easily, you need to report any form of abuse on the platform. You have to report it a lot easier and the apps should be proactive when they respond. Not just sending a generic response, but also being quick and making sure that they actually read what the harm is, what the report is and then get back, we have receive it had, we’ll contact you in 48 hours and they never do that. This is some of the things that, you know, the apps can technically do.

>> FABIO MONNET: Okay. That’s already a lot of ideas. I like especially the one you mentioned in the beginning, the apps could start to work more with the trust system and also for countries to see, okay, where are the repeat offenders. Also to make it way easier, as you said, in the end, for users to report things and to be sure that there is a follow-up. A lot of times, it is just a report and you don’t get a reason of what happens and it is not very clear on this platform.

Maybe I can pass this on to Nicolas. You have a dating app. How do you perceive this problem on your platform? Do you have something to add on this.

Nicolas.

(Speaker on mute).

>> FABIO MONNET: You are muted, I think.

>> NICOLAS: Well, actually I think the first point that Amanda mentioned concerning the list of dangerous people, I think it is interesting. Actually I don’t think companies will get access to these lists. They’re not public. So it would be very difficult to really have this feature.

Concerning my app, well, of course, there is features to ban use, to report users, also to block users. We really look at them in a timely manner and to check what happened, what was reported exactly, look at the pro foil for example or the incident that was reported and then really directly block the user from the platform so stuff like this can’t happen anymore.

What we also do, we have a list of dangerous words, stuff like this.

For example, if users put words like this on the profile, the profiles are already looked at in advance so that we can try to prevent as much as possible before stuff happens and these are important points. You cannot prevent everything, even if you tried to, but I think it is a task for dating app providers to do as much as they can.

Yeah. I think that’s important.

>> JESSICA: Can I comment on that?

>> FABIO MONNET: Of course.

>> JESSICA: Yes. I think it is very important that we put dating apps on a political discussion because the dating app, they’re politicalized. Also because of the data that they collect, right, the dating apps are collecting sensitive data, sexual orientation, ethnicity, political orientation, a granular description of your body which is at the source of aggressions. Not only the picture, but how you define yourself, and it is very sexualized in some apps and you can even describe your underwear.

What we have seen in a data collective that I lead, it is called data and privacy, we work with users of dating apps and victims of sexual violence in there. One of the problems is that the algorithm system reinforces the exposure of these victims to sexual predators.

On the algorithm, it is learning who this type of man or aggressor we have identified namely men, these aggressors like what kind of women and the women without knowing, they’re liking these type of men that are continuously presented on the app.

I’m a sociologist with an interdisciplinary background with engineering and data science. I can explain how the algorithm, how the apps and the data that is collected, it is reenforcing this kind of aggression. I have a question more to Amanda, how – the problem that we saw with data and privacy with some victims that contacted us, it is that they were raped in a face-to-face meeting, and they didn’t keep a trace of the contact, the match, it was deleted, so the victim didn’t have any trace at all. Then they request – we advise them, you have to recover your data, right, your data access rights. Tinder didn’t give anything. Then Tinder, it is even in the decision, in its dialogue, injustice with the police, what happens, the police do not defend the women, it is a very, very sensitive, touchy context. Right.

Two people are there, two meeting to have sex, to establish a relationship, it is very hard to put the border of – at least for the police, it is very hard for them to judge when it is really harassment or not or violence or not.

Then the second problem we saw, from a legal aspect at least, it is very hard to justify a digital case.

What’s your thought on that.

>> AMANDA: I’m glad you raised those questions.

So one of the big challenge, not only around the dating apps, but the online platforms in general, it is around liability, you find in a lot of countries, in the E.U., the stance that’s applied, it is that you’re not necessarily required as a platform to actively moderate full content, you do need to look at full content that’s either against the law that would constitute child sexual abuse material or terrorism or there is a definition of content that you should be on the lookout for.

Apart from that, they’re only usually reliable in instances where they are made aware of the harm or they should have reasonably been aware of the harm. Reasonably been aware from a legal perspective usually would be either the content went viral, so, so many people became aware of it, there is no way that you wouldn’t have known as the platform. Otherwise, it is usually women reported and there are issues around reporting, when it comes to sexual violations, and again, like as mentioned, it could start off being an image shared, inappropriate, didn’t ask for it, sometimes it ends up you made this particular person, they either violate you, harm you in some way in the physical space – met – and the question is, what laws apply and this is with the challenges now coming up, so if you want to make a request for the data, for the chats, a lot of this usually happens in the chats, it is not necessarily on the person’s profile, so if you want to make a data request, that’s what a lot of the platforms call them, a data request, you make that request, a lot of platforms would ask that this request is made through law enforcement.

A thing that we are asking for in Equality Now, it is that this be expanded to not just coming from a police saying we have a report, can you please share this particular information with us, but could it be from an interested party, could it be from the victim themselves, that’s one.

Two, it also still is very difficult to get this data. It is much easier when it is children. The moment it becomes an adult women, there are challenges around getting the particular data that you would need to be able to get the evidence that you would need to be bringing to court. There are a number of steps that victim survivor would have to go through. It also goes back to our laws.

A thing that the platforms could do, if I’m to delete as a user, if I’m to delete messages or other things, that information, that it be kept and then you make it easier for me to be able to access it when I need it and that information should be kept for a reasonable period.

You’re right, there are challenges and challenges with that, but as a victim, sometimes they’re not going to come forward in the space of a year, sometimes two years, you will also need to bare that in mind. And it may be deleted for a number of reasons, my side as a user, it is easier if I make a data request, to be able to get that particular message because sometimes without that information I can’t make a case against the particular perpetrator even with a law in my country that says this person was harassing me online and I should be able to find out who that individual is even if I have never met them in person. I need a lot of information from the dating app itself.

Thank you.

>> NICHOLAS: Keeping the data, that’s not the problem, because you could not delete it if someone requests for that data. There are laws for privacy, policy, stuff like that, that say that the user has a right to delete all its data. If a user say, for example, delete my profile, delete everything, you cannot just not delete the data because this would not be allowed because of privacy policy reasons, so actually I think just not deleting chats if someone wants to delete the chats, I think it wouldn’t work. How do you see that, Amanda?

>> AMANDA: It is actually – that’s an interesting dimension that you have put in, that an individual request that you delete, which is well within their right. This is where you would now need to balance between the right to privacy and the right to protection online. In the instance where someone has come to you as the app, they said I wanted to delete this, you’re not aware that they have done something wrong, they just requested this.

It is – I cannot give an answer for that. I suppose this is where, you know, the different minds come together and find a way to make sure that, you know, we’re able to still keep people safe online. Again, the online space is just mimicking what happens in the real world on in real life.

It is quite challenging, I’m glad we’re having this discussion. I would love to carry this on.

>> JESSICA: I have one proposal.

>> FABIO MONNET: We have to keep a watch on the time as well. Maybe last point on this one.

>> JESSICA: Another value for new apps is to think about reciprocated systems, not just designing a app for one way, like Amazon, Netflix. If you do it for Martin, you must be able to do it for all of the decisions. Should I delete this chat, not, should I take into account this or not. You take into account the two parties decisions and another proposal, it is to create a real moderation chain in the relationship process that’s been established online.

>> FABIO MONNET: Thank you for this maybe we’ll go for one more topic, maybe it is better to ask a broader question, how would the perfect swipe-based dating app occasion look like in terms of taking into account the cybersecurity of the users and the privacy, that we have seen already there is tension there and also while making sure that the algorithms, they don’t show you results that could reinforce violence or reinforce demographic patterns of how people meet?

>> JESSICA: Can I answer? I think it goes with the two statements I did before. Thinking about reciprocity systems, cybersecurity of relationship throughout the whole process when creating a profile, when having conversations online, when deciding to meet offline which some apps even ask did you meet offline or no. We’re actually – the apps, they’re configuring relationships online and offline. This is important. I think another way of creating better dating apps is to engage with real users. What I have seen, the developer, the founders create their apps based on their own experience and they disconnect from the user’s interest. You discussed this earlier in the conference, how to create multistakeholder approaches in pleural. This is why considering the users’ interests, this is what we do with dating privacy, we give a voice, a space to users to say what are the problems what, are the concerns, how they can do something with the dating apps. This should not be only done with NGOs like ours, but with the dating companies themselves directly.

>> FABIO MONNET: That’s an interesting idea, I would like to bring the multistakeholder approach to dating apps and implemented in a future where people can have feed become and also that this is really followed up then. Yeah.

You wanted to say something.

>> NICOLAS: Yes. I would also say something.

This is not only the task from the dating app companies. Maybe like most dating apps in general, on the app store from Apple, Google play, from Google, and they have their app store review guidelines, Google play review guidelines. I think these companies like Apple or Google, they could – because they have long-term views and guideline, stuff like this, I think it would be very good if they would, for example, add more stuff that has to be done in dating apps, that there was a general standard to enforce. You have a dating app company, it is very difficult and takes a lot of time and monitoring to implement new stuff. It would be easier if there were some standards that every dating app has to do to improve these things. Yeah, to conclude that, Apple, Google, they should enforce this a bit more.

>> FABIO MONNET: Okay. Excellent point. I guess Google and the stores, also from Apple, they could basically make the standards. I think also in terms of search results, you know, maybe not favor directly applications that are already very popular and I guess there were some connected interests there.

Amanda, do you have – what would it be for you, the vision of an ideal dating app.

>> AMANDA: Well, I have to agree with the points that have been raised. Definitely there is applying a human-wide space aid approach in the policies that are taken by the dating apps. I think that this is a situation where you have to balance between the different rights, the different interests of all users, and it will all go back to having the user in mind when developing that. It is useful. As users, we do want to use it and we want to make sure it is safe and secure.

>> FABIO MONNET: You heard about critical aspects of online dating. Do you think there is also a lot of chances or opportunities that the online dating space opens up.

>> NICOLAS: Of course, there is.

The user apps, they want to find love, so there is a good opportunity to find new people to get to new people, people are not just using the apps because they’re bad so that they cannot provide many advantages, even, of course, there are risks.

I think if the advantages wouldn’t be there, people would just not use it and the apps wouldn’t be that popular. Yeah. There are also very many good reasons to use the apps.

>> FABIO MONNET: Thank you. We’re running out of time. Maybe will is an audience question from online? There is no question there.

So well then, thank you very much for the discussion. I would love to go on. Unfortunately, we don’t have so much time space here. Everybody, let’s stay connected and continue the discussion maybe via email.

What I also wanted to maybe give you all a flow, a one-minute statement that would maybe summarize what do you think is the most important message that the audience here should take away. So maybe Jessica, you can start.

>> JESSICA: I mean, hard in a short time.

I think there was this question about Apple and this comment about Apple and Google that we have to take seriously. They are dominating now the market, the market of mobile app development. They put the standards on the pricing, by the way, there is a new regulation for dating apps in stores. This means that innovation is limited. That companies that created dating apps are submitted to the power of Google and Apple to dictate how to create dating apps and in general, how to innovate.

Unless we create a new power, a collective power with society and with new business models out of this power we’re not going to find the solutions.

>> FABIO MONNET: Thank you very much.

>> NICOLAS: The data has shown many different aspects on problems in dating apps and it is important that dating companies know their responsibilities and try to prevent as much as possible and I think it is also the task of every user to be cautious when using dating apps when going on dates. In the end, I would say there are many good reasons to use dating apps. Don’t fear them or think that they are something bad. If you use them, don’t take them too serious and know that it is just a game.

Of course, never forget to go with open eyes through the real world, in fact the place that you’re more likely to find your perfect match, it is in the real world.

>> FABIO MONNET: Thank you very much.

Amanda.

>> AMANDA: I want to say that the platforms, the dating app, the social media platform, they have a responsibility to make sure that people are safe. It is the same thing if you’re at a town square and you bring people to it, you know, you are asking for users to use the specific platforms and it is always good to make sure that you’re keeping citizens safe and you have made sure that you have some form of a reporting mechanism that’s easy to use and also to cooperate with both law enforcement and victims that report incidents.

>> FABIO MONNET: Great.

This session was very interesting. Thank you for coming, for joining me here.

Yeah. I wish you the best of success in all of your endeavors.

>> NADIA TJAHJA: Thank you. Thank you very much, Amanda, Jessica, Nicolas and thank you, Fabio, for the moderation. It was are very interesting session.