How does Tinder develop the features that keep you safe?
Mashable
Who decides what we need to stay safe online? And how do they know what

Who decides what we need to stay safe online? And how do they know what features we'd benefit from?

At Tinder, one person playing an integral role in the dating app's safety features is Rory Kozoll, Tinder's senior vice president of product integrity. Kozoll leads the team that develops in-app tools and resources which aim to keep users' interactions respectful and safe.

Tinder has launched a slew of new safety updates and features, most notably a long press reporting function, allowing you to tap and hold chat messages to directly start the reporting process. This means it's now easier to flag harassment, hate speech, or any other offensive texts that violate the app's Community Guidelines.

SEE ALSO: Tinder will let you hide from people unless you like them first

72 percent of 18–25 year olds are as concerned for their emotional safety as they are for their physical safety, according to a recent survey conducted by Opinium on behalf of Tinder. The survey, which looks broadly at online interactions, also found that 40 percent of 18–25 year olds have witnessed hate speech online, and 30 percent of people admit to sending harmful messages online that they later come to regret. On top of this, Tinder is expanding its existing 'Does This Bother You?' and 'Are You Sure?' features to broaden its categorisation of hate speech, harassment, and sexual exploitation.

For women and marginalised genders, being on dating apps, social media, or just existing online in general, can come hand in hand with sexual harassment, receiving non-consensual, unwanted sexual messages, in addition to experiencing violations such as cyberflashing.

How does Tinder know which safety features users need?

Kozoll spoke to Mashable about how Tinder's safety tools are developed and the four main sources of information that feed into the process.

Our members will tell us something has bothered them and that will give us the signal that we need to unpack and try to understand what the offence may be, and how we can be a part of diminishing that offence, he says. The second source is the things we can see very clearly in our data. And the third is we work with a lot of outside partners, both in the gender safety space and in the LGBTQIA space and other underrepresented groups to inform us.

The fourth source is a little bit more art than science, Kozoll says, referring to product intuition. Tinder's own employees are using the app and they will report back and discuss their own experiences to inform what they think needs to change on the platform.

Tinder's 'Does This Bother You?' feature came from a real-life experience.

In the case of Tinder's 'Does This Bother You?' feature, a real-life incident led to this tool being introduced on the app. The tool uses machine learning to flag potentially offensive messages, prompting an automated message to appear for message recipients when harmful language enters a conversation. With this prompt, users have the instant option to report the bad behaviour should they wish to.

Want more sex and dating stories in your inbox? Sign up for Mashable's Top Stories and Deals newsletters today.

Prior to this feature being released, Kozoll and his team had been looking into categories of offensive messages. When it comes to what Kozoll describes as more forward talk (read: sexually explicit messages), the key factor to consider is consent.

People may open the door to, let's say more forward talk. We want to make sure that we're always toeing the line between keeping everybody safe and making sure everybody's comfortable, and also not imposing ourselves and our own values upon upon our members, he says.

Kozoll says he and his team are constantly observing real-life examples of the problems people may encounter on the app.

I was out to dinner with my wife, walking to a restaurant in Santa Monica. This car drives by with these young guys and one of them leans out a window and catcalled. When I turned around, I could see there was a young woman by herself walking behind us. You could just see her visibly become uncomfortable with the guys catcalling, he explains. They kept driving and out of instinct I just turned around and said, 'Hey, are you are you OK? You want to walk with us?' Turned out she was walking to the same restaurant. In that moment, Kozoll's wife told him, You don't know how rare it is for somebody to actually just ask 'are you OK?'

That was the seed — just because we don't know for sure that these messages are problematic for this person, it never hurts to just ask them if they're OK. And that's where 'Does This Bother You?' came from, he adds.

What actual role does Tinder want to play here?

When it comes to the challenges that Tinder's team faces when considering safety needs, Kozoll says it's about figuring out where the right line is between making sure everybody's comfortable, but also giving them the freedom to express themselves and have the kind of conversation they want to have.

We see ourselves as the host of a party and we've invited all of these guests. We hope that people will hit it off and that they'll meet somebody exciting and new. We're not there to tell people how to talk to each other. But we are there if somebody looks across the room and gives us the look to say like 'hey, I'm really uncomfortable here,' we have to step in and help resolve the situation. Sometimes that means asking somebody to leave the party, and that's the role we try to play, he says.

SEE ALSO: The best dating apps and sites in February 2023

So, why has Tinder widened the scope when it comes to hate speech? Kozoll says it has to do with the ways in which language evolves in society.

Language is constantly evolving, emoji is constantly evolving, people are getting more and more creative, they're not trying to evade anything we're doing. But just the language is changing all the time, and so we're having to adapt really rapidly to that, he says.

As we evolve our understanding, we're going to be constantly updating these models, Kozoll adds. This is a forever stream of work, evolving these machine learning models and keyword lists to make them better at at identifying the context that these words are showing up in, and the new words that are showing up in the lexicon as well.

Read more about staying safe in the online dating world:

More No more data.