Dating App Filteroff matches scammers with bots

The Tinder app logo on a screen is blurred by reflections.

Tinder is reaching its 10th anniversary, which also marks nearly a decade of constant attempts to fend off bots and scammers on the platform.
Photo: KIRILL KUDRYAVTSEV/AFP (Getty Images)

It’s been almost 10 years since swiping left became the gesture of choice for millions of daters with the advent of Tinder. Ever since this app first appeared on September 12, 2012, it seems every app has tried to find the best way for users to browse potential dates at a pace that can setting the best-meaning thumbs on fire.

Although it’s been a decade, what so many of these apps still struggle with is manage bot, spam and scam accounts. Gizmodo has already reported on the thousands of people who told the FTC they were tinder scam during the pandemic. Innocent people looking for love have reported being scammed out of tens of thousands of dollars, being tricked into providing credit card information, or even being threatened after refusing to pay their fees. It took Tinder until last year to introduce identity verification to most of the app’s global user base, although it remains voluntary for the majority of users. The idea of ​​being cheated on a dating app even got a world premiere thanks to the popular Netflix documentary The Tinder scammer.

Bots and scammers are rampant on dating apps. A study 2017 published by researchers at the University of Southern California pointed out that determining whether a user is a bot is particularly difficult because there are few ways to view users’ profiles without interacting with them. These accounts often seem more legit than not, with original photos and other social media accounts. Scammers are even tougher because even when you delete a predator account, they can easily come back to the platform with an entirely different identity.

Well, a dating app had a new approach to fight scammers and bots on their platform– turning them against each other. In a August blog post, co-founder of video-centric dating site startup Filter Off, Brian Weinreich, said that when a suspected scammer first signs up to the site, they’re placed in a so-called “Dark Dating Pool” away from other users. The developer said his small team flooded the pool full of Based on GPT-3 chat bots and collected the most hilarious examples of scammers tempting rip off a being with no compassion or love (sorry, but IA just isn’t there yet).

Weinreich wrote that all chats are encrypted and that they “err on the side of caution” when placing users in the dark dating pool, which could mean some potential scammers are sneaking in. In an interview Wednesday with Tech Crunch, Weinreich said he used algorithms that sort accounts based on how fraudulent users most often sign up for the app. Oddly enough, these scammers will apparently try to scam each other, arguing among themselves over who should send a $40 gift card.

“We probably have over 1,000 scammers that I know of who are only actively talking to bots,” Weinrich told TechCrunch.

Although Gizmodo maynot independently checked much of what the developer claims, reading these posted chat logs between bot and scammer is like watching the Aliens vs Predator of the scummy dating scene. Robots are chock full of canned autoanswers that even a basic question gave rise to repeated or even outright contradictory responses. How does a bot respond to “Are you on Whatsapp? Well, first he says “no”, then “no”, then “no” and finally “yes”.

Here are some of the best clips we found from when a scummy scammer met his match against the most stubborn adversary imaginable: an abandoned AI bot.

About Sandra A. Powell

Check Also

Meta sued for tap-dancing around Apple’s new app privacy rules

from private theater department Last year, Apple received wide coverage about how the company was …