The news is by your side.

I gave my Tinder lover £22,000 to invest in crypto – then I discovered the truth

0

A TINDER scammer has defrauded a victim of thousands of pounds by posing as a young, handsome and wealthy bachelor from New York.

This dream girl was part of a new wave of AI chatbots, plaguing dating apps with fraudulent money-making schemes that have left a trail of heartbreak in their wake.

2

AI-generated images, videos and chats are becoming more sophisticated (STOCK)
Peter first came into contact with the AI ​​fraud when he used Tinder

2

Peter first came into contact with the AI ​​fraud when he used TinderCredit: ALAMY

Peter, 43, who did not want to give his surname, told The Sun how he was scammed out of $22,000 (€17,337) in a tragic rom-con.

He revealed how he was wooed by a beautiful woman, claiming to be 24, when she first messaged him on Tinder.

The couple spoke via app, on the phone and even via video calls, with him none the wiser.

But after being lured into a scam trap, he realized the whole thing was AI-generated.

It was a quick Google search of the images that made him realize that the photos used by his ‘girlfriend’ were fake.

Experts told The Sun that Peter’s story is a terrifying cautionary tale as AI becomes increasingly powerful.

And most importantly, AI creations are becoming increasingly difficult to distinguish from reality, making it even easier for them to ensnare their victims.

Peter realizes that the first red flag was the fact that his “lover” messaged him first.

“No 20-year-old woman would first message a middle-aged man on a dating app,” Peter told The Sun.

He added that he has been single since his divorce ten years ago, and had desperately used dating apps in the hope of finding someone.

Although Peter said everything seemed fine when the pair first started talking.

I’m not sure I’ll ever find anyone now

Peter

They spoke on Tinder for a few days before quickly switching to WhatsApp.

Tinder says it has a dedicated fraud team designed to scan for fake profiles and warning texts – and also urges users never to send money to people they haven’t met in person.

The couple then started talking on the phone and even making video calls a few times.

“I fell,” Peter said.

‘It seemed perfect. She was beautiful, well educated and very easy to talk to.”

The AI ​​chatbot had told Peter that she was a wealthy Asian woman living in New York.

“She said she owned several properties and she was an investor.

“She told me she had made her money in cryptocurrency and that she could help me do the same.”

After a month of talking, Peter said he thought he had found a real connection and that he trusted her enough to invest his money.

He said he initially sent the bot a $10,000 (£7,880) installment, and after receiving “proof” of his successes a few weeks later, he sent another $12,000 (£9,457).

It was only after sending the second large sum of money that Peter began to suspect that something was going on.

He said he started noticing how many of her messages came in twice on WhatsApp.

Out of curiosity, Peter proceeded to reverse-search some of the images she had sent him on Google – soon wishing he hadn’t done so.

“When I found her images online, it said she was an actress from China. My heart sank into my chest.

“I knew then that it was all deepfake AI. The video chats, the phone calls, everything was fake.”

Peter says he has been in contact with his bank about the scam, but it is currently unclear whether the money will be returned to him.

“I’m not trying to beat myself up, but I think this is a sign for me to stop using dating apps and stop dating in general,” he said.

“I deleted Tinder and WhatsApp and blocked and reported the account, but it was too late.

“I don’t feel like dating at all. I’m not sure I’ll ever find someone again now.”

“Really convincing deepfakes can be used relatively easily for highly sophisticated social engineering attacks

Chris DyerTechnical consultant at cybersecurity company Ultima

According to Cyber ​​Security Technical Consultant Chris Dyer, the use of AI in dating fraud is becoming increasingly common.

He told The Sun: “Scammers use AI to set up convincing fake profiles and conduct automated chats without having to spend hours themselves to get a return on their investment.

“Really convincing deepfakes can be used relatively easily for highly sophisticated social engineering attacks.

“Today you can use AI and deepfake technology to create a completely fake online identity that even works with live video calls.”

Dyer warned that AI technology is becoming so advanced that validating someone’s identity over a video call can no longer be trusted.

He says the technology has become so user-friendly that it’s incredibly easy to mimic a seemingly real person through live conversations.

He fears this will add another layer to trust issues.

“It used to be that we couldn’t trust everything we read online without supporting evidence, but now that we have AI models that can create realistic and imaginative scenes purely from text input, even that confirmation can be easily spoofed,” he says.

“My biggest concern for the general public is that not enough is being done to raise awareness of this potential problem.

“I foresee many victims of scams where they are presented with too much plausible and believable content, which then prompts them to send money to someone the target thinks is a loved one.”

How to protect yourself from AI scammers

Be critical of everything you see online

Dyer warns that fake images and videos are becoming increasingly widespread.

That’s why it’s important to stay alert and not take everything you see online at face value.

Never transfer money without research

Generating heartbreaking and compelling stories or images is easier than ever before.

Scammers can do this at the touch of a button and ask you to send money through channels that are difficult to trace, such as crypto.

If you are asked to send a significant sum of money, you are advised to think about this.

You must independently verify someone’s identity before acting.

Check unexpected calls

69 percent of people have difficulty distinguishing between human and AI-generated voices.

Be wary if you receive a call from an unknown number.

Even if the voice says he or she is a friend or family member, be sure to verify the caller’s identity.

You can do this by asking specific questions that only they can know.

Experts also recommend paying attention to:

Foreign body parts

You need to pay proper attention to people or animals in an image.

AI is known to struggle with the details of living things, especially hands.

It’s not uncommon to see AI-generated images with abnormally long or short fingers, missing fingers, or extra fingers.

Ears, eyes and body proportions are another sign of AI involvement.

Absurd details

AI has also been known to mess up the rendering of everyday objects.

Glasses, jewelry, and wearable items are just some of the things it struggles with.

Some AI-generated images have pens placed upside down in the hands.

AI often forgets to match earrings, or to ensure that rings go all the way around the fingers.

Strange lighting or shadows

Beware of seemingly absent shadows and lighting.

Sometimes AI can create a shadow that points in the wrong direction, or provide lighting that doesn’t make sense given the setting.

AI also tends to make the skin smoother, freeing people from the blemishes on real skin.

Weird backgrounds

There are some subtle nuances in AI-generated backgrounds that you should pay attention to.

Unnecessary patterns on walls or floors can give away AI, especially since the pattern can change abruptly.

Dyer added that scammers are using AI to enhance their phishing attacks, many of which revolve around cryptocurrencies.

He explained that these types of fraudulent attacks are perfect for AI due to the “semi-lawless nature” of the digital currency world.

It is much easier to get victims to funnel money into channels that are difficult to trace, making this an excellent opportunity for scammers.

He said: “Scammers are gradually creating a false ‘buzz’ around crypto scams, which can start with market manipulation and compromising trusted sources.

“AI can then follow up to create feelings of FOMO and a sense of urgency to trigger actions without their targets having time to think through the consequences.”

But Dyer insists it’s not all doom and gloom.

He said: “Although AI is used maliciously in the dating world, it is not all scams and fear.

“AI does some really useful things in the background, like refining the way we find matches and keeping the creeps at bay.

“It also helps to eradicate those fake profiles: AI fights AI.

“There is a lot of potential for AI to make finding love smoother and easier, if used wisely by the right people.”

Dyer said it is important to remain vigilant and aware of the potential problems with AI. But he has hope for the future.

He concluded: “It’s a bit of a tightrope walk, but with some smart regulations and smart implementations of AI technology, there is hope for a future where AI helps more than it hinders.”

Leave A Reply

Your email address will not be published.