AI voting fraud is on the rise. Here’s how to stay safe, according to security experts
- According to security experts, the number of AI voice clones is increasing
- Voice-controlled AI models can be used to imitate loved ones
- Experts recommend agreeing on a safe sentence with friends and family
The next spam call you receive may not be a real person, and your ear won’t be able to tell the difference. Scammers use voice-activated AI models to automate their fraudulent schemes, tricking individuals by imitating real human callers, including family members.
What are AI Voting Frauds?
Scam calls aren’t new, but AI-powered calls are a new, dangerous breed. They use generative AI to impersonate not only authorities or celebrities, but also friends and family.
The advent of AI models trained on human voices has opened up a new area of risk when it comes to phone scams. These tools, such as OpenAI’s speech API, support real-time conversations between a human and the AI model. With a small amount of code, these models can be programmed to automatically execute phone scams, encouraging victims to reveal sensitive information.
So how can you stay safe? What makes the threat so problematic is not only how easily and cheaply it can be deployed, but also how persuasive the AI voices have become.
OpenAI faced backlash earlier this year for its Sky voice option, which sounded eerily like Scarlett Johansson, while Sir David Attenborough has described himself as “deeply disturbed” by an AI voice clone that was indistinguishable from his real speech.
Even tools designed to defeat scammers show how blurred the lines have become. British network O2 recently launched Daisy, an AI grandma designed to catch phone scammers in a time-wasting conversation they think is with a real elderly person. It’s a smart use of the technology, but also one that shows how well AI can simulate human interactions.
Worryingly, fraudsters can train AI voices based on very small audio samples. According to F-Secure, a cybersecurity company, just a few seconds of audio is enough to simulate the voice of a loved one. This can easily be obtained from a video shared on social media.
How AI voice cloning scams work
The basic concept of voice cloning scams is similar to standard phone scams: Cybercriminals pose as someone to gain the victim’s trust and then create a sense of urgency that prompts them to reveal sensitive information or transfer money to the fraudster.
The difference with voice cloning scams is twofold. First, the criminals can automate the process with code, allowing them to target more people, faster, and for less money. Secondly, they can imitate not only authorities and celebrities, but also people you know directly.
All it takes is an audio clip, which is usually taken from an online video. This is then analyzed and simulated by the AI model, allowing it to be used in deceptive interactions. An increasingly common technique is for the AI model to imitate a family member asking for money in an emergency.
The technology can also be used to simulate voices of high-profile individuals to manipulate victims. Scammers recently have a AI voting clone of Queensland Premier, Steven Milesto try to carry out an investment conflict.
How to stay safe from AI voting fraud
According to Starlingbanka digital lender, 28% of British adults say they have been targeted by AI voice clone fraud, yet only 30% are confident they know how to spot one. That’s why Starling launched the Safe Phrases campaign, which encourages friends and family to agree on a secret phrase they can use to confirm each other’s identities – and it’s a sensible tactic.
TL;DR How do you stay safe?
1. Agree on a safe sentence with friends and family
2. Ask the caller to confirm some recent private information
3. Listen for uneven emphasis on words or emotionless talk
4. Hang up and call the person back
5. Be wary of unusual requests, such as requests for banking information
Even without a prearranged safe phrase, you can use a similar tactic if you ever doubt the veracity of a caller’s identity. AI voice clones can imitate a person’s speech pattern, but they don’t necessarily have access to private information. Asking the caller to confirm something only they know, such as information you shared during the last call you made, takes you one step closer to certainty.
Also trust your ear. Although AI voice clones are very convincing, they are not 100% accurate. Listen for telltale signs, such as uneven stress on certain words, emotionless expression, or slurring.
Scammers have the ability to mask the number they are calling from and can even appear as if they are calling from your friend’s number. If you are ever in doubt, the safest thing to do is to hang up and call the person back at the usual number you have for them.
Voice cloning scams also rely on the same tactics as traditional phone scams. These tactics are designed to apply emotional pressure and create a sense of urgency, forcing you to take an action you would not otherwise take. Be alert to this and be wary of unusual requests, especially when it comes to making a money transfer.
The same red flags apply to callers claiming to be from your bank or other authority. It pays to be familiar with the procedures your bank uses when they contact you. For example, Starling has one call status indicator in the app, where you can check at any time whether the bank is actually calling you.