AI Voice Scams: Combining AI and Deepfake
Deepfake technology can create fake voices that sound like someone you know. Scammers have been using this technology to make it seem like a friend or family member is asking for money or personal information. They might send you a fake voicemail, email, or text message that sounds like it’s from a person you trust. These messages usually try to get you to act quickly without thinking twice, simply because it’s a familiar sound of your close ones.
It’s getting easier for scammers to make convincing deepfakes, thanks to the help of AI. Soon enough, AI voice scams have been happening with very little police reports to make it to the big news. With the lack of awareness, many innocent people have fallen into the trap of these scams, thinking that their loved ones are in danger, but it turns out it’s not. Some fortunate escaped victims of the AI voice scam are Debbie Shelton Moore and Eddie.
Also Read: Top 5 Cybersecurity Solutions Today
The Cases of AI Voice Scams
A mother named Debbie Shelton Moore, Debbie, received a phone call from someone saying they had kidnapped her 22-year-old daughter Lauren and wanted $50,000 for her safe return. The caller made Lauren cry out for her mom in the background to make it sound real. But Debbie’s husband, who works in cybersecurity, heard the conversation and immediately called Lauren, who was safe and sound. The police said it was probably a scam using artificial intelligence to pretend to be Lauren’s voice. Debbie shared her story to warn others about this kind of AI voice scam.
Eddie, a TikTok content creator, shared a frightening experience his grandparents went through where they received a phone call from someone claiming to be Eddie, asking for money to cover the cost of a car accident he was supposedly involved in. The caller sounded exactly like Eddie, leading his grandparents to believe the call was genuine. However, Eddie’s father intervened and asked for proof that the call was actually coming from Eddie, revealing it to be a scam. Eddie shared this story on TikTok to warn his followers to be cautious when receiving unexpected calls or messages, especially those that ask for personal information or money.
Read More: First AI War Proves Christopher Nolan’s Fear
Protection and Prevention
One in four people have either been targeted by an AI phone clone or know someone who has, according to McAfee, a computer security company. Law enforcement suggests ways to avoid falling victim to these scams, such as verifying the location of loved ones, contacting police, and creating a secret code or phrase. Scammers often pose as someone you trust to steal your money, and this type of scam is very common in the United States. In 2022, Americans lost $2.6 billion due to these imposter scams, according to the Federal Trade Commission.
There are tools that can help detect fake audio, but they don’t always work perfectly. To stay safe, you should be sceptical of unexpected or suspicious messages, even if they seem to come from someone you trust. Verify the sender’s identity by calling them directly or checking their social media accounts. Educating yourself about deepfake technology can also help you avoid falling victim to these scams.
Frequently Asked Questions
What Is an AI Voice Scam?
AI voice scam is the criminal use of AI technology to copy the voice of a person you trust, like a friend or family member, to trick you into giving them money or personal information. The goal is to get you to do something that will benefit them, like revealing your password or sending them money.
How Long Have AI Voice Scams Been Around?
AI voice scams, also known as “Voice phishing”, have been around for several years. The first reported cases of AI voice scams date back to 2016, and it wasn’t until 2018 that AI voice scams became more sophisticated and began to gain widespread attention.
What Is the Best Way to Verify the Identity of a Caller?
To verify someone’s identity over the phone, consider the following methods:
- Ask for personal information, such as full name, address, or date of birth.
- Ask a series of questions that only the genuine caller would know the answers to.
- Use two-factor authentication, requiring a unique code sent to their email or phone, or using biometric data.
- Trust your instincts – if something feels off, end the call.
- AI2023.11.30Singapore Invests in AI Training for Professionals to Empower Healthcare
- CRYPTO2023.11.305 Takeaways From America’s Crypto Industry Biggest Crackdown in History
- GAMING2023.11.30Does Persona 5 Tactica Take Place After Strikers?
- AI2023.11.30AI in Pancreatic Cancer Detection: Will AI Doctors Aids Be Real?