Scammers are using AI voice generators to sound like your loved ones. Here’s what to watch for
Imagine getting a phone call that your loved one is in distress. In that moment, your instinct would most likely be to do anything to help them get out of danger’s way, including wiring money.
Scammers are aware of this Achilles’ heel and are now using AI to exploit it.
A report from The Washington Post featured an elderly couple, Ruth and Greg Card, who fell victim to an impersonation phone call scam.
Also: These experts are racing to protect AI from hackers. Time is running out
Ruth, 73, got a phone call from a person she thought was her grandson. He told her she was in jail, with no wallet or cell phone, and needed cash fast. Like any other concerned grandparent would, Ruth and her husband (75) rushed to the bank to get the money.
It was only after going to the second bank that the bank manager warned them that they had seen a similar case before that ended up being a scam — and this one was likely a scam, too.
This scam isn’t an isolated incident. The report indicates that in 2022, impostor scams were the second most popular racket in America, with over 36,000 people falling victim to calls impersonating their friends and family. Of those scams, 5,100 of them happened over the phone, robbing over $11 million from people, according to FTC officials.
Also: The best AI chatbots: ChatGPT and other alternatives to try
Generative AI has been making quite a buzz lately because of the increasing popularity of generative AI programs, such as OpenAI’s ChatGPT and DALL-E. These programs have been mostly associated with their advanced capabilities that can increase productivity amongst users.
However, the same techniques that are used to train those helpful language models can be used to train more harmful programs, such as AI voice generators.
These programs analyze a person’s voice for different patterns that make up the person’s unique sound, such as pitch and accent, to then recreate it. Many of these tools work within seconds and can produce a sound that is virtually indistinguishable from the original source.
Also: The looming horror of AI voice replication
What you can do to protect yourself
So what can you do to prevent yourself from falling for the scam? The first step is being aware that this type of call is a possibility.
If you get a call for help from one of your loved ones, remember that it could very well be a robot talking instead. To make sure it is actually a loved one, attempt to verify the source.
Try asking the caller a personal question that only your loved one would know the answer to. This can be as simple as asking them the name of your pet, family member, or other personal fact.
You can also check your loved one’s location to see if it matches up with where they say they are. Today, it’s common to share your location with friends and family, and in this scenario, it can come in extra handy.
You can also try calling or texting your loved one from another phone to verify the caller’s identity. If your loved one picks up or texts back and doesn’t know what you’re talking about, you’ve got your answer.
Lastly, before making any big monetary decisions, consider reaching out to authorities first to get some guidance on the best way to proceed.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.