Do you know who is calling? New generative artificial intelligence (AI) can be used to imitate anyone’s voice or face. Learn to protect yourself from scams that use AI clones.
On this page you’ll find
What is an AI voice clone scam?
New generative artificial intelligence (AI) can imitate anyone’s voice or appearance and are now widely available to the public. And scammers are using that technology to fool people.
There many types of scams using AI voice cloning software. One Ontario man thought he was bailing out a fishing buddy. He got a call from his friend who said he had been arrested for texting while driving and causing an accident. The man provided $8,000 only to discover later it was a scam and not his friend at all.
Actress Scarlett Johansson is currently taking legal action over an AI clone of her voice. It’s reported an app developer used an AI-generated clone of her voice to sound like she was pitching a product.
Imagine you pick up your phone. You are sure it’s your sister calling. The caller sounds exactly like her. She’s very excited about a new house she is about to buy. But she needs a little help with the down payment. She’s hoping you could transfer her some money and she’ll pay you back within the month. You’re happy for her. And you want to help. But before you do anything, try to verify that it’s actually her calling.
What are deepfakes?
A deepfake is when AI is used to make you think what you’re seeing is real — when it’s not.
A woman contacted the Contact Centre at the OSC. She was convinced she had spoken to billionaire Elon Musk. She had watched a video online of Musk telling her about a great investment. She put her money into it, convinced she was dealing with a financial genius.
But it was all a scam. Likely due to a deepfake impersonation of Elon Musk.
AI can be used to create a video of anyone you know, including you. It may look exactly like them. Their mouth is moving, and the voice sounds like theirs. But it’s all an illusion. It means you can’t trust your eyes or your ears. So you need to be extra careful.
How can you protect yourself from fake voice scams?
There are some simple ways you can protect yourself from AI voice scams and deepfakes.
First take a breath and pause. When people get a call about an emergency they naturally get stressed and want to act fast. But take a moment and ask questions to verify who the caller is.
If someone says they’re your sister, ask a question only your sister would know the answer to.
Some families have a special word or phrase they share with each other. A word they can use to confirm the identity of family members when they call. For example, you might agree with your family to make your special word your grandma’s middle name. So if a family member calls, you would ask them for that name before sharing any personal details.
And never forget. When in doubt. Hang up. And call the person back using a phone number you know is theirs.
Be on alert if you get a call, text or email that rushes you into making a decision — or asks you to share personal information.
- Learn to spot phishing scams.
- Review the Checklist: Protecting your financial information online.
Sometimes a family member truly needs help. Even then it’s wise to be cautious. It’s never easy to see a friend or family member struggling financially. But before you provide support read more about steps to offering a financial lifeline.
Summary
If you get a call from a family member asking urgently for money, take a moment to confirm that they are who they say they are. Consider these tips:
- Ask them a question only the real person would know the answer to.
- Have a special word your family shares to verify identity.
- When in doubt, hang up. And call the person back using a phone number you know is theirs.