• Calgary Citizen
  • Posts
  • Better Business Bureau warns public about AI-powered scams that clone voices to steal from unsuspecting victims

Better Business Bureau warns public about AI-powered scams that clone voices to steal from unsuspecting victims

A new scam is on the rise in Canada and the BBB wants to prepare Calgarians

New technology has prompted warnings about scammers who are now using AI to clone voices and scam unsuspecting victims. // Shutterstock

As technology continues to advance, so do the methods used by scammers to steal from innocent victims, adding a chilling new dimension to the meaning of “artificially intelligent.”

The latest development in the world of scams is the adoption of artificial intelligence (AI) by criminals, says Wes Lafortune, media and communications specialist for the Better Business Bureau.

Lafortune says scammers have previously used a technique known as the “grandparent scam,” where they pose as a grandchild in distress and ask for money from their grandparents.

“Typically what happens in a normal scam is there’s a representative that calls, usually a grandparent, and says, ‘I'm a lawyer’ or ‘I'm somebody helping out your grandchild, they've been hurt, they're in hospital, they need cash’ or ‘they're in jail, they need bail money,’” he explains.

A double-edged sword

New technology has prompted the Better Business Bureau (BBB) to issue an alert warning the public about scammers who are now using AI to clone voices and scam unsuspecting victims.

“Now, the scammers, who are from sophisticated criminal organizations that are set up to steal people’s money, are using artificial intelligence,” Lafortune says.

This scam has been successful because seniors are often inclined to help their family members, even when they suspect that something might be amiss.

By harvesting voices from social media accounts, the scammers can mimic the voice of a family member and use AI tools to carry on a conversation with their victim.

“Wherever that sample exists, they capture it. They then use tools to mimic that voice and contact the would-be victim and actually pretend that they're the grandchild asking for assistance,” Lafortune says, adding scams have the potential to cause serious financial harm to victims, but they can also be physically dangerous.

“In some cases, the scammers are arriving at people’s homes … to collect money, which of course, makes it even more dangerous.”

Things to watch for

While seniors are the group of people that tend to be the most targeted, anyone can fall victim to scams, especially as technology continues to advance.

The BBB has issued several tips to help people spot this type of scam.

First, resist the urge to act immediately, no matter how dramatic the story is.

“Validate the information, if somebody calls you out of the blue and you think it's a loved one or your grandchild, still validate that information. Say ‘I want to help you, but I'm going to hang up and I'm going to call you on your phone, or I'm going to call your parents,” Lafortune says..

“So don’t be in a big hurry. That’s what the scammer wants. They’re trying to create a sense of urgency.”

It’s important to check out the story with other family and friends, and to hang up or close the message and call your loved one directly. Don’t call the phone number provided by the caller or the number that appears on caller ID.

Online savvy

Secondly, know what your family members are sharing online and be careful about what you share as well.

“Be aware of what you're putting on your social media accounts because scammers harvest that information for all kinds of reasons, and they want to build profiles so they can be effective in mimicking or pretending that they're the loved one in question,” Lafortune says.

It is also important to never wire money if you have any doubts about the validity of the phone call.

“Don't send money by Bitcoin or transfer big, large sums of money until you get a hold of someone that you know and trust.”

Another tip is to listen closely to the audio. Fake audio might include choppy sentences, unnatural or out-of-place inflection, odd phrasing, or background sounds that don’t match the speaker’s location. These are all signs of fake audio.

Lafortune says artificial intelligence is “not at a perfect point in this stage” so there may be mistakes that victims can pick up on.

Many families have developed a code word as a safety measure in order to reveal their identity and protect themselves.

Getting ahead of the game

It's important to make sure you know who you are talking to. As deepfake technology progresses, you’ll need to confirm the identity of who you are speaking with—even if you think you know and trust them.

However, people do fall victim to scams and these crimes are not going anywhere anytime soon.

“Scams across Canada are happening at an alarming rate and people lose millions of dollars every year,” Lafortune says.

While AI-related scams have not yet been reported in Calgary, many have been reported to police across the country.

“We really want to get ahead of these scams and get the message out before [more] people become victimized,” Lafortune adds.

If you fall victim to a scam, BBB stresses that you report the scam to the police non-emergency line or call 911 if someone comes to your house.

In order to minimize the amount of people who fall victim to scammers, it is also important to report incidents to BBB’s scam tracker website.

Reply

or to participate.