A Saskatchewan expert in artificial intelligence says deepfake scams are becoming so realistic, even trained professionals struggle to spot them, and the public should be extremely cautious about what they see or hear online.
This warning comes after an impersonation scam using Premier Scott Moe’s image circulated on social media.
“Even for me who tracks this very closely, the level of sophistication just in the last couple of weeks would truly blow your mind,” said Brian McQuinn, co-director of the Centre for Artificial Intelligence, Data and Conflict at the University of Regina.
Please watch out for AI deepfakes of myself or others endorsing online investment platforms. pic.twitter.com/ddFviX4LNj
— Scott Moe (@PremierScottMoe) July 31, 2025
Read more:
- Warnings issued for STARS Lottery, Carney investment scams in Saskatchewan
- Fraud experts warn of smishing scams made easier by artificial intelligence, new tech
- Regina police warn public about phone scam circulating in Queen City
McQuinn said we’re quickly entering a phase where it will be nearly impossible to tell whether a video, photo or voice recording is authentic.
“In the next year, there will be no easy way of discerning real from fake,” he said.
And while videos of politicians like Scott Moe get a lot of attention, McQuinn said more personal scams; the kind that target individual people, are even more dangerous.
He said voice cloning is now being used to impersonate family members, especially in emergency scams that target seniors.
“We’re seeing personalized attacks using voices that sound exactly like someone’s child or grandchild,” he said.
“It creates panic, and people send money without thinking.”
Scammers often gather voice samples from social media or public videos, then using AI to generate convincing audio of someone in distress. In one common scam, a grandparent receives a call from what sounds like their grandchild, claiming they’ve be arrested while traveling and urgently need bail money.
The voice sounds real and the story is urgent, and that’s exactly the point.
“All of these scams rely on urgency, trust, and emotion,” he said. “They’re designed to short-circuit your critical thinking.”
McQuinn said the strategy hasn’t changed, but the tools have, and AI has made those tools far more convincing than ever before.
In addition to public figure deepfakes and emergency voice scams, McQuinn warns that cryptocurrency fraud is also growing fast, and AI will make it worse.
“If you hear the word ‘crypto’ shut it down immediately,” he said “There is no quick and easy way to make money online, and AI just makes those lies look more real.”
McQuinn said the important thing people can do is pause when something feels urgent.
“If you’re being asked to send money or share personal information, stop. It doesn’t matter how convincing the voice or video is,” he said. “Check with someone you trust. Take five minutes. That’s all it takes to avoid being scammed.”
— With files from 980 CJME’s Lisa Schick