Deepfake AI: How scammers are using technology to dupe Canadians

On Jan. 16, CTV News Ottawa aired a story about a retired Ottawa couple who were scammed into using their line of credit to purchase cryptocurrency through a ‘financial advisor’.

Two weeks later, a fraudulent rendition of that story, made with deepfake AI, surfaced on Facebook.

The falsified video shows CTV’s Graham Richardson and Patricia Boal, as well as the retired couple, talking about a “program that helps Canadians achieve financial independence.”

Scammers had taken the original cautionary tale and, in a deeply ironic twist, used it in an attempt to dupe more people into falling for investment scams similar to the one that had tricked Doug and Victoria Lloyd.

“These tools are becoming more powerful, much higher fidelity, much more affordable, much easier to use, and they are finding themselves more increasingly in the hands of opportunistic cybercriminals,” said technology analyst Carmi Levy.

“This, unfortunately, is very rapidly becoming the technological scourge of our era.”

A spokesperson for Meta, the parent company of Facebook, Instagram, Threads, and WhatsApp, tells CTV News the company is investing money to tackle this type of issue.

The page that posted the video has also been removed from the platform.

“In an effort to prevent fraudulent activity that can harm people or businesses, we remove pages and content that purposefully deceives, willfully misrepresents or otherwise defrauds or exploits others for money or property,” read a portion of the statement.

“It’s important to note that this isn’t new, nor limited to Meta’s platforms: It’s plagued the online ad world for years and we don’t expect this fight to be over any time soon. The people behind these ads use various methods and channels to get at victims, across the internet, but we are investing significantly to tackle this issue.”

Data provided to CTV News by the Canadian Anti-Fraud Centre (CAFC) shows that investment scams — which often rely on deepfake AI technology — have claimed more than half of all money lost to scams across Canada in 2023.

The CAFC says Canadians lost more than $553 million to scammers last year. Investment scams were responsible for more than $309 million of that total.

Levy says it is not just scammers looking for money who are taking advantage of the rapid development of AI.

“This is the kind of technology can not only dupe people into giving up their life savings, it can turn the corner on an election,” he said.

“It can prompt people to vote for someone they didn’t intend to vote for, or not vote at all. The very future of democracy hangs in the balance.”

In June 2022, the federal government introduced Bill C-27, the Digital Charter Implementation Act, which includes three proposed acts: the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act.

The feds say the AI and Data Act will outline “clear criminal prohibitions and penalties regarding the use of data obtained unlawfully for AI development or where the reckless deployment of AI poses serious harm and where there is fraudulent intent to cause substantial economic loss through its deployment.”

However, implementing the act is a slow moving process and, after nearly two years, it is still before the House of Commons.

There are also questions from industry experts as to how effective it will be in reducing the spread of AI-generated scam videos.

“Knowing what lies on the horizon, where AI is going to get better and we still don’t have appropriate laws on the books and we are lead by governments who really don’t understand the implications of artificial intelligence, is terrifying,” said Levy.

“More national-level discussion is required. Better laws are required and, quite frankly, we should be electing officials who are savvy in these technologies so that they can prioritize the creation of legislation that forces technology companies to behave appropriately.”

Leave a Reply

Your email address will not be published. Required fields are marked *