A short clip of what appears to be popular Indian star Rashmika Mandanna entering an elevator has blown up in India and received condemnation across the world.
At first glance, the video looks to be a harmless clip of the 27-year-old Bollywood star – who has 39 million Instagram followers – in activewear getting out of the lift.
But despite looking painfully realistic, the video isn’t Mandanna at all.
The woman in the video was actually a British-Indian influencer named Zara Patel, with her real face being visible in the first frame of the six-second video.
Deepfakes are false images or videos created using artificial intelligence.
The phenomenon is nothing new, but recent advancements in technology have led to creepily-convincing videos being posted online every day.
The star herself is now calling for greater regulation of AI technology, calling the clip “extremely scary”and saying it shows how technology can be easily misused.
Abhishek Kumar, a journalist from India, tracked down the fake video’s origins and called for new “legal and regulatory” measures to tackle the spooky phenomenon, as thousands condemned the video for using Mandanna’s likeness without her permission.
The incident has sparked further discussions in Indian media publications about exactly how to combat deepfake technology as artificial intelligence continues to be developed at breakneck speed.
“There is an urgent need for a legal and regulatory framework to deal with deepfake in India. You might have seen this viral video of actor Rashmika Mandanna on Instagram. But wait, this is a deepfake video of Zara Patel,” Kumar posted.
Mandanna took a stand against deepfake technology on Monday and thanked her fans for the support.
“I feel really hurt to share this and have to talk about the deepfake video of me being spread online. Something like this is honestly extremely scary, not only for me, but also for each one of us who today is vulnerable to so much harm because of how technology is being misused,” she wrote.
“But if this happened to me when I was in school or college, I genuinely can’t imagine how I could ever tackle this. We need to address this as a community and with urgency before more of us are affected by identity theft.”
Several celebrities showed support for Mandanna and expressed their shock at the deceptive use of the technology.
Bollywood star Amitabh Bachchan supported Mandanna and called for legal action against the creators of the deepfake video.
Other celebrities, including singer Chinmayi Sripaada, also voiced their concerns about the misuse of technology and the need for legal protection.
“It’s truly disheartening to see how technology is being misused and the thought of what this can progress to in the future is even scarier,” Sripaada posted online.
“Action has to be taken and some kind of law has to be enforced to protect people who have and will be a victim to this. Strength to you.”
Fans defended Mandanna and demanded strict laws be brought in to combat the fakes.
The deepfake phenomenon has made headlines in recent weeks, with Australia’s own Hamish Blake being caught up in a “scary” video scam.
An advertisement running on Instagram features a somewhat convincing video of the comedian and broadcaster appearing to promote weight loss gummies.
“Two months ago, I saw an advertisement for gummies and the website claimed that with the help of this product, you can lose weight by 12 kilos in four weeks,” the fake Blake says in the ad.
“I decided to order four bottles and in the first few days, nothing changed. I was sceptical about this. But what was my surprise when my weight started to evaporate.
“After only two weeks, I had lost six kilos. At the end of the course, I had lost 13 kilos.”
The fake Blake sounds alarmingly like the real one and the vision, although a low resolution, animates his face and shows his mouth moving.
On air this morning, 2GB Breakfast host Ben Fordham said he knows Blake well and was shocked when he spotted the Instagram ad at the weekend.
“That sounds like Hamish Blake,” Fordham said, before introducing the real-like star.
“I promise this is the real Hamish,” Blake said. “This one won’t sell you magic beans in the form of weight loss gummies.”
He said with some two decades of recorded examples of his voice available online thanks to his prolific career in radio and TV, AI technology has plenty to work with.
“I guess there’s enough words out there to effectively make me say anything,” he said.
Authorities around the world are scrambling to set up guardrails for AI, with several US states such as Minnesota passing legislation to criminalise deepfakes aimed at hurting political candidates or influencing elections.
On Monday, US President Joe Biden signed an ambitious executive order to promote the “safe, secure and trustworthy” use of AI.
“Deep fakes use AI-generated audio and video to smear reputations… spread fake news, and commit fraud,” Biden said at the signing of the order.
He voiced concern that fraudsters could take a three-second recording of someone’s voice to generate an audio deepfake.
“I’ve watched one of me,” he said.
“I said, ‘When the hell did I say that?’”