How Crypto Scammers Use AI Deepfakes for Phishing Victims
Crypto scammers and hackers are finding new ways to penetrate security measures even as the crypto industry accelerates attempts to add advanced layers of security to various platforms. Hackers and scammers are now exploiting AI deepfakes to breach the security of crypto exchanges and web3-related businesses. By using deep AI, notorious elements aim to circumvent the identification criteria established by the platforms, said Binance Chief Security Officer Jimmy Su in a recent interview.
Deepfakes are artificially generated photos or videos designed to convincingly reproduce the voice and facial features and expressions of an individual, living or deceased. Artificial intelligence (AI) and machine learning (ML) tools are used to create deepfakes with realistic graphics.
If scammers are successful in creating deepfakes of crypto investors, it increases their chances of bypassing the security of crypto platforms and stealing user funds. “The hacker will look somewhere for a normal image of the victim online. Based on this, using fake deep tools, they are able to produce videos to do the bypass. Part of the verification requires the user to, for example, blink the left eye or look left or right, look up or down. Deep counterfeits are advanced enough today to actually fulfill these orders,” Su said. said Corner Telegraph.
For the past few months, crypto industry players have been pointing out the growing threat that AI-generated deepfakes pose to uninformed and unsuspecting victims. In February 2023, a deepfake video of Binance CEO Changpeng Zhao surfaced on social media. In this clip, an artificially generated Zhao can be heard calling on people to exclusively trade crypto with them.
Deeply fake AI poses a serious threat to humanity, and it’s not just a far-fetched idea anymore. I recently came across a video featuring a deep fake of @cz_binance and it is terribly convincing. pic.twitter.com/BRCN7KaDgq
—DigitalMicropreneur.eth (@rbkasr) February 24, 2023
A similar deepfake video of Elon Musk sharing misleading crypto investing advice was also spotted on social media earlier this month.
Since these deepfake videos are very attractive, many people might not be able to spot some warning signs that they are deepfakes. In the future, Su predicts that AI will be able to detect uneven parts of deepfakes and improve the quality.
“When we watch these videos, there are certain parts that we can detect with the human eye. For example, when the user has to turn their head to the side. The AI will overcome [them] over time. So it’s not something you can always rely on. Even though we can control our own videos, there are videos that don’t belong to us. So one thing, again, is user education,” Su said in the interview.
A recent report by blockchain research firm CertiK estimates that a whopping $103 million (roughly Rs. 840 crore) has been stolen in crypto exploits this year in April. Exit scams and flash loans have become the main source of stolen funds in crypto crimes. In the last four months of 2023, CertiK estimates that $429.7 million (about Rs 3,510 crore) was stolen by scammers and hackers.
#Crypto #Scammers #Deepfakes #Phishing #Victims #crypto