Sunday, May 19, 2024

Man Loses Rs. 5 Cr to a Scam Deepfake AI Call Posing as His Close Friend

AI is proving to do more harm than good. Recently an individual from northern China fell victim to a scam involving highly sophisticated AI deepfake technology.

Thus, AI tools can undoubtedly be used to commit financial crimes. Now the question is- how far it could and would go? The public and authorities are on high alert after this incident which they should be anyway.

What happened?

The scammer made the victim believe that he was his close friend and in dire need of money. The perpetrator could do so by using AI-powered face-swapping technology. He made a video call to the man from China and persuaded him to transfer 4.3 million yuan.

- Advertisement -

The person believed his friend needed money urgently to deposit during a bidding process, so he complied. He found out about being duped only when his friend expressed ignorance about the whole fiasco.

Luckily, Reuters reported most of the stolen money has been recovered as stated by the local police on Saturday. They are working diligently to trace and recover the remaining funds as well.

China recognized the threat posed by AI early on and has since been aggressively tightening its scrutiny of such applications and technology. There has been a rise in AI-driven scams and frauds involving the manipulation of facial and voice data primarily. Thus, new rules were implemented in China in January to deliver legal protection to victims.

- Advertisement -

Another case

Another case has been reported last month in the USA. Jennifer DeStefano received a call from an unknown number asking for ransom for releasing her kidnapped daughter. The woman said her 15-year-old went for a skiing trip when she got the call.

She heard her daughter saying ‘Mom’ and sobbing on the phone. Then a man threatened the woman to not call the authorities. DeStefano could hear her child’s voice calling for help in the background. The man demanded $ 1 million for the release.

The woman said, “It was never a question of who is this? It was completely her voice. It was her inflection. It was the way she would have cried,” Shed told the local news media, “I never doubted for one second it was her. That’s the freaky part that really got me to my core.” Her daughter was not kidnapped in reality and it was all a scam, nevertheless a dreadful one.

Be vigilant and cautious

Deepfakes have always been an issue. They were mainly used to spread misinformation. But with the development of AI, deepfake technology is getting more sophisticated. Thus, the scope for misusing it is increasing as well. 

Since India is seeing a rise in scams of different sorts, people are being advised to remain vigilant and cautious during online interactions.

- Advertisement -
Dipanita Bhowmick
Dipanita Bhowmick
Dipanita Bhowmick: I am a content writer with 13+ years of experience in various genres, allowing me to adapt my writing style to diverse topics and audiences. Alongside my passion for creating engaging content, I have a deep interest in esoteric knowledge, constantly exploring the mystical and unconventional realms for inspiration along with spiritual and personal growth.

Related Articles

Stay Connected

- Advertisement -spot_img

Latest Articles