Get featured on INDIAai

Contribute your expertise or opinions and become part of the ecosystem!

According to a report by Time of India, a 59-year-old woman from Hyderabad has become the victim of a scary AI voice scam by losing around Rs 1.4 lakh rupees. It was reported that the scammer sounded exactly like the nephew of the woman who resides in Canada and was said to be in distress and urgently in need of money.   

The woman got a call late at night, and the caller pretended to be her nephew and informed her about an accident he supposedly had and the threat of getting jailed. He then requested money from the woman discreetly. The woman shared her horrifying experience: “He sounded just like my nephew and spoke exactly in the Punjabi we speak at home with all the nuances. He called me late in the night and said he had had an accident and was about to be jailed. He requested me to transfer money and keep this conversation a secret.”   

The woman transferred the money, which was asked into the scammer’s account, later realising she had fallen victim to the scam. The city police official acknowledged the rarity of AI voice scams and advised the residents to be more cautious. The cyber experts explained that individuals who have family members in countries like Canada and Israel are the ones who have been targeted by AI voice scams recently.   

The Director of Operations, Centre for Research on Cyber Intelligence and Digital Forensics in Delhi, Prasad Patibandla, said, “AI voice imitating tools can mimic a person’s voice precisely by utilising data available in the public domain, such as social media recordings or even sales calls made by fraudsters. Creating a sense of urgency by fabricating a distressed situation in a foreign country adds to the effectiveness of these scams.” 

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE