Updated News Around the World

AI-powered scams on the rise in India, 83% victims lose money: Report

The increasing use and acceptance of artificial intelligence (AI) tools have made it simpler to manipulate images, videos, and voices of acquaintances and relatives. Recently, news surfaced that cyber attackers are exploiting AI-based voice technology to deceive individuals. A recent study indicates that India is the country with the highest number of victims, with 83% of Indians losing their money in such fraudulent activities.

Fraudsters are taking advantage of AI to mimic the voices of distressed family members, and a considerable number of Indians are becoming victims of such scams. A McAfee report suggests that a majority (69%) of Indians are unable to distinguish between a genuine human voice and an AI-generated voice.

In addition, the report titled ‘The Artificial Imposter’ reveals that nearly half (47%) of Indian adults have either been a victim of or know someone who has fallen prey to some form of AI voice scam. This percentage is almost twice the global average (25%).

“AI technology is fueling a rise in online voice scams, with just three seconds of audio required to clone a person’s voice. The survey was conducted with 7,054 people from seven countries, including India,” the report highlighted.

As per the findings of a McAfee study, a staggering 83% of Indian victims of AI voice scams have reported monetary losses, with nearly half of them (48%) losing more than 50,000. McAfee CTO, Steve Grobman, highlighted that while Artificial Intelligence offers enormous possibilities, it can also be exploited by cybercriminals for malicious purposes. He added that the simplicity and accessibility of AI tools have enabled fraudsters to amplify their efforts, making their scams even more convincing.

Since each person’s voice is distinctive, it can be considered as a biometric fingerprint that establishes credibility. However, the prevalent practice of 86% of Indian adults sharing their voice data online or through recorded notes at least once a week (on social media, voice notes, etc.) has made voice cloning a potent weapon for cybercriminals.

According to McAfee, a majority (66%) of Indian survey participants admitted that they would respond to a voicemail or voice note that appears to be from a friend or family member in urgent need of money, especially if it supposedly originated from their parent (46%), partner/spouse (34%), or child (12%). The report further disclosed that messages that claim the sender was robbed (70%), involved in a car accident (69%), lost their phone or wallet (65%), or required assistance while travelling overseas (62%) were most likely to provoke a response.

The proliferation of deepfakes and fake news has made people more cautious about the veracity of online content. The report indicated that 27% of Indian adults have lost faith in social media platforms, while 43% are apprehensive about the growing prevalence of disinformation or misinformation.

 

Catch all the Technology News and Updates on Live Mint.
Download The Mint News App to get Daily Market Updates & Live Business News.

More
Less

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsUpdate is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.