fbpx
Monday, December 23, 2024
Monday December 23, 2024
Monday December 23, 2024

Social media videos at risk: Scammers using Ai to clone voices and defraud

PUBLISHED ON

|

Research reveals that 28% of people targeted by AI voice cloning scams in the past year; experts advise using safe phrases to combat the rising threat

Consumers are being warned about a growing fraud tactic where scammers exploit social media videos to clone voices using artificial intelligence (AI). This technique allows fraudsters to mimic an individual’s voice and use it to deceive their friends and family into sending money under false pretences.

According to research released by Starling Bank, 28% of individuals have encountered AI voice cloning scams in the past year. Despite this, 46% of people are unaware of such scams, and 8% admitted they would likely send money if requested by someone they believed to be a loved one, even if the request seemed unusual.

Embed from Getty Images

Lisa Grahame, Chief Information Security Officer at Starling Bank, highlighted the vulnerability posed by the widespread sharing of personal content online. “People regularly post content online which includes their voice recordings, without considering that this makes them more susceptible to fraud,” Grahame said.

Scammers need only a few seconds of audio from a person’s social media videos to replicate their voice. They then use this cloned voice to make phone calls or leave voicemails asking for urgent financial assistance. This alarming trend has prompted Starling Bank to recommend that individuals establish a “safe phrase” with their close contacts. This simple precaution could help verify the authenticity of unexpected requests for money.

“Scammers only need three seconds of audio to clone your voice, but it would take only a few minutes to set up a safe phrase with your family and friends to protect yourself,” Grahame advised. She emphasized the importance of being aware of such scams and taking steps to safeguard oneself and loved ones.

However, safe phrases alone may not always be foolproof. To further protect against fraud, individuals are encouraged to verify suspicious calls or messages by directly contacting the purported sender using known and trusted communication channels or by calling 159 to speak with their bank.

In January, the UK’s cybersecurity agency noted that AI’s advancements are making it increasingly difficult to identify phishing messages, where users are tricked into revealing sensitive information. These sophisticated scams have even managed to deceive large corporations. For instance, in February, Hong Kong police investigated a case where an employee at a company was tricked into transferring HK$200 million (£20 million) to fraudsters who used deepfake technology to impersonate senior executives in a video conference.

Lord Hanson, the Home Office Minister responsible for fraud, underscored the dual nature of AI’s impact. “AI presents incredible opportunities for industry, society, and governments, but we must remain vigilant against its potential misuse, including in AI-enabled fraud,” he said.

Analysis:

Political:

The emergence of AI-enabled scams highlights the urgent need for updated regulatory frameworks to address new forms of cybercrime. Governments and policymakers must adapt to the evolving threats posed by AI technologies. The focus on AI fraud underscores the necessity for political action to enhance cybersecurity measures and protect both individuals and businesses from sophisticated scams.

Social:

The social implications of AI voice cloning scams are significant, affecting how people interact and communicate online. The ease with which scammers can exploit social media content reflects a broader issue of digital privacy and security. Public awareness campaigns and educational efforts are crucial in helping individuals understand and mitigate the risks associated with sharing personal information online.

Economic:

Economically, the rise of AI-driven scams poses a threat to both individual finances and business operations. The substantial financial losses reported in high-profile cases, such as the HK$200 million fraud, illustrate the significant economic impact of such scams. The growing sophistication of these frauds necessitates investment in advanced security technologies and preventive measures to protect financial assets and maintain economic stability.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Related articles