AI is fuelling a rise in online voice scams – with just three seconds of audio required to clone a person’s voice, study warns

AI technology is fuelling an explosion in voice cloning scams, experts have warned.

Fraudsters can now mimic a victim’s voice using just a three second snippet of audio, often stolen from social media profiles.

It is then used to phone a friend or family member convincing them they are in trouble and urgently need money.

One in four Britons say they or someone they know has been targeted by the scam, according to cybersecurity specialists McAfee.

It is so believable the majority of those affected admitted they have lost money as a result, with the cost for around a third of victims over £1,000.

Keep a watchful eye: AI technology is fuelling an explosion in voice cloning scams, experts have warned (stock image)

A report by the firm said AI had already ‘changed the game for cybercriminals’, with the tools needed to carry out the scam freely available across the internet.

Experts, academics and bosses from across the tech industry are leading calls for tighter regulation over AI as they fear the sector is getting out of control.

US Vice President Kamala Harris is today (Wednesday) meeting with the chief executives of Google, Microsoft, and OpenAI, the firm behind ChatGPT, to discuss how to responsibly develop AI.

They will address the need for safeguards that can mitigate potential risks and emphasise the importance of ethical and trustworthy innovation, the White House said.

McAfee’s report, The Artificial Imposter, said cloning how somebody sounds had become a ‘powerful tool in the arsenal of a cybercriminal’ – and its not hard to find victims.

A survey of over 1,000 UK adults found half shared their voice data online at least once a week on social media or voice notes.

The investigation revealed more than a dozen AI voice-cloning tools openly available on the internet, with many free and only needing a basic level of expertise to use.

In one instance, just three seconds of audio was enough to produce an 85 per cent match, while it had no trouble replicating accents from around the world.

With everybody’s voice the spoken equivalent of a biometric fingerprint, 65 per cent of respondents admitted they were not confident that they could identify the cloned version from the real thing.

Read More: AI is fuelling a rise in online voice scams

Leave a Reply

Your email address will not be published. Required fields are marked *