Cybercriminals can steal your voice in three seconds


Monday, 08 May, 2023


Cybercriminals can steal your voice in three seconds

A new report from McAfee illustrates how AI is fuelling a rise in online voice scams. The company surveyed 1004 Australians and performed in-depth analysis on the rise of AI voice-cloning technology and cybercriminal use.

The Artificial Imposter report found that 61% of Australians lack the confidence to tell the difference between an AI voice and a real voice. This is a major concern as scammers are increasingly using AI to replicate the voices of friends and family members to steal money from everyday Aussies.

The ACCC’s recent 2022 Targeting Scams Report found that scams in general are on the rise, with Australians losing at least $3.1 billion in 2022, an 80% increase on total losses recorded in 2021.

With the rise in popularity and adoption of AI tools, it is easier than ever to manipulate images, videos and, perhaps most disturbingly, the voices of friends and family members. McAfee’s research reveals scammers are using AI technology to clone voices and then send a fake voicemail or voice note to the victim’s contacts pretending to be in distress.

As technology becomes more sophisticated, all it now takes is for a small three-second voice snippet to mimic the vocabulary of family and friends and produce a believable scam.

According to the report, 43% of adults share their voice data online or in recorded notes at least once a week (via social media, voice notes and more), which is making cloning someone’s voice a powerful tool in the arsenal of a cybercriminal.

As social media consumption soars, so does the likelihood that Australians are unknowingly handing out ammunition to cybercriminals. This puts not only themselves but family and friends at risk, and the financial impact is starting to show, with 62% of Aussies having either lost or known someone who has lost between $500 to $5000 because of an AI voice scam.

“It’s clear that advanced artificial intelligence tools have already changed the game for cybercriminals," said Tyler McGee, Head of APAC at McAfee.

“Now, with very little effort, they can clone a person’s voice and deceive a close contact into sending money.”

Young people at most risk at being cloned

Despite being typically seen as the most technologically savvy generation, McAfee’s report found that Gen Zers are the least responsible and most at risk when it comes to their voice being cloned. 62% of those aged 18–29 admitted to sharing their voice online at least once per week, ahead of all other age groups. To make matters worse, 66% of 18- to 29-year-olds said they are not concerned about the rise of misinformation or disinformation found online.

Perhaps unsurprisingly, young people are most likely to fall for scams from a parent (46%) and in the following circumstances:

  • Loved one or friend has car issues, phone battery died and needs money to get home (57%).
  • Family member or friend has been robbed; can you transfer money to help (57%).
  • Give money to a friend/family member if they lost their bank card (47%).
  • Family or friend need money travelling abroad (49%).
     

Overall, 41% of all Australians said they would respond and share money based on a voicemail or voice note purporting to be from a friend or loved one in distress.

Older generations still unaware of AI threat

Despite being more responsible with their online activity, Australians aged 50–65 still remain a key target for scammers. Almost half (47%) of those aged between 50 and 65 are completely unaware of the existence of these types of voice cloning or AI scams.

Despite a lack of awareness of new cybersecurity threats on the horizon, ACCC’s 2022 Targeting Scams Report revealed that almost $100 million was lost in that age group to cyber scams.

Criminals need only limited expertise and just seconds of audio

As part of McAfee’s review and assessment of this new trend, McAfee Labs’ security researchers investigated the accessibility, ease of use and efficacy of AI voice-cloning tools, with the team finding more than a dozen freely available on the internet.

Both free and paid tools are available, with many requiring only a basic level of experience and expertise to use. In one instance, just three seconds of audio was enough to produce an 85% match*, but with more investment and effort it’s possible to increase the accuracy. By training the data models, McAfee researchers were able to achieve a 95% voice match based on just a small number of video files.

The researchers produced multiple AI-generated voice clones based on an Australian accent and were able to create a high confidence rating using freely available tools that helped to refine the accent and make it more believable.

The more accurate the clone, the better chance a cybercriminal has of duping somebody into handing over their money, and with these hoaxes based on exploiting the emotional vulnerabilities inherent in close relationships, a scammer could net thousands of dollars in just a few hours.

The overriding feeling among the research team, though, was that artificial intelligence has already changed the game for cybercriminals. The barrier to entry has never been lower, which means it has never been easier to commit cybercrime.

McGee continued, “It’s important to remain vigilant and to take proactive steps to keep you and your loved ones safe. Should you receive a call from your spouse or a family member in distress and asking for money, verify the caller — use a codeword, or ask a question only they would know. Identity and privacy protection services will also help limit the digital footprint of personal information that a criminal can use to develop a compelling narrative when creating a voice clone.”

How to protect yourself from AI voice cloning

  • Set a verbal ‘codeword’ with kids, family members or trusted close friends that only they could know. Make a plan to always ask for it if they call, text or email to ask for help, particularly if they’re older or more vulnerable.
  • Always question the source — If it’s a call, text or email from an unknown sender, or even if it’s from a number you recognise, stop, pause and think. Does that really sound like them and something they might ask? Hang up and call the person directly or try to verify the information before responding and certainly before sending money.
  • Think before you click and share — Who is in your social media network? Do you really know and trust them? Be thoughtful about the friends and connections you have online. The wider your connections and the more you share, the more risk you may be opening yourself up to that your identity may be cloned for malicious purposes.
  • Identity theft protection services can help make sure your personally identifiable information is not accessible or notify you if it makes its way to the Dark Web. Take control of your personal data to avoid a cybercriminal being able to pose as you.

To access the full survey data, including results broken down by country, click here.

*Based on the benchmarking and assessment of McAfee security researchers

Image credit: iStock.com/berya113

Related Articles

Secure-by-design software development for digital innovation

The rise of DevSecOps methodologies and developments in AI offers every business the opportunity...

Bolstering AI-powered cybersecurity in the face of increasing threats

The escalation of complex cyber risks is becoming a pressing issue for those in business...

How attackers are weaponising GenAI through data poisoning and manipulation

The possibility for shared large language models to be manipulated through data poisoning...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd