As artificial intelligence (AI) voice cloning scams become more prevalent, cybersecurity experts are warning cell phone users to change their voicemail.
AI is on the rise and scammers have taken advantage of new tools, finding ways to clone voices in order to create compelling scams or even gain access to accounts via voice authentication.
Why It Matters
Americans lose billions of dollars each year to scammers. According to the Federal Trade Commission, roughly $8.8 billion was lost through fraud in 2022 alone.
AI-fueled voice cloning scams have skyrocketed in recent years, and fraudsters have been able to gain access to users’ accounts or convince others to give them money by using voice mimicking technology.
What To Know
Just a few minutes of voice audio is all some scammers need to clone your voice and create fake audio with the help of generative AI.
Generative AI can clone both human writing and voices, unlocking access to some users’ accounts or tricking them into giving money under the guise of being someone else.
Sometimes a criminal will make a phone call using a family member’s voice and claim they have been kidnapped or arrested and need money urgently to steal from unsuspecting victims.
While former President Joe Biden put some restrictions on AI in an executive order in 2023, President Donald Trump revoked that order upon entering office.
To reduce your risk, cybersecurity experts recommend deleting any voicemail recording that uses your real voice. Instead, it’s safer to use an automated message.
What People Are Saying
Lucas Hansen, co-founder of the non-profit CivAI, told Newsweek: “Voice authentication is, and always has been, a deeply flawed security practice. It is a useful supplement to other authentication methods but should never be solely relied upon.
“AI voice cloning is just the final nail in the coffin. While removing personalized voicemail greetings is a good precaution, it is still important to contact your bank and request that they require stronger multi-factor authentication, such as app-based codes or long unique PINs, rather than voice authentication alone.”
Truman Kain, offensive security researcher at Huntress, told Newsweek: “Most accounts aren’t being accessed by voice verification directly, but voice cloning is absolutely being used in scams. We’re seeing it in business, where attackers impersonate CEOs or other executives to reroute wire transfers or request gift cards, and outside of work in, for example, the classic grandparent scam.”
Nati Tal, Head of Guardio Labs at Guardio, told Newsweek: “With this technology, scammers can impersonate you or loved ones to trick people into divulging sensitive information or transferring money. As AI advances, it’s becoming easier to replicate voices with high accuracy, which makes traditional methods of voice-based authentication risky. Always be cautious if a familiar voice asks for unusual requests or personal details, and verify the call independently if in doubt.”
What Happens Next
Experts are encouraging people to change their voicemails as they could provide enough voice audio for scammers to clone.
“Any images, videos, or audio of you out there should be considered ammunition for attackers,” Kain said. “The smart move here is to limit what you’re sharing publicly and lock down those privacy settings on social media. That way, when a suspicious call or message comes in, your loved ones aren’t left guessing. Remember, phone numbers can be spoofed! The call will look like it came from your phone, and the caller will sound like you.”
Read the full article here