AI voice cloning escalates kidnapping scams in US

Post Image

Recent advancements in artificial intelligence have given rise to a troubling new trend in phone scams, where scammers use AI to replicate the voices of victims' family members in fake kidnapping schemes.

This tactic has been particularly effective in convincing individuals to send money for the supposed safe return of their loved ones.

Highline Public Schools in Burien, Washington, alerted the community on September 25 that two individuals had been targeted by "scammers falsely claiming they kidnapped a family member."

The scammers played an AI-generated audio recording of the family member and then demanded a ransom.

The FBI has reported a nationwide increase in these scams, particularly affecting families who speak languages other than English.

"The scammers played [AI-generated] audio recording of the family member, then demanded a ransom," the school district stated.

In a harrowing testimony to Congress, Arizona mother Jennifer DeStefano recounted her experience when scammers used AI to make her believe her daughter had been kidnapped.

DeStefano described answering a call from an unknown number on a Friday afternoon, hearing her daughter Briana sobbing on the line. "At first, I thought nothing of it. … I casually asked her what happened," DeStefano recalled.

However, the call quickly escalated when a threatening man took over, demanding a $1 million ransom. "Listen here, I have your daughter... you call the cops, I am going to pump her stomach so full of drugs," he threatened.

The man continued to manipulate the situation while Briana's voice pleaded in the background.

Amidst the chaos, DeStefano's husband found Briana safe at home, unaware of the scam that was unfolding. "How could she be safe with her father and yet be in the possession of kidnappers? It was not making any sense," DeStefano wrote.

The ease with which scammers can replicate voices has raised concerns among experts.

They employ two primary methods: collecting voice data from unknown calls and extracting audio from public videos on social media.

Beenu Arora, CEO of cybersecurity firm Cyble, explained, "The intent is to get the right data through your voice... and this is becoming a lot more prominent now."

The National Institutes of Health (NIH) has issued guidelines for potential victims.

They recommend being cautious of calls demanding ransom from unfamiliar numbers and encourage individuals to verify claims by contacting the supposed victim directly.

As AI technology evolves, the line between reality and manipulation blurs. Arora warns that as society becomes increasingly dependent on technology, identifying real versus fake will become more challenging.

"My humble advice … is that when you get such kind[s] of alarming messages, it's always better to stop and think before you go," he said.

Authorities urge anyone who believes they have fallen victim to these scams to contact their local police department. The rapid pace of AI development poses ongoing challenges for those working to prevent exploitation by bad actors.

Share:
Social Icon Social Icon Social Icon Social Icon