• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

Mother got a scam call saying her daughter had been kidnapped and she'd have to pay a ransom. The 'kidnapper' cloned the daughter's voice using AI

SBFNews

Alfrescian
Loyal

A mother reportedly got a scam call saying her daughter had been kidnapped and she'd have to pay a ransom. The 'kidnapper' cloned the daughter's voice using AI.​

  • A scammer reportedly used AI to clone a girl's voice in an attempt to get money from her mother.
  • The scammer pretended that he had kidnapped the 15-year-old using fake audio, Arizona's Family reported.
  • An FBI agent told the outlet that scammers often get details of their victims from their public social media accounts.
A mother reportedly got a call from a scammer who cloned her daughter's voice to pretend he'd kidnapped her.

Jennifer DeStefano from Scottsdale, Arizona told Arizona's Family – a local news station affiliated with CBS – that she received a call from a number listed as unknown. When she answered it, she could hear her 15-year-old daughter crying and saying things like, "mom, I messed up." Her daughter was out of town on a ski trip at the time, she said.

69718359-11961539-image-m-6_1681241367950.jpg


DeStefano said that a man's voice then came on the line, telling her daughter to put her head back and lie down.

"Listen here, I've got your daughter," the scammer reportedly said, according to DeStefano's account of the call. In the background, her daughter pleaded for DeStefano's help.

The voice "100%" belonged to her daughter, she told the local news site. "It was completely her voice. It was her inflection. It was the way she would have cried. I never doubted for one second it was her," she added.

69718347-11961539-image-a-3_1681241318109.jpg


DeStefano said she was asked for money. First, the scammer started with $1 million, then, after she said she didn't have that much money, it was lowered to $50,000.

In a Facebook post in January detailing the call, DeStefano said: "The "kidnapper" was demanding $50k in cash and was arranging how he was going to pick me up with the money and bag my head ... I was warned I better have the money or we would both be dead," she said.

Insider reached out to DeStefano on Facebook but did not immediately hear back.

At the time, DeStefano was on her way to her other daughter's dance studio. One of the other moms there called DeStefano's husband, who said that their 15-year-old daughter was actually safe and with him, DeStefano wrote on Facebook.

Companies have been developing AI software that can clone people's voices with various use cases.

A platform developed by AI startup ElevenLabs, for example, allows users to create text-to-speech voiceovers and foreign-language audio dubs that can be used for content like YouTube videos and audiobooks. But trolls quickly started to use the technology to mimic the voices of celebrities including Joe Rogan, Ben Shapiro, and Emma Watson. AI-generated audio of President Joe Biden and former President Donald Trump gaming has gone viral on TikTok, too.

Though voice-cloning software is often created with good intentions, some malicious actors have used it in attempts to scam people by mimicking the voices of their relatives. A man told The Washington Post that someone claiming to be a lawyer had scammed his parents out of $21,000 by claiming he'd been jailed for killing a person in a car accident and playing them an AI-generated clone of his voice.

Speaking about scams made using AI voice clones, Dan Mayo, the assistant special agent in charge of the FBI's Phoenix office, told Arizona's Family that scammers often get details of their victims from their social media accounts and often call people using numbers from unfamiliar area codes or even from overseas. Scams of this sort happen on daily basis, he said.

DeStefano clarified on Facebook this week that her daughter does not have any public social media accounts but said there are public interviews from her school that include a sampling of her voice.




Anoher article: Terrifying new AI kidnapping scam used teen girl's voice to demand $1m

Source:https://www.businessinsider.com/ai-scam-voice-clone-fake-kidnap-call-mother-money-ransom-2023-4
 
Top