Inside the Rise of AI Voice-Cloning Scams Targeting Families Across America
Robin and Steve, a couple in Brooklyn, experienced a terrifying scam when they received a call from a supposed kidnapper claiming to have Robin’s mother-in-law at gunpoint. As reported in newyorker.com, the scammer demanded money via Venmo, threatening violence if they didn’t comply. Panicked, they sent the requested funds but later realized it was a hoax. The incident highlights the rise of AI-driven voice cloning technology, enabling scammers to impersonate loved ones convincingly, extracting money through fear.
The scam began with a late-night call from Robin’s mother-in-law, Mona, who sounded distressed and claimed she was in danger. Mona handed the phone to someone claiming to hold her at gunpoint, demanding payment and threatening violence if they contacted the authorities. Steve, a law enforcement officer, attempted to negotiate while reaching out to a colleague for advice, buying time and hoping to confirm Mona’s safety.
Despite their efforts to negotiate, the scammer insisted on payment and even patched in a female voice to further manipulate the situation. Eventually, Robin and Steve sent the requested money through Venmo, fearing for Mona’s safety. However, upon reaching Mona, they realized she was safe, and they had fallen victim to a sophisticated scam using AI-generated voices.
The incident underscores the dangers of voice cloning technology, which has advanced significantly in recent years, making it difficult to discern real voices from fake ones. While voice cloning has positive applications, such as preserving voices for individuals with speech impairments, it has also become a tool for fraudsters to exploit emotions and extract money through deception.
As authorities grapple with regulating voice cloning technology and combating scams, victims like Robin and Steve emphasize the importance of vigilance and establishing safeguards, such as family passwords, to verify the authenticity of calls in the face of increasingly sophisticated scams. However, the prevalence of voice cloning scams continues to pose challenges for law enforcement and regulators seeking to protect the public from financial exploitation and emotional distress.
read more at newyorker.com
Leave A Comment