“A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough.” Mom wary about answering calls for fear voice will be cloned for future virtual kidnapping.
nice idea - but improbable
it’s much easier to take just about any girly voice and call hundreds of people instead of taking the time to make a single convincing call. Someone will “recognize” that voice.
Did this happen already?
the title already is about “improving kidnapping scam”
so yes, that “kidnapping scam” seems to be a thing
and scams generally work best if done low effort
fun fact: most of the time you don’t want to waste time on less-guilible people, so making your scam less obvious isn’t that usefull.
Yes, the article is about a specific instance of it happening.
I think this might be a case where the generic “scams generally work best if done low effort” doesn’t apply, since to be successful, this sort of scam requires some specifics. The not-kidnapped daughter was away training for a ski race. Blasting “we kidnapped your daughter” to people whose daughter is sitting on the couch next to them or people without daughters doesn’t work at all.
The article mentions people lose an average of $11k in these scams, which means they’re probably working best when targeting people with some savings.
The simple solution here is to try to contact someone before assuming they have been kidnapped. Of course the goal of scammers is to make you panic, and stop thinking rationally. A deepfake could definitely help with that.