Voice deepfakes: beware of AI software that mimics the voices of your loved ones
With the help of artificial intelligence software, criminals are now able to clone a voice from a 3-second sound clip. In Great Britain, according to a survey, 28% of respondents say they have been victims of a vocal deepfake. We give you some tips to protect yourself from it.
We never stop talking about artificial intelligence, exploited with both benevolent and malicious intentions. A few months ago, we talked about romance scams and in particular the case of the fake Brad Pitt.
This time, we specifically address scams aimed at imitating the voice of a loved one using technology. By retrieving video or voice recordings of 3 seconds or more via social networks, scammers are now able to reproduce a voice thanks to artificial intelligence tools. They can imitate someone's voice and ask for money from their loved ones by making it look like they are in danger by leaving a note or a voice message.
Starling Bank revealed last year the results of a survey: 28% of the 3,010 Britons surveyed they had been the target of a voice cloning scam at least once in the past year. And besides that, 46% of people had never heard of it... Even more worrying: 8% of those surveyed that even if they had doubts about the origin of the call, they would still send money in such a situation.
To deal with cases of voice deepfakes, Starling Bank has launched the Safe Phrases campaign, which invites users to agree on a "security phrase" to allow victims to verify whether the person is who they say they are.
Tips for recognizing a vocal deepfake
However, there are some typical vocal characteristics of artificial intelligence that can be used to recognise fraud, such as a lack of varied and jerky intonation, abrupt pauses in the middle of a sentence or irregular breathing noises. So listen carefully!
If the sound is associated with an image, study the coherence between the two, and in particular the movement of the lips. Also think about details such as the person's fingers (often, the AI forgets or adds some...) or the sometimes strange elements of the background...
If you receive an urgent call asking for money, hang up and call the person's usual number to ask directly if it is really them.
And as always, don't go too fast! Fear, stress and anger drive you to act too quickly, without thinking, and scammers know it...