Artificial intelligence is transforming our way of communicating, working and creating content. But it is also generating new Risks to privacy and personal safety. One of the most alarming is the cloning of voice through AI, a technique that allows to precisely replicate the way of speaking of a person with just a few seconds of recording.
He explains Pau Garcia-Milàtechnological entrepreneur and disseminator, in a video posted on his Instagram account, where he warns about how easy it is today to clone a voice without the victim knowing it: “With just a few seconds of your voice, anyone can make you tell you what you want thanks to artificial intelligence,” he says.
This type of technology, known as ‘Voice Cloning’, has already been used in Telephone scams, bank fraud and manipulated assemblies. The risks shoot if you are a person with high public exposure, if you have a podcast, create videos or even if you send voice notes frequently by social networks.
Three keys to protect your vocal identity
Given this growing threat, Garcia-Milà offers three simple recommendations to reduce the risk:
- Avoid exposing your voice unnecessarily: Do not share long or public audios on social networks if it is not essential. Every second counts for those who seek to replicate it.
- Distrust of unexpected calls: If you receive a call from an unknown number that claims to be someone you know, verify before another channel.
- If you have visibility, protect yourself better. If you are a content creator or have a relevant media exposure, value the use of mild voice distortion tools or even safe synthesizers that prevent direct cloning.
What if your voice has already cloned?
One of the most important recommendations of this expert is to establish a Secret word with your loved ones or trusted people. The objective: use it as a private code when some sensitive action must be performed, such as authorizing a transfer of money or confirming an urgent call. If someone who sounds as you call and cannot say the correct key word, they will know that it is not you.
“We are used to protecting passwords, but we still don’t understand that Our voice can also be stolen“, says Garcia-Milà.