Dead relatives, healers and AI: this is how they scam with the Speechify app in South Africa

Speechify is an application initially designed to help consume written content aurally. It was launched in 2016 by Cliff Weitzman when he was still a student at university and has been gaining features since then; in recent times, closely related to generative artificial intelligence. One of them is the ability to clone voices from an audio clip and make them sound naturally, a feature they are now exploiting alleged healers in South Africa to deceive his victims.

According to El País, the South African police have warned of growing use of Speechify to scam vulnerable people posing as healers, sometimes with significant exposure on social media, which allows them talk to a deceased loved one in exchange for money.

The media collects an archetypal case of this deception, that of a man of Johannesburg who had not worked for ten years and whose desperation due to this situation caused him to fall into the trap. Pearce Banjolo consulted one of these false healers, who assured him that He had to invoke the spirit of his grandmother to avoid reaching 60 without work or family.

After collecting an amount equivalent to 500 euros and transferring it to the supposed healer, he went to a cabin where he was behind a curtain, since the ritual demanded that he not see it.

The scammer appeared to go into a trance and then the voice of the victim's deceased grandmother began to sound, telling him how he could appease her spirit. “The voice had been reproduced thanks to an AI program and they had used a speaker. There was no sangoma (healer) behind the curtain. I cried uncontrollably. I had been robbed, I was in debt to my brothers and I was still unemployed,” Banjolo explained to El País.

According to Colonel Brian Malope, a forensic detective with the South African Police Service's serious fraud squad, these types of scams have become popular in South Africa. in the last couple of years. To clone the voice, the victim is asked to send an audio in which it appears and if it is not possible, they record the victim's voice and then modify it with Speechify to make him believe that it is that of the deceased person.

Speechify is not the only AI tool available for clone voices, but it is one of the most accessible and can be used both from a computer and an Android or iOS mobile. Last year, the Southern African Fraud Prevention Service, SAFPS, reported that Phishing attacks increased by 264% during the first five months of 2023, compared to 2021. The cases with Speechify are another example of this problem.