new methods for fraudsters to deceive individuals. Scammers might utilize AI chatbots such as TerifAI, which enables voice replication to target oblivious victims. They may impersonate family members during crises, supervisors requesting monetary transactions, or famous personalities endorsing particular investments. The aim is to persuade the victim to act, and AI voice replication applications are readily available and simple to use. They merely require a few seconds of audio to replicate someone’s voice, and some may lack adequate security measures to prevent misuse. You should be concerned about voice replication threats and reconsider any dubious voice calls in the future.
A Consumer Reports examination from March 2025 underscored the dangers linked to voice replication applications. The non-profit reviewed six applications utilizing AI for voice replication features, discovering that four of them do not sufficiently ensure that the individual performing the cloning has the speaker’s consent. The report identifies ElevenLabs, Speechify, PlayAI, and Lovo as the four applications lacking a technical means to verify speaker consent, as of last March. Descript and Resemble AI are the two applications that made misuse more challenging, but Consumer Reports still identified methods to circumvent their protections.
To evaluate these applications, Consumer Reports attempted to create voice replicas from publicly accessible audio, a situation a scammer might leverage. A malicious person might search for voice samples on social media when targeting someone. The audio could be utilized with voice replication services to generate the intended outputs. Some of the companies examined by Consumer Reports claimed to implement protective measures, such as watermarking recordings or employing a database of deepfakes, but those measures might prove inadequate.

