Audio deepfakes: What they are, and the risks they present

Audio deepfakes are becoming a big problem. Recent cybercriminal campaigns use voice cloning technology to replicate the speech tone and patterns of celebrities such as Elon Musk, Mr. Beast Tiger Woods, and others and use them for endorsing fake contests, gambling, and investment opportunities.

In this Help Net Security video, Bogdan Botezatu, Director of Threat Research and Reporting at Bitdefender, discusses the growing trend of celebrity audio deepfakes.

Don't miss