From the course: The Cybersecurity Threat Landscape

Explore the threat of deepfakes

From the course: The Cybersecurity Threat Landscape

Explore the threat of deepfakes

- You've probably seen some pretty convincing CGI and special effects in recent science fiction shows and movies. And movie magic has been used to bring back actors who are no longer with us or make older actors look much younger. Welcome to the world of deep fakes. Deep fakes are created using AI based software to mimic people's voices and even their faces. Outside of Hollywood, criminals are starting to use deep fakes to make social engineering cyber attacks more convincing. Deep fake voice attacks have been reported where the cloned voice of a company's CEO or other executive was used to commit payment fraud. The cloned voice tricks employees into making large payments or changing the payment process to send funds to a scammer's bank account. These attacks can be used in conjunction with business email compromise, or BEC attacks which I describe in another video. For instance, an email spoof to look like it came from your CEO is sent to someone in your finance department. The email is requesting an urgent payment to be processed for an important business transaction. That's followed up with a phone call using a deep fake audio of your CEO's voice asking the finance person to quickly make the payment referred to in the email. As you can imagine, this combination attack can be very persuasive. Deep fake audio applications that can do this type of voice cloning are easy to find. And the results can be hard to detect as fake even to people who know the deep faked person well. All the attackers need is a good amount of audio of the target person speaking. And in the age of podcasting, it's not hard to find interviews of high profile people like your company's CEO. As deep fake technology rapidly develops, we can expect that it will continue to grow as a dangerous attack on the cybersecurity threat landscape.

Contents