Fake video and audio streams that appear to be real can ruin the reputations of your executives and your company. They can cost you money. They can even cost you your job. Fortunately, there are steps you can take that can help.
If you’re at all familiar with the term “deep fake” you probably think about it in terms of fake videos about celebrities or politicians where they’re already being used as parts of disinformation campaigns. In some cases these fake videos are part of extortion campaigns, in other cases they’re produced as a way to discredit government leaders. But such fakes are already showing up in attempts to extract money from companies as well.
As AI gets more sophisticated, fakes will get more sophisticated
A deep fake is a media stream that’s been created using artificial intelligence by modifying existing information to create a new media stream that purports to be someone saying or doing something that didn’t actually do. Technology that does this has been around for a while, but what’s changed is that these fakes are getting so good that it’s nearly impossible to tell the difference between something that’s real and something that’s not.
This image made from video of a fake video featuring former President Barack Obama shows elements of … [+]
The fake video is created by feeding the AI software a series of images, either still photos or real video, and feeding it voice clips. The software then uses those images, which will show the subject in a variety of positions, to create an image of them that seems to be moving in real time. It also analyzes the voice recordings of the subject, which allows it to create a new voice that sounds like that of the subject, including subtle characteristics that will lend credence to the fake.
In the past the voices were done by actors impersonating the subject, but now that’s changed and the AI creates a voice that’s much more realistic. A good example of a fake video that uses a voice actor can be found in video of former President Barack Obama seemingly discussing such fakes.
“As AI gets more sophisticated, fakes will get more sophisticated,” said Ben Goodman, senior vice president of global business and corporate development at ForgeRock. Goodman said that the AI systems are learning speech cadences and even a person’s gait as they walk.
“Audio deep fakes are a big threat,” said Adam Kujawa, director of Malwarebytes Labs. “They can be used for CEO fraud. If your company doesn’t have some confirmation of expenditures, you can fall victim.”
Voices and Fraud
By now most executives in most large companies are aware of the type of CEO fraud in which a senior executive, frequently the CEO, appears to send an email to another person in their company that requests the transfer of a large sum of money. In the past, that money was transferred without question, resulting in a big loss when it turned out that the request was fraudulent.
Because of this fraud, companies usually require a second means of authentication, usually a phone call with the person requesting the transfer. What happens with voice fakes is that the person transferring the money may find a message in their voice mail, perhaps with the CEO’s Caller ID, verbally authorizing the transfer. But what’s really happened is that the voice is a fake, and the phone number is spoofed.
“Those kind of things can put a company out of business through reputation damage,” said Chris Kennedy, CISO and VP of customer success at AttackIQ. “We’re hitting the tipping point in which technology is taking advantage of the biggest human weakness, we’re over trusting.”
Audio deep fakes can be used in a variety of fraudulent activities beyond CEO fraud. Such fakes can be used to subvert the procurement process and even to disrupt business relationships.
But video deep fakes can also be used to disrupt business in another way. Suppose, for example, your CEO shows up on a video saying something that can cause your company’s stock value to swing wildly. By the time your PR staff can put out the word that the video is a fake, a ruthless investor could have used your temporary misfortune to cash in on those wild stock swings or cause other damage to your company’s reputation.
Dealing with Deep Fakes
How you handle these fakes depends on exactly what type of activity you’re encountering. For attempts at fraud you will need to create a means of authentication that you actually control. This means that your company will need to create a policy that’s intended to prevent CEO or procurement fraud.
One example of how this might work is to require that the person making the transfer initiate the authentication. This means that they call the CEO’s personal cell phone using a number that’s already on file in the company records. Because Caller ID can be spoofed, and because you can’t trust someone who calls you, that means you can’t accept any inbound call as a means of authentication.
When you make this policy, you must also accompany this policy with one that protects the person who is insisting on authentication. Your employee manual must specifically say that they can’t be punished for following that rule if they were acting in good faith.
The same is true with procurement fraud. There, the most effective policy is to simply not accept orders over the phone. If you can’t do that, then set a limit above which you must make an authentication call using a number you already have.
Unfortunately, there’s not much anyone can do right now to prevent someone, perhaps a disgruntled former employee, from creating a fake video and then releasing it on the internet. For example, such a video could have your company CEO announcing a major financial loss, or perhaps a termination of a line of business. Such an announcement could have a significant effect on your stock prices.
Likewise, a fake video announcing the termination of a partnership with another company could result in real damage unless the partner company could be convinced that the announcement was a fake.
The only way to a handle on this is to first be able to prove the video is a fake, and second to get your communications department to spread the word as quickly as possible.
One way to determine that such a video is a fake is to use the services of Deeptrace, a company with the tools required to show that a video is a fake.
But once you’ve got the proof, your communications department needs to already have a plan in place for exposing the false nature of the video.
Don’t Trust Anyone
Meanwhile, you have to teach your staff that they can’t trust anyone without authentication. So while you may be able to trust instructions given in person, that trust must end when it comes to emails and phone calls,
More About DeepFakes
For a look at some DeepFakes, here are a couple of links to videos on YouTube that will show you how good such fakes can be,
Hillary Clinton’s face placed on Saturday Night Live’s Kate McKinnon
Sylvester Stallone’s face on Arnold Schwartzenegger’s body in a Terminator 2 clip
A detailed explanation of the technology from Australia’s ABC network, “Behind the News”