Deep Fakes are a large problem. If you’re not familiar with the term, a Deep Fake is an advanced photo or video technique in which some other person digitally overwrites an existing person in a video.
Want to make it look like a wholesome actress has a porn video on the web? A Deep Fake photo or video will do the job done.
Want to make it look like a politician you can’t stand said something completely hypocritical? Have a Deep Fake photo or video commissioned to make it look like the Politician in question said whatever you want him or her to say?
Or, on the financial front, if you want to tank a rival company’s stock price, you can make a Deep Fake video of the company’s CEO reporting a disastrous course of action for his or her company.
These are just a few of the problems of Deep Fakes, and the underground industry that’s producing this content is still in its infancy. So in the months and years ahead, we can expect to see much more of this type of thing, and in increasingly advanced forms.
Worst of all, people tend to believe the evidence in front of their eyes, so once a Deep Fake photo or video starts making the rounds, it spreads like wildfire and quickly is accepted as the truth. After all, what could be more damning than actual video footage of a given event?
Except, of course, that Deep Fake specialists make a trade out of inventing fictions out of thin air and then building photos or videos to support whatever story they want to push. It’s incredibly dangerous, and several companies are working hard to come up with ways to spot Deep Fakes, Adobe included.
Adobe Combats Deep Fake Photos and Video
Adobe recently rolled out a new content attribution tool in Photoshop with a specific design to spot and combat Deep Fakes photos and the damage they can do. While it’s still in beta, it represents one of the first tangible steps big tech companies are beginning to take to fight a war they didn’t even know they are in. Kudos to Adobe for their work to this point.