Fake News: The Worst Is Yet To Come?

If you’re concerned about fake news, it may only get worse.

We’ve already had examples of what clever technology can do — this video appeared on the Internet in May of Donald Trump apparently offering advice to the people of Belgium on climate change.  (It looks pretty artificial to me, but many took it seriously.)

Cheesy or not, such video manipulations have become simple to create.  In 2014, a graduate student name Ian Goodfellow invented a machine-learning technique called a “generative adversarial network”, or a GAN.  Basically, it can generate new types of data out of existing data sets, like examining a thousand photos, then making a new one that approximates the thousand without being an exact copy of any of them.  GAN can also do the same with audio and text data sets.

This technology seemed relatively benign until 2017 when a Reddit user who called himself “Deepfakes” started posting pornographic videos with celebrities’ faces superimposed on porn actresses.  The practice quickly became known as “deep fakes.”  Such videos were banned, of course, but the secret was out.

Meanwhile, technology marches on — a team of researchers affiliated with Germany’s Max Planck Institute for Informatics recently unveiled a method of producing “deep video portraits,” where one person controls another person’s face and speech.  Granted, this might be a very useful tool for dubbing movies into foreign languages, but still…  The misuse possibilities are only limited by imagination.

So we, the public, must be more skeptical than ever.  Remember what Abraham Lincoln reportedly said — “Don’t believe everything you read on the Internet.”  (If you don’t believe me, I’m sure the video is around somewhere.)

For a more detailed explanation, see “You thought fake news was bad?  Deep fakes are where truth goes to die” by Oscar Schwartz at https://www.theguardian.com/technology/2018/nov/12/deep-fakes-fake-news-truth?.  The photo came from that site.

 

Leave A Reply

Your email address will not be published.