Think You Can Believe Your Eyes? Think Again.
By MS. BETTY NYLUND BARR, STAFF WRITER
We live in an age when we are inundated with news from various media. Most of us likely are so busy with our own lives and responsibilities that we are lucky if we can keep up with what is going on in the world. Now, thanks to the rapidly growing field of artificial intelligence, or AI, we cannot even take the videos and photos we see at face value.
Enter deepfakes.
According to the Merriam Webster online dictionary, a deepfake is “an image or recording that has been convincingly altered and manipulated to misrepresent someone as doing or saying something that was not actually done or said.†Although you could use the technology to send a birthday greeting to your friend from “President Biden†or to try on clothes virtually before you buy them, bad actors are using deepfakes for fraud, blackmail, theft, and defamation.
Deepfakes appeared on the scene in 2017. Reputable news sources such as the BBC and The New York Times warned that such misinformation threatened to destabilize society. Although it did not happen immediately, the proliferation of deepfakes is taking place.
How are people making deepfake videos? According to Matt Groh with the Massachusetts Institute of Technology (MIT) Media Lab, the creator uses a facial recognition algorithm and a variational autoencoder (VAE) to analyze the structure of someone’s face.1 Using machine learning, the creator sets up two generative adversarial networks (GANs) to train in competition with each other. The first of the two networks, the generator, creates the counterfeit photo, video, or audio product. The second network, the discriminator, identifies the counterfeit data and adjusts the forgery. The possibly thousands or millions of iterations that result from the training become more and more refined until the original product and the forgery are virtually indistinguishable.2
The following are deepfakes that actually happened.
Former President Barack Obama’s face was superimposed on a figure and his voice was forged to create a video of him using derogatory language to describe former President Donald Trump.
A fake video showed Ukrainian President Volodymyr Zelenskyy ordering Ukrainian soldiers to surrender to Russian forces.
The U.S. intelligence community determined that Russia engaged in extensive influence operations during the 2016 presidential election to “undermine public faith in the U.S. democratic process, denigrate Secretary Clinton, and harm her electability and potential presidency.â€
The CEO of a U.K.-based energy company followed the phone instructions from someone he was convinced was his boss to transfer a large sum of money to a foreign supplier.
“Like many other cyberattack methods, we predict that threat actors will look to monetize the use of deepfakes by starting to offer deep-fake-as-a-service, providing less skilled or knowledgeable hackers with the tools to leverage these attacks through just the click of a button and a small payment,†warns Alon Arvatz, Senior Director of Product Management at IntSights.
Think of the implications and complications this technology presents to the Air Mobility Command and the rest of the armed forces.
A research report by the Brookings Institution suggests, “Deepfakes can be leveraged for a wide range of purposes, including falsifying orders from military leaders, sowing confusion among the public and armed forces, and lending legitimacy to wars and uprisings.â€3
“Deepfakes do pose a risk to politics in terms of fake media appearing to be real, but right now the more tangible threat is how the idea of deepfakes can be invoked to make the real appear fake,†says Henry Ajder, an expert on synthetic media and AI.4 A 2020 article in The Atlantic reported, “Law professors Danielle Citron and Robert Chesney call this the ‘liar’s dividend’: Awareness of synthetic media breeds skepticism of all media, which benefits liars who can brush off accusations or disparage opponents with cries of ‘fake news.’â€5
How can we feel confident that what we see is the real deal? MIT’s Groh says to pay attention to the following:
- Face—Is someone blinking too much or too little? Do their eyebrows fit their face? Is someone’s hair in the wrong spot? Does their skin look airbrushed or, conversely, are there too many wrinkles?
- Audio—Does someone’s voice not match their appearance (example: a heavyset man with a higher-pitched feminine voice).
- Lighting—What sort of reflection, if any, are a person’s glasses giving under a light? (Deepfakes often fail to fully represent the natural physics of lighting.)6
If you are faced with a situation in which you receive a phone call, text, or email instructing you to take unusual action, contact the supposed source to verify the instruction.
Airmen, to become more informed and aware of deepfakes, please check out the resources in the footnotes. They also contain links to deepfake examples and additional sources to access.
- Meredith Somers, “Deepfakes, Explained,” MIT Sloan School of Management, July 21, 2020. https://mitsloan.mit.edu/ideas-made-to-matter/deepfakes-explained. ↩︎
- 2 Kelley M. Sayler and Laurie A. Harris, “Deep Fakes and National Security,” In Focus, Congressional Research Service, April 17, 2023. https://crsreports.congress.gov/product/pdf/IF/IF11333. ↩︎
- Daniel L. Byman, Chongyang Gao, Chris Meserole, and V.S. Subrahmanian, “Deepfakes and International Conflict,” The Brookings Institution, January 2023. https://www.brookings.edu/research/deepfakes-and-international-conflict/. ↩︎
- Karen Hao, “The Biggest Threat of Deepfakes Isn’t the Deepfakes Themselves,” MIT Technology Review, October 10, 2019. https://www.technologyreview.com/2019/10/10/132667/the-biggest-threat-of-deepfakes-isnt-the-deepfakes-themselves/. ↩︎
- 5 Matteo Wong, “We Haven’t Seen the Worst of Fake News,” The Atlantic, December 20, 2022. https://www.theatlantic.com/technology/archive/2022/12/deepfake-synthetic-media-technology-rise-disinformation/672519/. ↩︎
- 6 Somers, “Deepfakes, Explained.” Also see DHS, Increasing Threats of Deepfake Identities, pp. 33–34. ↩︎