Skip to main content

Digital-age tools and technology give rise to fake videos

Cronkite School's Dan Gillmor, Eric Newton share on the direction of video and what individuals and tech companies need to do


Person holding phone with globe in the back

|
March 01, 2018

About the only thing more dangerous than a fake news story is a fake news video.

Fake news videos aren’t new, but they are on the rise and more realistic than ever due to technological advances. What used to be a fairly big production and cost thousands of dollars can now be achieved with a selfie stick and a smartphone. That may not sound like a big deal, but when politics, propaganda and bad intentions enter the fray, the potential to cause harm is staggering and potentially irreparable.  

ASU Now spoke to Dan GillmorDan Gillmor, an internationally recognized author and leader in new media and citizen-based journalism, teaches digital media literacy and is director of News Co/Lab, an initiative that works with journalists, teachers, librarians, technologists and others to elevate news literacy and awareness in our culture. and Eric NewtonEric Newton is a global leader in the digital transformation of news. As the Innovation Chief, he drives change and experimentation at Cronkite News, the news division for Arizona PBS. He also serves Knight Foundation as a consultant, working on special projects and endowment grants., who launched News Co/Lab in October, a collaborative lab inside the Walter Cronkite School of Journalism and Mass Communication that aims to help the public find new ways of understanding and engaging with news and information. They believe fake videos soon will be “trivially easy, inexpensive, and all too believable.”

Man in white shirt

Dan Gillmor

Question: The Los Angeles Times recently reported that false videos will become so accurate “they will defy reality.” How long have fake videos been around, and what’s the usual tone and nature of them?

Dan Gillmor: Media hoaxes aren’t new. What’s new in the digital age is the advent of tools that make fraudulent photos, audio and video easy to make and easy to believe — putting words in people’s mouths that they didn’t say, and showing them doing things they didn’t do. Tom Hanks’ Forrest Gump character, in a film showing him with presidents, took time and money. Soon it’ll be trivially easy, inexpensive and all too believable.

Eric Newton: Of course, Hollywood and "War of the Worlds" are entertainment, not news. When believable fake video becomes common in the world’s news stream, we will be breaking new ground — and unless we have figured out what to do, we will be in real trouble. As the power of artificial intelligence increases, counterfeit audio and video files will become harder and harder for people to recognize. In time, people (including journalists) will simply not be able to tell a fake video from a real one. That’s a huge problem.

Q: Is there inherently more danger with a fake video than a fake news story, and if yes, why?

DG: Yes — for now, anyway, because people seem to think it shows something that really happened, such as videos of police shooting people and the countless scenes captured by witnesses with cameras during and after disasters. But people will have to adjust their thinking, and additional verification, beyond what’s apparently shown in the video, will become even more important.

EN: That’s right: Seeing is believing, until it isn’t. Fake news video could start a panic. Or worse. Example: Warren Buffett seems to say he is selling all his stocks (when he isn’t) and advises everyone to get out of the stock market right now (which he didn’t). A political leader calls on supporters to take up arms and occupy local police stations (when he or she didn’t). Worse still, in this confusing environment, an authentic video showing a police shooting can easily be dismissed as “fake” by politicians when it really happened. If no one believes anything, the worst of humanity can hide in plain sight. In the long run, that’s far worse than a subgroup believing in a conspiracy theory.

Q: The article speaks to a scenario where someone like North Korean leader Kim Jong Un could announce a missile strike or make a doomsday proclamation. What safeguards are in place right now to determine whether such a video is fake?

DG: The U.S. has all kinds of technologies that watch for missile launches, and Kim has already made plenty of belligerent statements. But even if the danger of a nuclear war isn’t huge in this scenario, a video of this kind — if spread and believed widely before debunking — could destabilize markets and cause other kinds of trouble. The safeguards are fundamentally our insistence (and especially our leaders’ insistence) on wanting proof before trusting such things.

EN: Military technology does not directly help the average person. And do we really want government employees dictating what’s true? We need detection software, available to everyone. If AI can create fake video, AI can detect the fakes. In the future, software that detects digital misinformation may be as common as anti-virus software, spam blockers, ad blockers, fraud blockers and the like. We can’t wait until the fakes walk among us; this needs development now.

Q: Platforms like Facebook, Google and Twitter have promised to police themselves when it comes to fake news. Have you seen evidence of this, and what advice should they heed in regard to fake videos?

DG: They’re trying, and not always succeeding. We have to ask whether we want the tech platforms to be arbiters of truth, however. For sure, the platforms should participate in, and maybe lead, a global initiative to develop better detection and verification tools. This is easy to say and almost certainly difficult to do. They should be helping their users (and third-party developers) create add-on tools to help us be the arbiters of what we see. And they should be much more transparent about what they do, and how they do it.

Man in beard and suit smiling

Eric Newton

EN: Tech companies can make a big difference. We wouldn’t partner with them or accept their funding if we thought otherwise. Tech companies have the capacity to lead the way in developing filtering tools that prevent their own products being used for evil purposes. People should be given choices about how much filtering they want, just as people should have the right to examine and change data that private companies have acquired from them. For consumer choices to be effective, however, a lot of other folks need to step up: journalists can become more transparent and community-engaged, educators and librarians can make it their business to know and teach the fundamentals of all modern literacies, and each of us can learn to share news with more care and to push back against misinformation.

Q: How can the public better educate themselves on fake videos so they don’t get duped?

DG: Start by understanding that malicious actors are trying to deceive you — that they are talented and have time and resources. Be relentlessly skeptical of just about everything. The more sensational it is, the more skepticism is required. The more you want to believe something bad about someone or something you dislike, the more skeptical you should be. Wait for secondary evidence. Society needs to put critical thinking at the core of education, as a lifelong skill we constantly develop and improve.

EN: We should find and use news sources not because someone shared them with us, but because we know them and agree with how they verify and clarify news. Those sources should win and keep our trust by being clear about how and why they do what they do. When we hear or see a story that is “too good to be true” or in other ways makes us wary, we need to check it with one of our trusted sources.

Q: Are we doomed to live in a world where we can believe nothing, trust nothing?

DG: No. We have to trust someone along the way. We’re going to have to learn who’s more trustworthy and believable that not, recognizing that everyone makes mistakes. We have to demand better from institutions that want our trust, and we have to recognize our own responsibility in this changing information ecosystem.

EN: A good place to start is the News Co/Lab’s website, newscollab.org. Check out some of the best practices from newsrooms wanting to earn your trust — and ask your local news organizations to try them. Look at the best practices in education — and ask your local schools to teach the fundamentals of news and media literacy, civics literacy and digital literacy.

More Law, journalism and politics

 

Journalism student taking photos of players on a baseball field.

ASU's Cronkite News Phoenix Sports Bureau students gain valuable experience covering major events

Sports journalism students at Arizona State University’s Walter Cronkite School of Journalism and Mass Communication are receiving opportunities that many professional sports reporters envy.…

Gold Medallions above the words "Pulitzer on the Road"

New podcast is a prize for journalism enthusiasts

Pulitzer Prize-winning writers and judges are receiving a new platform to showcase their work and spark a dialogue about the inner workings of the prestigious award, thanks to a partnership between…

Two men in suits sit together smiling while one holds an iPad.

9 ASU students, alumni among finalists for Presidential Management Fellows program

Through the Presidential Management Fellows program, a leadership development program for advanced degree holders across all academic disciplines, nine Arizona State University students and alumni…