DEEPFAKERY: A new web series of critical conversations about deepfakes, satire, human rights, documentary and journalism

By Katerina Cizek | Originally posted on the Co-Creation Studio blog | August 20, 2019

Deepfakes first arrived on the scene in the form of celebrity pornographic videos, revenge porn, fake news, hoaxes, and financial fraud. Now, artists, documentarians and activists have also gotten their hands on deepfake tech — or synthetic media —  to expose, challenge, enlighten, provoke and even protect.

Deepfakes are a suite of AI technologies that enable video face-swapping, puppeteering and voice manipulation. Its malicious uses have triggered moral uproar and confusion in the media. But our colleagues at witness.org have recommended in an important study that instead of panicking, we might do better to prepare.

“Rhetoric about the ‘end of truth’ plays into the hands of people who already are saying you can’t believe anything – and that is neither true of most audiovisual material, nor true yet for deepfakes. We should not panic but prepare instead,” says Sam Gregory, Program Director at Witness.

So at Co-Creation Studio, we have partnered with Witness to present a new weekly web talk series that examines the potentials and perils of this tech for documentary, art and journalism, especially within the context of dis- and mis- information.

(I worked with Sam on a film twenty years ago called Seeing is Believing, which traced the uses of handicam and emergent tech to provide evidence in human rights activism. We have been calling this new project “Seeing isn’t Believing”).
 

 

 
Throughout the series, we will explore the boundaries and importance of satire as the tech becomes available for political commentators and activists. We will situate deep fakes within a larger spectrum of  media manipulations, now called shallow fakes. We’ll do a deep dive into an incredible documentary project that is the first to the use deep fakes “for good” – to protect the identities of vulnerable subjects. ,And we’ll also dive into deep fakes that are meant to discredit. Our conversations will feature some of the brightest technologists, artists, activists, journalists and scholars from around the world.
 
 
SCHEDULE 

Faking the powerful
Thursday, August 27th, 12:00-12:45 pm EST
Bill Posters and Daniel Howe (Spectre Project) and Stephanie Lepp (Deep Reckonings) in conversation with Sam Gregory (WITNESS)

Not funny anymore: Deepfakes, manipulated media, and mis/disinformation
Thursday, September 3rd, 12:00-12:45 pm EST
Jane Lytvynenko (Buzzfeed News), Karen Hao (MIT Tech Review) in conversation with Corin Faife (WITNESS)

Using AI-generated Face Doubles in Documentary: Welcome to Chechnya
Tuesday, September 8th, 12:00-1:30 pm (NOTE: different day of week and longer slot as part of MIT Open Documentary Lab public lecture series
David France (Welcome to Chechnya) in conversation with Kat Cizek (MIT Co-Creation Studio)

Boundary lines? Deepfakes weaponized against journalists and activists
Thursday, September 17th, 12:00-12:45 pm EST
Samantha Cole (Vice), other participants TBC

Manipulating memories: Archives, history and deepfakes
Thursday, September 24th, 12:00-12:45 pm EST
Francesca Panetta and Halsey Burgund (In Event of Moon Disaster), James Coupe (Thoughtworks Synthetic Media Resident 2020), Yvonne Ng (WITNESS) in conversation with William Uricchio (MIT)

Still funny?: Satire, deepfakes, and human rights globally
Thursday, October 1st, 12:00-12:45 pm EST
Julie Owono (Internet Sans Frontieres) and Evelyn Aswad (University of Oklahoma) in conversation with Sam Gregory (WITNESS)
 

 

Throughout the series, we will explore how we all might prepare, based on Witness’ list of recommendations that span realistic and wise considerations for technological, legal, policy solutions, and most importantly, media literacy.