ODL Fellow Halsey Burgund to Premiere Deepfake Installation at IDFA DocLab

November 22nd, 2019

 

 

 

 

 

 

 

 

 

 

In Event of Moon Disaster, an immersive installation that reimagines the story of the moon landing co-created by Halsey Burgund and Francesca Panetta, is having its worldwide premiere today at the International Documentary Festival Amsterdam (IDFA). Burgund is a Fellow at the MIT Open Documentary Lab, and Panetta is Creative Director at the MIT Center for Advanced Virtuality. The project was selected for the IDFA DocLab Competition for Digital Storytelling, and is also part of DocLab’s “Domesticating Reality” program, which will explore the nature of physical space in computational times.

Halsey Burgund, Lewis D.Wheeler (actor), and Fran Panetta in MIT Studio recording voice clips for Nixon synthetic voice production.

 

 

 

 

 

 

 

 

 

 

 

In Event of Moon Disaster illustrates the possibilities of deepfake technologies by reimagining the Apollo 11 mission. What if it had gone wrong and the astronauts had not been able to return home? A contingency speech for this possibility was prepared for, but never delivered by, President Nixon – until now. 

To create a moving elegy that never happened, the team used deep learning techniques and the contribution of a voice actor to build the voice of Richard Nixon with the help of the company Respeecher. They also worked with the company Canny AI to use video dialogue replacement techniques to make a highly believable video of Nixon reading this speech from inside the Oval Office. The resulting video invites you into this alternative history and asks us all to consider how new technologies can bend, redirect and obfuscate the truth around us.

Art installation in Amsterdam

 

 

 

 

 

 

 

 

 

 

 

This deepfake video will be an integral part of the physical installation that opens today in Amsterdam. In a 1960s era living room, audiences will be invited to sit on vintage furniture surrounded by three screens including a vintage television set. The screens will play an edited array of vintage footage from NASA, taking the audience on a journey from take off into space and to the moon. Then, on the center television, Richard Nixon will read a contingency speech titled “In Event of Moon Disaster”  written for him by speech writer Bill Safire, which he was to read if the Apollo 11 astronauts had not been able to return to earth.

Ad created for deepfakes newspaper in the installation

 

 

 

 

 

 

 

 

 

 

The researchers chose to create a deepfake of this historical moment, for a number of reasons: space is a widely loved topic so potentially engaging to a wide audience; the piece is apolitical and less likely to alienate unlike a lot of misinformation; and as the 1969 moon landing is an event widely known by the general public, the deepfake elements will be starkly obvious. 

Rounding out the educational experience, In Event of Moon Disaster will include newspapers written specifically for the exhibit which detail the making of the installation, how to spot a deepfake, and the most current work being done in algorithmic detection. The goal is to increase public awareness about the capabilities of these AI technologies and people’s ability to identify deepfakes.  Audience participants will be encouraged to take a newspaper with them. 

About his inspiration for the project, Burgund states, “In the past, my work has been largely about using the authenticity of the spoken human voice for aesthetic and conceptual purposes. Now that we live in a time when synthetic voices can be produced using artificial intelligence, I am exploring how this lack of authenticity can affect democracy and society as a whole.”

While the physical installation opened today in Amsterdam, the team is building a web-based version that is expected to go live in spring 2020. In the meantime, you can view the In Event of Moon Disaster website here for more information and to keep up on further project developments.

In Event of Moon Disaster is an MIT Center for Advanced Virtuality Production supported by the Mozilla Foundation and the MIT Open Documentary Lab.