Limbic Lab, Narrative Engine & Media Machine by Heidi Boisvert

Category
Artificial Intelligence, Documentaries, Performance

The Narrative Engine & Media Machine is an embodied multi-modal, cross-media experience that combines scientific rigor, creative expression and emerging technology to offer new insights into data science and media effects research through generative art, electronic music, and 3D printed sculptures. A series of assays are transformed into unique instantiations of the embodied mindʼs reception of diverse media forms. These instantiations—media imprints of usersʼ biophysical and EEG signals (as well as eye-tracking and facial recognition)—are captured and stored to build a database of signatures for playback and analysis, as well as a feedback loop to generate automatic media production processes using predictive modeling and machine learning algorithms.

 

Long Description:

Most assessment research on the impact of games, web-based interactive, virtual reality and other digital pop culture draws upon outdated media effects and edutainment research from communication studies, which focus upon the false-linearity of KAP (knowledge, attitude, and practice) and do not consider the social context of emerging technology or the psychographics of the participant. Arvind Singhal observes that “there’s no formula for social change, there are no foolproof recipes that lead to knowable, agreeable and predictable outcomes.” Instead, we have “hunches” about the role that communication and mass media strategies play in social change processes. By creating a Limbic Lab to develop tools to measure, predict and iteratively prototype experiences through the real-time capture, storage and playback of multi-sensory data sources (EEG, biometric, facial recognition, eye-tracking et al), I can map the activation areas triggered by cognitive and affective cues during engaged interactive experiences with various intelligent technology, isolate common variables and predict which affordances will best communicate with target audiences. Such creative research will translate those hunches into quantifiable insights and a new media effects model based on bio-data-driven inquiry. The data would then function as a real-time feedback loop, enabling more efficaciously designed multi-media experiences employing emerging technology to stimulate and positively reshape our brain wiring diagrams

 

Inspired by British cyberneticists who saw the brain as an embodied organ tied to bodily performance and spurred by rhetorical hype and ungrounded assumptions made about virtual reality as an “empathy machine,” I decided to create a prototype of a Narrative Engine (formerly Empathy Engine) to experiment with an alternative, multi-modal approach to gathering, analyzing and displaying empirical media effects data to better understand the underlying mechanisms that drive empathic engagement. The Narrative Engine is an embodied, cross-media experience that combines scientific rigor, creative expression and emerging technology to offer new insights into data science and media effects research. A series of assays are transformed into unique instantiations of the embodied mind’s reception of diverse media forms. These instantiations—media imprints of users’ biophysical and EEG signals (as well as eye-tracking & facial recognition)—are captured and stored to build a database of signatures for playback and analysis, as well as experiential engagement. The engine consists of a MongoDB database, static affordances in .json, the XTH Sense, a biophysical sensor that captures, amplifies and parses heartbeat, body temperature, blood flow, muscle contraction, spatial data, and OpenBCI’s UltraCortex Mark IV,  a 16-channel electroencephalogram (EEG) headset mapped to event nodes within the narrative arc, which represent the rhythm and physiological response patterns of participants through real-time sonification and visualization as they experience a media form. This creative research seeks to:

  • Map psychophysiological measures to cognitive processing – mapping XTH Sense biophysical data to MatLab
  • Patch variables into predictive modeling scripts to identify effective pathways that lead to the threshold of empathic conversion
  • Determine common variables across individual, project & media types to feed into the Media Machine for iterative prototyping
  • Correlate EEG and biometric measures for richer analysis of media effects on the level of an individual imprint or group
  • Translate empirical findings into multi-modal representations through generative sound & image for broader public understanding of data science
  • Design real-time systems to bio-adapt, personalize and automate media production processes through predictive modeling & machine learning

 

The Limbic Lab, a fully-integrated, open-source and mobile biometric platform to study the body’s intelligence, will further expand upon and refine the research and explore: How can large-scale change happen in our current media ecology of constant interruption? Is behavior and attitudinal shift still possible in an increasingly affectless society, even if we take up the commercial tools that surreptitiously shape the public imagination? By appropriating intelligent technology as value-neutral and instrumental, aren’t we, as change agents, too, reinforcing the systems of oppression caused by our dependence on technology?

 

This lab will also support the development of the full vision of the Empathy Engine and Media Machine into a scalable product, as well as other open-source tools for cultivating and measuring the art and science of influence.

Funding: Pop Culture Collaborative – $30,000 (Measuring the Narrative Ingredients of Episodic Television)

Additional Partner: USC Norman Lear Center – Media Impact Project