WORK   ABOUT      PRESS
Spring 2022

MIT Media Lab, MAS S.63 Engineering Sleep and Dreams

 
Collaborator:  Davide Zhang, Alejandra

Instructors: 
Pattie Maes and Adam Haar

# EEG
#BCI
#Unity
#Procedural Mesh
#Brain fingerprint
#Sleep
#Interactive art installation



Sleep Artifact


We are proposing an interactive art  installation that materializes sleep into spatial artifacts and invites the audience to experience the process. EEG data of a sleeping participant is visualized and the trace of the visualization is materialized into a procedural mesh. The spatial artifacts then go on to form part of a larger archive where the visual language of the artifacts speaks to the sleep identity of each person. As the archive grows, the complex abstractions have the potential of gaining more legibility.




Research

There are five main aspects to this technical pipeline: EEG data collection and mapping, procedural mesh generation, real-time visual effects, visualization in virtual reality, and 3D printing.

The Neurosity Crown EEG sensor collects and transmits data via WiFi to a computer in real-time. Through a modified Neurosity SDK (“Notion Unity SDK”), the real-time EEG data is fed into Unity. Then, the Delta power band value is specifically accessed through the SDK and each stream of data from its eight channels maps to a starting point in the mesh generation process. Each participant received a unique set of starting points through a randomized selection process so that the base shape of the meshes look different by person, but are legibly similar for the same person. It is worth noting that we took creative control in individualizing the form of the meshes. In an ideal scenario, the EEG “fingerprint” would be mapped to the mesh generation so that the correlation between form and data is less arbitrary.



Exhibition