This is an experiment in visualizing and storing data from an Emotiv Epoc neuroheadset using Processing and OSC.
The project was developed over the course of two weeks during X | Y research lab held at Castrignano De' Greci in July 2014.
Before starting, make sure the headset is connected and its sensors placed correctly. We found that it helped using a saline solution to have a better quality from the sensors.
Insert a name to identify the observation and start the visualization.
Each blob on the right represents a sensor with its values recorded over a predefined time. On the left, from top to bottom, are the values from the OSC library, the combined values over time and the average values for each observation
You can click a sensor to keep it opened and hover it to see its name and realtime value.
The circle in the middle of the blob represents the sensor quality, the blue lines are the value recorded over a predefined interval of time and the white circle represents the current value sensed by the sensor.
Eugenio Battaglia, Alessio Erioli, Leonardo Romei, Danilo Di Cuia, Giulia Marzin, Michele Pastore, Antonio Vergari.
Code by Danilo Di Cuia, Alessio Erioli, Antonio Vergari.
Based on the Java porting of the EMOKIT library by Samuel Halliday https://github.com/fommil/emokit-java who ported the open source Emokit library originally written in C developed by several brave people credited on the original repository: https://github.com/openyou/emokit.
Additional resources and unorganized mental snapshots from the process can be found at https://xylabopeneeg.wordpress.com