An experimental generative art work that integrates electroencephalography (EEG) activity with a virtual environment.
Supplemental Research & Writing
artist-statment.mp4
Combining the Emotiv Insight 2, five (5) channel EEG brainware, Meta Quest 2 virtual reality headset, and the Unity development platform we integrate streaming neurological data with real-time virtual interactions. Accessing six (6) performance metrics including; engagement, excitement, stress, relaxation, interest, and focus - alongside 9 axis motion sensors, we generate programmatic visuals that allow the user to interact and influence the immersive effects of the program. Users are able to either focus on one metric at a time or chain multiple experiences together, leading to the creation of a multitude of unique generative worlds.
This technology offers a personalized experience that allows users to engage with their cognitive states. By using internalized emotional focus they can shape visuals and interact with the program on a subconscious level. This bridge connects neuroscience, virtual reality, and interactive programming to open up applications in the research of mental wellness and technology through expressive sensibilities, aesthetics, and creative expression.
- The High Definition Render Pipeline (HDRP) is a high-fidelity Scriptable Render Pipeline built by Unity to target modern (Compute Shader compatible) platforms. HDRP utilizes physically based Lighting techniques, linear lighting, HDR lighting, and a configurable hybrid Tile/Cluster deferred/Forward lighting architecture. (source)
- The graphic menu is integrated to search for avalible devices via bluetooth and allow a user to select which device they would like to integrate with the session.
- When the channels for emotional response are of a connectivity value greater than 80%, interactive elements intantiate into the scene. Here each of these boxes represents a different emotional data point.
- These data points are then passed through to generative visual effects, like this particle ray system.
- In some cases we work to push the graphical limits. In this scene we are baking 500000+ animated hair material frames. Eventualy we will push this work to the DOTS programing design pattern to see these limits go even further.
There are different builds and branches that are in active development, each has its own system and hardware requirements.
Default System Requirements
-
PC with Graphics Enable GPU
I'm currently using Version: Direct3D 11.0 [level 11.1] Renderer: NVIDIA GeForce RTX 3070 Ti (ID=0x2482) Vendor: NVIDIA VRAM: 8031 MB Driver: 31.0.15.2824
-
Windows 10+ ( the emotiv launcher does not yet support Windows 11)
-
Unity 2022+ ( current version is 2022.3.2f1), via the Unity Hub
This build package is for demonstration of interactions without the brainware dependency.
We use this configuration to work virtualized hardware from the Emotiv Launcher.
Additional Dependencies
- in the
DataSubscriber
, setUse Virtual Device
to active - in the
Scene
activateUsePhysicalDevcice
GameObject - in the
Scene
setFreePlay
GameObject to inactive
This boolean will allow for the data stream but not the motion data.
We use this branch to work with physical brainware.
Additional Dependencies
- in the
DataSubscriber
, setUse Physical Device
to active - in the
Scene
activateUsePhysicalDevcice
GameObject - in the
Scene
setFreePlay
GameObject to inactive
Same as above, but with a physical device connected via bluetooth. This has some additinoal quaternion data points that are only avalible through the physical device.
This project was initated by leveraging the Emotiv-Unity-Plugin and is included with this project as a submodule.