Skip to content

gscottqueen/emotiv-bci-hdrp-vr

Repository files navigation

Integrating Human Data into Virtual Environments

An experimental generative art work that integrates electroencephalography (EEG) activity with a virtual environment.

Supplemental Research & Writing

artist-statment.mp4

Combining the Emotiv Insight 2, five (5) channel EEG brainware, Meta Quest 2 virtual reality headset, and the Unity development platform we integrate streaming neurological data with real-time virtual interactions. Accessing six (6) performance metrics including; engagement, excitement, stress, relaxation, interest, and focus - alongside 9 axis motion sensors, we generate programmatic visuals that allow the user to interact and influence the immersive effects of the program. Users are able to either focus on one metric at a time or chain multiple experiences together, leading to the creation of a multitude of unique generative worlds.

This technology offers a personalized experience that allows users to engage with their cognitive states. By using internalized emotional focus they can shape visuals and interact with the program on a subconscious level. This bridge connects neuroscience, virtual reality, and interactive programming to open up applications in the research of mental wellness and technology through expressive sensibilities, aesthetics, and creative expression.

Screen Captures


  • The High Definition Render Pipeline (HDRP) is a high-fidelity Scriptable Render Pipeline built by Unity to target modern (Compute Shader compatible) platforms. HDRP utilizes physically based Lighting techniques, linear lighting, HDR lighting, and a configurable hybrid Tile/Cluster deferred/Forward lighting architecture. (source)


  • The graphic menu is integrated to search for avalible devices via bluetooth and allow a user to select which device they would like to integrate with the session.


  • When the channels for emotional response are of a connectivity value greater than 80%, interactive elements intantiate into the scene. Here each of these boxes represents a different emotional data point.


  • These data points are then passed through to generative visual effects, like this particle ray system.


  • In some cases we work to push the graphical limits. In this scene we are baking 500000+ animated hair material frames. Eventualy we will push this work to the DOTS programing design pattern to see these limits go even further.

Active Development

There are different builds and branches that are in active development, each has its own system and hardware requirements.

Default System Requirements

1. Open Explore

This build package is for demonstration of interactions without the brainware dependency.

2. Configuration for Virtual Devices

We use this configuration to work virtualized hardware from the Emotiv Launcher.

Additional Dependencies

  1. in the DataSubscriber, set Use Virtual Device to active
  2. in the Scene activate UsePhysicalDevcice GameObject
  3. in the Scene set FreePlay GameObject to inactive

This boolean will allow for the data stream but not the motion data.

3. Configuration for Physical Device

We use this branch to work with physical brainware.

Additional Dependencies

  1. in the DataSubscriber, set Use Physical Device to active
  2. in the Scene activate UsePhysicalDevcice GameObject
  3. in the Scene set FreePlay GameObject to inactive

Same as above, but with a physical device connected via bluetooth. This has some additinoal quaternion data points that are only avalible through the physical device.

Dependencies:

This project was initated by leveraging the Emotiv-Unity-Plugin and is included with this project as a submodule.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages