BRAAHMS: A Novel Adaptive Musical Interface Based on Users’ Cognitive State
- Beste Filiz Yuksel, Tufts University
- Daniel Afergan, Tufts University
- Evan Peck, Tufts University, Bucknell University
- Garth Griffin, Tufts University
- Lane Harrison, Tufts University
- Nick Chen, Tufts University
- Remco Chang, Tufts University
- Robert Jacob, Tufts University
- We present a novel brain-computer interface (BCI) integrated
with a musical instrument that adapts implicitly (with no extra effort from user) to users' changing cognitive state during musical improvisation.
Most previous musical BCI systems use either a mapping
of brainwaves to create audio signals or use explicit
brain signals to control some aspect of the music. Such
systems do not take advantage of higher level semantically
meaningful brain data which could be used in adaptive systems
or without detracting from the attention of the user.
We present a new type of real-time BCI that assists users in
musical improvisation by adapting to users' measured cognitive
workload implicitly. Our system advances the state of
the art in this area in three ways: 1) We demonstrate that
cognitive workload can be classified in real-time while users
play the piano using functional near-infrared spectroscopy.
2) We build a real-time, implicit system using this brain
signal that musically adapts to what users are playing. 3)
We demonstrate that users prefer this novel musical instrument
over other conditions and report that they feel more
Return to the previous page.