Getting to the Birds
Virtual reality for your ears? Maya Ackerman and student researchers are using technology to enhance meditation.
If trying to meditate has you feeling stressed, you’re not alone. It’s hard. That’s why apps like Headspace, Personal Zen, Happify, and The Stress Doctor exist. But apps aren’t the only tech tools people are using to shut off the noise in their heads. There’s hardware, too.
Enter Muse—a headset paired with an app that gives real-time feedback on what’s happening in your brain when you meditate. It detects brainwave activity and uses this data to determine the users state of mind: active, neutral, or calm. It takes the guesswork out of reaching Zen—if you’re there, you literally hear birds chirping in your ears.
The problem is, it’s really hard to get to the birds, says Maya Ackerman, assistant professor in the computer engineering department. Ackerman bought the Muse and after trying it out, identified a design flaw. It created something she calls biofeedback anxiety.
The app alerts the user, through various sounds of nature, whether the brain is in a relaxed state, based on brain waves received from the headset. The user works through sounds of stormy weather to reach a pleasant sunny day with birds chirping. During this process, there’s constant feedback from a brainwave sensor in the device. If the nature sounds get louder—for example if the ocean sounds more turbulent—it becomes immediately obvious that you’re still stressed, which is counterintuitive.
“It got me stressed out about being stressed out,” says Ackerman. “And I knew we could design something to make it better.”
For their senior design project, Jason Capili ’18, Mark Hattori ’18, and Maile Naito ’18 worked to resolve the biofeedback issue by integrating a feedforward methodology instead. The program combines binaural beats—which they generated themselves—laid over ambient meditation music to take the place of the nature sounds currently available on the Muse.
If you’re unfamiliar with the concept, binaural beats are what the brain hears when it encounters two sounds that are played at the exact same time with close but different frequencies. For example, if the brain hears two sounds with frequencies of 710Hz and 720Hz played at the same time, it will process the sound at 10Hz. That 10Hz that the brain hears, is called a binaural beat.
Research has shown that binaural beats can have an impact on the brain. Low Hz binaural beats can calm the brain and higher ones can make you feel more active and alert.
The student researchers are using the lower Hz binaural beats to relax users. When combined with meditative music, the binaural beats drive the user to a relaxed state faster, without the stress of any negative biofeedback.
Hattori has a musical background and focused on generating the binaural beats for the project. Hattori coded them to be in the key of the song so it’s subtle enough that the user doesn’t notice the binaural beat in the music.
“We felt that if we tell the user that the binaural beat is here, it might impact the data—we wanted it to blend in with the music,” Hattori says.
Capili set up the Graphical User Interface (GUI)—the part of the system the user interacts with. He set up the entire system up to make sure that their computers could connect to the Muse headband and to check that the incoming data was accurate.
The team tested users with music that included binaural beats and ambient music that didn't have them. The music with the binaural beats resulted in a faster path to relaxation.
Naito, a computer science and engineering major, had to step outside of her comfort zone and focus on biology for this project.
“Since we’re working on brain waves and none of us had the background, I did a lot of research on that and trying to help fine tune how we go about using that brain data to the best of our ability to help our system—to figure out what waves we needed to create and to read,” Naito says.
The program prototype the students developed uses basic feedforward techniques but the goal is to eventually utilize machine learning to personalize the experience by determining which patterns of sound or binaural beats are the most effective at relaxing each user.
The project will continue and Ackerman has a team of three graduate students who will test the system this summer. Julia Scott, adjunct lecturer in the bioengineering department, will work to add a virtual reality element to the app where the user will be lead to relaxation not only through an audio but visually as well.
“I’m really proud of all that this team has achieved. A computer science education is to teach you the basics and to give you enough insight into how things work so you can figure it out after banging your head against the wall” Ackerman says, “This was a perfect example of how they broke through that wall.”