Sunday, June 8, 2014

Controlling a Hex Bug with my Brain Waves

Ever since my effort with OpenBCI began, I've been looking to control something with my brain.  Sure, a while back, I was successful in lighting an LED with my brain waves, but that's pretty simple.  I wanted something more.  And now I can do it.  I can control a robot with my mind!  Yes!


Approach:  My robot has just a few actions that it can do...turn left, turn right, walk forward, and fire.  To make this brain-controlled, I need a way to invoke these commands using signals from my brain.  Ideally, I'd just think the word "Fire!" and the robot would respond.  Unfortunately, those kinds of brain waves are too hard to detect.  Instead, I need to use brain waves that are easy to detect.  For me, "easy" brain waves include the Alpha waves (10 Hz oscillations) that occur when I close my eyes, as well as the brain waves that occur when I watch my blinking movies (a.k.a. visual entrainment).  So, my approach is to use OpenBCI to record my brainwaves, to write software to detect these specific types of brain waves, and to issue commands to the robot based on which brain waves are detected.

Here are all the pieces that you see in the video
Hardware Setup:  The core hardware for this hack is similar to my usual OpenBCI setup: EEG electrodes, an OpenBCI board, an Arduino Uno, and my computer. Added to this setup is the Hex Bug itself and its remote control, which I hacked so that the remote can be controlled by an Arduino.  So, as shown below, my brain wave signals go from my head all the way to the PC.  The PC processes the EEG data looking for the Alpha waves or the visually-entrained waves.  If any are detected, it decides what commands to give the robot.  The commands are conveyed back to the Arduino, which then drives the remote control, which the Hex Bug receives over its usual IR link.

Here is the schematic of how the pieces work together.

EEG Setup:  I'm going to be measuring my Alpha waves and I'm going to be measuring the brain waves induced through visual entrainment.  Based on my previous experience, I know that both are best recorded using an electrode on the back of the head (at the "O1" position, if you're into your 10-20 electrode placement standard).  I do not need electrodes all over my head.  That's the only sensing electrode that I'm using.  That's it.  Of course, EEG also requires a reference electrode, which I put on my left earlobe.  And, finally, EEG often has a third electrode ("bias" or "driven ground"), which I placed on my right earlobe.

Looking at the Frequency of my Brain Waves:  As mentioned above, my approach is to control my robot by detecting Alpha waves and by detecting visually-entrained brain waves.  These are easily detectable because they occur at specific frequencies.  Alpha occur around 10 Hz and the visually-entrained brain waves occur at the blink rate(s) of whatever movies I use (my best results were from 5 Hz and 7.5 Hz movies). So, to control my robot, I will be looking for EEG signals at these frequencies: 5 Hz, 7.5 Hz, and 10 Hz.  I'm going to "look" for these frequencies by writing some EEG processing software that'll look at the frequency content of my EEG signal to see if these frequencies are present.


EEG Processing:  The flow chart above shows the steps that I use to process the EEG signal (my software is here).  Once the PC gets EEG data from the OpenBCI board, the first step is to compute the spectrum of the signal, which tells me the content of the EEG signal as a function of frequency.  I then search through the relevant part of the spectrum (4-15 Hz) to find the peak value.  I note both its frequency value and its amplitude.  In parallel, I also compute the average EEG amplitude across the 4-15Hz frequency band.  This average value is my baseline for deciding whether my peak is tall (strong) or short (weak).  By dividing the amplitude of my peak by this baseline value,  I get the signal-to-noise ratio (SNR) of the peak.  The SNR is my measure of the strength of the peak.  The output of the EEG processing, therefore, are two values: the frequency of the peak and the SNR of the peak.


Deciding My Robot's Action:  Once my EEG processing finds the frequency and SNR of the peak in my EEG spectrum, I now have to decide how to act on that information.  After some trial and error, I settled on the algorithm shown in the flow chart above.  It's got three steps:
  • SNR Check:  First, I decide whether the current peak in the spectrum is legitimate, or if it is likely to be just noise.  I don't want to issue a command if it is just noise because then my robot will be taking all sorts of actions that I didn't intend.  That is not what I want.  So, to decide if the peak is likely to be legitimate, I look at the SNR of the peak.  If it has a big SNR, I'll accept it as a legitimate peak.  If it is too small, I'll take no further action.  Right now, my threshold for this decision is at 6 dB.  Setting a higher threshold results in fewer false commands (which would be good), but it also makes the system less sensitive to legitimate commands (which is bad).  This 6 dB threshold resulted in an OK (but not great) balance.
  • Frequency Check:  If the peak seems legitimate, I decide how to command the robot based on the frequency of the peak.  If the peak is between 4.5-6.5 Hz, I must be looking at the right-side of my 2-speed blinking movie (ie, the portion that blinks at 5 Hz), so the computer prepares the "Turn Right" command.  Alternatively, if the EEG peak is 6.5-8.5 Hz, I must be looking at the left-side of my 2-speed blinking movie (ie, the portion that blinks at 7.5 Hz), so it prepares the "Turn Left" command.  Finally, if the EEG peak is 8.5-12 Hz, it must be my eyes-closed Alpha waves, so the computer prepares the "Move Forward" command.
  • New Command Check:  Before issuing the command, I check to see whether this command is the same as the last command that was extracted from my brain waves.  If the latest command is different, I hijack the command and, instead, issue the "Fire!" command.  If the latest command is the same, I go ahead and issue the left / right / forward command like normal.  The reason for this hijack is that I have no other type of easily-detected brain wave that I can use for commanding the robot to fire.  This approach of issuing "Fire!" on every change in command seemed like a decent way of getting a 4th command out of 3 types of brain waves.
Putting It All Together:  As you can see in the movie, I eventually able to get all of these pieces working together to allow me to command the Hex Bug using just my brain waves.  Of course, it didn't work the first time.  Even once I got all the hardware working, I still needed to tune a bunch of the software parameters (FFT parameters and the detection threshold) until I got something that worked somewhat reliably.  To help with this tuning process, I used the spectrum display that is in my Processing GUI.  Some screen shots are below.

Example EEG spectrum when I stared at the right side of my two-speed blinking
movie.  It induced 5 Hz brain waves.  I programmed 5 Hz to mean "Turn Right".
The SNR here is between 6 and 7 dB.

Here's an example EEG spectrum when I stared at the left side of my two-speed
blinking movie.  It induced 7.5 Hz brain waves.  When the GUI detected 7.5 Hz,
it issued a "Turn Left" command to the Hex Bug.  The SNR is only 6-7 dB.

Finally, here's an example EEG spectrum with my eyes closed so that I was
exhibiting Alpha waves, which are near 10 Hz.  When it detected 10 Hz, I
programmed it to issue a"Forward" command.  The SNR is > 8 dB.

Weak Peaks:  In the screenshots above, the red line shows the current EEG spectrum.  The heavy black circle shows the spectral peak that my software algorithms have detected.  The black dashed line is the "background noise" from which the SNR is computed.  To be declared a legitimate detection, the peak must be 6 dB higher than the black dashed line (unfortunately, I don't show this on the plot...sorry!).  As can be seen, the 5 Hz and 7.5 Hz examples are not very strong (the SNR is only 6-7 dB).  Other peaks within the plots are very close to being the same size, which would cause false commands to be sent to the robot.  In my movie at the top of this post, there were several false commands.

Balancing Sensitivity with False Commands:  To reduce the number of false commands, I could raise my detection threshold above 6 dB. Unfortunately, as see in the first two spectrum plots above, my 5 Hz and 7.5 Hz peaks are usually pretty weak (<  7 dB).  Therefore, any attempt to raise my detection threshold above 6 dB would cause me to no longer detect my legitimate brain waves.  I know because this is exactly the tuning process that I tried.  Bummer!  So, if I want more reliable performance, I'll need to develop a fancier signal processing beyond this simple FFT-threshold approach.  Future challenges!

Wrapping Up:  Even with the false commands seen in my movie, I was still able to command the robot to move around the table.  I could get it to go (roughly) where I wanted it to go.  And, I did it all with just my brain waves.  I think that this is pretty exciting!  Yay!  What are the next steps?  Well, maybe now that I have this under my belt, I can move on to control flying fish, or maybe a quadcopter!  Do you have any other cool ideas for things I can control with my brain?

Coolness:  This hack got picked up by IEEE Spectrum as part of an article on OpenBCI.  Cool!  Check it out here.

More Coolness:  This hack also got picked up by Wired.  Fun!

Follow-Up: I got to share this hack with Joel and Conor of OpenBCI.  You can see their luck with controlling the robot here.

Follow-Up: Follow-Up:  We used a similar approach to get a 5-person team to brain-control a swimming shark balloon.  It's cool.  Check it out here.

46 comments:

  1. Hi Chip,
    It's a great demo. You are getting there. I'm amazed by your work putting in the Processing sketch for other to use. I probably try your approach with Matlab to see how will it turn out.
    Also, how about reducing the detection threshold and adding a confirmation method (feedback) to the system before issuing a command? For example, the system will ask for a double eye blink in a certain interval before issuing the command. It will not be a purely EEG system though.

    ReplyDelete
    Replies
    1. Hey, thanks for reading! I appreciate your feedback!

      Regarding the use of a confirmation method, I think that's a great idea. Unfortunately, my experience with lowering the detection threshold below 6 dB is that there are SO MANY detections, that you'd be having to continually confirm/reject candidate actions. That would be such a pain.

      The best approach is to improve the SNR of the of the signal that you want to detect...in this case, it is the SNR of my visually-entrained brain waves (the ones at 5 Hz and 7.5 Hz) that need to be made stronger. One can always improve the SNR through additional averaging, but that slows down the response time. Other ideas that I've had include:

      * Brighter blinking lights
      * Blinking lights with more light-dark contrast
      * Blinking lights closer to my eyes, such as on eye glasses

      Or, maybe the SNR could be improved through using multiple EEG electrodes. If I used multiple electrodes on the back of my head (say O1 and O2), perhaps the visually-entrained signal would be coherent between the two electrodes, whereas the "noise" (background activity) might not be coherent. If I exploit the coherence, I might get another dB or two (best case for two electrodes would be +3 dB on SNR).

      So, those are the ideas that I have for improving the reliability of the commands. Keep sending me your ideas!

      Delete
    2. I just read an IEEE paper on steady-state visual evoked potential (SSVEP) like I'm doing here with my blinking movies. Like I mentioned in my own comment above, they suggested that multiple EEG electrodes could improve performance.

      In my suggestion above, I said that I could seek to use additional channels where the SSVEP signal might be coherent (ie, common) between the channels. By adding the channels together, I could boost the amplitude of the SSVEP signal, which would increase its SNR and make its detection more reliable.

      Conversely,iIn the IEEE paper, they suggested using channels where the SSVEP signal is NOT common but where the background EEG activity (ie, the "noise") might be common. They suggest pairing O1 with PO1, Oz with POz, or O2 with PO2. By differencing these pairs, you can maybe cut the noise amplitude, which would be an equally effective way to achieve my goal of increasing the SNR.

      Cool idea!

      Delete
    3. What is the paper you're referring to?
      Thx,
      David

      Delete
    4. Eeg Hacker: Controlling A Hex Bug With My Brain Waves >>>>> Download Now

      >>>>> Download Full

      Eeg Hacker: Controlling A Hex Bug With My Brain Waves >>>>> Download LINK

      >>>>> Download Now

      Eeg Hacker: Controlling A Hex Bug With My Brain Waves >>>>> Download Full

      >>>>> Download LINK kv

      Delete
  2. Fantastico Chip: as I can I will try for a dancing robot!
    All the best!
    Giorgio

    ReplyDelete
    Replies
    1. Thanks for the kind words. If do you make a dancing robot, be sure to share a video! I can't wait!

      Chip

      Delete
  3. Chip, hey cool. Nice going.

    Checkout this video posted by the g.tec guys. They use flashing checkerboard icons in the 4 corners, and it looks like a single sensor at POz or Pz. (Not Oz !) Not sure where their reference is, video said ground was approx. Fz.

    http://vimeo.com/channels/miranda/88151780

    ReplyDelete
    Replies
    1. Great link!

      Clearly, we're both using ssvep as our input method. I like how they're doing 4 blink rates and not just 2, like I'm using.

      I also like hoe they're using the bci to select whole musical phrases. That's the only approach that makes sense, given the speed (slowness) of bci's, but it is an approach that had never occured to me. Very smart.

      Finally, I like how they had the bci-decided musical phrases go to human players. They could have easily had the computer play the musical phrases, but the human players make the whole endevour more compelling.

      Thank for sharing!

      Chip

      Delete
  4. Wonder what it is about the complementary checkerboard patterns they find effective? Could it be that is more "attention grabbing" as far as the various levels of visual processing? Or less subject to interference from the neighboring patterns? The checkerboard matrix dimensions may also be an important variable.

    They may be sampling farther from Oz to tap into these higher level abstraction effects.

    re: human players, yeah I bet the resultant mix is ten times better than with strictly mechanical playback. Because of all the nuances quartet players would add to segue, blend, harmonize with each other. In other words, they listen and respond to the other players, huge. I'm a big advocate of music improv, see lightfield.com/iii.htm .

    ReplyDelete
    Replies
    1. My understanding is that the checkerboard, with all of its horizontal and vertical lines, stimulates the edge-detection portion of the brain's circuitry in addition to the simple light-dark contrast-detection circuits. As a result, people think that they get a stronger overall response.

      I did some testing of my own using a checkerboard, but I didn't see much difference. The large-area light-dark blink worked better for me. But, maybe my checkerboard sizing wasn't ideal.

      Re: human players, I totally agree that the resultant music was far more engaging than it would have been with computerized playback. For us mere mortals without g.tec funding to pay the musicians, it is fun to think how we all could hack together something similar, as long as we're OK using computerized playback of the EEG-directed music.

      Chip

      Delete
  5. Hey Chip. Really great stuff man! Check your SoundCloud...or email me at mike.tate@me.com
    I have something I would like to talk with you about.

    ReplyDelete
  6. "Do you have any other cool ideas for things I can control with my brain?" - Yes I do. It has not been done yet and is waiting for someone like you to help develop it! Get in touch with me.
    Mike Tate

    ReplyDelete
  7. Hi Chip,

    First off thank you so much for the reply earlier! So I am using your EOG setup to measure eye movements and am still having a little trouble with your MATLAB processing code. I have access to matlab as I am at a university. While plotting the data will be really great - what I ultimately want to do is to use the EOG input as an output to an Arduino that controls a robotic eye so that I can essentially make the eye mimic human eye movements in real time. (Like what you have done with LED/Hex Bug). I have been reading those posts and that code and trying to figure it out, but as I am at a very basic level with all of this (have no prior programming experience, except for some very simple things with Arduino), I am having trouble translating. I am wondering if there is a way to very simply get the EOG data input and then transform it into simple if else statements in Arduino - for instance, if one electrode above the human eye reaches some threshold voltage then the arduino will activate the eye to move upward, and the same with an electrode below the human eye. I apologize if this is a somewhat simplistic view of the scenario, but as I said before I am just learning!

    Thank you,I appreciate all of your help!

    Best,
    Johanping

    ReplyDelete
  8. hey guys,keep up the good work!!even im trying to use open bci kit .Im new to it.Can please let me know how to interface open bci with arduino.As in once the data is collected what to do next.

    ReplyDelete
    Replies
    1. I am having the same difficulties - Aishwarya let me know what you find and I'll do the same!

      Delete
  9. Yeah sure,with pleasure.
    Everywhere its just mentioned what has been done and not how to do it.

    ReplyDelete
    Replies
    1. Hi Aishwarya,

      I am trying this https://decibel.ni.com/content/docs/DOC-40837
      software to interface, currently in download process, but it should be easier with LABVIEW to look at data and then even make controlled outputs?

      Here's the link to download the program LABVIEW
      http://www.ni.com/academic/download.htm

      and the previous link was to the OPENBCI toolkit!

      Good luck!

      Delete
  10. Hi johanping,
    thanks for that link !!!
    Yes as you said Labview is a loot helpfull ,in both to accept the signals and its easy to interface with arduino .I'm trying it out.I'll let you know further details.

    ReplyDelete
    Replies
    1. And i have 2 queries.
      1)The link of labview open bci toolkit ,is it the one that is given in the attachment files in this page
      https://decibel.ni.com/content/docs/DOC-40837

      2)Is there any ni drivers that has to be downloaded???(i found so in certain links but none was mentioned in this link)

      Delete
  11. Currently looking at these guys and trying to figure out the code they used -- didn't end up using that one because i couldn't figure it out either!

    ReplyDelete
  12. Oh !
    Even i did download the files that you mentioned in the links,and im trying to figure it out and its little confusing.
    I'll keep in touch johanping and let you know in case if i succeed:D

    ReplyDelete
  13. Hi ..
    Did you use 32 bit OpenBCI ?
    What is the price of openBCI board and how can i buy it?

    ReplyDelete
    Replies
    1. This hack was done with a prototype OpenBCI, not the one that they've had for sale in 2014/2015. Since all of the EEG processing is done on the PC, this hack could be done with either the 8-bit or the 32-bit OpenBCI board.

      I think that you can buy the OpenBCI boards from their site. www.openbci.com.

      Chip

      Delete
  14. Hi Chip,

    I like the way of making a connection "OpenBCI board --> Arduino Uno --> PC". Could you show me how to make it? I couldn't find in your blog.

    Does making this connection mean that the data is transferred to the PC without a need of onboard RFduino built in the board?

    ReplyDelete
    Replies
    1. The OpenBCI boards that are for sale all have a microcontroller built in along with the wireless unit. While it may be feasible to bypass the built in microcontroller and wireless unit, it is not recommended. The wireless connection is critical to keeping this device safe. Connecting this device to an Arduino and then to your PC (via USB) means that you could get a nasty shock if your PC is connected to the mains power.

      In the post above, I'm using an early prototype of an OpenBCI board. It had to be plugged into an Arduino and it had no wireless unit. So, I had to be careful to ensure that my PC was not plugged into mains power. It always made me nervous.

      All of the new OpenBCI avoid this problem. No Arduino needed. No wired connection needed. Much safer.

      Delete
  15. hi Chip and all,
    I successfully created an SSVEP based BCI game with an OPENBCI hardware derivative. Check it out here if you are curious.
    https://eegexplore.blogspot.com/2016/07/ssvep-based-bci-game-with-my-hardware.html

    ReplyDelete
    Replies
    1. Very nice work! Getting stable flickering frequencies from a computer monitor has been a challenge for me. I'm glad that you have a system that seems to do it well!

      Chip

      Delete
  16. hey chip thanks for the video. can you tell me how many channels board you used to do the project? can 4 channel board be used to do the same thing you did in the video?

    ReplyDelete
    Replies
    1. This hack is purely about the visual cortex. So, I only needed one channel. As described in the "EEG Setup" section of this post, the single active electrode was approximately at the "O1" position on the back of my head. I then had a reference electrode on one ear lobe and the bias electrode on the other ear lobe (these other two electrodes don't count as "channels"...they're sorta like electrical grounding). While I don't know much about the 4-channel OpenBCI board, I would expect it to work just fine for this hack.

      Delete
  17. Love this project. I'm going to try it myself this year. Will be using the newer Open BCI controllers, but I have no idea how to wire up the controller to the remote for the HEX. Should be funn trying to figure it out!

    ReplyDelete
  18. hey,
    very interesting project !
    I want to ask you if i have the new openBCI that does not require an arduino board (it has a built in RFduino compatible with arduino), how can i get the signal of my brain (view in openbci processing) and connect its result to the arduino where i can program to do things based on the brain signal?

    ReplyDelete
    Replies
    1. I am facing the same issue. First of all, great work chip. Your work is an inspiration to us to work on openbci. Could you kindly help us with the connections,software and setup for the new V3 board sothat life becomes easier for newbies like us? Thanks

      Delete
    2. Dear chip, you did an amazing job! I would like to ask you some suggestions...

      For my PhD study I am using openBCI Cyton + Daisy biosensing board. The idea is to record alpha oscillation from different brain sites and to display alpha waves during the performance of a memory task (that I programmed using Psychtoolbox in Matlab). During the performance of the memory task (that is the delayed match to sample task), participants should learn to increase alpha suppression that is supposed to be related to increased attention (expectation: more alpha suppression, better attention and better performance on the memory task). For this purpose I would like to build a real time EEG-Neurofeedback (connecting openBCI and Matlab)... It would be great if the neurofeedback will be displayed as a fixation cross that changes color depending on increased or decreased alpha suppression. Any suggestions?

      I hope I made the point enough clear. Thank you for your time and patience. :)

      Best,
      Beatrice

      Delete
  19. Hey Chip! I just came across your blogpost and I am very fascinated by it. I am interested in using OpenBCI UltraCortex IV with Cyton board to control a toy car as a proof of concept. Would you be kind enough to point me in the right direction to begin this? What programming software to be used, how do i convert openBCI signals into an input like you did above?

    ReplyDelete
  20. nice work ..how about running the software on portable microcontroller

    ReplyDelete
  21. @zmzm, some microcontrollers such as Raspberry, have usb ports that will accept the Cyton dongle. In that case the Cyton just appears as a serial port on the Raspberry.

    ReplyDelete
  22. hi guys what is the wiring diagram to connect openbci board to arduino board ?

    ReplyDelete
  23. @imifranki, hi. The original setup mentioned in Chip's article used an Arduino based V1 OpenBCI shield. The current OpenBCI board uses a wireless usb dongle to communicate with the OpenBCI board. The EEG data streams into your app on the laptop via a usb serial port. Then to control the hexbug, a SEPARATE Arduino is used, which has it's own usb serial port. Your app sends serial 'commands' to this Arduino, which is wired to the hexbug controller. Commands like single letters 'L', 'R', 'F', standing for left, right, forward.

    ReplyDelete
  24. hello i'm using neurosky mobile 2 to control a drone and i using processing to write a code i want convert the eeg raw data to freq for using it by depending on the freq and the power spectrum. how to do that

    ReplyDelete
  25. Discover a wide range of high-quality spy gadgets online in India at Spy World. From hidden cameras to GPS trackers, find the perfect surveillance tools for your needs. For any query: Call us at 8800809593 | 8585977908.

    ReplyDelete
  26. Nice articles and your information valuable and good articles thank for the sharing information best robotic signals

    ReplyDelete