tag:blogger.com,1999:blog-72763770531201743332024-03-14T04:13:57.322-04:00EEG HackerChiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.comBlogger49125tag:blogger.com,1999:blog-7276377053120174333.post-73162454859329958892015-03-03T07:35:00.003-05:002017-10-19T17:51:38.522-04:00Brain-Controlled Shark Attack!Visiting my friends at OpenBCI HQ, we got together to do some hacking. Since I'm always looking to control new things with my brain, I was really excited to see that someone had brought a remote-controlled shark-shaped balloon (an "<a href="http://www.wmctoys.com/products/air-swimmers">Air Swimmer<span id="goog_790409107"></span><span id="goog_790409108"></span>s</a>"). This is a very cool toy -- it swims through the air in a wondrous way. But, I can't just leave a good things alone. So, after a few hours of hacking, Joel and I were able to turn this simple toy into a 5-person, brain-controlled, SHARK ATTACK!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjd0eRgCFdiOnNzgN8McJLFHCPjkFAsecOotJ3bBm6nINPzVmBnS8w6CbyW1dezZnGHDu3Zs5AcuVMz6v4JA3CNbuIhu02QCNUwUi4Lu7SAYQJDB87pZPnBGoGb435T_-vMaFsOIEqrcjk/s1600/SharkAttack!.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjd0eRgCFdiOnNzgN8McJLFHCPjkFAsecOotJ3bBm6nINPzVmBnS8w6CbyW1dezZnGHDu3Zs5AcuVMz6v4JA3CNbuIhu02QCNUwUi4Lu7SAYQJDB87pZPnBGoGb435T_-vMaFsOIEqrcjk/s1600/SharkAttack!.png" width="400" /></a></div>
<br />
<u><br /></u>
<u>Approach:</u> Our approach to this hack is extremely similar to the approach that we used for our <a href="http://eeghacker.blogspot.com/2014/11/two-brains-one-robot.html">multi-person control of a toy robot</a>. As shown in the figure below, the idea is that you get multiple players hooked up to a single EEG system (OpenBCI, in my case). The computer processes the EEG data looking for each person's <a href="http://eeghacker.blogspot.com/2013/11/openbci-alpha-wave-detector.html">eyes-closed Alpha waves</a>. Depending upon which person's Alpha waves are detected, the computer sends commands to the shark. The commands are conveyed to the shark via an <a href="http://www.arduino.cc/">Arduino</a>, which is driving the shark's remote control. The end result is that the shark swims because of one player's brain waves. I think that's pretty cool.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsQKSk1Bd4A_IG0W7-ug7V7_qkaCr_QuOtwEwb5T_Tvy3TzASJ7f5tEu0Wnkn5IMieET4CxI-WE33RRKEm6LWfCFCpLqPmPyAZXmTaYP5daLW9RZSc1vrUMOL4qhkG4aU2L_u50SnYo3c/s1600/MultiPersonSharkSchematic.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="422" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsQKSk1Bd4A_IG0W7-ug7V7_qkaCr_QuOtwEwb5T_Tvy3TzASJ7f5tEu0Wnkn5IMieET4CxI-WE33RRKEm6LWfCFCpLqPmPyAZXmTaYP5daLW9RZSc1vrUMOL4qhkG4aU2L_u50SnYo3c/s1600/MultiPersonSharkSchematic.png" width="560" /></a></div>
<br />
<u>Two-Person Demo:</u> As Joel and I were pulling this hack together, we started to test it using just the two of us. Being just two people, we could only do shark two commands, not all five. It was still pretty fun, though. I love the sense of excitement that happens when a hack first starts to work.<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/Qnr3YWWAZL0" width="420"></iframe></div>
<br />
<u>Hacking the Remote Control:</u> To make this shark controllable from the computer, we needed to hack into the shark's remote control. Like when <a href="http://eeghacker.blogspot.com/2014_05_01_archive.html">I hacked the remote for the toy robot</a>, Joel found that the remote for the shark was simply a few push buttons that were wired to pull one side of the switch down to ground whenever the button was pushed. So, to make this controllable from my computer, Joel soldered some wires to the circuit board (to the high side of each switch) to allow an Arduino to pull it down to ground instead of having to push it with your finger. As a result, we can now send a command to the Arduino and cause the shark to move. Our Arduino code for this hack is on GitHub <a href="https://github.com/OpenBCI/OpenBCI_Processing/tree/variant_sharkSwimmer">here</a>.<br />
<div style="text-align: center;">
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiUacN4CotUpuQruWOaP9LgX92a7JOVxruBCF-6qhvgG4FSiHRGtTS-tx15LMxCN_rhRrzAZZ76sxLWQ9TehAdqBe_0wkqlt7Iw5bh1OJ_Xorq1MAXpmVm3HC6f0eJobKsl1GfFrx5P8Ac/s1600/HackedRemoteControl.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="212" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiUacN4CotUpuQruWOaP9LgX92a7JOVxruBCF-6qhvgG4FSiHRGtTS-tx15LMxCN_rhRrzAZZ76sxLWQ9TehAdqBe_0wkqlt7Iw5bh1OJ_Xorq1MAXpmVm3HC6f0eJobKsl1GfFrx5P8Ac/s1600/HackedRemoteControl.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">We modified the shark's remote control by adding a wire to the non-grounded side of each push button.<br />
We brought the wires out and connected them to an Arduino.</td></tr>
</tbody></table>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiHpMX3bgjcjTB7RpOQWms169BBthma4ApEEHu6CAJkHq6Pe6RkORAdoB9Sr8t-vP0Y8uoqwXx1GqYkkDPqqb7fBSiVEz5utjyLicEutQbngiGoaOQcy9MVVImN1ISl6QvzZTWxMrXVchw/s1600/RemotePlusArduino.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiHpMX3bgjcjTB7RpOQWms169BBthma4ApEEHu6CAJkHq6Pe6RkORAdoB9Sr8t-vP0Y8uoqwXx1GqYkkDPqqb7fBSiVEz5utjyLicEutQbngiGoaOQcy9MVVImN1ISl6QvzZTWxMrXVchw/s1600/RemotePlusArduino.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">An Arduino drives the shark's remote control.</td></tr>
</tbody></table>
<div>
<br />
<u>EEG Electrode Setup:</u> It's quite easy to record a person's eyes-closed Alpha waves. You need three electrodes. Put one electrode on the back of your head (O1 or O2, if you know <a href="http://en.wikipedia.org/wiki/10-20_system_%28EEG%29">the 10-20 system</a>), put the EEG reference electrode onto your earlobe, and put the EEG bias electrode on your other earlobe. You can see some examples in the photo below, where we had three people controlling the shark. We used the gold cup electrodes and Ten20 electrode paste that came with the OpenBCI kit.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjyAaf02Cj0ut4ndwb_eXXdTleYHvvGAbntA8Vb4DfRc1NQgMcpWbIXWHvoWgfDsdNuo3LwysHibppZuZfnWs8hX4Co6WdNZwOskOrf42ClF35yAGz29g1jHf9jEwxHwb3TJahBaGpyDVI/s1600/IMG_3989.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjyAaf02Cj0ut4ndwb_eXXdTleYHvvGAbntA8Vb4DfRc1NQgMcpWbIXWHvoWgfDsdNuo3LwysHibppZuZfnWs8hX4Co6WdNZwOskOrf42ClF35yAGz29g1jHf9jEwxHwb3TJahBaGpyDVI/s1600/IMG_3989.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Three-Man "Team Alpha!" Controlling the Shark. You can <span style="font-size: 12.8000001907349px;">also see</span><br />
<span style="font-size: 12.8000001907349px;">where we put the electrodes -- back of head and both earlobes.</span></td></tr>
</tbody></table>
<br />
<u>OpenBCI Setup:</u> We are going to wire up multiple people to control this shark. And to be clear, it is not normal to hook multiple people to one EEG system. But that is what we are going to do. This is definitely using EEG in a non-traditional way. That's why this is called "hacking". The trick to making it work is to tell the EEG system (in this case, OpenBCI) that each person has his own EEG reference electrode. OpenBCI enables this by allowing each EEG channel to be run in "differential mode", where you use each channel's the "P" and "N" inputs as a differential pair. This is in contrast to the more-usual "common reference mode", where we use one of the SRB inputs as a common EEG reference for all EEG channels. To change OpenBCI to differential mode, you use the OpenBCI GUI, via the "Chan Set" tab, to change each channel's "SRB1" and "SRB2" setting to "off". Then, in the main window, turn off all of the other channels that you are not using.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjo934oV8Kmj3VivCmB6VBIwdCvlkfDeQyzJHNeDmZm_pFh-iMP5EIBJ0vjMM5bZQzlGRteKe08I6DJ7W6SA4DSJS2AdaIkRpScjNjuxJ-QuPW20HI7JBj_h2HIVDve-sZBfmqNv4AVhzs/s1600/ScreenShot.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="300" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjo934oV8Kmj3VivCmB6VBIwdCvlkfDeQyzJHNeDmZm_pFh-iMP5EIBJ0vjMM5bZQzlGRteKe08I6DJ7W6SA4DSJS2AdaIkRpScjNjuxJ-QuPW20HI7JBj_h2HIVDve-sZBfmqNv4AVhzs/s1600/ScreenShot.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Screenshot Showing How to Configure for Five Channels in<br />
Differential Mode...simply Turn Off SRB2.</td></tr>
</tbody></table>
<br />
<u>Plugging Into OpenBCI:</u> Once you've got the EEG electrodes on the individual players, you've got to hook them into the OpenBCI board. Because we're in "differential mode", each player will get one "P" input and one "N" input. For this hack, we put the electrode from back of the head into the "N" input. We then put the left earlobe into the corresponding "P" input. Finally, the right earlobe was connected to a bias pin. Because we had five players, there aren't enough bias pins available on the OpenBCI board. I used a 16-channel OpenBCI board because it has 4 bias pins, so that covered four players. The fifth player simply plugged into the analog ground pin ("AGND"), which is not as good as using a bias pin, but it worked well enough.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipbuky03RxUi4RWPQQL53idsmFwSaXg10IA8njN5cY5nhBviycL0QukSyUvvE8esQ4N5OYNdOz04hLivpDwKzX22pG8bYM_JSkb6HfFHHar33zQi84Q2hL_laH91-MGD5qs-Hu7U-B2m8/s1600/OpenBCI_Wiring.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipbuky03RxUi4RWPQQL53idsmFwSaXg10IA8njN5cY5nhBviycL0QukSyUvvE8esQ4N5OYNdOz04hLivpDwKzX22pG8bYM_JSkb6HfFHHar33zQi84Q2hL_laH91-MGD5qs-Hu7U-B2m8/s1600/OpenBCI_Wiring.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Wiring Electrodes to the OpenBCI Board for Five Players. Note that the board only<br />
has 4 bias connections, so one player is attached to AGND instead.</td></tr>
</tbody></table>
<br />
<u>EEG Processing Algorithms:</u> As mentioned earlier, the PC does all of the EEG processing...no processing occurs on the OpenBCI board itself. For the software on the PC, we started with the stock OpenBCI GUI. Then, I extended it by (1) adding the <a href="https://github.com/OpenBCI/OpenBCI_Processing/blob/variant_sharkSwimmer/OpenBCI_GUI/EEG_Processing.pde">Alpha detection algorithms</a> and by (2) adding code to <a href="https://github.com/OpenBCI/OpenBCI_Processing/blob/variant_sharkSwimmer/OpenBCI_GUI/HexBug.pde">send shark commands</a> to the Arduino. This variant of the OpenBCI GUI is currently saved <a href="https://github.com/OpenBCI/OpenBCI_Processing/tree/variant_sharkSwimmer">here</a> on GitHub as a branch of the main repository. As you can tell from the class names shown in the code, the code is based heavily upon the previous work with the HexBug robot...which you may find confusing since we're controlling a shark and not a HexBug. Sorry for the confusion!<br />
<br />
<u>Three-Person Testing:</u> After Joel and I did our two-person testing, we roped in a couple of other players to join Joel. That got us to a three-man shark attack!</div>
<div>
<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-aP6Phmij_RXVwvV45k7g0-eqtosACWpMg7pMDP-Iou4bIqkAGnk0i8rNFiSu_x0lHSMjvcyKjlH_dnQ-QsmNrVspqKoJ7Q2G53yAIhuiFu7nuPWmsTnMYGCmsbQ_r_FNB6Yoga9Js4c/s1600/IMG_3981-001.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj-aP6Phmij_RXVwvV45k7g0-eqtosACWpMg7pMDP-Iou4bIqkAGnk0i8rNFiSu_x0lHSMjvcyKjlH_dnQ-QsmNrVspqKoJ7Q2G53yAIhuiFu7nuPWmsTnMYGCmsbQ_r_FNB6Yoga9Js4c/s1600/IMG_3981-001.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Three-Person Brain-Controlled Shark</td></tr>
</tbody></table>
<br />
<u>Shark Food:</u> While we were all hacking the shark to make it brain-controlled, Conor was busy doing his own hacking. Once we finally got our brain-controlled shark into fighting condition, we couldn't resist swimming it over to harass Conor. Conor was pretty sure that his teeth were sharper than the shark's, so he wasn't much afraid.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjrZW-99BeQE689OIVX4SFWeZ5HfF9dVAQSM1kCB54hOW00LvLA491EpWtU7_exU5-x1Iv6VD96sSr5BIXsTLzBLTjvEaYoks1eVlBqe507efRIeMTTrdfif9uHVdJVmdP0-LbeIjPejgE/s1600/IMG_3992.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjrZW-99BeQE689OIVX4SFWeZ5HfF9dVAQSM1kCB54hOW00LvLA491EpWtU7_exU5-x1Iv6VD96sSr5BIXsTLzBLTjvEaYoks1eVlBqe507efRIeMTTrdfif9uHVdJVmdP0-LbeIjPejgE/s1600/IMG_3992.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Conor Faces Off Against the Shark.</td></tr>
</tbody></table>
<br />
<u>Four-Person Shark Control:</u> Having successfully used three people to control the shark, we wired up a fourth person. Have you ever tried to get four people doing anything in a smooth and coordinated fashion? It's hard! But, we did have success...<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/13lnTEcIhSw" width="420"></iframe>
</div>
<br />
<u>Five-Person Shark Attack:</u> OK, if four people working together is hard, five people is just chaos. In case you can't seem him, notice below that the fifth guy is in the center of the crowd, kneeling so that you just see his head popping above the bench. The wiring on the OpenBCI electrodes seems generously long when you're just attaching one person. With 5 people, though, you really need longer wires...or you simply need a little creativity on how you pack the people together.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMmtoRw9IwaRmhzrxMi80Vnme9vh_vTRj-CfeZtfdACt6KrQceGfroyxEBqs2bHSLTwAv1-ENEFHbAXeO78xML4QJyUFhKyKztv5lYTy5gznW6jCY6tpQT7pmAnGhJoAP0SWpM5R2lYjU/s1600/IMG_4003.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMmtoRw9IwaRmhzrxMi80Vnme9vh_vTRj-CfeZtfdACt6KrQceGfroyxEBqs2bHSLTwAv1-ENEFHbAXeO78xML4QJyUFhKyKztv5lYTy5gznW6jCY6tpQT7pmAnGhJoAP0SWpM5R2lYjU/s1600/IMG_4003.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Five-Person Brain-Control of the Swimming Shark. The fifth person is kneeling<br />
and you can only see his head. We need longer wires!</td></tr>
</tbody></table>
<br />
<u>Swimming Away:</u> By the time that we got this all working, it was really late at night. The time stamps on the pictures show that it was about midnight, and we'd been at OpenBCI HQ since about 10AM. So, between the fatigue and all the caffeine, I think that we were getting some funny brain wave behavior. For example, one guy was making some weak Alpha even with his eyes open. Wacky! Regardless, we were able to get the shark to swim around...until we swam the shark too far to one end of the OpenBCI HQ...<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMsqD-Z3N6-ojJmtEdEa51I0saQAAF-mjbAQU_pMNooCLPSJGUNgV3KUZvPPppLtmdg3LnYY75okL7K8DZLahrAwxVdmuKLyhJSbJFFMLwGkTxVsRxfQARUmlxOqo2U3iqx0LzFAnNmXg/s1600/IMG_4005.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMsqD-Z3N6-ojJmtEdEa51I0saQAAF-mjbAQU_pMNooCLPSJGUNgV3KUZvPPppLtmdg3LnYY75okL7K8DZLahrAwxVdmuKLyhJSbJFFMLwGkTxVsRxfQARUmlxOqo2U3iqx0LzFAnNmXg/s1600/IMG_4005.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Swimming the Shark Off Into the Sunset.</td></tr>
</tbody></table>
<br />
...at which point the shark's IR remote control could no longer communicate with the shark. Stranded Shark! And so our night of EEG hacking ended. Still, it was mighty fine work, Go Team Alpha!<br />
<br />
Update 2015-03-09: I just saw that someone has already made a BCI for controlling this very same swimming shark (Chen <i>et al. </i>"<a href="https://books.google.com/books?id=ZTnMBQAAQBAJ&pg=PA176&lpg=PA176&dq=eeg+air+swimmers&source=bl&ots=vdU9wweG-M&sig=alNy5AKmgaQ3Bi269DJrRlWfqgY&hl=en&sa=X&ei=D879VODmJLDisASkoYCYDQ&ved=0CCsQ6AEwAg#v=onepage&q=BCI&f=false">Recreational devices controlled using an SSVEP-based Brain Computer Interface (BCI)</a>"). Note that they used one person to control the shark via SSVEP, which is exactly what I did with <a href="http://eeghacker.blogspot.com/2014/06/controlling-hex-bug-with-my-brain-waves.html">my brain-controlled Hex Bug</a>!<br />
<br />
Update 2015-09-24: Wow! I was given an opportunity to write an article for IEEE Spectrum for their Oct 2015 issue. How cool is that? You can check it out here: "<a href="http://spectrum.ieee.org/geek-life/hands-on/openbci-control-an-air-shark-with-your-mind">OpenBCI: Control An Air Shark With Your Mind</a>".<br />
<br />
Update 2015-11-08: I see that Wired (magazine) posted <a href="http://www.wired.com/2015/11/watch-these-guys-make-a-shark-swim-with-their-minds/">their nicely-done video</a> on our shark hacking. It's quite an enjoyable piece. Good work, Wired!. <br />
<br /></div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com20tag:blogger.com,1999:blog-7276377053120174333.post-27142858423732046312015-01-25T10:48:00.000-05:002017-04-14T16:41:43.557-04:00Brain Got Beats -- Not YetI like controlling things with my mind. That's why I do this brain-computer interface (BCI) thing. The tough part of BCIs, though, is finding brain signals that are simple enough for the computer to detect, yet are also something that I can consciously control. So far, I can do <a href="http://eeghacker.blogspot.com/2013/11/openbci-alpha-wave-detector.html">eyes-closed Alpha waves</a>, <a href="http://eeghacker.blogspot.com/2014/04/detecting-concentration.html">concentration-controlled Beta/Gamma</a>, and <a href="http://eeghacker.blogspot.com/2014/06/controlling-hex-bug-with-my-brain-waves.html">steady-state visual evoked potential</a> (SSVEP). I need more options. Today, I'm going to try to do auditory steady state response (ASSR). Or, more colloquially, does my brain got beats?<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhY3zKYVFBUPx7FVNzFx5CEtW7wVK2ZzGeFhcqxiQLdN3zorEVw4KtZRJlunKXalkl4KXTOZb-6jQdhlQm75yb4s_NYZEfFR2HWM2gedBeNsI0MZtE-zFToSFVvWR2xWnbjOoX57P9YUz4/s1600/BrainGotBeats.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhY3zKYVFBUPx7FVNzFx5CEtW7wVK2ZzGeFhcqxiQLdN3zorEVw4KtZRJlunKXalkl4KXTOZb-6jQdhlQm75yb4s_NYZEfFR2HWM2gedBeNsI0MZtE-zFToSFVvWR2xWnbjOoX57P9YUz4/s1600/BrainGotBeats.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Can I use beating tones to entrain brainwaves?</td></tr>
</tbody></table>
<br />
<h3>
Auditory Steady-State Response (ASSR)</h3>
<div>
<br />
The idea with ASSR is that we are looking for EEG signals from my brain that are driven by sounds presented to my ears. When doing an ASSR, you use an audio tone whose amplitude is varied ("modulated") at a fixed rate such as 40 Hz. Then, when you play that sound in your ears, you look in the EEG signals for a strong 40 Hz component. Easy, eh?<br />
<br />
Note that this is very similar to the steady-state visual evoked potential (SSVEP) that I used previously, where I'd make my computer screen blink at 8 Hz and 8 Hz signals would <a href="http://eeghacker.blogspot.com/2014/05/visual-entrainment-blinking-screen.html">appear in my EEG</a>.<br />
<br />
<h3>
Attention-Based ASSR?</h3>
<br />
If I want to use ASSR for a brain-computer interface (ie, for controlling robots!), there needs to be some way to consciously control my response to the sound. For the SSVEP, where stimulation was my blinking computer screen, my response was much stronger if I consciously paid attention to the blinking screen. This attention-based response was the key to being able to exploit it for a BCI. <br />
<br />
Does ASSR have a similar attention-based component? Until yesterday morning, I didn't know. But then I came across this paper: <span style="font-family: inherit;">Do-Won Kim <i>et al.</i> "Classification of selective attention to auditory stimuli: Toward vision-free brain–computer interfacing". Journal of Neuroscience Methods 197 (2011) 180–185. <a href="http://h2j.info/file/3.pdf">PDF here</a></span>.<br />
<br />
<h3>
Kim's ASSR Setup</h3>
</div>
<div>
<br /></div>
<div>
In the paper by Kim, they used two loudspeakers to present tones to the test subject. The setup is shown below. The subjects were sitting down in a comfy chair listening to the tones while wearing a small montage of EEG electrodes (Cz, Oz, T7, T8, ref at left mastoid, ground at right mastoid).</div>
<div>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiBfx4tXsKcHRMR41G8nqglVVpXhW2tOUStoZA4GkUXuExn7EVIXXXoMNR9eL4TaeS_2DhmXrv_mnKXwr4Mm2xb5pBNushwQeTDc3JsiYXvtfVK3kUnsGUDcR28ZzNuqTE5EVVAtP9a6Vg/s1600/Kim-Setup.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="177" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiBfx4tXsKcHRMR41G8nqglVVpXhW2tOUStoZA4GkUXuExn7EVIXXXoMNR9eL4TaeS_2DhmXrv_mnKXwr4Mm2xb5pBNushwQeTDc3JsiYXvtfVK3kUnsGUDcR28ZzNuqTE5EVVAtP9a6Vg/s1600/Kim-Setup.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Test Setup as used by Kim (2011) for Evoking Auditory Steady-State Response (ASSR)</td></tr>
</tbody></table>
<br />
For the audio tones, they used a 2500 Hz tone from one speaker and a 1000 Hz tone from another speaker. The key feature of ASSR, though, is the modulation of these tones. For one of the tones, they varied the amplitude of the tone (ie, they alternately made it quiet and loud) at a rate of 37 Hz, while the other tone they modulated at a rate of 43 Hz. These frequencies are the "beat rates" for the audio. It is the 37 Hz or 43 Hz beat rate that they are looking for in the EEG (hence, "brain got beats?").<br />
<br />
Below is what they saw in the EEG signals (Cz) for one of their subjects when the subject gave their attention to the 37 Hz modulated signal (red) or the 43 Hz modulated signal (blue). There is clearly a difference. This makes me happy. This is what I want to recreate with my own testing.</div>
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwbxZhKcUatadtj3uKLmiw9_be-GTDXXGcnV5STOl1q3zcWQ50VJ56vBhbk56Ycs_1UGp7AeGeZmXFBzR9Wd0lBj6P8cq58G7ofXbP22cqLdcYyqpflQC_Pi1deUaGNSE4RTe7Z3gWDTM/s1600/Kim-Results.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="245" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwbxZhKcUatadtj3uKLmiw9_be-GTDXXGcnV5STOl1q3zcWQ50VJ56vBhbk56Ycs_1UGp7AeGeZmXFBzR9Wd0lBj6P8cq58G7ofXbP22cqLdcYyqpflQC_Pi1deUaGNSE4RTe7Z3gWDTM/s1600/Kim-Results.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Spectral Results for One Subject from Kim (2011) In Response to Steady-Pitch<br />
Tones that were Amplitude Modulated at 37 Hz or 43 Hz.</td></tr>
</tbody></table>
<br />
<h3>
My Test Setup</h3>
<br />
I want to recreate their results. I'm going to create some audio files with the amplitude modulated signals, I'm going to play them into my ears via headphones, and I'm going to record my EEG signals (<a href="http://openbci.com/">OpenBCI</a>!) to look for my ASSR. <br />
<br />
<u>EEG Setup:</u> Reading more details from the paper, they said that they got the strongest response from the electrode at Cz, so I decided to start there. I put one electrode at the top of my head (Cz) with the reference on my left ear lobe and the OpenBCI "bias" on my right ear lobe. I used the gold electrodes and the Ten20 EEG paste that came with the OpenBCI kit. Without really trying, I happened to get an electrode impedance of 20-30 kOhm at both Cz and at the reference, which are probably good enough.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2e79owIuhDiAFRanyt6YQQ8RAHiUuMfDcRkI3OSmg0nfScRhHED3lIPIceloTP0CraAYb3lMe3jax6CvPCkaSCY-7_7nUTflwhSWkh07ER7WypWxSOxvniAM7Ptc2bICuYz6BBDuh5-g/s1600/EEG-Seutp.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="248" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2e79owIuhDiAFRanyt6YQQ8RAHiUuMfDcRkI3OSmg0nfScRhHED3lIPIceloTP0CraAYb3lMe3jax6CvPCkaSCY-7_7nUTflwhSWkh07ER7WypWxSOxvniAM7Ptc2bICuYz6BBDuh5-g/s1600/EEG-Seutp.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">My EEG Setup, Cz Only. Also, unlike Kim, I used ear buds (headphones)<br />
stead of loudspeakers to present my tones.</td></tr>
</tbody></table>
<br />
<u>OpenBCI EEG System:</u> For this test, I happened to use my <a href="https://openbci.myshopify.com/collections/frontpage/products/openbci-16-channel-r-d-kit">16-channel </a>OpenBCI system. I'm only using one channel of EEG data, though, so I could have used the 8-channel systems (or even other systems, like <a href="https://www.olimex.com/Products/EEG/OpenEEG/">OpenEEG</a>) just as well. I wired up my OpenBCI unit as shown below. Starting from the left, the white wire is the "bias" (aka, driven ground) going to my right ear lobe, the brown wire is the electrode at the top of my head, and the black wire is the reference electrode on my left ear lobe. Note that they are all plugged into the lower row of pins (the "N" inputs) on the lower board. The system is being powered by four AA batteries and is sending its data wirelessly back to the PC. I'm using the <a href="https://github.com/OpenBCI/OpenBCI_Processing">OpenBCI GUI</a> in Processing.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj7X2y1TQoaxXhQdkSL76PKmhJJ7jZkVzcpEMeDIknViS4qkq-J0WA3TbSumznqtaP-OBlhJhyLG6YaaXkar7rnGoVqOUTMcG1OM5Jrdf-0cTxJoR-S-Wkt1hkJO0sxk0ziwxv9YYpDPFo/s1600/IMG_3868.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj7X2y1TQoaxXhQdkSL76PKmhJJ7jZkVzcpEMeDIknViS4qkq-J0WA3TbSumznqtaP-OBlhJhyLG6YaaXkar7rnGoVqOUTMcG1OM5Jrdf-0cTxJoR-S-Wkt1hkJO0sxk0ziwxv9YYpDPFo/s1600/IMG_3868.JPG" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Here's How I Plugged into the OpenBCI Board.</td></tr>
</tbody></table>
<br />
<u>Audio Files:</u> I created my audio files in <a href="http://audacity.sourceforge.net/">Audacity</a>. I created two sets of files, based on the frequencies used in the Kim paper: one set of files using a 1000 Hz tone and the other set using a 2500 Hz tone. The Kim paper said that the strongest ASSR generally occurs for a beat frequency of 40 Hz. I wanted to see my response at different beat frequencies, so for each tone I created three versions: one beating at 38 Hz, one at 40 Hz, and one at 42 Hz. I made each version 20 seconds long. I used a square wave (ie, on/off) amplitude modulation, though next time I might try sine wave modulation instead.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi7Mdg6A5gDn1TPW2KouUTmTQpcuZqt3rgoRDPOa78C3j2fuMhoU2GcUVaRIXtGLq5Sv4JFrqDm_4AmSZNSMwzU4QDRBV6CcdGY3qaG8ElldYRsLM301XO_rwVycK0ffocGZ5Wd6g5huoI/s1600/AudacityScreenshot.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="210" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi7Mdg6A5gDn1TPW2KouUTmTQpcuZqt3rgoRDPOa78C3j2fuMhoU2GcUVaRIXtGLq5Sv4JFrqDm_4AmSZNSMwzU4QDRBV6CcdGY3qaG8ElldYRsLM301XO_rwVycK0ffocGZ5Wd6g5huoI/s1600/AudacityScreenshot.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I Created My Amplitude-Modulated (AM) Test Tones in Audacity. First, "generate" the<br />
tone. Then, to do the AM, go under "Effect" and select "Tremolo". </td></tr>
</tbody></table>
<u><br /></u>
<u>Data and Analysis Files:</u> My audio files, my data files, and my analysis files are all on my GitHub <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2015-01-24%20Auditory%20Steady%20State">here</a>. Note that I did my analysis using an IPython Notebook (see it <a href="http://nbviewer.ipython.org/github/chipaudette/EEGHacker/blob/master/Data/2015-01-24%20Auditory%20Steady%20State/exploreData.ipynb">here</a>). My specific Python installation is described <a href="http://eeghacker.blogspot.com/2014/10/moving-from-matlab-to-python.html">here</a>.<br />
<br />
<h3>
My ASSR Response</h3>
<br />
<div>
My goal is to see if I exhibit the ASSR response with this test setup. To do the test, I wired myself up as discussed above, I queued up all six audio files (the three at 1000 Hz followed by the three at 2500 Hz), put in my ear buds, and started recording.<br />
<br />
<u>Eyes Closed:</u> The spectrogram below shows my Cz EEG signal when I did this test with my eyes closed. That strong red stripe at 10 Hz is my Alpha response simply due to having my eyes closed. What I do not see here are horizontal stripes of energy at 38, 40, or 42 Hz. In other words, I do not see any brain waves entraining with the audio stimulation. This is disappointing.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh755KChuuDFS327NsdQjEyRH15xC8NyCApswqGXasLPPkKOvsP-YQgwut1IukykCG3P3lfVPd_INpU6koVvCHYpa9962FbK5lnWUGwL9wi-70oBduAuSxE-aTT5d9RS8ZxN1eD78NZiiE/s1600/Results-EyesClosed.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="260" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh755KChuuDFS327NsdQjEyRH15xC8NyCApswqGXasLPPkKOvsP-YQgwut1IukykCG3P3lfVPd_INpU6koVvCHYpa9962FbK5lnWUGwL9wi-70oBduAuSxE-aTT5d9RS8ZxN1eD78NZiiE/s1600/Results-EyesClosed.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><div style="font-size: 12.8000001907349px;">
Spectrogram of EEG Signal from Cz with AM Auditory Stimulation Near 40 Hz.</div>
<div style="font-size: 12.8000001907349px;">
My eyes were closed, hence the strong response at 10 Hz.</div>
<div style="font-size: 12.8000001907349px;">
There is no signature of the 38-42 Hz AM Audio Stimulation.</div>
</td></tr>
</tbody></table>
<br /></div>
<u>Eyes Open:</u> I also performed this test with my eyes open. A spectrogram of my EEG signal at Cz is shown below. I started and ended the test with my eyes closed for 10 seconds, which you can see as 10 Hz Alpha waves at the start and end. What I really want to see, though, is something corresponding to the audio stimulation at 38 Hz, 40 Hz, or 42 Hz. Again, I see nothing.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhoGDjNtzH9sARSMyeUQ9GjHBiYqzomaGt0hJMbQpsTwCUZKH2qEJEJXTvJ-ABnUfO8vXVeXB1JdtabHaKv5A6MHhhddAW2kbxW6q0bO7dXAo2jNwNpOGy6Twq0Rz0NLtVYtXzOjkYZhxQ/s1600/Results-EyesOpen.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="260" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhoGDjNtzH9sARSMyeUQ9GjHBiYqzomaGt0hJMbQpsTwCUZKH2qEJEJXTvJ-ABnUfO8vXVeXB1JdtabHaKv5A6MHhhddAW2kbxW6q0bO7dXAo2jNwNpOGy6Twq0Rz0NLtVYtXzOjkYZhxQ/s1600/Results-EyesOpen.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><div style="font-size: 12.8000001907349px;">
Spectrogram of EEG Signal from Cz with AM Auditory Stimulation Near 40 Hz.</div>
<div style="font-size: 12.8000001907349px;">
My eyes were open, except at the beginning at end.</div>
<div style="font-size: 12.8000001907349px;">
There is no signature of the 38-42 Hz AM Audio Stimulation.</div>
</td></tr>
</tbody></table>
<br />
<div>
<u>Average Spectrum:</u> To most closely mimic the plot from the Kim paper (ie, the graph that I copied earlier), I plotted the average spectrum. In the Kim plot, there were clear peaks at his two beat frequencies (37 and 43 Hz). In my equivalent plot below, there are no peaks at the three beat frequencies that I studied (38, 40, and 42 Hz).<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8cNPfuYzFKL9x1a2odNaAhpr5oWvkE4CxvtbHayVivi3nQPDmXNrVcbv5Mf0ZBhT2NcgvIuJ73g3RtE9nlJU-EYSTKy5LVQ7qI_fgXnEIW5n6L7keF2yOWWfF-lnlsGbyTDHukppERUU/s1600/Spectrum-EyesOpen.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="260" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8cNPfuYzFKL9x1a2odNaAhpr5oWvkE4CxvtbHayVivi3nQPDmXNrVcbv5Mf0ZBhT2NcgvIuJ73g3RtE9nlJU-EYSTKy5LVQ7qI_fgXnEIW5n6L7keF2yOWWfF-lnlsGbyTDHukppERUU/s1600/Spectrum-EyesOpen.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><div style="font-size: 12.8000001907349px;">
Mean Spectrum During the Test Period. There is no evidence of my brain waves entraining</div>
<div style="font-size: 12.8000001907349px;">
with the 38, 40, and 42 Hz AM auditory signals. Bummer.</div>
</td></tr>
</tbody></table>
<br />
<u>Conclusion:</u> So, it is clear that i did not see any ASSR in my EEG recordings. This is very disappointing to me.<br />
<br />
<h3>
Comparison to Kim</h3>
<div>
<br /></div>
<div>
Why did Kim see ASSR and I did not? I'm not sure. Maybe my test setup or my audio files were sufficient different to prevent the response. Or, maybe I'm reading too much into his results...</div>
<div>
<br /></div>
<div>
In looking back at his plot with the spectrum from one of his subjects (copied earlier in this post), I see that the y-axis is a linear axis, whereas I always do dB. What might his values look like when converted to dB?</div>
<div>
<br /></div>
<div>
As an example, I see that his first peak is 0.40 uV^2, relative to a baseline of about 0.30 uV^2. Converted to dB (re: 1 uV^2), this would be -4.0 dB and -5.2 dB. Comparing to my own spectrum plot above, where my baseline is about -10 dB, any peak at -4.0 dB should be easily seen. Therefore, if my own response were as strong has Kim's subject's response, I would think that I would see the response in my plots. I don't see the peak, so I guess that I didn't have the response as strongly as Kim's subject.</div>
<div>
<br /></div>
<div>
Perhaps the "gotcha" here is that the difference in Kim's data between the peak (-4.0 dB) and the baseline (-5.2 dB) is only 1.2 dB. That is a really small difference. For reliable detection, I generally like to see 6-10 dB of difference. It might be too much to hope to reliably see only a 1.2 dB difference.</div>
<div>
<br /></div>
<h3>
Next Steps</h3>
<div>
<br /></div>
<div>
I'm not going to give up yet. I'm going to try again. I'm going to try using the additional EEG electrodes as used by Kim and I'm going to try to use sine-wave modulation instead of square-wave modulation. I want to see this response!</div>
<div>
<br /></div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com11tag:blogger.com,1999:blog-7276377053120174333.post-56766206918236425782015-01-11T22:50:00.001-05:002017-04-14T16:41:49.733-04:00Estimating OpenBCI Battery LifeIn OpenBCI's twitter feed a few weeks ago, I saw that someone <a href="https://twitter.com/memowaveB/status/545648168860647424">3D printed a belt holster</a> for their OpenBCI system, including the battery pack. I thought that was pretty sweet. It also put in my mind the question of "How long might the system run on its batteries?". So, after <a href="http://eeghacker.blogspot.com/2015/01/soldering-16-chan-openbci.html">assembling the 16-channel OpenBCI system</a>, I decided to measure the power draw.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhaSTPBFiWqDkrCRkQYveguTJCmvMikPI0UgHBNTqUofxwnXlHsZgqfOeTINjSsSNggsSmMY3YkzIZjTi7lxnG3wlio7KIDvXUs6PwYWhlYf9x8zr6Qbz8M2EE27QGBc1KJHQ8edVXYpcg/s1600/IMG_3804.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhaSTPBFiWqDkrCRkQYveguTJCmvMikPI0UgHBNTqUofxwnXlHsZgqfOeTINjSsSNggsSmMY3YkzIZjTi7lxnG3wlio7KIDvXUs6PwYWhlYf9x8zr6Qbz8M2EE27QGBc1KJHQ8edVXYpcg/s1600/IMG_3804.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Current Draw from 16-Channel OpenBCI System During Operation (no SD)</td></tr>
</tbody></table>
<br />
As you can see above, I used my digital multi-meter (DMM) in its "mA" setting to measure the current flow. It doesn't matter where measure the current, you just have to break into the power circuit somewhere and bridge the gap with the DMM. I did it at the battery pack, because you just have to pop out one battery. Easy.<br />
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiHDcpx2OEStlHq9ZtlVDmISnsBLL_GOU3SN3bN9vX65O7s53l069YBfCm2VT0kV4uhMv8hjpRXB2soo5B0xXt_r_eEceKJjWdriH5h1NtBEkB00yBQ9s7T6ZlDXaR7XaYu6LOJOIfd8ec/s1600/IMG_3803.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiHDcpx2OEStlHq9ZtlVDmISnsBLL_GOU3SN3bN9vX65O7s53l069YBfCm2VT0kV4uhMv8hjpRXB2soo5B0xXt_r_eEceKJjWdriH5h1NtBEkB00yBQ9s7T6ZlDXaR7XaYu6LOJOIfd8ec/s1600/IMG_3803.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Touch the Red Lead (Positive) to the Battery</td></tr>
</tbody></table>
<div>
<br />
I measured the current draw of two of the three OpenBCI versions. As shown in the picture at the top, the 32-bit board with daisy module (ie, the 16-channel version of OpenBCI) draws about 62 mA. The picture below shows that the 8-bit board was drawing about 40 mA. In both cases, the boards were actively streaming their EEG data via their RFDuino BT module. Neither was saving data to their SD cards (SD writing can be *very* power hungry). <br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCNcph8-8tujthcznG1dwW5gcQgeUSzAndktlPOyBM3yQ_pDkm6RGa0oGEo3OJtL-kthILCylCYsCIPk5_BCT4qKN3j-KRVW0frIQp8-L5C4_KlrTMBtZbqMMjAuUEE3KV6UxRWWi20I0/s1600/IMG_3815.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCNcph8-8tujthcznG1dwW5gcQgeUSzAndktlPOyBM3yQ_pDkm6RGa0oGEo3OJtL-kthILCylCYsCIPk5_BCT4qKN3j-KRVW0frIQp8-L5C4_KlrTMBtZbqMMjAuUEE3KV6UxRWWi20I0/s1600/IMG_3815.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Measuring the Current Draw for the 8-Channel 8-Bit OpenBCI Board.</td></tr>
</tbody></table>
<br />
So, how long might the OpenBCI system run from a set of batteries? Well, that can be a complicated question. I started by looking at the datasheet for Energizer AA batteries (<a href="http://data.energizer.com/PDFs/E91.pdf">here</a>). The graph below is copied from near the end of the datasheet. It shows how the battery voltage will change as a function of time for two different loads...one called "remote" and one called "radio". Which might be similar to OpenBCI? Well, if the 16-channel board is pulling 62 mA, and the nominal battery voltage is 1.5V, then the effective load is (1.5/0.062) = 24 ohms. Hey, the graph below says that the "remote" is also a load of 24 ohms! So we can read that line directly.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9tBQVAC5NAgwRcy2S-3uHzmFqveuPyYAdE03bM_kPwDUa9IA5hpIz8_QZDhOX7plCE0SyboR_a45qUDbz-ClPbckAcDAtjuXBiNHiFoscIDH4vaRsLRwbJCH4Fib78umeW7pVxkSGghk/s1600/BatteryDischarge.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="178" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9tBQVAC5NAgwRcy2S-3uHzmFqveuPyYAdE03bM_kPwDUa9IA5hpIz8_QZDhOX7plCE0SyboR_a45qUDbz-ClPbckAcDAtjuXBiNHiFoscIDH4vaRsLRwbJCH4Fib78umeW7pVxkSGghk/s1600/BatteryDischarge.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Discharge Curve for Energizer AA Batteries. The 16-channel OpenBCI board might last 26 hours.</td></tr>
</tbody></table>
<br />
Looking at the graph, we need to know when the batteries will no longer be able to power the OpenBCI system...when can we call the batteries "dead"? Often, AA cells are considered dead at 1.0 or even 0.8 V. Unfortunately, I think (I'm not sure) that OpenBCI can't run that low. I think that it needs a 5V supply to run (though I could totally be wrong, especially if it uses a buck-boost converter). If we assume that it needs 5V, and if we've got 4 AA cells, then each cell needs to supply at least 1.25V. That's our threshold.<br />
<br />
Looking at the graph above, I focus on the blue line labeled "remote" and I see when it crosses our hypothetical 1.25V threshold. It says that it could live for 26 hours. Wow. That's a pretty long time. Cool.<br />
<br />
Remember that this lifetime is for a 62 mA current draw (ie, for the 16-channel OpenBCI system). Pulling 62 mA for 26 hours means that we are utlizing 62 mA * 26 hrs = 1612 mA-hours of battery capacity. For the 8-channel board, which only pulls 40 mA, that same battery capacity might allow us to run for 1612 mA-hrs / 40 mA = 40 hours. Not bad at all!<br />
<br />
Since the battery life looks pretty good, it means that we should be able to come up with some pretty good mobile EEG hacks. No need to stay indoors, people! Let's get outside and freak some people out with our silly EEG headgear!<br />
<br />
UPDATE 2015-07-10: In the comments section, there's been some discussion regarding my "equivalent resistance" approach to estimating battery life. As an alternative, it might be better to assume that OpenBCI board is actually a constant current load, rather than a constant resistance load. So, let's estimate the battery life using that approach. Below is the graph from the datasheet for battery life as a function of constant current draw.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhkbZS8qFBvwhhJrAYcl1xq-fSvestFz-QJ4Gs1MEQNAM-WgSiPg99pFVxWR2GoQk4aqO8yxxf6cZH14h3tw4RW8n6j8FPEJk2qtKfH_7KiqQ-rkcGB7Xi5BrBTfOsK7DjCJjWbzLvgxTQ/s1600/ConstantCurrent.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="271" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhkbZS8qFBvwhhJrAYcl1xq-fSvestFz-QJ4Gs1MEQNAM-WgSiPg99pFVxWR2GoQk4aqO8yxxf6cZH14h3tw4RW8n6j8FPEJk2qtKfH_7KiqQ-rkcGB7Xi5BrBTfOsK7DjCJjWbzLvgxTQ/s400/ConstantCurrent.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Another Method of Estimating Battery Life for the 16-Channel OpenBCI Board.</td></tr>
</tbody></table>
<br />
To use this graph, I start with the knowledge that the OpenBCI board draws 62 mA. This locates me on the x-axis. I then read up to the line corresponding to the battery voltage where my device will die. In this case, I think that OpenBCI will die at 1.25V. There's a curve for 1.2V. Let's use that. From that point, I read off the the service life from the y-axis. Allowing for some uncertainty in reading a value from a logarithmic scale, it looks like the battery life would be about 23 hours. This value agrees decently well with the 26 hour value that I found based on my "equivalent resistance" method. Such agreement is always satisfying. It doesn't always work out that way. :)</div>
<div>
<br /></div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com7tag:blogger.com,1999:blog-7276377053120174333.post-9039508393581613942015-01-03T10:57:00.001-05:002017-04-14T16:41:55.696-04:00Soldering 16-chan OpenBCI<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
For a while now, I've been using the <a href="https://openbci.myshopify.com/collections/frontpage/products/openbci-8-bit-board-kit">8-channel version</a> of OpenBCI. You can see some of my EEG data <a href="http://eeghacker.blogspot.com/2014/10/first-alpha-with-openbci-v3.html">here </a>and some of my accelerometer data <a href="http://eeghacker.blogspot.com/2014/12/openbci-accelerometer-data.html">here</a>. Recently, I've been interested in getting more EEG channels, which means that I have turned my attention to the <a href="https://openbci.myshopify.com/collections/frontpage/products/openbci-16-channel-r-d-kit">16-channel version</a> of OpenBCI. The 16-channel version consists of a single 8-channel OpenBCI board with a additional "Daisy Module" to provide the additional 8-channels. Today, I'm going to show a few pictures of the soldering necessary to assemble these two boards into a working unit.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgOXoWiEYfnBb23fmtqLImryDvsxi_ktsHfjNyVaevuZEYGM0ETqh7GPJdJn-X_Mbp8zTYzVOMmKgDB9YTC0_wDbwbdwai2UBkgOoV9f8j_JBNuYCEHRn-y-jl1d-GQp_1-3voX1IQss14/s1600/01-IMG_3711.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgOXoWiEYfnBb23fmtqLImryDvsxi_ktsHfjNyVaevuZEYGM0ETqh7GPJdJn-X_Mbp8zTYzVOMmKgDB9YTC0_wDbwbdwai2UBkgOoV9f8j_JBNuYCEHRn-y-jl1d-GQp_1-3voX1IQss14/s1600/01-IMG_3711.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">OpenBCI Daisy Module (Left) with OpenBCI 32-Bit Board (Right) along<br />
with their male and female headers (bottom).</td></tr>
</tbody></table>
<br />
I started the assembly process by reading through <a href="http://docs.openbci.com/hardware/02-OpenBCI_Daisy_Module">the assembly instructions</a> (with its pictures) as provided on the OpenBCI website. Those instructions were good, though I thought that some additional illustration would be helpful to others. Hence, the reason for today's post.<br />
<br />
<u>Parts and Components:</u> As you can see in the picture above, the OpenBCI boards themselves are fully assembled. But, like many Arduino-style kits, you do need to solder on some pin headers in order to connect the boards together. To make this easy, I found that the OpenBCI kit comes with the correct male pin header for the Daisy module as well as the correct collection of female headers for the base OpenBCI board. Great!<br />
<br />
<u>A Trick for Soldering Headers:</u> I started by looking to solder the female headers to the base OpenBCI board. Based on my experience soldering headers to various Arduino kits, I know that soldering the female headers can be annoying because it is hard to hold the header in place while your two hands are already busy holding the soldering iron and the solder. To overcome this problem, I used a trick that I saw a while ago where you use a solderless breadboard to hold your female headers vertically in place, hands-free. It's pretty sweet trick. In addition to a solderless breadboard, you need some cheap double-ended pins (see below left). I got mine from <a href="http://www.adafruit.com/products/400">Adafruit</a>, but they are standard items available from a number of vendors. I only ever this pins for this soldering trick, so I bought them once and they've lived in my toolbox ever since.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEih69yJgFw84M_50N7Ao6XqPgvMfMdKyH2dpi1ZdI5JTqkBrFPO6j-131LbmG0ijWDUkILH1vnIAkdYORVrNzlJ7ITTX7loE40S_PNlKREv7Je6J6VhzpNXifUkMwdOymxYFmOc0IikHF8/s1600/02-UsingDoubleEndedPinHeaders.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="238" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEih69yJgFw84M_50N7Ao6XqPgvMfMdKyH2dpi1ZdI5JTqkBrFPO6j-131LbmG0ijWDUkILH1vnIAkdYORVrNzlJ7ITTX7loE40S_PNlKREv7Je6J6VhzpNXifUkMwdOymxYFmOc0IikHF8/s1600/02-UsingDoubleEndedPinHeaders.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Use double-ended extra-long pin headers along with a solderless breadboard as a trick to hold<br />
the female headers in place. The long pins will go into the bread board. The short stubs off<br />
the top of the female header get soldered into the OpenBCI board.</td></tr>
</tbody></table>
<br />
As seen in the picture on the right, above, you stick the extra-long headers into the female header that you are looking to solder to the OpenBCI board. Then, as shown in the picture below, left, you stick the extra-long headers into the solderless breadboard, which leaves the short pins (which are the solderable part of the female header) sticking up in the air.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiDCZg4rQusO4XDZPNlhO8CyGec7n05W3NEsbAS4xLgVRVlaitQRwHyUMRxPaNdi0rz9oHZtLgde9WLj8RsFR0pj77Hu67Yw7tLIoMFWzfrl3swzUct-GE65Utqj6VC18n7wC6chmE5dcE/s1600/03-AlignWithPinHeaders.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="246" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiDCZg4rQusO4XDZPNlhO8CyGec7n05W3NEsbAS4xLgVRVlaitQRwHyUMRxPaNdi0rz9oHZtLgde9WLj8RsFR0pj77Hu67Yw7tLIoMFWzfrl3swzUct-GE65Utqj6VC18n7wC6chmE5dcE/s1600/03-AlignWithPinHeaders.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Use the extra-long pin header to hold the female header onto the solderless breadboard.<br />
Then flip the OpenBCI board upside-down and place the OpenBCI board onto the<br />
solder pins of the female header. You're ready to solder those pins!</td></tr>
</tbody></table>
<br />
<div>
<u>Soldering the Female Headers:</u> Now, you can place the OpenBCI board over those solder pins (see the right picture, above). Note that the OpenBCI board has been flipped over so that it is face-down. It's important that you solder the header onto the correct side of the board! Once you have confirmed that everything is sitting correctly, you can start soldering.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8Jh2m_RgS-w3gU3mKL3caTzew1W6TZ71kOSxwoAffV8GB-drgY0Pal0AmF2EOpqSC06osaGiljQ8oheFPgUQf4Kltm4DFvydEo-S42vITKh9Xi1E0grU_mXIWgFumoPQy3lyokSyW5c0/s1600/04-IMG_3773.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8Jh2m_RgS-w3gU3mKL3caTzew1W6TZ71kOSxwoAffV8GB-drgY0Pal0AmF2EOpqSC06osaGiljQ8oheFPgUQf4Kltm4DFvydEo-S42vITKh9Xi1E0grU_mXIWgFumoPQy3lyokSyW5c0/s1600/04-IMG_3773.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Soldering each one of the pins in this header.<br />
Then repeat for all of the other headers.</td></tr>
</tbody></table>
<br />
After repeating this process for all of the other female headers, the base OpenBCI board is fully prepared.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4RKjGREtTp3hyphenhyphenVqGTiv-Q9Z0WgxfAjPE_yMg8iX-8tyXPT-RNJUqksjaMGw5FcM8BBKuf3kZK2hJrrewU2SF-VQVvNxvSfgpISxvNpewqg_i1r3vAEY1hY4bBhEHYCvEkQtuoasUCyR8/s1600/05-BaseBoardSoldered.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4RKjGREtTp3hyphenhyphenVqGTiv-Q9Z0WgxfAjPE_yMg8iX-8tyXPT-RNJUqksjaMGw5FcM8BBKuf3kZK2hJrrewU2SF-VQVvNxvSfgpISxvNpewqg_i1r3vAEY1hY4bBhEHYCvEkQtuoasUCyR8/s1600/05-BaseBoardSoldered.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The base OpenBCI 32-bit board is finished.</td></tr>
</tbody></table>
<br />
<u>Preparing the Daisy Module:</u> With the base OpenBCI board finished, I turned to the Daisy module. Here, you start but using your pliers to snap apart the single, long, male pin header into the smaller pieces needed to fit into the different spots of the Daisy module.</div>
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJCWZ2XmALeOCKpdlsYsF7vNMurJXvJ96g2VFPJipJzZvA369VeygMJJVkqU37EGhvU9GKDmLsSMEZ1NON1GzFzf57bD7-dGPi1RJnwcdJ4bkIKQiwsg6bHVG7momH_qmmdoHYDQGA6tk/s1600/06-PinHeadersIMG_3716.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiJCWZ2XmALeOCKpdlsYsF7vNMurJXvJ96g2VFPJipJzZvA369VeygMJJVkqU37EGhvU9GKDmLsSMEZ1NON1GzFzf57bD7-dGPi1RJnwcdJ4bkIKQiwsg6bHVG7momH_qmmdoHYDQGA6tk/s1600/06-PinHeadersIMG_3716.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Prepare the male pin headers for the Daisy board. Snap the long pin header<br />
into the correct number of pieces.</td></tr>
</tbody></table>
<div>
<br /></div>
<div>
<u>Use the Base Board as Your Fixture:</u> Then, as before, it can be tricky to solder these headers when your hands are full with the soldering iron and solder. The tick this time, is to use the base OpenBCI board itself as your fixture. This is a classic trick for soldering Arduino shields. As shown in the picture below, left, insert the male pin headers into the base board's female headers. Do this for all of the male headers that you will solder to the Daisy module. Once they're in place, you can simply place the Daisy module onto the pins (see below, right) and everything will be nicely aligned and ready to solder.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhq-KbN_uNlzZ727Yr3qgVqCqXcK1MPsmrBTbtI-8h5xUYVN8btzWBSThVc0egKQdPGX69EXD6Ld8YRERNuuaUYAnPSeGpy-UvtMZX3ZbRJRl9qJqvopDsQ6v5KeNt7MFhA4TSw1Uw-6AI/s1600/07-MountDaisy.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="242" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhq-KbN_uNlzZ727Yr3qgVqCqXcK1MPsmrBTbtI-8h5xUYVN8btzWBSThVc0egKQdPGX69EXD6Ld8YRERNuuaUYAnPSeGpy-UvtMZX3ZbRJRl9qJqvopDsQ6v5KeNt7MFhA4TSw1Uw-6AI/s1600/07-MountDaisy.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">(Left) Insert the male pin headers into the female headers that were just soldered into<br />
the base OpenBCI board. This holds them in the right place. Then, place the Daisy board<br />
on top so that you can solder the pins into the Daisy board.</td></tr>
</tbody></table>
<br />
<u>Solder the Daisy Module:</u> With the pins all in place, solder the headers into place. With everything so nicely held, this part is fast! For me, it went so quickly that I forgot to solder one of the headers into place. Ooops! So, I went back and soldered the remaining pins. No problem.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjdS8db7IgFY8czVovWvVx3r-xjyxF5H8GmcB2q5G8QeRUJ5gZ2ayYphLId_NFovy0GVUBvhpJrNZNtiXOBxswtgv_kBlsclQ0EEiW7nfVdZt_7Tb4Fo7hZJs9yazBiaLBHZdK7I-aS9Bw/s1600/08-SolderingDaisy.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjdS8db7IgFY8czVovWvVx3r-xjyxF5H8GmcB2q5G8QeRUJ5gZ2ayYphLId_NFovy0GVUBvhpJrNZNtiXOBxswtgv_kBlsclQ0EEiW7nfVdZt_7Tb4Fo7hZJs9yazBiaLBHZdK7I-aS9Bw/s1600/08-SolderingDaisy.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Soldering the pins to the Daisy board. Be sure to solder all of the pins on all<br />
of the new headers (I forgot one header when I did it)</td></tr>
</tbody></table>
<br />
<u>Ready for EEG:</u> With the last soldering complete, the two boards are mated and I'm ready to collect 16-channels of EEG. This is gonna be fun!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiH9wcwKbuT4ZBT2SvuSC82h5ZstnBfwCz3xY9ZWwVDmT2ybMUULyyQaNBfFax-5qPjqtbR41IrmrqLR3Cd90TAHod7ca1c7Y8kv7w9J8oheTB34AnvcbFKvM_z6P7V2K_cVdHzgHc8ovw/s1600/09-Finished.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiH9wcwKbuT4ZBT2SvuSC82h5ZstnBfwCz3xY9ZWwVDmT2ybMUULyyQaNBfFax-5qPjqtbR41IrmrqLR3Cd90TAHod7ca1c7Y8kv7w9J8oheTB34AnvcbFKvM_z6P7V2K_cVdHzgHc8ovw/s1600/09-Finished.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I'm finished!</td></tr>
</tbody></table>
<u><br /></u>
<u>Follow-Up:</u> I measured the power draw of the system <a href="http://eeghacker.blogspot.com/2015/01/estimating-openbci-battery-life.html">here</a>.</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com2tag:blogger.com,1999:blog-7276377053120174333.post-28033749425340561992014-12-02T07:59:00.002-05:002017-04-14T16:42:03.638-04:00OpenBCI Accelerometer DataThe OpenBCI V3 board does more than just EEG. Yes, I've already shown examples of doing <a href="http://eeghacker.blogspot.com/search/label/ECG">ECG </a>and <a href="http://eeghacker.blogspot.com/search/label/EOG">EOG </a>with my old V1 and V2 boards, but the new V3 board includes an <a href="http://www.st.com/web/catalog/sense_power/FM89/SC444/PF250725">accelerometer</a>, which the old boards did not have. How could an accelerometer be useful? Well, you could use it to sense orientation (or change in orientation) of the head as part your BCI. Or, you could use it to sense rough motion, which might suggest that you'll have motion artifacts in your EEG data. Or, you could sense yourself tapping on the board as a way to introduce markers during your data collection. There are many possibilities! Today, I'm going to look at the accelerometer data for the first time.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhLdiy4MgmVQzx1epsDfAdQEb-qH097TnMxNThI0TeJN1tMuP3oCYRbGjxdbqu3OCkJY7JtSYRA86ltJIGEwy7Aa71_x_GO1sKw8U8ltXO4SJYpPsjr6aK3qJrSnbCDHKpaWwWr4Y3gY5Q/s1600/BoardWithBatteries.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="267" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhLdiy4MgmVQzx1epsDfAdQEb-qH097TnMxNThI0TeJN1tMuP3oCYRbGjxdbqu3OCkJY7JtSYRA86ltJIGEwy7Aa71_x_GO1sKw8U8ltXO4SJYpPsjr6aK3qJrSnbCDHKpaWwWr4Y3gY5Q/s1600/BoardWithBatteries.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">OpenBCI V3 Board with Batteries</td></tr>
</tbody></table>
<div>
<b><u>Goal</u>:</b> My goal is to record accelerometer data during known motions so that I can confirm that the data matches the motions.<br />
<br />
<b><u>Setup</u>:</b> I used my OpenBCI V3 Board (see picture above) as delivered from OpenBCI. On the OpenBCI board, I was running the same software as was shipped by OpenBCI in November, 2014. The OpenBCI board used its wireless link to the PC. On the PC, I ran the OpenBCI GUI in Processing. The GUI logged the data to a file.<br />
<br />
<b><u>Procedure</u>: </b>I inserted the batteries to my OpenBCI board to give it power. I started the OpenBCI GUI to begin recording data. Holding the board in my hand, I completed the following maneuvers:<br />
<ol>
<li>Start with board flat and level (z-axis points up, like in the picture at the top)</li>
<li>Roll it 90 deg to the right (x-axis points down) and 90 deg left (x-axis points up)</li>
<li>Tip it nose down (y-axis points down) and nose up (y-axis points up)</li>
<li>Flip it upside down (z-axis points down)</li>
</ol>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjmNl1MszebF3Sq7RZJrp99hEKIX0L3ob3ORsxu6jn-_QHrb8px1t9LYjUTyyLZjedjzkDEzwhBsUtk1dv_rYxy35K8QKaI363gFz2oEuLOjU7MSubihn7XF4SUUqbsPdEu2zdGrNf21-s/s1600/CollageOfPositions.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="393" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjmNl1MszebF3Sq7RZJrp99hEKIX0L3ob3ORsxu6jn-_QHrb8px1t9LYjUTyyLZjedjzkDEzwhBsUtk1dv_rYxy35K8QKaI363gFz2oEuLOjU7MSubihn7XF4SUUqbsPdEu2zdGrNf21-s/s1600/CollageOfPositions.png" width="520" /></a></div>
<br />
Notice the markings on the OpenBCI board (zoomed picture below) that indicate the direction of the accelerometer's axes.<br />
<br /></div>
<div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLVL-D653FTE3E9ZPKJQVzx9Fxqf1o4LSqLv1aXSmQzNSIxMOhnvQODJzeqFzdvmnggimopLZdWGq0mF-Ih7ryUprfsZ5aSEA5VfwlA31pjSHntlikU4eSgI00sof5pTaW8HFffavBhyM/s1600/BoardZoom.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgLVL-D653FTE3E9ZPKJQVzx9Fxqf1o4LSqLv1aXSmQzNSIxMOhnvQODJzeqFzdvmnggimopLZdWGq0mF-Ih7ryUprfsZ5aSEA5VfwlA31pjSHntlikU4eSgI00sof5pTaW8HFffavBhyM/s1600/BoardZoom.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The accelerometer is the small black square towards the bottom.<br />
Note that "X" points right, "Y" points forward, and "Z" comes up out of the board.</td></tr>
</tbody></table>
<b><br /></b>
<b><u>Data Files</u>: </b>The 3-axis accelerometer data was saved to a text file by the Processing GUI. I analyzed the data using Python. The data and analysis files are available on my <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2014-11-23%20Accelerometer">EEGHacker repo on GitHub</a>. If you use this data, be sure to unzip the ZIP file in the <span style="font-family: "courier new" , "courier" , monospace; font-size: x-small;">SavedData</span> directory!<br />
<br />
<b><u>Analysis</u>: </b>The specific goals of this analysis are to confirm that the data is well behaved, that the correct axes are responding to the known motions, that the units are correct, and that the scale factors are correct. I used an IPython Notebook to step through each one of these analyses. You can see the IPython Notebook <a href="http://nbviewer.ipython.org/github/chipaudette/EEGHacker/blob/master/Data/2014-11-23%20Accelerometer/exploreAccelData.ipynb">here</a>.<br />
<br />
<u style="font-weight: bold;">Results, Data Continuity</u>: The first thing I did was to look at the data to make sure that it was well behaved. The most important part of being well behaved is that the data is continuous. Looking at the packet counter in the data file (a counter which is transmitted by the OpenBCI board), there were no missing data packets. Excellent. I did see however that accelerometer data is only included in every 10th or 11th data packet. Why? Well, looking at the code on the OpenBCI board, it has configured the accelerometer to only produce data at 25 Hz. So, compared to the 250 Hz sample rate for the EEG data (which then drives a 250 Hz rate for data packets), we see why we only get acceleration values every 10th or 11th packet. It makes sense. Good.<br />
<br />
<b><u>Results, Individual Axes</u>:</b> After ensuring that the data was continuous, I looked at the data values themselves. I plotted the acceleration values as a function of time. The plots below show the values recorded from each of the accelerometer's three axes. As can be seen, the signals clearly reflect the maneuvers defined in my procedure. Additionally, from these plots, we learn that negative acceleration values result when the accelerometer's axis is pointing down (relative to gravity) and positive values result when the axis is pointing up. This polarity information is important if you wish to use the accelerometer data to estimate the orientation of the OpenBCI board.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0hKHYE_wJnDIUsV5qCX-mB7wNoY5RC8qqXLJ1d7_YDhIP9Q-k3GvWvDZruQKPzpo9vEKW9X1WxVu4CbBgGm2UaUnzMGTrnfeH734VZ2qRQJHQ8hpIrmJwaj6iWatBEYh52lVuix3Mf9w/s1600/AccelAxes.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="492" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0hKHYE_wJnDIUsV5qCX-mB7wNoY5RC8qqXLJ1d7_YDhIP9Q-k3GvWvDZruQKPzpo9vEKW9X1WxVu4CbBgGm2UaUnzMGTrnfeH734VZ2qRQJHQ8hpIrmJwaj6iWatBEYh52lVuix3Mf9w/s1600/AccelAxes.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Acceleration Values for the Accelerometer's Three Axes.<br />
The three channels correspond to the X, Y, and Z axes.</td></tr>
</tbody></table>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<b><u>Results, Scale Factor</u>:</b> With the behavior of the 3 axes shown to be reasonable, I then wanted to confirm that the magnitude of the values were correct. I wanted to make sure that the scale factors used for interpreting the raw values was correct. The quickest way for me to confirm the scale factor was to compute the magnitude of the 3-axis acceleration vector. When the device is at rest, the magnitude of the measured acceleration should equal gravity, which is 1.0 G. As you can see below, the magnitude of our acceleration was generally close to 1.0 G (though often a little high), except when it was moving during its transitions between positions. This is good.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgu0zRuawnxfn0NQ0cRsTzY6NFx0fFOPDiY3YAfp6YaNtUccj8e-WaQ5HqBhzLjmo2RS6uetEQohQH9Q0UE4PBGDxrHWC5GbHyK3_TIbUmHr5N4PEwWzc5-OItRT9Zqivp3gihsPguQ99k/s1600/Magnitude.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="169" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgu0zRuawnxfn0NQ0cRsTzY6NFx0fFOPDiY3YAfp6YaNtUccj8e-WaQ5HqBhzLjmo2RS6uetEQohQH9Q0UE4PBGDxrHWC5GbHyK3_TIbUmHr5N4PEwWzc5-OItRT9Zqivp3gihsPguQ99k/s1600/Magnitude.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The magnitude of the 3-axis acceleration vector should equal 1.0 G when at rest.<br />
Ours equals about 1.044 G, which is within the known offset error bounds of the device</td></tr>
</tbody></table>
<br />
When I look very closely at the values, it appears that the typical reading is actually 1.044 G instead of 1.000 G. There is a 44 mG difference. Is this unexpected? Well, yes, it was unexpected at first. And then I read the <a href="http://www.st.com/web/en/resource/technical/document/datasheet/CD00274221.pdf">datasheet</a>. Always look at the datasheet. In this case, it reports that the accelerometer should have a typical offset error of 40 mG per axis. For a 3-axis device, this could result in sqrt(40<sup>2</sup> + 40<sup>2</sup> + 40<sup>2</sup>) = 69 mG of error on my magnitude value. As a result, my 44 mG value appears to be in-line with the device's advertised performance. That's satisfying.<br />
<br />
<b><u>Conclusion</u>:</b> With this test, I confirmed that my accelerometer is sending well-behaved data, with all three axes responding appropriately to known motions, with all axes having the correct scale factor. These are good results and I'm pleased. Now it's time to figure out something <b>fun</b> to do with the accelerometer! <br />
<br /></div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com5tag:blogger.com,1999:blog-7276377053120174333.post-90326543338017373432014-11-16T10:24:00.002-05:002017-04-14T16:42:09.929-04:00My Kickstarter OpenBCI Arrived!It's arrived! It's arrived! My OpenBCI Kickstarter award has arrived! And now, the guilty pleasure of unpacking a new piece of tech...<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjp8m2uWHgF-Fvv6TeEGQTnHErm0S0ui_455qVnT_V27h6EW4RIVvEY3IFrW6QQnoQt1o-acVF3jnWMTugVtzukAMqyE_UbwOAngtrgtr8sAMmvXGGKi9NorMcTeuPW7-bGA8HM1dToy1w/s1600/output_LwjmVa.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="265" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjp8m2uWHgF-Fvv6TeEGQTnHErm0S0ui_455qVnT_V27h6EW4RIVvEY3IFrW6QQnoQt1o-acVF3jnWMTugVtzukAMqyE_UbwOAngtrgtr8sAMmvXGGKi9NorMcTeuPW7-bGA8HM1dToy1w/s1600/output_LwjmVa.gif" width="400" /></a></div>
<br />
Sure, Joel and Conor did send me <a href="http://eeghacker.blogspot.com/2014/08/first-data-with-openbci-v3.html">an early unit</a> for me <a href="http://eeghacker.blogspot.com/2014/10/first-alpha-with-openbci-v3.html">to test for them</a>, but yesterday I received my actual purchased unit. For <a href="https://www.kickstarter.com/projects/openbci/openbci-an-open-source-brain-computer-interface-fo">their Kickstarter</a> back in January, I choose the "OpenBCI Board -- Early Bird Special". Based on their description, I thought that I'd get just the OpenBCI board. It turns out that I got quite a bit more!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhblWq3DHVlMLUjPcwexjV8qvCBq0Qb3aOXMBwEj8mEX5dUZ1ZjaWQK_QQaoo76sg0eGS6557tMAqfu279skh8x2331aO6vCD7NPf52zIpjx0NPhWScmkq2OZw0_N5uJ53NZwKA47cxxi8/s1600/IMG_3479.JPG" imageanchor="1"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhblWq3DHVlMLUjPcwexjV8qvCBq0Qb3aOXMBwEj8mEX5dUZ1ZjaWQK_QQaoo76sg0eGS6557tMAqfu279skh8x2331aO6vCD7NPf52zIpjx0NPhWScmkq2OZw0_N5uJ53NZwKA47cxxi8/s1600/IMG_3479.JPG" width="400" /></a></div>
<br />
<div>
As you can see in center of the picture above, I got the OpenBCI board (8-bit version) as well as the USB Bluetooth dongle. That was expected. Looping around the outside are all the extra pieces that I didn't expect. Starting from the left side of the picture, I got: a 4xAA battery holder, an OpenBCI sticker and sew-on patch, an OpenBCI T-Shirt, a set of electrode adapters, and two little bags of solderable female headers (for expanding the functionality of the OpenBCI board and of the dongle). That's some good stuff!</div>
<div>
<br /></div>
<div>
Thanks, OpenBCI!</div>
<div>
<div>
<br /></div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com5tag:blogger.com,1999:blog-7276377053120174333.post-27635918940974092722014-11-02T21:59:00.000-05:002017-04-14T16:42:37.422-04:00Two Brains - One RobotAfter my success with <a href="http://eeghacker.blogspot.com/2014/10/sharing-brain-controlled-hex-bug.html">sharing the brain-controlled hex bug</a> with Conor and Joel, we brainstormed on how we could make this hack even more fun. We decided that the main problem with this hack is that only one person gets to participate -- the person driving the robot. The solution? Let's hook up multiple people <i>at the same time </i>to control the one robot. It'll be like that 3-legged race, where you tie your leg to the leg of another person, and then you stumble together in slapstick hilarity until you both get to the finish line. We are going to do the same thing, but with brain-controlled robots. Here's how far we've gotten so far...<br />
<br />
<div style="-webkit-text-stroke-width: 0px; color: black; font-family: 'Times New Roman'; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: auto; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;">
<div style="text-align: center;">
<div style="margin: 0px;">
<iframe allowfullscreen="" frameborder="0" height="360" src="https://www.youtube.com/embed/rMmXxku1fF4" width="480"></iframe></div>
</div>
</div>
<br />
<u>The Plan:</u> Our goal is to have multiple people control one robot via their brain waves. To do this, we aimed to connect multiple people to a single OpenBCI board. I have never connected multiple people to one EEG system before, so this was pretty exciting for me. As shown in the figure below, the idea is that each player is responsible for just one of the robot's actions -- one player is responsible for "Turn Left", another for "Turn Right", etc. Since the robot has four actions (Left, Right, Forward, Fire), we can have up to four players.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj46LrGf0bhFUsk7N-xyypDhBQuym-xHBXWLQunLkrsyWft4W6HwPtzAdzGHhbAeL6ptWMF6DjIMDKKdif0C2aFV1Ygdwxn9FAD1Kgul0PMczgUB_h9Q34AG0AjOYHM7qc4MyXr36khok0/s1600/SetupSchematic.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="334" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj46LrGf0bhFUsk7N-xyypDhBQuym-xHBXWLQunLkrsyWft4W6HwPtzAdzGHhbAeL6ptWMF6DjIMDKKdif0C2aFV1Ygdwxn9FAD1Kgul0PMczgUB_h9Q34AG0AjOYHM7qc4MyXr36khok0/s1600/SetupSchematic.png" width="520" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The Hexbug robot has four commands (Left, Right, Forward, Fire), so for multi-player fun,<br />
connect four people to one OpenBCI board and work cooperatively!</td></tr>
</tbody></table>
<u><br /></u>
<u>Commanding the Robot:</u> In setting up this hack, I wanted to make it as easy as possible for the players to command the robot with their brain waves. The easiest brain waves to generate and the easiest brain waves to detect are Alpha rhythms (ie, 10 Hz oscillations), specifically the Alpha rhythm that naturally occurs when you close your eyes. So, with the setup above, we have the computer looking for Alpha waves in each person's EEG signal. If the computer sees Alpha waves from Player 1, the computer issues a "Turn Left" command to the robot. If the computer sees Alpha waves from Player 2, it issues a "Forward" command. And so on...<br />
<u><br /></u>
<u>EEG Setup:</u> To detect these eyes-closed Alpha waves, we put one electrode on the back of a player's head over the visual cortex (position <a href="http://www.bci2000.org/wiki/index.php/User_Tutorial:EEG_Measurement_Setup">"O1" in the 10-20 system</a>). We put another electrode on one ear lobe to act as the EEG reference. Finally, we put a third electrode on the other ear lobe to act as the EEG Bias.<br />
<br />
<u>Individual Reference:</u> To allow each person to use their own reference electrode, we configured the software on the OpenBCI board to put the ADS1299 EEG chip into per-channel differential mode. Unlike our normal mode of operation, which uses a common reference electrode via SRB1 or SRB2, this differential mode allows each channel (ie, each player) to have its own reference. This is what we want! We simply plug the O1 electrode into the channel's "P" input and the ear lobe reference into the channel's "N" electrode.<br />
<br />
<u>Common Bias:</u> The only tricky part is that we want all four players to be connected to the OpenBCI Bias. This is tricky because the OpenBCI board does not have four Bias pins. Well, as you can see below, all it takes is a soldering iron and you can connect a piece of <a href="https://www.sparkfun.com/products/116">pin header</a> to turn the single Bias pin into four Bias pins. Now we're hacking!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9i-aPNALZojcGIoOGTOCqcZshor5SRlkyWXfXMxzGcFLLT8dmiwF7lpBQibAZPqDD2g6lmHKn1jejdS-0b1YuHJjuz7JcrV8oSUFZrTH1BR2epI8R9o6FptfjPpIEccmATeI0SebP3Ao/s1600/Modified+V3.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="262" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9i-aPNALZojcGIoOGTOCqcZshor5SRlkyWXfXMxzGcFLLT8dmiwF7lpBQibAZPqDD2g6lmHKn1jejdS-0b1YuHJjuz7JcrV8oSUFZrTH1BR2epI8R9o6FptfjPpIEccmATeI0SebP3Ao/s1600/Modified+V3.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">OpenBCI V3 Board With Extra Pins Soldered to the Bias Output</td></tr>
</tbody></table>
<br />
<u>Connecting the Pieces:</u> The picture below shows all the connections to the OpenBCI board assuming three players. On the lower left, we've got three pairs of wires (one pair for each player) plugged into the "P" and "N" inputs of three different channels. Then, in the upper-left, you see three wires plugged into three of the four new Bias pins. Finally, in the upper-right, you see five wires that go off to command the <a href="http://eeghacker.blogspot.com/2014/05/arduino-control-of-hex-bug.html">hacked Hexbug remote control</a>.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_ktfGem1jdmcV0G8djDnN1h7AK902o9NobpDyeCn8cMvvKo2AUNHwGObQlYGLhMnaZSlYgfICThzJWpsxZmnoAUKr11CBq-btFv3-oFAsLoveBLrXNPut7Y6ZFQCxZb6woWceFXB-WfM/s1600/Wired+Up+V3.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="262" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_ktfGem1jdmcV0G8djDnN1h7AK902o9NobpDyeCn8cMvvKo2AUNHwGObQlYGLhMnaZSlYgfICThzJWpsxZmnoAUKr11CBq-btFv3-oFAsLoveBLrXNPut7Y6ZFQCxZb6woWceFXB-WfM/s1600/Wired+Up+V3.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">OpenBCI Board with Connections Ready for Three Players</td></tr>
</tbody></table>
<br />
<div>
<u>Making It Happen:</u> Being a rare thing that me and Joel and Conor are all together, it was really fun that we could work together to make this hack happen. Joel worked the soldering iron to attach the pins and he modified <a href="https://github.com/biomurph/OpenBCI_8bit_HexBug">the Arduino code</a> running on the OpenBCI board to enable the per-channel differential mode. Conor further modified the Arduino code as well as the Processing GUI to enable slower turning of the robot (originally, it was turning WAY too fast). Then, I modified the <a href="https://github.com/chipaudette/OpenBCI/tree/variant_hexBugControl_teamAlpha">Processing GUI</a> to enable Alpha detection on the four individual players. We did all this in parallel. I'd never really done group-hacking before. It was definitely fun.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjJH1-Uezhl5Qwbs-fSbaGksp-PNisTO92YoHzWwe1j84Og3sWXaZrPL8UvUb-d-jL91GvzMIJBNM4JhsZKjiaZY_4CMWfsm6sPTr_paqohmnlSalO9k3E4IaY_PUl57-6u5rm02Mdzf2g/s1600/IMG_3336.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="292" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjJH1-Uezhl5Qwbs-fSbaGksp-PNisTO92YoHzWwe1j84Og3sWXaZrPL8UvUb-d-jL91GvzMIJBNM4JhsZKjiaZY_4CMWfsm6sPTr_paqohmnlSalO9k3E4IaY_PUl57-6u5rm02Mdzf2g/s1600/IMG_3336.JPG" width="440" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Conor and Joel working through the details of the connecting the Hexbug remote control.</td></tr>
</tbody></table>
<br />
<u>Testing It:</u> Once we pulled together all of the pieces, Conor and I began to test the complete setup (see pic below). After a little tweaking, we got the whole system working, as shown in the video at the top of this post. It was a group effort that worked out. Pretty sweet.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjNGNXJwEFVjQmsnRT5lvGPV6zTWeMh_Hc8kwaxrzptF0LOB_tyA5QVYJBVAHJHYngjHtISkkhRNT2wWafTGxJS47yYJGoJHiNnS3nQws3u3cyupBu8QCGUR63SKTNNFDUk8y4N730mfPI/s1600/IMG_3342.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="313" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjNGNXJwEFVjQmsnRT5lvGPV6zTWeMh_Hc8kwaxrzptF0LOB_tyA5QVYJBVAHJHYngjHtISkkhRNT2wWafTGxJS47yYJGoJHiNnS3nQws3u3cyupBu8QCGUR63SKTNNFDUk8y4N730mfPI/s1600/IMG_3342.JPG" width="440" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Conor and Chip -- Two Brains, One Robot.</td></tr>
</tbody></table>
<br />
<div>
<u>Breaking Robots:</u> So our original vision was to get this hack working so that we could have *two* 4-person teams, with each team controlling their own robot. Luckily, we had multiple robots and multiple OpenBCI boards, so we thought that we could make it happen. Unfortunately, as soon as Conor and I made our video, the robots started to break. They don't like being stuffed in suitcases, I guess. So, we were left with just one working robot. Bummer.<br />
<br />
<u>Recruiting a Team:</u> At the <a href="http://www.labhack.org/">AF LabHack</a>, there were lots of folks doing their own hacking. By the time we got our system working (with the one healthy robot), the other teams were scrambling to get there last results prior to presenting to the group...so we had a tough time recruiting volunteers for being part of a robot-control team. In the short time we had left, we did get three enthusiastic folks step up. We got them all equipped with EEG electrodes, tuned the system a bit and let them play!<br />
<br /></div>
<div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhf_zo0w1AeEt_KvKVSlvniePYrUnB5-K7Ze53Aw1R4ulW8XdIbx9-HxowYqaDAZowgu1ONg3yULuLNx7TERFaBJkxa3HTOHnn5BmN4AmmxD9ADRIUNmw_07VB8mKLCCy9I6w14jbGmj6s/s1600/IMG_3345+-+anotated.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="313" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhf_zo0w1AeEt_KvKVSlvniePYrUnB5-K7Ze53Aw1R4ulW8XdIbx9-HxowYqaDAZowgu1ONg3yULuLNx7TERFaBJkxa3HTOHnn5BmN4AmmxD9ADRIUNmw_07VB8mKLCCy9I6w14jbGmj6s/s1600/IMG_3345+-+anotated.jpg" width="440" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Our Fine Volunteers. Three Brains, One Robot. </td></tr>
</tbody></table>
<br />
<u>No Video:</u> At this point, we should be presenting a triumphant video. Unfortunately, we don't have one. If we did, what you'd see is that two of the three players could easily and repeatably use their eyes-closed Alpha waves to command the robot. It was cool to see.<br />
<br />
<u>No Alpha:</u> The third player, though, did not have much luck controlling his part of the robot. At first, I assumed that it was a problem with our system, but after a little debugging, I came to the conclusion that his brain simply wasn't generating eyes-closed Alpha. He could have been trying too hard (you must be relaxed, without concentrating or being overly focused), or he could have been part of the 11% of the normal, healthy population that simply does not generate Alpha upon closing their eyes [Ref 1]. For these folks, I've got to come up with an alternate robot-control methodology...perhaps by the concentration signature of <a href="http://eeghacker.blogspot.com/2014/04/detecting-concentration.html">counting-backwards-by-three</a>.<br />
<br />
<u>Next Steps:</u> The next steps are clear -- I have to get a bunch of people together, hook them up, and enjoy the shenanigans of many brains trying to control a single robot. Should be fun!<br />
<br />
Ref [1]: Gibbs FA, Gibbs EL, Lennox WG. Electroencephalographic classification of epileptic patients and control subjects. Arch Neurol Psychiatry. 1943;50:111–28, as referenced by <a href="http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3927247/">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3927247/</a><br />
<br />
<u>Follow-Up:</u> We used a similar approach to get a 5-person team to brain-control a swimming shark balloon. It's cool. Check it out <a href="http://eeghacker.blogspot.com/2015/03/brain-controlled-shark-attack.html">here</a>.</div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com12tag:blogger.com,1999:blog-7276377053120174333.post-88613873138200420502014-10-27T16:49:00.000-04:002017-04-14T16:42:53.620-04:00Sharing the Brain-Controlled Hex BugI got to meet up with Joel and Conor (of <a href="http://openbci.com/">OpenBCI</a>) for some hacking over the weekend. We were at a <a href="http://www.labhack.org/">hackathon</a> sponsored by the Air Force Research Laboratory. During some of the down-time between hackathon events, we got to do some hacking of our own. Since Joel and Conor had never seen my <a href="http://eeghacker.blogspot.com/2014/06/controlling-hex-bug-with-my-brain-waves.html">brain-controlled hex bug</a> up close, I brought out my stuff. And nerdy-fun shenanigans ensued.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjRtzdH5WquHdRaBYrtTHCbUHDd0M0pPgdOtCEhxOdy_atJDAoe2jtMT05vjNLeyTbZXjGJDJRsSMwCeLTc1WeL8pkM77P3EKiLqzrxctxxjIcF774vtNxF_zvASQMZedjj2V-U4jgr7Rw/s1600/IMG_3319-001.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="299" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjRtzdH5WquHdRaBYrtTHCbUHDd0M0pPgdOtCEhxOdy_atJDAoe2jtMT05vjNLeyTbZXjGJDJRsSMwCeLTc1WeL8pkM77P3EKiLqzrxctxxjIcF774vtNxF_zvASQMZedjj2V-U4jgr7Rw/s1600/IMG_3319-001.JPG" width="420" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Brain-Controlled Robots Rule!</td></tr>
</tbody></table>
<br />
The EEG and computer setup was exactly the same as when <a href="http://eeghacker.blogspot.com/2014/06/controlling-hex-bug-with-my-brain-waves.html">I did it earlier</a> -- one electrode on the back of the head near O1, the reference electrode on one ear lobe, and the bias electrode on the other electrode. We used my same blinking movies to induce brain waves at 5 Hz and 7.5 Hz, and we the normal eyes-closed response to induce Alpha waves (which are near 10 Hz). After playing around with the detection thresholds, we were able to get the system to work for both Joel and Conor.<br />
<br />
As a spectator, I really enjoyed the tension and drama provided by Joel's showmanship:<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="360" src="//www.youtube.com/embed/9XAWRo8_2E4" width="480"></iframe>
</div>
<br />
And I also enjoyed the authority of Conor's brain-control skills:<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="360" src="//www.youtube.com/embed/F4jbZR9pZPk" width="480"></iframe>
</div>
<br />
...until his skills failed...<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEimZ3u8pq_9aBBu1YDED3p7-BZ2d0c9-SR1USn0i3NSmxM9xbxM7-jcA2XwJ-a7HxJaibPBNXu2nnUcxgaXowVhzclXymsyyr4I2edEtl5bBr1oPsjs28hSUY68tZyKk5_YYRqQ0TXEe4o/s1600/output_lCMZah.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEimZ3u8pq_9aBBu1YDED3p7-BZ2d0c9-SR1USn0i3NSmxM9xbxM7-jcA2XwJ-a7HxJaibPBNXu2nnUcxgaXowVhzclXymsyyr4I2edEtl5bBr1oPsjs28hSUY68tZyKk5_YYRqQ0TXEe4o/s1600/output_lCMZah.gif" width="285" /></a></div>
<br />
It was really fun to share this hack with Joel and Conor. Personally, I find that making something move out here in the real world (like this toy robot) is way more fun than simply making traces move on a computer screen. Sharing hardware hacks is where it's at. Hardware hacking, FTW!<br />
<br />Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com1tag:blogger.com,1999:blog-7276377053120174333.post-87125254990958780722014-10-25T00:21:00.001-04:002017-04-14T16:43:08.751-04:00Alpha Detection - Comparison Across EEG RecordingsIn the<a href="http://eeghacker.blogspot.com/2014/10/detected-alpha-waves-roc-curves.html"> previous analysis</a>, I used ROC curves to determine that the a good trade-off between good detection sensitivity and low false alarms occurs when my alpha detection threshold is at 3.55 uVrms and my guard rejection threshold is at 2.55 uVrms. These threshold values allowed me to successfully detection 56% of the eyes closed data blocks, while having zero false detections. As follow-up, I promised that I'd look at more additional recordings of my Alpha waves to see if these threshold values were also appropriate for other EEG recordings. Well, below are the results on 6 recordings. As you can see, the results are mixed.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhlkr27BTZhV8zSCOjIqK3omNnNLRbR_cg00fX810CEwG1xEpNtc-pyDrDjz01vHB2srhkrfC4derI5Scdyfs5HKJoO64kJnaiB9IlBpt4QbaGyEjFfjn52L3Hqz48h51kLw5WQJjRSU7A/s1600/all_files_thresh2_2.5.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="224" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhlkr27BTZhV8zSCOjIqK3omNnNLRbR_cg00fX810CEwG1xEpNtc-pyDrDjz01vHB2srhkrfC4derI5Scdyfs5HKJoO64kJnaiB9IlBpt4QbaGyEjFfjn52L3Hqz48h51kLw5WQJjRSU7A/s1600/all_files_thresh2_2.5.png" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Sensitivity and False Alarm Rate for an Alpha Detection Threshold of 3.75 uV and a Guard Rejection Threshold of 2.5. As you can see, there is a large variation in detection sensitivity and false alarm rate across the six different EEG recordings.</td></tr>
</tbody></table>
<u>Multiple Files:</u> The plot above includes data from six EEG recordings. The data files are specified in my Python analysis code <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2014-10-03%20V3%20Alpha">here</a>. File 1-3 were from Oct 4, File 4 was from Oct 5 and was the subject of all my recent posts, File 5 was from May 31 (from my robot control) and File 6 was from May 8. As you can see in the plots above, File 4 (the one discussed in my recent posts) was a very high quality recording because it was easy to detect the Alpha waves with high sensitivity (left plot) and with few false alarms (right plot). By comparison, File 3 and 5 were much more challenging becuase, for the same detection thresholds, we get far lower sensitivity (left) plot) and far higher false alarms (right plot). How do we decide the best thresholds to use for all files?<br />
<br />
<u>Lumped ROC Curve:</u> If we lump together all 6 sets of data, we can compute the receiver operating characteristic (ROC) curve, as we discussed in my <a href="http://eeghacker.blogspot.com/2014/10/detected-alpha-waves-roc-curves.html">previous post</a>. The plot below shows the ROC curve for all of the EEG data lumped together. It shows that, if we were to target a false alarm rate of 1 false alarm per minute, we could achieve an Alpha detection sensitivity where we detection ~40% of all the data blocks where my eyes are closed. It's not great, but it's not bad, either.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjpe85EqEhiu9EBxb9zO3i9OxWdK-tAB2rj8rLVJYLU2iOaNqvvpnUdOJoB-FJXB5iQokUw9hAnTf8rNKh14hcFpW-qL8TB0q-YMQ13cpFlbgSBDzzss1uKRgcqTALaNq7Zhk33owSto7E/s1600/Lumped_ROC_only.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="285" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjpe85EqEhiu9EBxb9zO3i9OxWdK-tAB2rj8rLVJYLU2iOaNqvvpnUdOJoB-FJXB5iQokUw9hAnTf8rNKh14hcFpW-qL8TB0q-YMQ13cpFlbgSBDzzss1uKRgcqTALaNq7Zhk33owSto7E/s1600/Lumped_ROC_only.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Lumping Together All 6 EEG Recordings, We Can Evaluate the Relationship Between Sensitivity and False Alarms When using Our Alpha Band vs Guard Band Detection Approach.</td></tr>
</tbody></table>
<br />
<u>Detection Thresholds to Use:</u> If we wish to achieve a target false alarm rate of 1.0 per minute, our ROC analysis also yields the results below, which shows which threshold values to use for our Alpha detection criteria (3.8 uVrms) and for our guard rejection criteria (1.6 uVrms).<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUrBX5ZAmpoVt4c1ZL6uWyDohxclNRpCm8oQdufEx8ZIlZsdOR8VD3F8p6Dyc6AUZQo2FrG0ElpIKhxt3ykMDcR6G2LVAv1dWAlrb1bJnanTVNCjo2SaiUqr-JqO_XxgtBveXKyOlXhHI/s1600/Lumped_ROC_thresholds.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="215" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUrBX5ZAmpoVt4c1ZL6uWyDohxclNRpCm8oQdufEx8ZIlZsdOR8VD3F8p6Dyc6AUZQo2FrG0ElpIKhxt3ykMDcR6G2LVAv1dWAlrb1bJnanTVNCjo2SaiUqr-JqO_XxgtBveXKyOlXhHI/s1600/Lumped_ROC_thresholds.png" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Thresholds to Use for Achieving 1.0 False Alarms Per Minute for All Six EEG Files Lumped Together Using the Alpha<br />
+ Guard Discrimination Approach.</td></tr>
</tbody></table>
<br />
<u>Better Detection Performance:</u> Using these two detection thresholds determined through the lumped data analysis above, we can evaluate the sensitivity and false alarm rate for each individual EEG recording. These results are shown below. As expected, File 3 and File 5 still have the lowest sensitivity. File 2, however, now has the highest false alarm rate.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhknUf016ZZkJ62IutolhcBDFiGrI5BYpVo3lvHlelcbT7mcdvrf5rPnoXVg-QNWlaNAT2CopfRlpXlD9EfAfry87jw2XcpYhQj9USSWzFHCd71DilfgzMTRQ1o-2WW6uKEkfyPi0EsgBU/s1600/Results_at3.8_1.6.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="215" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhknUf016ZZkJ62IutolhcBDFiGrI5BYpVo3lvHlelcbT7mcdvrf5rPnoXVg-QNWlaNAT2CopfRlpXlD9EfAfry87jw2XcpYhQj9USSWzFHCd71DilfgzMTRQ1o-2WW6uKEkfyPi0EsgBU/s1600/Results_at3.8_1.6.png" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Detection Sensitivity (Left) and False Alarm Rate (Right) When Using the Detection and Rejection Thresholds That Should Yield an Overall False Alarm Rate of 1.0 Incorrect Detections Per Minute.</td></tr>
</tbody></table>
<br />
<u>Is it Good Enough:</u> While these plots seem OK enough, is it good enough? Well, that can only be answered by looking at the detection plots for the individual recordings. Plots of the six recordings are presented blow, if you really want to see the details...click on any one of them to see a bigger version. When I look at these figures, I'm feeling pretty good about these detection thresholds.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
Files 1,2: <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_IJH2Zezrj7Hg9DGPW4oA3NOsxx3Xr6IPTctbDKhC9wyaTn0dELSO7zcsacEBRq2SJAcpUliYyk9VcrWphdwCoOpUsL2CNRVNUEbrqVLeBWN8sEwyUALNE1ShgmN6pZIrNYa0-5cP8Sw/s1600/File1_detection.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_IJH2Zezrj7Hg9DGPW4oA3NOsxx3Xr6IPTctbDKhC9wyaTn0dELSO7zcsacEBRq2SJAcpUliYyk9VcrWphdwCoOpUsL2CNRVNUEbrqVLeBWN8sEwyUALNE1ShgmN6pZIrNYa0-5cP8Sw/s1600/File1_detection.png" width="162" /></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgWeN9jK95KE44IV1S7ccFZkznTnhEwiCx8_l5vTFdCMFkUc09lOI5V5kRVGz9KrplRVGMs3KcnJqDNYOtyjcgQma-9sXGXj1ZnjenoREO_1hLz1lJE9eW8q5lKSZ5686sk48G3qtSv9sE/s1600/file2_detection.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgWeN9jK95KE44IV1S7ccFZkznTnhEwiCx8_l5vTFdCMFkUc09lOI5V5kRVGz9KrplRVGMs3KcnJqDNYOtyjcgQma-9sXGXj1ZnjenoREO_1hLz1lJE9eW8q5lKSZ5686sk48G3qtSv9sE/s1600/file2_detection.png" width="162" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
Files 3,4: <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnKNMfgwBMtOjtTa_cYcWeWTOaRsxYhtrFY1KmoJ5YyURdWK7jZokMt23VVUCWcYHeVlqkOlrZ0zjlBJXgE2iuefqGQshCBO6Mo3ZFUM9Va3YXUzh78qgTwbwavFKxumjtzHUKV2ZyQ_k/s1600/file3_detection.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjnKNMfgwBMtOjtTa_cYcWeWTOaRsxYhtrFY1KmoJ5YyURdWK7jZokMt23VVUCWcYHeVlqkOlrZ0zjlBJXgE2iuefqGQshCBO6Mo3ZFUM9Va3YXUzh78qgTwbwavFKxumjtzHUKV2ZyQ_k/s1600/file3_detection.png" width="162" /></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgm1PwenZQaPdBAZNoHB_a5L8A7kZ2vudFW2vr11GEv8OSaAPzE16XClqPf9PLUbeh5oWh-fNJeO7ZvkNnxaVzUNYvia9tBN_5bVeZxSgypo_tqrPpDv16a4nYeeQLFMYzTKfWp5CV31AQ/s1600/File4_detection.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgm1PwenZQaPdBAZNoHB_a5L8A7kZ2vudFW2vr11GEv8OSaAPzE16XClqPf9PLUbeh5oWh-fNJeO7ZvkNnxaVzUNYvia9tBN_5bVeZxSgypo_tqrPpDv16a4nYeeQLFMYzTKfWp5CV31AQ/s1600/File4_detection.png" width="162" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
Files 5,6:<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYp9C1nhZ7n7238XyUflWjvD7nV2UswDW3yzfT14hFWZoJWJStCwIJ3KkmLr-45Sj0DLKnWJj6DSeP00UQWJCJ5yrRx64bwzIDazpC2HmEufx2hWHWEvAzg-d8Z6IoouNrB_Fi_Mmxmr8/s1600/file5_detection.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiYp9C1nhZ7n7238XyUflWjvD7nV2UswDW3yzfT14hFWZoJWJStCwIJ3KkmLr-45Sj0DLKnWJj6DSeP00UQWJCJ5yrRx64bwzIDazpC2HmEufx2hWHWEvAzg-d8Z6IoouNrB_Fi_Mmxmr8/s1600/file5_detection.png" width="162" /></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMlDFGzDb2V4nfCIPq13fiiA_oFeRG06_P5bv9gRTrJu4ocrwELVawa3ub6wyq5wdPz2f1DlECBtTY6qHpfbtXBI6dFT2pqANi7wGi_q7xIH4yV1ARtY19t7qdeFJInZEgG1q9l1_PJlE/s1600/file6_detection.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMlDFGzDb2V4nfCIPq13fiiA_oFeRG06_P5bv9gRTrJu4ocrwELVawa3ub6wyq5wdPz2f1DlECBtTY6qHpfbtXBI6dFT2pqANi7wGi_q7xIH4yV1ARtY19t7qdeFJInZEgG1q9l1_PJlE/s1600/file6_detection.png" width="162" /></a></div>
<br />
<br />
<u>Next Steps:</u> Having established a good set of detection thresholds, we have two paths forward. One path would be to continue this analysis to find other types of detection algorithms that, via the ROC curve, might show better detection performance. Another approach would be to implement these detection threshold values in the real-time GUI to see if it give good performance. I'm not sure which direction I'll take, but given that the <a href="http://www.labhack.org/">Air Force Hackathon</a> is starting shortly (which I and OpenBCI will be attending), I bet my next step will indeed involve actual hacking. Let's do it!Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com0tag:blogger.com,1999:blog-7276377053120174333.post-10238059405913695672014-10-19T15:19:00.000-04:002017-04-14T16:43:25.859-04:00Detected Alpha Waves - ROC CurvesIn my <a href="http://eeghacker.blogspot.com/2014/10/detecting-alpha-waves-guard-bands.html">previous post</a>, I improved my Alpha detection processing by introducing "guard bands", which allowed me to reject most of the non-Alpha activity that looked like Alpha waves. In finding the best thresholds to use for detecting and rejection, I had to find the right balance between keeping good sensitivity (ie, detecting most of the true Alpha activity) while minimizing false alarms (ie, avoiding detecting non-Alpha activity as if it were Alpha). Finding the right balance can be tough and can involve lots of guesswork. In this post, I'll show how that process can be formalized...by making a ROC curve!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiPAT_vOf91tQsTsMiVXNs1jFpGqNU1WZR7ySPZKLw1MMFQju2py5WX9BNpZa56EFmn7s9ny26pXIic43Vw5LfBu_vu0NS7ZIOpGQDqyxrU16TqgwsZu2eoY1W2Sm0Np1BW4OPXLDl9D80/s1600/detect_alpha_and_guard.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="591" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiPAT_vOf91tQsTsMiVXNs1jFpGqNU1WZR7ySPZKLw1MMFQju2py5WX9BNpZa56EFmn7s9ny26pXIic43Vw5LfBu_vu0NS7ZIOpGQDqyxrU16TqgwsZu2eoY1W2Sm0Np1BW4OPXLDl9D80/s1600/detect_alpha_and_guard.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Example EEG data (top) and some example processing (middle and botom). True Alpha activity is limited to the "Eyes Closed" period. I used the amplitude in the Alpha band (blue trace) to detect the Alpha activity. I used the amplitude in the Guard Band (green trace) to reject activity that is not specific to the Alpha band. For my given detection threshold (Alpha > 3.5 uVrms) and rejection threshold (Guard > 2.5 uV), the red circles show when my algorithm detects Alpha activity.</td></tr>
</tbody></table>
<br />
<u>Background:</u> The plot above shows the final result from the last post. It shows the raw EEG data in the spectrogram at the top, as recorded from the back of my head (O1 with earlobe reference). As you can see, when my eyes are closed, the horizontal stripe of color shows my Alpha rhythm. The middle plot shows the amplitude of the EEG signal in the Guard band (I picked 3-6.5 Hz and 13-18 Hz) and the bottom plot shows the amplitude of the EEG signal in the Alpha band (7.5-11.5 Hz). You can also see that, by eye, I picked a Guard rejection threshold of 2.5 uVrms and an Alpha detection threshold of 3.5 uVrms. The red circles show the "detections" when the EEG signal satisfies the Alpha detection threshold (ie, is > 3.5 uVrms) and avoids the Guard rejection threshold (ie, is < 2.5 uVrms). This works algorithm works way better than the Alpha-only detection approach that I used <a href="http://eeghacker.blogspot.com/2014/10/detecting-alpha-waves-threshold.html">in my first attempt</a>. Theis Alpha+Guard algorithm isn't perfect, however. You see that I still have a pair of false detections around 84 seconds.<br />
<br />
<u>Choosing the Thresholds, By Eye:</u> One of the key decisions in this algorithm is what value to use for the Alpha detection threshold and what value to use for the Guard rejection threshold. Initially, I choose them by eye from the plots above and then used trial-and-error to tune them further. <br />
<br />
<u>Alpha-Guard Scatter Plot:</u> Another approach for choosing these thresholds is to use a dedicated plot such as the one below. Here, I take the exact same Alpha and Guard values as plotted above and I re-plot the data. But, instead of plotting the data as a smooth trace as a function of time, I plot the the data as a scatter plot of the Alpha and Guard value for each moment of time. For each moment in time when my eyes are closed (which mean that true Alpha waves are likely present), I plotted the data point in blue. For each data point outside of this time period (so Alpha are likely not present), I plotted the values in red.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-SOBwFOTYwoxbPeWX4A4LjERiYSbedGchC3reaYtOBQ0YOrIkROD2iYkxWtoAErZ7CKwwSA-uIqk2nx_DY0bNj7AzP09mZTORCRgaf957zgVVOlth2HvSmAMmOs_TsqE1c5S5-bo1a3A/s1600/alpha_v_guard_annotated.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="272" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-SOBwFOTYwoxbPeWX4A4LjERiYSbedGchC3reaYtOBQ0YOrIkROD2iYkxWtoAErZ7CKwwSA-uIqk2nx_DY0bNj7AzP09mZTORCRgaf957zgVVOlth2HvSmAMmOs_TsqE1c5S5-bo1a3A/s1600/alpha_v_guard_annotated.png" width="420" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">For each moment in time, I measure the amplitude of the EEG signal in both the Alpha band and the Guard band. Each dot in this plot represents the Alpha and Guard values for one moment in time. The blue dots are from the time when my eyes were closed, so they should contain Alpha activity. You can see how the Alpha threshold (black line) and Guard threshold (red line) can be used to try to discriminate between the two types of activity.</td></tr>
</tbody></table>
<u><br /></u>
<u>Choosing the Thresholds, Graphically:</u> With the Alpha-Guard scatter plot shown above, it is much easier to visually move the Alpha threshold (the horizontal black line) up-and-down, or to move the Guard threshold (the vertical red line) left-and-right. Any points in the upper-left quadrant satisfy our detection rules (measured Alpha > Alpha Threshold and measured Guard < Guard Threshold). The thresholds are currently shown at the values from my previous post -- Alpha Threshold at 3.5 uVrms and Guard Threshold at 2.5 uVrms. We see that the vast majority of the detections are blue dots, which is good because that is the eyes-closed activity. We also see the two false detections as the two red circles. With this plot, we clearly see that if we raise our Alpha band a little bit, we could eliminate those two false detections...at the expense of losing more of the desired (blue dot) detections. Again, we see that it's a balance between high sensitivity and a low rate of false alarms.<br />
<br />
<u>Try All Alpha and Guard Threshold Values:</u> To get a better picture of the effect that these two threshold values have on the sensitivity and false alarm rate, we can have the computer brute-force reprocess the data trying a wide range of threshold values. By stepping through all the combinations of Alpha and Guard thresholds, the computer forms a dense table of the number of correct detections and the number of incorrect detections for our given EEG recording. We can then use a contour plot to visualize this dense table of values. These are shown below.<br />
<br />
<u>Effect on Sensitivity: </u>The first plot below shows how the Alpha and Guard thresholds affect our detection sensitivity (the fraction of eyes-closed EEG activity was correctly detected). The black "x" shows our current threshold values (3.5 uV for Alpha, and 2.5 uV for the Guard). At the end of my previous post, I said that this achieved a sensitivity of 65%, which agrees with the plot below. If I want higher sensitivity, this plot clearly shows that the ideal is toward the upper-left, which means lowering my Alpha detection threshold and raising my Guard rejection threshold.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjigmDrVPDrupk03xsiP3NOSeYizdaUiNbnPcwg5yScQXkyrXmykQR6P2Fv2edZ2HdgdQ6KzUwwQ_u1AZqGq3ELfw9__zGABL_kbX63wYgJQAFXWeMImmfXl8_hyphenhyphenThlhqY0_4Ru7gNY0dI/s1600/True_Contour.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="308" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjigmDrVPDrupk03xsiP3NOSeYizdaUiNbnPcwg5yScQXkyrXmykQR6P2Fv2edZ2HdgdQ6KzUwwQ_u1AZqGq3ELfw9__zGABL_kbX63wYgJQAFXWeMImmfXl8_hyphenhyphenThlhqY0_4Ru7gNY0dI/s1600/True_Contour.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">After trying all combinations of Alpha and Guard thresholds (as shown by the black dots), we can see how the threshold values affect our sensitivity in detecting the true Alpha activity. The highest sensitivity is in the top-left of this plot, meaning we want the lowest threshold for Alpha detection and the highest threshold for Guard rejection.</td></tr>
</tbody></table>
<br />
<u>Effect on False Alarms:</u> The desire for high sensitivity, however, must be balanced by the need for a low false alarm rate. The plot below shows how the threshold values affect the number of false alarms for this data set. As before, the black "x" shows our current threshold values: 3.5 uV for Alpha, and 2.5 uV for the Guard. In my previous post, I said that this point resulted in 2 false alarms, and the plot below agrees with that finding. This plot says that if we want to further lower the false alarm rate, we need to move down and to the right, which is to lower our Guard rejection threshold and to raise the Alpha detection threshold. This is exactly the opposite direction that was needed to increase our sensitivity. Again, we're seeing that it is a struggle to optimize sensitivity versus false alarms<br />
<div>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQwHgxDK6aP7uBmzfHQad9oF5_OmA5qoqqK6s4yY4KtBFAX4qhpnpa_mfO9_5V1mHzXrSm5q40n_Xg7diWGPcX4tjdfoZx1VlL1Tw6yvVT0QesbBAgL8Pa1HDA78cxGvioPYcO9M2p4bg/s1600/False_Contour.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="311" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQwHgxDK6aP7uBmzfHQad9oF5_OmA5qoqqK6s4yY4KtBFAX4qhpnpa_mfO9_5V1mHzXrSm5q40n_Xg7diWGPcX4tjdfoZx1VlL1Tw6yvVT0QesbBAgL8Pa1HDA78cxGvioPYcO9M2p4bg/s1600/False_Contour.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">After trying all combinations of Alpha and Guard thresholds (as shown by the black dots), we can see how the threshold values affect the number of false alarms (incorrect detection of non-Alpha activity). The fewest false alarms are seen in the bottom-right of this plot, meaning we want the highest threshold for Alpha detection and the lowest threshold for Guard rejection.</td></tr>
</tbody></table>
<br />
<u>Best Sensitivity for a Given False Alarm Rate:</u> Given that it is a trade between sensitivity and false alarm rate, how does one choose where to set the detection and rejection thresholds? One approach would be to simply start by picking a number of false alarms that would seem to be be acceptable. Then, one can search through the threshold values to find the combination that achieves the best sensitivity at the given false alarm rate. That's what I did to make the plot below. I used the dense grid of values before and directly plotted the sensitivity and false alarm count for every combination of threshold values. Clearly, there is an upper limit running across this plot. So, I added the green line to define the maximum sensitivity (for this detection algorithm) that can be achieved for any given false alarm rate.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhC0f2ZrM3Gy4oz3nJs7NHZIqMdl_Yl1n6bUw_YdmNfcrRC_gHDumyFObYqRHPMDqZ-_-QAwzowaP-AXWj-1kvqxZw2oKCMe550w-kl1tw3adrQTi6mbllKsLlK9QFbt472Pi16oswTKhY/s1600/BuildUpROC.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="278" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhC0f2ZrM3Gy4oz3nJs7NHZIqMdl_Yl1n6bUw_YdmNfcrRC_gHDumyFObYqRHPMDqZ-_-QAwzowaP-AXWj-1kvqxZw2oKCMe550w-kl1tw3adrQTi6mbllKsLlK9QFbt472Pi16oswTKhY/s1600/BuildUpROC.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A plot of the sensitivity versus number of false alarms for every combination of threshold values analyzed earlier. There is clearly a maximum sensitivity value that occurs for each given false alarm rate. This is Receiver Operating Characteristic (ROC) curve for this detection algorithm.</td></tr>
</tbody></table>
<u>ROC Curve:</u> The green line in the plot above is a very useful tool for understanding the performance of the detection algorithm. It is so useful that it has a name -- it is the "<a href="http://en.wikipedia.org/wiki/Receiver_operating_characteristic">receiver operating characteristic</a>" (ROC) curve for this detection algorithm. The name is kinda funny because it originated in World War II during the development of radar. In radar, one is trying to detect pulses of radio energy that have been reflected off distant aircraft. When comparing different radar receivers, one can always appear to be more sensitive, though at the cost of increasing the rate of false alarms. So, they developed the ROC curve as a way to make it easier to understand, compare, and optimize detection systems. It was such an effective tool, that it has became a core technique in the general field signal detection and classification.<br />
<br />
<u>Comparing Alpha Detection Systems:</u> As an example of how useful it is in comparing detection "systems", the ROC curves below show the performance of "Alpha+Guard" algorithm versus my original <a href="http://eeghacker.blogspot.com/2014/10/detecting-alpha-waves-threshold.html">"Alpha Only" detection algorithm</a>. As can be seen, the ROC curve for the Alpha+Guard algorithm is higher at point. It is superior at any allowed false alarm rate. Alpha+Guard is clearly superior. Interestingly, this plot also exposes how the "Alpha Only" algorithm was never able to have less than 10 false alarms in this EEG recording. If one cares about a low false alarm rate (which I certainly do), the ROC curve exposes this critical information.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgphUNyfP3cRMI8QapstswWNn5f8Rfp-3HFB8ezC499_28kK8Yy1_R6HQeueHZfd9QkOQ9dvqTHOH1C-gzcdxtHRLiX2H0muMiBMkUzPrH0ZstKnGmg0san4jla6WPXFRyF0mNJQ3wNrZ0/s1600/ROC.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="267" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgphUNyfP3cRMI8QapstswWNn5f8Rfp-3HFB8ezC499_28kK8Yy1_R6HQeueHZfd9QkOQ9dvqTHOH1C-gzcdxtHRLiX2H0muMiBMkUzPrH0ZstKnGmg0san4jla6WPXFRyF0mNJQ3wNrZ0/s1600/ROC.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">ROC curves for two different detection algorithms: "Alpha+Guard" detects in the Alpha band and rejects based on the Guard bands, while "Alpha Only" does not include the rejection based on the Guard band. The ROC curve for "Alpha+Guard" is higher at all false alarm rates, thereby showing its performance is superior.</td></tr>
</tbody></table>
<br />
<u>Selecting the Operating Point:</u> The ROC curve is an excellent visualization for deciding which detection algorithm to use. It is also helpful for understanding where the steepest trade-off exists between sensitivity and false alarms. For the Alpha+Guard algorithm (unlike the Alpha Only algorithm), we see that the curve is generally quite flat, so there is not a severe penalty in targeting a very low false alarm rate. As an experiment, let's target zero false alarms. The plot above tells me that, at zero false alarms, it is possible to achieve a sensitivity of 56%. Fine. But what threshold values should I use to get this performance? The ROC curve itself does not tell me that information.<br />
<br />
<u>Finding the Threshold Values:</u> When I made the ROC curve, I built it up by plotting the data generated through a brute-force assessment of all pairs of Alpha and Guard threshold values. These were the green circles shown two plots ago. The ROC curve came from a subset of those green circles. So, I can go back and find the threshold values that correspond to the green circles that just fall on the ROC curve. These threshold values would be the optimal threshold values for a given false alarm rate. The figure below plots these optimal threshold values as a function of allowed false alarms (ie, allowed number of incorrect detections). If we are targeting zero false alarms, this plot says that we should choose an Alpha detection threshold of about 3.75 uV and a Guard rejection threshold of about 2.55 uV.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhRQDgG9Z9x6QDXrWIggEFldcC5UYXHNYguXoRd3kJvoYjKewsTfBMVB2NsTf3SzCoLJYdcFoj4yMUeplTI_0hwzig2NJjOJbSxqaeBcmA6rDEbQSz0S_HfJ_OpnSI6WRk8dMkOVfknAO4/s1600/ChoosingThresholdValues.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="260" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhRQDgG9Z9x6QDXrWIggEFldcC5UYXHNYguXoRd3kJvoYjKewsTfBMVB2NsTf3SzCoLJYdcFoj4yMUeplTI_0hwzig2NJjOJbSxqaeBcmA6rDEbQSz0S_HfJ_OpnSI6WRk8dMkOVfknAO4/s1600/ChoosingThresholdValues.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">For a given number of allowed incorrect detections (ie, false alarms), this plot shows the Alpha and Guard threshold values that result in the highest sensitivity (ie, most number of correctly identified Alpha activity). As with all figures in this analysis, the curves above are derived from a single EEG recording, which is why the curves are noisy.</td></tr>
</tbody></table>
<br />
<div>
<u>Re-Process the EEG Data:</u> If we use these two threshold values and reprocess our EEG data, we get the detection performance seen below. Again, the top plot is the raw EEG recording. The middle plot shows the EEG amplitude in the Guard band along with (in red) the Guard rejection threshold. The bottom plot shows the EEG amplitude in the Alpha band along with (in black) the Alpha detection threshold. The red circles show those points that satisfy both thresholds -- the are the "detections" resulting from our algorithm. You'll note that there are only detections when my eyes are closed -- there are no false detections. This looks really good! I think that I'm getting close to being able to deploy this into my OpenBCI GUI for real-time operation in detecting Alpha waves.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNeiFmU1qxM7ntF-nUbIANum0YLM0IC3PFFhKY3wli6IdUE7CAup2QBXV-Ia7Zo4iPoyVcJubkVd_5lhq3o7b6Xd24VA6JNzFhqyPIHr_cBLHjFnfQLvFYdYx7Ry2WfHVOEDVmfUzMSvo/s1600/NoFalseAlarms.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="591" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNeiFmU1qxM7ntF-nUbIANum0YLM0IC3PFFhKY3wli6IdUE7CAup2QBXV-Ia7Zo4iPoyVcJubkVd_5lhq3o7b6Xd24VA6JNzFhqyPIHr_cBLHjFnfQLvFYdYx7Ry2WfHVOEDVmfUzMSvo/s1600/NoFalseAlarms.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Detection Performance Using the Detection Thresholds for Zero False Alarms. The detections (rec circles) resulting from this approach look pretty good!</td></tr>
</tbody></table>
<br />
<u>Summary:</u> In this post I introduced a number of new ways of analyzing and plotting EEG data so that we can optimize the performance of our Alpha detection algorithm. Specifically, I showed how the ROC curve is a great way of visualizing system performance in a way that smoothly handles the inherent trade-off between sensitivity and false alarms. I showed how the ROC curve is a simple way to quickly see how one detection algorithm is better than another ("Alpha+Guard" is clearly superior to "Alpha Only") and to quickly see that there is little penalty for driving to an even lower false alarm rate.</div>
<div>
<br /></div>
<div>
<u>Next Steps:</u> Up until now, we've just been looking at one EEG recording. Therefore, all of our conclusions on which algorithm to use and which detection thresholds to use have been based upon a very small amount of data. The results seen here might not generalize well, which means that my algorithm might not perform well when faced with future data. To address this possibility, my next step is to bring in some of my other EEG recordings, all of which are noisier and more challenging. With the additional data, we can better optimize our algorithm and have greater confidence that it'll work well on future data that has not been part of its training.<br />
<div>
<br />
<u>Follow Up:</u> I examine the performance on six different EEG recordings <a href="http://eeghacker.blogspot.com/2014/10/alpha-detection-comparison-across-eeg.html">here</a>.</div>
</div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com3tag:blogger.com,1999:blog-7276377053120174333.post-40954406729984611252014-10-15T07:08:00.004-04:002014-10-19T15:21:31.993-04:00Detecting Alpha Waves - Guard BandsIn my <a href="http://eeghacker.blogspot.com/2014/10/detecting-alpha-waves-threshold.html">previous post</a>, I discussed a simple algorithm for detecting Alpha rhythms: (1) use an FFT spectrum to measure the EEG amplitude in the Alpha band and (2) compare this value to a fixed detection threshold to decide if Alpha are present. As shown in the figure below, this approach yields good detection sensitivity (it correctly flags 66% of the eyes-closed data blocks) and a reasonably low number false alarms (it incorrectly flags 15 data blocks). While this is good, I think that I can do better. Let's talk about how...<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjq41tlUZS4wfkogl83CoGQfWiGl2klSZlCVZfSkmAafPzca5mf3VCBETFb5t6J6Y7bAcPynNzOu6B6M46YXQas2dApZWmJKbuyLSW7Hfr_GITUVVb48ZgV4r8ZcNVv8aQ_oRs03RQsHUo/s1600/spec_alpha_detect.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjq41tlUZS4wfkogl83CoGQfWiGl2klSZlCVZfSkmAafPzca5mf3VCBETFb5t6J6Y7bAcPynNzOu6B6M46YXQas2dApZWmJKbuyLSW7Hfr_GITUVVb48ZgV4r8ZcNVv8aQ_oRs03RQsHUo/s1600/spec_alpha_detect.png" height="392" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Example EEG data (top) showing Alpha rhythms when my eyes are closed. At each time slice, I measure the peak of the spectrum in the Alpha band (7.5-11.5 Hz), which yields the blue trace on the bottom. By looking for any value above 3.5 uVrms, we are able to detect the presence of Alpha waves (as indicated by the red circles)</td></tr>
</tbody></table>
<br />
<u>Alpha Band Detection is Not Specific Enough:</u> In the simple Alpha band detection algorithm discussed above, we are sensitive to any signal with lots of energy in the Alpha (7.5-11.5 Hz) band. The problem is that there are signals besides Alpha rhythms that have energy in the Alpha band. For example, the bottom plot below shows the spectrum (black line) for a segment of eyes-closed Alpha waves. The plot also shows the spectrum (red line) for a segment of "other" activity that is no an Alpha rhythm (it is probably motion artifact from the EEG lead wires). As can be seen, both spectra show substantial energy in the Alpha band, and so they would both be flagged as "Alpha!" using my simple threshold detection approach. For the segment of "other" activity, this would be a false alarm. I don't want that. I want to improve my algorithm to reject this kind of false alarm.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9NLyoTWQf54AxhR16wKVw1A_hnqSzNQXaY7Z5GYoVzC25_o5yvWsWR08m3hlFSVO36AqCs_Hqv4qdQw_m1IgGUlb34pV3V1s5H9zExfRZ9RkVf4WLHqgNVEtDUtI6tctEBsHUuGtjHfI/s1600/spec_alpha_withExamples.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9NLyoTWQf54AxhR16wKVw1A_hnqSzNQXaY7Z5GYoVzC25_o5yvWsWR08m3hlFSVO36AqCs_Hqv4qdQw_m1IgGUlb34pV3V1s5H9zExfRZ9RkVf4WLHqgNVEtDUtI6tctEBsHUuGtjHfI/s1600/spec_alpha_withExamples.png" height="392" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Two spectra: (1) a segment of eyes-closed Alpha rhythm that I do want to detect and (2) a segment of "other" EEG activity that I do not want to detect. Both, however, show a high amplitude in the Alpha band. So, my original detection rule that is simply based on the Alpha amplitude would not reject the "other" activity.</td></tr>
</tbody></table>
<u><br /></u>
<u>Introduce "Guard" Bands:</u> One way of distinguishing between the two example spectra above is to introduce "guard" bands on either side of the Alpha band. The idea is that we measure the signal amplitude both in the Alpha band <i>and</i> in the guard bands. Based on the plots above, we know that true eyes-closed Alpha activity will not show much energy in the guard bands whereas the confusing "other" activity can be rejected because it does show energy in the guard bands.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGT6lNLcrgY-NvYpdWsGVsoh3vppxyl3i2KXeW1od6tb_12foxGFlWH43lGT2GTH_PpH2qu4cla36pFg61V5mu290Km2-XwABVkiF3mCU_0kSZfftPeJF9GCL-eYVKl0c31R9Kg3CUhck/s1600/guard_withExamples.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGT6lNLcrgY-NvYpdWsGVsoh3vppxyl3i2KXeW1od6tb_12foxGFlWH43lGT2GTH_PpH2qu4cla36pFg61V5mu290Km2-XwABVkiF3mCU_0kSZfftPeJF9GCL-eYVKl0c31R9Kg3CUhck/s1600/guard_withExamples.png" height="198" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">If we measure the mean EEG amplitude in the guard bands, as well as in the Alpha band, we can distinguish between the two signals. True Alpha rhythms will not have much energy in the guard bands whereas most of our confusing "other" activity will show substantial energy in the guard bands.</td></tr>
</tbody></table>
<u><br /></u>
<u>Evaluating the Guard Amplitude:</u> To quantify the amplitude in the guard bands, I simply take the average of all the spectrum values that fall within our two guard bands (3-6.5 Hz and 13-18 Hz). When I apply do this for our EEG data, I get the green trace shown in the middle plot below. As you can see, it stays low during all of the legitimate eyes-closed Alpha activity and it jumps high only during the confusing other activity. This looks promising!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyT3OddrcXBoVZh0mJ396XzSmwiQoD2gAZPIuJ1YdXj9N-I5pu_rS9GPjalQcAtB5aYuxIno6aicCmioWHet-9gtWxqYcb1lYL3wie55_O_K_rUM5fvd_jdUnNKXE-JIVL64WNcYCJfkQ/s1600/detect_alpha_and_guard.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgyT3OddrcXBoVZh0mJ396XzSmwiQoD2gAZPIuJ1YdXj9N-I5pu_rS9GPjalQcAtB5aYuxIno6aicCmioWHet-9gtWxqYcb1lYL3wie55_O_K_rUM5fvd_jdUnNKXE-JIVL64WNcYCJfkQ/s1600/detect_alpha_and_guard.png" height="591" width="480" /></a></div>
<br />
<u>Combined Detection Rules:</u> Based on this graphs above of the guard amplitude (green line) and of the Alpha amplitude (blue line), it looks like a good combination of rules would be to look for points where the Alpha amplitude is greater than 3.5 uVrms and, simultaneously, where the guard amplitude is less than 2.5 uVrms. When I apply these detection rules, I get the red circles shown in the figure above. Looks pretty good! You'll note that the addition of the guard band has successfully rejected the false alarms that we had been getting at t=58, t=77, and t=123. This is exactly what I was hoping for.<br />
<br />
<u>Quantifying the Improvement:</u> Compared to yesterday's results (tabulated below), this new detection algorithm obtains nearly the same sensitivity (65% vs 66%) with a greatly reduced number of false alarms (2 vs 15). This is definitely an improvement in my Alpha detecting algorithm.<br />
<span style="font-family: 'Courier New', Courier, monospace;"> </span><br />
<span style="font-family: Courier New, Courier, monospace;">Guard N_TRUE N_FALSE</span><br />
<span style="font-family: Courier New, Courier, monospace;">None 101 (66%) 15</span><br />
<span style="font-family: Courier New, Courier, monospace;">2.5 uVrms 100 (65%) 2</span><br />
<span style="font-family: Courier New, Courier, monospace;"><br /></span>
<br />
<u>Moving Forward:</u> For this example EEG recording, I am satisfied with the performance of this algorithm. It would give me quite reliable performance while still being nicely sensitive. This EEG recording was pretty "clean", however -- its Alpha was pretty strong and there was not too much confusing "other" activity. I have other EEG recordings that are more difficult. Next time, we'll look at those harder recordings, you'll see that even the combined Alpha+Guard algorithm is insufficient, and I'll discuss yet another extension (hopefully an improvement!) on this detection approach.<br />
<br />
<u>Follow-Up:</u> I further optimize this algorithm by using ROC curves to attack, head-on, the trade-off between sensitivity and false alarms. Check it out <a href="http://eeghacker.blogspot.com/2014/10/detected-alpha-waves-roc-curves.html">here</a>.<br />
<br />
<br />Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com3tag:blogger.com,1999:blog-7276377053120174333.post-13796997115631448302014-10-14T09:21:00.001-04:002017-04-14T16:43:39.232-04:00Detecting Alpha Waves - Threshold DetectionIn my <a href="http://eeghacker.blogspot.com/2014/10/first-alpha-with-openbci-v3.html">previous post</a>, I showed some EEG data that I recorded from the brand-new <a href="http://openbci.com/">OpenBCI</a> V3 board. The data that I showed included some <a href="http://en.wikipedia.org/wiki/Alpha_wave">Alpha waves</a> that my brain generated (like most people's brains) simply by closing my eyes. I've copied a spectrogram of that EEG data below. You can see the Alpha waves as the horizontal stripe of energy near 10 Hz. While it is pretty easy to see (to "detect") this signal by eye, it might be fun to get the computer to automatically detect these Alpha waves, so that you can use Alpha waves to make <a href="http://eeghacker.blogspot.com/2013/11/openbci-alpha-wave-detector.html">a brain-controlled light</a>, or <a href="http://eeghacker.blogspot.com/2014/06/controlling-hex-bug-with-my-brain-waves.html">a brain-controlled robot</a>, or to do some other cool hacking shenanigan. How do we get the computer to detect the Alpha waves? In this post, and in some follow-up posts, I'm going to discuss a few ways...<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjNiZlU40KbwQ22rhmpj2FYH1BgS5Y31f6swH-KY0YIXl3tfGFfbYxO_Ywtjx1piPqh56OQ6C2HD2wZzvufQMhxFZnbucOr22CaHr-TwAkQjSIX3Gy0CfrVLI2_A_RkLnVmJp3fGSsQSMs/s1600/spec.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="200" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjNiZlU40KbwQ22rhmpj2FYH1BgS5Y31f6swH-KY0YIXl3tfGFfbYxO_Ywtjx1piPqh56OQ6C2HD2wZzvufQMhxFZnbucOr22CaHr-TwAkQjSIX3Gy0CfrVLI2_A_RkLnVmJp3fGSsQSMs/s1600/spec.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Spectrogram of the EEG Signals from the Back of my Head ("O1"). Note the horizontal stripe of energy near 10 Hz when my eyes are closed. These are the Alpha waves being generated in my <a href="http://en.wikipedia.org/wiki/Occipital_lobe">occipital </a>lobe.</td></tr>
</tbody></table>
<u><br /></u>
<u>Simple Approach First:</u> There is a huge body of literature out there on the various signal processing techniques to address the "<a href="http://en.wikipedia.org/wiki/Detection_theory">detection</a>" problem. Most approaches (or, at least, the language used to describe the approaches) get very technical very fast, even <a href="http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-011-introduction-to-communication-control-and-signal-processing-spring-2010/readings/MIT6_011S10_chap14.pdf">in introductory material</a>, so I'm going to take a simple approach first, and only add additional complication as needed to solve particular problems.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh7xNriK4SCuZm_1tsTyLK0_GeQTvwTPOHVDtSmi0CZVWnrST4O7zB2Ev5GcUV3mCDcl6LX3lIn_DObsTY4rex9o9BUyVMQPol9631eP7xEtbWtqQEtzM227K7JRcMYDNfgWBQjtE4koGg/s1600/alpha_spectrum.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="192" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh7xNriK4SCuZm_1tsTyLK0_GeQTvwTPOHVDtSmi0CZVWnrST4O7zB2Ev5GcUV3mCDcl6LX3lIn_DObsTY4rex9o9BUyVMQPol9631eP7xEtbWtqQEtzM227K7JRcMYDNfgWBQjtE4koGg/s1600/alpha_spectrum.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Average EEG Spectrum when My Eyes are Closed. The Alpha rhythm clearly shows up around 10 Hz.</td></tr>
</tbody></table>
<br />
<u>Frequency View:</u> As a human being looking at the spectrogram at the top of this post, I easily see the horizontal stripe of energy that represents my Alpha waves. Since the frequency is nearly constant, the Alpha waves should show up in a simple spectrum view of signal. The spectrum view plotted above shows the EEG spectrum averaged entire period when my eyes are closed. Clearly, there is a strong peak in the Alpha Band (7.5-12 Hz). This is the tool that we will use to measure the Alpha waves.<br />
<br />
<u>Alpha Through Time:</u> Since the Alpha waves are clearly identifiable in the spectrum, and since the <a href="https://github.com/chipaudette/OpenBCI/tree/master/Processing_GUI/OpenBCI_GUI">OpenBCI GUI</a> already computes the spectrum as the EEG data arrives from the OpenBCI board, let's use the spectrum as our tool for focusing on just the Alpha waves. To quantify the amplitude of the Alpha waves, I find the maximum value of the spectrum within the 7.5-12 Hz band. Since the OpenBCI GUI computes a new spectrum every 200 msec, I get a new estimate of the Alpha amplitude five times a second. The plot below shows the estimate of Alpha amplitude that results from this process.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBp-buGcvW6zEGx_Pl4MIfN_uFd9hkIWDfNVxMr1sBdANsFaSWqQTjxizjjQVlFnqbsDnQnaZXy3BpUCW37mj7sv4e3PdS7y1UGx0ySF9cI-l8dzwqwYggDLpB88GsKR-PtHSUskLod_4/s1600/spec_alpha_noDetect.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="391" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgBp-buGcvW6zEGx_Pl4MIfN_uFd9hkIWDfNVxMr1sBdANsFaSWqQTjxizjjQVlFnqbsDnQnaZXy3BpUCW37mj7sv4e3PdS7y1UGx0ySF9cI-l8dzwqwYggDLpB88GsKR-PtHSUskLod_4/s1600/spec_alpha_noDetect.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Looking at the Alpha Band (7.5-11.5 Hz) through Time. Notice that the amplitude is highly variable. When my eyes are closed, the amplitude is generally much higher.</td></tr>
</tbody></table>
<br />
<u>Choose a Detection Threshold:</u> As you can see in the plot above, the EEG amplitude in the Alpha band increases greatly when my eyes are closed (and, a bit surprisingly, we also see that my Alpha amplitude is not very steady...it varies a lot when my eyes are closed). To have the computer decided when Alpha waves are present, the simplest approach is to pick a threshold value such that, when the signal amplitude is above the threshold, we declare that Alpha waves are present. Looking at the plot, I picked a threshold value of 3.5 uVrms.<br />
<br />
<div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiVjKN_wr6kBJPjSdwcrkwLDK_lt0md3fzq6YQAnR8lslt5VH66BG2Friq9jI0Ll3p75WbMDeu9jsyJ3sPFesj219noVf_72WO99xYY99OvBmj9-3poaWMaZJQ0NdDYDYPTz6PAeNwgAj4/s1600/spec_alpha_detect.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="391" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiVjKN_wr6kBJPjSdwcrkwLDK_lt0md3fzq6YQAnR8lslt5VH66BG2Friq9jI0Ll3p75WbMDeu9jsyJ3sPFesj219noVf_72WO99xYY99OvBmj9-3poaWMaZJQ0NdDYDYPTz6PAeNwgAj4/s1600/spec_alpha_detect.png" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The red circles shows those data points where the Alpha amplitude is greater than my detection threshold of 3.5 uVrms. It correctly captures most of the data when my eyes are closed, yet it also incorrectly captures a few moments of strong non-Alpha activity.</td></tr>
</tbody></table>
<br /></div>
<u>Detection Results:</u> The plot above shows the effect of setting the detection threshold at 3.5 uVrms. The red circles shows those data points where amplitude in the Alpha band is above the threshold and we would declare that Alpha is present. Based on the good coverage during the "eyes closed" portion of the data, I'd say that this detection threshold yields good sensitivity. <br />
<br />
<u>False Alarms.</u> To improve our sensitivity further, one could imagine lowering the detection threshold so that we capture more of the points within the "eyes closed" region. Doing this, though, would also cause more points outside of the "eyes closed" region to be falsely detected as Alpha waves. Even with our 3.5 uVrms threshold, there are several moments (t = 58, t = 77, t = 123) when strong broadband EEG activity happens to be strong enough to cross our detection threshold. Since these detections are not due specifically to Alpha activity, we call these false alarms. <br />
<br />
<u>Balancing Sensitivity with False Alarms:</u> Selecting a good detection threshold requires one to balance the desire for high sensitivity with the requirement for a low false alarm rate. After trying several different threshold values (see table below), 3.5 uVrms seems like it provides a decent balance for this EEG recording. Other recordings might require a different threshold value. <br />
<br />
<span style="font-family: "courier new" , "courier" , monospace;">Threshold N_TRUE N_FALSE</span><br />
<span style="font-family: "courier new" , "courier" , monospace;">2.5 uVrms 126 (82%) 43</span><br />
<span style="font-family: "courier new" , "courier" , monospace;">3.0 uVrms 112 (73%) 20 </span><br />
<span style="font-family: "courier new" , "courier" , monospace;">3.5 uVrms 101 (66%) 15</span><br />
<span style="font-family: "courier new" , "courier" , monospace;">4.0 uVrms 75 (49%) 12</span><br />
<span style="font-family: "courier new" , "courier" , monospace;">4.5 uVrms 59 (39%) 11</span><br />
<br />
<u>Moving Forward:</u> With this simple method of quantifying the Alpha amplitude (ie, take the maximum value from the spectrum in the 7.5-12 Hz band) and with this simple method of deciding whether Alpha is present (ie, using a pre-defined detection threshold), we can easily have the computer detect our eyes-closed Alpha waves. Sure, we might have a few false alarms but this is just our first try! In the next post, I'll try adding a few techniques to be more selective to reduce our false alarms, without significantly degrading our sensitivity.<br />
<br />
<u>Follow-Up:</u> See how I reduce the false alarms <a href="http://eeghacker.blogspot.com/2014/10/detecting-alpha-waves-guard-bands.html">by introducing Guard Bands</a>!Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com3tag:blogger.com,1999:blog-7276377053120174333.post-21847249709692428492014-10-06T07:37:00.001-04:002017-04-14T16:43:45.279-04:00First Alpha with OpenBCI V3OK, I'm back to work now. After <a href="http://eeghacker.blogspot.com/2014/08/first-data-with-openbci-v3.html">my previous post</a>, where I got my first data ever with the new <a href="http://www.openbci.com/">OpenBCI</a> board (aka, "V3"), I took a little hiatus while Joel worked through some issues with the Bluetooth link. Everything appears to be working well now, so I'm back on the case. Yesterday, I connected everything up and recorded my first real EEG data with the V3. Exciting!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEge1bP-30V476tl2lBVOMPA-1H0Pn2QF0JtIWekdIT7Fkw3CGRz6apYrP32QQGMuxTgcLtJbMauksPHxAVz4-Zr1uKGyATFBvR2y5Nl6n1Lf9-7gtk61uY2V3EZOGYUdBX4gK4-iGzPba8/s1600/IMG_3195.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEge1bP-30V476tl2lBVOMPA-1H0Pn2QF0JtIWekdIT7Fkw3CGRz6apYrP32QQGMuxTgcLtJbMauksPHxAVz4-Zr1uKGyATFBvR2y5Nl6n1Lf9-7gtk61uY2V3EZOGYUdBX4gK4-iGzPba8/s1600/IMG_3195.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">My Little OpenBCI V3 Board (8-Bit Version) with Homemade Electrode Adapter</td></tr>
</tbody></table>
<br />
<u>EEG Setup:</u> To get started, I set an easy goal for myself -- just record some eyes-closed alpha waves. So, I got out my trusty gold electrodes, my trusty Ten20 electrode paste, and put on a few electrodes. I attached one electrode to the back-left of my head ("O1"), the reference electrode to my left earlobe, and the bias electrode to my right earlob. So far, this is just like normal.<br />
<br />
<u>Software Setup:</u> For software, I used Arduino software for the OpenBCI Bluetooth dongle (aka, the RFduino "Host"), for the remote Bluetooth module on the OpenBCI V3 board itself (aka, the RFduino "Device"), and for the Atmel microcontroller that is the core of the OpenBCI V3 board (and which is programmed like an Arduino Uno). The software is surely going to change with time, but right now I'm working with this code <a href="https://github.com/chipaudette/EEGHacker/tree/master/Arduino/OBCI_V3_Debugging/03-ThirdCodeFromJoel">here</a>. On the PC side, I used a version of our <a href="https://github.com/chipaudette/EEGHacker/tree/master/Processing/OBCI_V3_Debugging/OpenBCI_GUI">Processing GUI</a> that we modified to accept the new binary data format being generated by the V3 board.<br />
<br />
<u>Data and Analysis:</u> I did a couple of recordings of my eyes-closed alpha waves. My data and analysis files are <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2014-10-03%20V3%20Alpha">here</a>. Some example plots of the data that I recorded are shown below. This is my first time trying to analyze the data using <a href="http://eeghacker.blogspot.com/2014/10/moving-from-matlab-to-python.html">Python instead of Matlab</a>. Because I'm so new with Python, I was a lot slower in doing the analysis, but now that I've completed this one little task, I'm feeling pretty OK about the switch. Maybe, just maybe, it is possible to learn new tricks! <br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMOyaPaelEsI0muuwiAfoPjxqL0_XXiyf9q8lm9xnBskhL0eDzrorBnYejl8izCUulFN28icvw1azFfyXpVJn1uh29MiubpDGvJQWW_86e7Ay1FLAA3cqrFm1DPuj5rnVLTCl1gGd_-BE/s1600/2014-10-05_Alpha_NoCaffeine.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="541" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMOyaPaelEsI0muuwiAfoPjxqL0_XXiyf9q8lm9xnBskhL0eDzrorBnYejl8izCUulFN28icvw1azFfyXpVJn1uh29MiubpDGvJQWW_86e7Ay1FLAA3cqrFm1DPuj5rnVLTCl1gGd_-BE/s1600/2014-10-05_Alpha_NoCaffeine.png" width="440" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">EEG Data Recorded from the Back-Left of my Head ("O1") After Closing my Eyes<br />
around t = 88 seconds. By closing my eyes, I get alpha waves appear near 10 Hz.</td></tr>
</tbody></table>
<br />
<u>Time-Domain Plot:</u> The top plot is a simple plot of the recorded EEG signal as a function of time. Actually, it's not a totally "simple" plot because I have done some processing of the data. I highpass filtered it to remove the DC component and I notch filtered it at 60 Hz and 120 Hz to get rid of power line interference. In my opinion, though, time-domain plots are not very useful when zoomed out to a wide range of time (like we're doing here). So, there's not much to say.<br />
<br />
<u>Spectrogram:</u> The middle plot is a spectrogram of the same data. I love <a href="http://eeghacker.blogspot.com/2014/05/eeg-as-wav-files-go-spectrograms.html">spectrograms</a>. Here, time is again on the horizontal axis, but now frequency is on the vertical axis. The intensity of the color of each pixel shows how much signal energy is at the pixel's time and frequency. Here, by the red horizontal line that appears ~10 Hz, you can clearly see the alpha waves. Cool!<br />
<br />
<u>Frequency-Domain Plot:</u> Unfortunately, it is difficult to be quantitative about the amplitude of signals that are seen in the spectrogram. So, once I located my alpha waves (t = 90 sec to t = 118 seconds), I plotted the mean spectrum for the data just in that time period. The bottom plot shows this spectrum -- it shows the spectrum of my brain waves during t = 90 sec to t = 118 sec You can see the prominent bump around 10 Hz. These are my alpha waves. As can be seen. the amplitude is approximately 4.1 uVrms and the peak is focused at 9.38 Hz. That's my brain! Specifically, that's my visual cortex when it's bored because my eyes are closed!<br />
<br />
So, that's the quick fun that I had using the new OpenBCI V3 hardware and the fun that I had using Python for the first time to make decent graphs. Learning new things makes me feel pretty empowered. To celebrate, I'm going to go eat some breakfast now. Mmm...Wheat Chex...I really know how to party. ;)<br />
<br />
<u>Follow-Up</u>: Here's some additional discussion on <a href="http://eeghacker.blogspot.com/2014/10/detecting-alpha-waves-threshold.html">how to detect these Alpha waves</a><br />
<br />Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com2tag:blogger.com,1999:blog-7276377053120174333.post-38297315104873791442014-10-05T22:44:00.003-04:002017-04-14T16:43:53.282-04:00Moving from Matlab to PythonFor any real data-hounds out there, you've probably noticed that I do most of my data analysis and plotting in Matlab. It's a computing and plotting package that I've been using for a long, long time (almost 20 years...yikes!). As a result, I'm very comfortable with it, which makes it nearly effortless for me to use it to explore new data. Unfortunately, Matlab is very expensive (thousands of dollars), so it's unlikely that there are very many other hobbyists that are likely to have this tool. As a result, while I have been sharing all of the Matlab EEG analysis code on <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data">my GitHub</a>, it is a bit pointless since Matlab itself is so unavailable.<br />
<br />
To make my EEG analysis code more usable for other folks, I've decided to put on my big boy pants and to try to learn something new. I'm making the jump to Python. Look out!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivLNUXkIzOZbrtTYHjNRzXdCKcLCCutd3dUlf33swxoBjoOj0sFiIliTj0xDBJq_z9zHs7wIxOkXpfqOklFUTnqHJX6HEypVLjlpdw-gS2jrD4PsItoMG3gG9d35We8D0HIz8MWORBEiI/s1600/PythonScreenCapture.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="250" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivLNUXkIzOZbrtTYHjNRzXdCKcLCCutd3dUlf33swxoBjoOj0sFiIliTj0xDBJq_z9zHs7wIxOkXpfqOklFUTnqHJX6HEypVLjlpdw-gS2jrD4PsItoMG3gG9d35We8D0HIz8MWORBEiI/s1600/PythonScreenCapture.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Developing Python Code in the Spyder IDE</td></tr>
</tbody></table>
<br />
The primary benefits of Python are that it's free and that it has a huge community of developers. As an added benefit to me, there are also a lot of former Matlab programmers who have made the jump to Python, which means that there is a lot of Matlab-reminiscent Python code out there for Matlab junkies like myself to use as a gateway drug to Python.<br />
<br />
When getting started in Python, it's helpful to use a pre-packaged Python distribution that includes a good selection of helpful packages along with the Python core. Based on some guidance from my buddy Rob, I decided to use the "Anaconda" distribution because it includes tried-and-true packages numpy (numerical math routines and data structures), scipy (scientific computing routines, such as filtering), and matplotlib (a collection of Matlab-style plotting routines). It's a pretty sweet distribution. You can download it for free from the <a href="http://continuum.io/downloads">Anaconda website</a>.<br />
<br />
As with any programming language, one's first experience is highly dependent upon the quality of the development environment. I've chosen to start with the Spyder IDE, which is shown in the screenshot above. I don't know if it is considered to be an especially good or especially bad IDE, but it seems to do the job. If you're a Matlab ninja turned Python super-hero, I'd love to hear what IDE you use.<br />
<br />
So, moving forward, you should expect to see me share my Python code for my upcoming EEG Hacker posts. While my Python style will be ugly (being a Python newbie, how can it be anything but ugly?), hopefully you can read it well enough to learn a few ideas (or to teach me a few ideas) on how to do some EEG processing.<br />
<br />
Wish me luck...'cause I'm jumping in!<br />
<br />Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com6tag:blogger.com,1999:blog-7276377053120174333.post-14265379977632734982014-08-18T10:50:00.002-04:002017-04-14T16:44:02.077-04:00First Data with OpenBCI V3It's here! It's here! The (near) future has arrived! I have received an early OpenBCI V3 board from my friends over at <a href="http://www.openbci.com/">OpenBCI</a>. Check out the photo below. It's a little smaller than I expected. Very cool! Let's get it running and see what it can do!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgcy9xxIPeMSiuXHpZqkawQJsmrsyH56hEDibTeZEwN3PUjU7hU39VXeCHXgQAauag_E719srwAu192Uz8H86F3GJu5Vh5LYfw4kIY11FvYNEr1h0GSU_mcaCqj2LlSCz0vWc52OdFvg7E/s1600/IMG_3039.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgcy9xxIPeMSiuXHpZqkawQJsmrsyH56hEDibTeZEwN3PUjU7hU39VXeCHXgQAauag_E719srwAu192Uz8H86F3GJu5Vh5LYfw4kIY11FvYNEr1h0GSU_mcaCqj2LlSCz0vWc52OdFvg7E/s1600/IMG_3039.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption"><div style="font-size: 13px;">
The OpenBCI V3 board is a smaller than I expected. Looks great!</div>
</td></tr>
</tbody></table>
<br />
<div>
Note that the V3 board includes a built-in microprocessor. This means that you no longer have to buy a separate Arduino to act as host. That's pretty sweet. OpenBCI gives you two choices of microprocessor: at ATmega 328 (the <a href="https://openbci.myshopify.com/collections/frontpage/products/openbci-8-bit-board-kit">8-bit option</a>) or a PIC32 (the <a href="https://openbci.myshopify.com/collections/frontpage/products/openbci-32-bit-board-kit">32-bit option</a>). While the power of the 32-bit PIC is appealing, I chose the ATmega because that allows me to program the OpenBCI board as if it were an Arduino Uno. I hear that the 32-bit PIC version can also be programmed from the Arduino IDE, but the Uno is my friend, and I chose to stick with him. </div>
<div>
<br /></div>
<div>
Another change with the V3 version is that it has a built-in Bluetooth module. In fact, to maximize electrical safety for the user, the wireless Bluetooth link is now the only way to get data off the device in real-time (though it does have a built-in SD card for those looking to simply log data). This is quite a change...and a change for the better, in my opinion.<br />
<br />
OpenBCI says that the Bluetooth module is compatible with standard protocols (to enable connection to your mobile device) and that it has a special high-speed mode, if you have a mating BT module for your PC. To enable these high-speed modes, OpenBCI includes a BT USB dongle, which is shown at the bottom of the picture below.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjzOfyOlyYARtGJW_FQU01Yt_L_YsIoPbfayZPJxbq89vpXA7JiYrSeeXtzaTEksBPMFNUpr7Jr6LUFDtx823eGchAozIkbS1Rw4z6pK8u0g3Q1Fiy3Z9IEyPeIdW-wuYDIGloBv1OFFjI/s1600/IMG_3046.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjzOfyOlyYARtGJW_FQU01Yt_L_YsIoPbfayZPJxbq89vpXA7JiYrSeeXtzaTEksBPMFNUpr7Jr6LUFDtx823eGchAozIkbS1Rw4z6pK8u0g3Q1Fiy3Z9IEyPeIdW-wuYDIGloBv1OFFjI/s1600/IMG_3046.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Using the new OpenBCI V3 board to record my ECG. I used one disposable ECG electrode<br />
on each wrist. The OpenBCI board was powered by a 9V battery, which is in the black<br />
battery case. The OpenBCI board transferred the data to the PC via Bluetooth.<br />
OpenBCI includes a BT dongle for the PC, which is shown in the bottom-left.</td></tr>
</tbody></table>
</div>
<div>
<div>
<br /></div>
<div>
Once I got my hardware, I started in. Unfortunately, as part of the deal with me getting this hardware so early, the software to run the hardware is not yet complete. So, I had a little work to do. <br />
<br />
When diving into new hardware, it's best to take baby steps. Start from something that works and then add features incrementally, with lots of tests along the way. This makes it much easier to identify and squash the bugs as they pop up. For my first work with OpenBCI V3, here's my approach<br />
<br />
<ol>
<li>Test the wireless link using pre-defined dummy data</li>
<li>Test getting data from the ADS1299 using its built-in test signals</li>
<li>Test getting real data from the ADS1299 by recording my ECG</li>
<li>Test the full system by recording my EEG</li>
</ol>
<br />
<u>Dummy Data:</u> To get this process started, Joel (of OpenBCI) provided some example code that exercised the wireless link using dummy data. He even pre-loaded this software on to the V3 board for me. So, all I had to do was plug in the USB BT dongle, connect a battery to the OpenBCI board, and I was good to go. I started my Terminal program on my PC and was immediately interacting with the OpenBCI V3 board. It correctly transferred the pre-defined dummy data. Success!<br />
<br />
<u>Built-In Test Signals:</u> Building from this working code base, and building from OpenBCI's initial code for configuring the ADS1299 EEG chip, I added the ability to grab data from the ADS1299 and send it out the wireless link. I started with the ADS1299's built-in test signals. After some fiddling with 24-bit vs 32-bit number formats (and then discovering that Joel already programmed the solution for me), I got the nice square wave signal as shown below. This proves that I can communicate with the ADS1299 chip and that I've got all the number formats correct. Success again!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjSKW0TBzStYbPYJnVIRc3Z_99u6_-XkMRrbte2OheNgFd1MgmokiaIteeB8CwvnFDGwejIvzlzP3aME6s_NqwsYCD6Q_bNpth1o_OYp-CSBwR0g1TPvg1-K5kg1p4wxldacwWoa1aR9vY/s1600/TestSig.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="257" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjSKW0TBzStYbPYJnVIRc3Z_99u6_-XkMRrbte2OheNgFd1MgmokiaIteeB8CwvnFDGwejIvzlzP3aME6s_NqwsYCD6Q_bNpth1o_OYp-CSBwR0g1TPvg1-K5kg1p4wxldacwWoa1aR9vY/s1600/TestSig.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Data from my OpenBCI V3 board...this is a built-in test signal being generated<br />
by the ADS1299 EEG chip. Since it looks beautiful, it means that I have confirmed<br />
that I can configure the chip and that all my number formats are correct.</td></tr>
</tbody></table>
<br />
<u>ECG Data:</u> As you may know from my <a href="http://eeghacker.blogspot.com/2013/11/collecting-ecg-with-my-eeg-setup.html">previous "getting started" post</a><u>,</u> I like to start my collection of real data by recording my ECG (ie, heart signals). I do this because ECG signals are so much stronger and simpler than EEG signals. Having strong signals makes it more obvious when the system is working correctly (and when it is not working correctly). So, I got out my disposable ECG electrodes, put one on each wrist, and started recording. As you can see in the plot below, I got a nice sample of my ECG. My code for OpenBCI V3 might be rough, and my code is definitely not feature-complete, but it does work.<br />
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNiv9gDdV9M3Pt3nUNlO8wL_sA7bt8uH7t4ECsQl0dXZwuaY49y7ERx0RuYdrPzrejcXxPbS5a6djJZc5ONn728ay8sCpz8uOtN2Lno0WspQcjr12Kl6nEtDl8agL1hU3i-wrL2Wp6Yrc/s1600/ECG+on+OpenBCI+V3.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="258" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNiv9gDdV9M3Pt3nUNlO8wL_sA7bt8uH7t4ECsQl0dXZwuaY49y7ERx0RuYdrPzrejcXxPbS5a6djJZc5ONn728ay8sCpz8uOtN2Lno0WspQcjr12Kl6nEtDl8agL1hU3i-wrL2Wp6Yrc/s1600/ECG+on+OpenBCI+V3.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">My wrist-to-wrist ECG looks pretty good.</td></tr>
</tbody></table>
<div>
<div>
<br /></div>
<div>
The sample of data shows that my heart rate was about 80 beats per minute, which is a little high for simply sitting in a chair at my computer. Maybe I was just excited to be having success playing with the new hardware! I'm like that.<br />
<br />
So, returning to my four step process described earlier, I've got 3 of the steps completed. Before I do the last step -- collected actual EEG data -- I'd like to revise the code a little more. Right now, I cannot view the streaming data in real time because the data format is a little different then before. To move forward, I need to adjust the code in my Processing GUI so that it can interpret the data packets and plot the data in real time. Once I get that to happen, I'll hook up some electrodes to my head and maybe <a href="http://eeghacker.blogspot.com/2014/06/controlling-hex-bug-with-my-brain-waves.html">make my robot dance some more</a>! Wish me luck!<br />
<br />
<u>Follow-Up</u>: Raw data and analysis code is <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2014-08-17%20First%20V3%20Data">here</a>.<br />
<div>
<u>Follow-Up:</u> First EEG Data from the V3 Board is <a href="http://eeghacker.blogspot.com/2014/10/first-alpha-with-openbci-v3.html">here</a>.</div>
</div>
</div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com17tag:blogger.com,1999:blog-7276377053120174333.post-52917021209024657472014-06-08T11:59:00.001-04:002015-03-03T12:54:17.305-05:00Controlling a Hex Bug with my Brain WavesEver since my effort with OpenBCI began, I've been looking to control <i>something</i> with my brain. Sure, a while back, I was successful in <a href="http://eeghacker.blogspot.com/2013/11/openbci-alpha-wave-detector.html">lighting an LED with my brain waves</a>, but that's pretty simple. I wanted something more. And now I can do it. I can control a robot with my mind! Yes!<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="360" src="//www.youtube.com/embed/RtM2R-L25W0" width="480"></iframe>
</div>
<br />
<u>Approach:</u> My robot has just a few actions that it can do...turn left, turn right, walk forward, and fire. To make this brain-controlled, I need a way to invoke these commands using signals from my brain. Ideally, I'd just think the word "Fire!" and the robot would respond. Unfortunately, those kinds of brain waves are too hard to detect. Instead, I need to use brain waves that are easy to detect. For me, "easy" brain waves include the Alpha waves (10 Hz oscillations) that occur when I close my eyes, as well as the brain waves that occur when I watch my blinking movies (a.k.a. <a href="http://eeghacker.blogspot.com/2014/05/controlling-entrainment-through.html">visual entrainment</a>). So, my approach is to use OpenBCI to record my brainwaves, to write software to detect these specific types of brain waves, and to issue commands to the robot based on which brain waves are detected.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgxtKRRl3EOP4_NfteATgq0NnrzoMpxt9yRDE8UDuUA_w7wnB2kJO0UbFQxdogFAqVqvbVBHRpyK1jhA-IJpYSIJYOQyX8p28GebOqZcNs160uCYPOFNVTCTgOOPG46PZ_z5jq5yo5xf5s/s1600/Setup.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgxtKRRl3EOP4_NfteATgq0NnrzoMpxt9yRDE8UDuUA_w7wnB2kJO0UbFQxdogFAqVqvbVBHRpyK1jhA-IJpYSIJYOQyX8p28GebOqZcNs160uCYPOFNVTCTgOOPG46PZ_z5jq5yo5xf5s/s1600/Setup.png" height="375" width="540" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Here are all the pieces that you see in the video</td></tr>
</tbody></table>
<u>Hardware Setup:</u> The core hardware for this hack is similar to my usual OpenBCI setup: <a href="http://www.neuro-source.com/product.php?id_product=544">EEG electrodes</a>, an <a href="http://www.openbci.com/">OpenBCI</a> board, an <a href="http://arduino.cc/en/Main/arduinoBoardUno">Arduino Uno</a>, and my computer. Added to this setup is the <a href="http://www.radioshack.com/product/index.jsp?productId=31269256">Hex Bug</a> itself and its remote control, which <a href="http://eeghacker.blogspot.com/2014/05/arduino-control-of-hex-bug.html">I hacked</a> so that the remote can be controlled by an Arduino. So, as shown below, my brain wave signals go from my head all the way to the PC. The PC processes the EEG data looking for the Alpha waves or the visually-entrained waves. If any are detected, it decides what commands to give the robot. The commands are conveyed back to the Arduino, which then drives the remote control, which the Hex Bug receives over its usual IR link. <br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihGOEXjZ78UWoW-fMyS3yYw9G43yXzhEWNEL7ZVhgBAt7axQ-gPW87W0_ImhDGIXr4HYC2lneykrxVHjq4vBVRsMlyavMebd9iqQ1NE0_Eh_DW_bbgxxyLz59ySQf1LLnGjQ9Wsa2AbMA/s1600/Setup-Schematic2.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEihGOEXjZ78UWoW-fMyS3yYw9G43yXzhEWNEL7ZVhgBAt7axQ-gPW87W0_ImhDGIXr4HYC2lneykrxVHjq4vBVRsMlyavMebd9iqQ1NE0_Eh_DW_bbgxxyLz59ySQf1LLnGjQ9Wsa2AbMA/s1600/Setup-Schematic2.png" height="299" width="540" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Here is the schematic of how the pieces work together.</td></tr>
</tbody></table>
<br />
<div>
<u>EEG Setup:</u> I'm going to be measuring my Alpha waves and I'm going to be measuring the brain waves induced through visual entrainment. Based on my previous experience, I know that both are best recorded using an electrode on the back of the head (at the "O1" position, if you're into your <a href="http://en.wikipedia.org/wiki/10-20_system_(EEG)">10-20 electrode placement</a> standard). I do not need electrodes all over my head. That's the only <i>sensing </i>electrode that I'm using. That's it. Of course, EEG also requires a reference electrode, which I put on my left earlobe. And, finally, EEG often has a third electrode ("bias" or "driven ground"), which I placed on my right earlobe.<br />
<br />
<u>Looking at the Frequency of my Brain Waves:</u> As mentioned above, my approach is to control my robot by detecting Alpha waves and by detecting visually-entrained brain waves. These are easily detectable because they occur at specific frequencies. Alpha occur around 10 Hz and the visually-entrained brain waves occur at the blink rate(s) of whatever movies I use (<a href="http://eeghacker.blogspot.com/2014/05/controlling-entrainment-through.html">my best results</a> were from 5 Hz and 7.5 Hz movies). So, to control my robot, I will be looking for EEG signals at these frequencies: 5 Hz, 7.5 Hz, and 10 Hz. I'm going to "look" for these frequencies by writing some EEG processing software that'll look at the frequency content of my EEG signal to see if these frequencies are present.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDOANKuscbqTH-QiWJrVmhAK6d0h96TjdOY7oMWxpbY42jeAYgNVJeJVgc_euBdy4V4rwQ2jWN3Z8Ozkq3QOi-2O7uDZ4jSDH3iBW_T9FT1Etb6BfShtKGnHaXqd4nGFAGnCdCgmovyvc/s1600/EEG+Processing.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDOANKuscbqTH-QiWJrVmhAK6d0h96TjdOY7oMWxpbY42jeAYgNVJeJVgc_euBdy4V4rwQ2jWN3Z8Ozkq3QOi-2O7uDZ4jSDH3iBW_T9FT1Etb6BfShtKGnHaXqd4nGFAGnCdCgmovyvc/s1600/EEG+Processing.png" height="158" width="504" /></a></div>
<br /></div>
<div>
<u>EEG Processing:</u> The flow chart above shows the steps that I use to process the EEG signal (my software is <a href="https://github.com/chipaudette/OpenBCI/tree/variant_hexBugControl_visualEntrainment/Processing_GUI/OpenBCI_GUI">here</a>). Once the PC gets EEG data from the OpenBCI board, the first step is to compute the spectrum of the signal, which tells me the content of the EEG signal as a function of frequency. I then search through the relevant part of the spectrum (4-15 Hz) to find the peak value. I note both its frequency value and its amplitude. In parallel, I also compute the average EEG amplitude across the 4-15Hz frequency band. This average value is my baseline for deciding whether my peak is tall (strong) or short (weak). By dividing the amplitude of my peak by this baseline value, I get the signal-to-noise ratio (SNR) of the peak. The SNR is my measure of the strength of the peak. The output of the EEG processing, therefore, are two values: the frequency of the peak and the SNR of the peak.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUC8D1SlTHy9PmqMVuxqiQ4qM8xyvaHcrkvEqf7NbqUxYgUhKVSICnsYQD4OQd2rZCsbKamfQKUnG7uh5Wc9I0W5Fi72peSgwV7L1Fv6NBrP4Lx04CsoVei84-WW-BbdYpDuoo-5Kugfg/s1600/DecideCommand.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhUC8D1SlTHy9PmqMVuxqiQ4qM8xyvaHcrkvEqf7NbqUxYgUhKVSICnsYQD4OQd2rZCsbKamfQKUnG7uh5Wc9I0W5Fi72peSgwV7L1Fv6NBrP4Lx04CsoVei84-WW-BbdYpDuoo-5Kugfg/s1600/DecideCommand.png" height="250" width="540" /></a></div>
<br />
<u>Deciding My Robot's Action:</u> Once my EEG processing finds the frequency and SNR of the peak in my EEG spectrum, I now have to decide how to act on that information. After some trial and error, I settled on the algorithm shown in the flow chart above. It's got three steps:<br />
<ul>
<li><u>SNR Check:</u> First, I decide whether the current peak in the spectrum is legitimate, or if it is likely to be just noise. I don't want to issue a command if it is just noise because then my robot will be taking all sorts of actions that I didn't intend. That is not what I want. So, to decide if the peak is likely to be legitimate, I look at the SNR of the peak. If it has a big SNR, I'll accept it as a legitimate peak. If it is too small, I'll take no further action. Right now, my threshold for this decision is at 6 dB. Setting a higher threshold results in fewer false commands (which would be good), but it also makes the system less sensitive to legitimate commands (which is bad). This 6 dB threshold resulted in an OK (but not great) balance.</li>
</ul>
<ul>
<li><u>Frequency Check:</u> If the peak seems legitimate, I decide how to command the robot based on the frequency of the peak. If the peak is between 4.5-6.5 Hz, I must be looking at the right-side of my 2-speed blinking movie (ie, the portion that blinks at 5 Hz), so the computer prepares the "Turn Right" command. Alternatively, if the EEG peak is 6.5-8.5 Hz, I must be looking at the left-side of my 2-speed blinking movie (ie, the portion that blinks at 7.5 Hz), so it prepares the "Turn Left" command. Finally, if the EEG peak is 8.5-12 Hz, it must be my eyes-closed Alpha waves, so the computer prepares the "Move Forward" command.</li>
</ul>
<ul>
<li><u>New Command Check</u>: Before issuing the command, I check to see whether this command is the same as the last command that was extracted from my brain waves. If the latest command is <i>different,</i> I hijack the command and, instead, issue the "Fire!" command. If the latest command is the same, I go ahead and issue the left / right / forward command like normal. The reason for this hijack is that I have no other type of easily-detected brain wave that I can use for commanding the robot to fire. This approach of issuing "Fire!" on every <i>change</i> in command seemed like a decent way of getting a 4th command out of 3 types of brain waves.</li>
</ul>
<ol>
</ol>
<u>Putting It All Together:</u> As you can see in the movie, I eventually able to get all of these pieces working together to allow me to command the Hex Bug using just my brain waves. Of course, it didn't work the first time. Even once I got all the hardware working, I still needed to tune a bunch of the software parameters (FFT parameters and the detection threshold) until I got something that worked somewhat reliably. To help with this tuning process, I used the spectrum display that is in my Processing GUI. Some screen shots are below.<br />
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjN_tqx-fUDS4zaEjsoGmuum8Uf9DyWciH-IwzeyRV3SJncf-RmzoJtDPH-9OiF-Dl5cV0cKkrYDTzuYfO9hPEq7O3AuOyy4LeoE9Y6HcypqgD9pq000KyxGPZKprXrNCj1mBJ-a9gm49o/s1600/Spectrum-TurnRight.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjN_tqx-fUDS4zaEjsoGmuum8Uf9DyWciH-IwzeyRV3SJncf-RmzoJtDPH-9OiF-Dl5cV0cKkrYDTzuYfO9hPEq7O3AuOyy4LeoE9Y6HcypqgD9pq000KyxGPZKprXrNCj1mBJ-a9gm49o/s1600/Spectrum-TurnRight.png" height="240" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Example EEG spectrum when I stared at the right side of my two-speed blinking<br />
movie. It induced 5 Hz brain waves. I programmed 5 Hz to mean "Turn Right".<br />
The SNR here is between 6 and 7 dB.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjxJni2CpxtxJdekCV-opc37OUtsDoFRquZy79Eela8KgcMKP19wTPIJW5dMUWSQK4_iqa43hbGCR9vFD-xMvW0HZNbuuShR5r3qzKKA3ABFL8w8yf04gvhXKXuNqypbPlavW2h0KZDbAM/s1600/Spectrum-TurnLeft.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjxJni2CpxtxJdekCV-opc37OUtsDoFRquZy79Eela8KgcMKP19wTPIJW5dMUWSQK4_iqa43hbGCR9vFD-xMvW0HZNbuuShR5r3qzKKA3ABFL8w8yf04gvhXKXuNqypbPlavW2h0KZDbAM/s1600/Spectrum-TurnLeft.png" height="233" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Here's an example EEG spectrum when I stared at the left side of my two-speed <br />
blinking movie. It induced 7.5 Hz brain waves. When the GUI detected 7.5 Hz, <br />
it issued a "Turn Left" command to the Hex Bug. The SNR is only 6-7 dB.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8v20-kFppUry-IigTHN9dxLHAAN4ZxbwZb17JMP-6frStl3qsYL8I_i-zfGHIMXHSrBVvu4eq2-yaWojT5SqqTvNzP18OzLXEYbcLBjvFiUzs2I1yyVjJWGPWYo8jgimXJzHpcLk53S0/s1600/Spectrum-Forward.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi8v20-kFppUry-IigTHN9dxLHAAN4ZxbwZb17JMP-6frStl3qsYL8I_i-zfGHIMXHSrBVvu4eq2-yaWojT5SqqTvNzP18OzLXEYbcLBjvFiUzs2I1yyVjJWGPWYo8jgimXJzHpcLk53S0/s1600/Spectrum-Forward.png" height="233" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Finally, here's an example EEG spectrum with my eyes closed so that I was<br />
exhibiting Alpha waves, which are near 10 Hz. When it detected 10 Hz, I<br />
programmed it to issue a"Forward" command. The SNR is > 8 dB.</td></tr>
</tbody></table>
<br />
<u>Weak Peaks:</u> In the screenshots above, the red line shows the current EEG spectrum. The heavy black circle shows the spectral peak that my software algorithms have detected. The black dashed line is the "background noise" from which the SNR is computed. To be declared a legitimate detection, the peak must be 6 dB higher than the black dashed line (unfortunately, I don't show this on the plot...sorry!). As can be seen, the 5 Hz and 7.5 Hz examples are not very strong (the SNR is only 6-7 dB). Other peaks within the plots are very close to being the same size, which would cause false commands to be sent to the robot. In my movie at the top of this post, there were several false commands. <br />
<br />
<u>Balancing Sensitivity with False Commands:</u> To reduce the number of false commands, I could raise my detection threshold above 6 dB. Unfortunately, as see in the first two spectrum plots above, my 5 Hz and 7.5 Hz peaks are usually pretty weak (< 7 dB). Therefore, any attempt to raise my detection threshold above 6 dB would cause me to no longer detect my legitimate brain waves. I know because this is exactly the tuning process that I tried. Bummer! So, if I want more reliable performance, I'll need to develop a fancier signal processing beyond this simple FFT-threshold approach. Future challenges!<br />
<br />
<u>Wrapping Up:</u> Even with the false commands seen in my movie, I was still able to command the robot to move around the table. I could get it to go (roughly) where I wanted it to go. And, I did it all with just my brain waves. I think that this is pretty exciting! Yay! What are the next steps? Well, maybe now that I have this under my belt, I can move on to control <a href="http://www.airswimmers.com/">flying fish,</a> or maybe a <a href="http://www.hobbyking.com/hobbyking/store/__35630__Walkera_QR_Infra_X_Micro_Quadcopter_w_IR_and_Altitude_Hold_Mode_1_RTF_.html">quadcopter</a>! Do you have any other cool ideas for things I can control with my brain?<br />
<br />
<u>Coolness:</u> This hack got picked up by IEEE Spectrum as part of an article on OpenBCI. Cool! Check it out <a href="http://spectrum.ieee.org/biomedical/devices/building-mindcontrolled-gadgets-just-got-easier">here</a>.<br />
<u><br /></u>
<u>More Coolness:</u> This hack also got picked up by <a href="http://www.wired.com/2014/08/mind-controlled-robot/">Wired</a>. Fun!<br />
<br />
<u>Follow-Up</u>: I got to share this hack with Joel and Conor of OpenBCI. You can see their luck with controlling the robot <a href="http://eeghacker.blogspot.com/2014/10/sharing-brain-controlled-hex-bug.html">here</a>.<br />
<br />
<u>Follow-Up: </u><u>Follow-Up:</u> We used a similar approach to get a 5-person team to brain-control a swimming shark balloon. It's cool. Check it out <a href="http://eeghacker.blogspot.com/2015/03/brain-controlled-shark-attack.html">here</a>.<br />
<br /></div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com44tag:blogger.com,1999:blog-7276377053120174333.post-10681484695668150292014-05-15T07:27:00.004-04:002015-01-03T10:58:14.592-05:00Arduino Control of a Hex BugBased on <a href="http://eeghacker.blogspot.com/2014/05/controlling-entrainment-through.html">my previous success</a> with visual entrainment, I'm moving forward with my plans on making a brain-computer interface (BCI) using my blinky lights. To really kick this project into high gear, though, I need a good goal -- I need something that would be really fun to control with my brain. Luckily, my <a href="http://www.openbci.com/">OpenBCI </a>friend <a href="http://www.openbci.com/people/">Conor</a>, found these cool <a href="http://www.radioshack.com/product/index.jsp?productId=31269256">remote-controlled 6-legged robots</a> that can walk around a fire a little gun. Bingo! As you can see below, today's post shows how to hack the robot's remote to make it controllable from an <a href="http://www.arduino.cc/">Arduino</a>. Once it is controllable from an Arduino, it's only one more step until it's controllable from my brain!<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="360" src="//www.youtube.com/embed/hRHOhHUqdoM" width="480"></iframe>
</div>
<br />
<u><br /></u>
<u>The Hex Bug Battle Spider:</u> The remote-controlled robot that Conor found is a <a href="http://www.hexbug.com/mechanical/battlespider/">Hex Bug Battle Spider</a>. They are available in two colors and you can have them do battle. Hex Bug makes smaller and cheaper versions of this robot, but I believe that only the Battle Spider has the ability to do battle. I'm looking forward to facing off cerebro-a-cerebro with Conor, so I'm sticking with the Battle Spider.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKUWm7ti1D_9eDKykQLbKXRrtUIyNyVUaJxHkNDETCpkH0Qp-B52-qAKk0fbUpeyv3kdiMyJrBWIFT6HRfrV8dMiAZGxgWU8IbYk6On1yU8NVQAsxhYT6LlwZtK4EJtgGGrysGio1Jhrc/s1600/IMG_1164.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKUWm7ti1D_9eDKykQLbKXRrtUIyNyVUaJxHkNDETCpkH0Qp-B52-qAKk0fbUpeyv3kdiMyJrBWIFT6HRfrV8dMiAZGxgWU8IbYk6On1yU8NVQAsxhYT6LlwZtK4EJtgGGrysGio1Jhrc/s1600/IMG_1164.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Hex Bug Battle Spider - My Hacking Target for Today</td></tr>
</tbody></table>
<br />
<u>IR Remote Control:</u> The Hex Bug is commanded using an infrared (IR) remote-control. It is the remote control that I will hack so that the Hex Bug can be commanded from an Arduino. As you can see in the picture below, I was so anxious to hack the remote, that I never got a picture of it while it was still in one piece...I just couldn't wait to smell the solder! Regardless, as you can see, the remote is simply a bunch of plastic pieces, a couple of coin cell batteries (not shown), and a printed circuit board (PCB).<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjdzeSW9CzIvzlmP170grc-DbrBu0_B34V91dTufotrZlCchjU4hIUzGFy4RogBvJX0HNO0k_byk5jWD_dm4ZY6bl50LJf6iStW0lmn3k3Ot7Isi95VqNoslLMpHjoWk00CtgEkgbuV9lc/s1600/IMG_1170.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjdzeSW9CzIvzlmP170grc-DbrBu0_B34V91dTufotrZlCchjU4hIUzGFy4RogBvJX0HNO0k_byk5jWD_dm4ZY6bl50LJf6iStW0lmn3k3Ot7Isi95VqNoslLMpHjoWk00CtgEkgbuV9lc/s1600/IMG_1170.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The Hex Bug Infrared Remote Control (in Pieces)</td></tr>
</tbody></table>
<br />
<u>Thank You, Test Points!</u> Flipping over the PCB, I was very pleased to see that this board has test points for<i> everything</i>. Oh, the joy! Because of these test points, it is much easier to probe the board with my multimeter or with an oscilloscope to figure out how this thing works. The test points also makes it much easier to attach wires to control this thing from the outside (like from my Arduino). It turns out that there is a test point for each of the four user buttons on this board. The relevant test points are shown in the picture below. These test points will be the focus of my work.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgy0BMg1-XLYQCkKWyVgmJuhwHacxozf6F4LX8QZoZJS7DnzM_vMjGJMwNVLZWaGwHp53zu5VWDJMwxBBbFErAR_bTtRqYGHYRnYa44fogJjLGUAmFO-PTGtyf58GZ5IwCuwuuGiebrf_Q/s1600/TopOfBoard.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgy0BMg1-XLYQCkKWyVgmJuhwHacxozf6F4LX8QZoZJS7DnzM_vMjGJMwNVLZWaGwHp53zu5VWDJMwxBBbFErAR_bTtRqYGHYRnYa44fogJjLGUAmFO-PTGtyf58GZ5IwCuwuuGiebrf_Q/s1600/TopOfBoard.png" height="268" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">All those test points enable easy hacking. The test points<br />
with the arrows are for the user buttons.</td></tr>
</tbody></table>
<br />
<u>How the Buttons Work:</u> After a bit of probing of these test points, I learned that the buttons are used like most buttons in small devices like this (see <a href="http://arduino.cc/en/tutorial/button">"Button" demo</a> by the Arduino folks for more info). The buttons on this remote control are simply switches that are normally open-circuit. When you press a button, the switch becomes closed-circuit. The "low" side of each button is tied to ground. The "high" side is connected to 3.3V via a pull-up resistor (probably inside the microcontroller). The microcontroller is continually sensing the voltage on the high side of the switch. When the button <i>is not</i> pressed, no current flows through it nor through the pull-up resistor, so the voltage seen by the micro is high. When the button<i> is </i>pressed, current flows through the button, which drops the voltage seen by the micro. As a result, the micro knows that the button was pressed. Easy! <br />
<br />
<u>Hacking Approach:</u> Based on this discussion, it is clear that the microcontroller on the remote control knows nothing about the buttons...it only knows about the voltage being controlled by the buttons. When one of those lines goes from high to low, the microcontroller thinks that a button has been pressed. My hacking approach, therefore, is to wire the Arduino to the remote control so that the Arduino can pull the lines low for me. This requires me to attach a wire to the high side of each button and to a attach a wire to the remote control's ground. Normally, I'll keep the Arduino's pins in a high-impedance state so that no current flows. When I want it to "press" a button for me, I'll command the relevant pin to go into a low-impedance state to allow it to conduct current to ground. Electrically, this will mimic the behavior of the buttons themselves. No extra components will be necessary!<br />
<br />
<u>Connecting Ground:</u> OK, we're mostly done talking. Now let's start soldering. First, I connected a wire to the remote's ground. After looking around the PCB, I decided that I liked the solder that was on the low side of the "fire" button. So, as you can see below, I sneaked a black wire into that location and soldered it in place.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpkxB17LHzaBmHYkRwH7uSk-Trv5Q4jmBsl3gKXjIYCsOGl_wvAHUZHJa9084tW26N_D2RF1gNGzWd1ESZPklkyc3j_BMgXExwC3xNqLpCt3kkiVNCAyH7gMx09hiT8H7od9ee9Ka7r4s/s1600/GroundWire.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpkxB17LHzaBmHYkRwH7uSk-Trv5Q4jmBsl3gKXjIYCsOGl_wvAHUZHJa9084tW26N_D2RF1gNGzWd1ESZPklkyc3j_BMgXExwC3xNqLpCt3kkiVNCAyH7gMx09hiT8H7od9ee9Ka7r4s/s1600/GroundWire.png" height="270" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A blurry picture showing where I soldered a black wire to<br />
attach to the remote control's ground.</td></tr>
</tbody></table>
<br />
<u>Connecting Each Button:</u> Then, I flipped the board over and soldered a colored wire to each button's test point.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjZyeRPGugOT9KcazXiQfQMBStqr-TSurq5QUWTKW4vz8H5HcDxl3ucJqkmwZG45HfowhXgo9WCKLe6k5pjBVHB6OfqFfVj6fNLm4oq2mIgMznpmFBAaG9kFs61Xvhx1msag8i3F2ZoX2o/s1600/IMG_1180.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjZyeRPGugOT9KcazXiQfQMBStqr-TSurq5QUWTKW4vz8H5HcDxl3ucJqkmwZG45HfowhXgo9WCKLe6k5pjBVHB6OfqFfVj6fNLm4oq2mIgMznpmFBAaG9kFs61Xvhx1msag8i3F2ZoX2o/s1600/IMG_1180.JPG" height="267" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">All of my wires are now soldered to the test points.</td></tr>
</tbody></table>
<br />
<u>Snip a Pass-Through for the Wires:</u> While it is not necessary to do this, I like the idea of re-assembling the remote so that I can use it with my fingers (as if it were not modified) or so that I can use it with the Arduino. To enable the reassembly of the remote, one simply has to cut a hole in the plastic housing to get the wires out. I used a "nibbler" tool to cut a small hole. As you can see below, it worked really well!<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrvTbqC_Tj33sKTCVUUopjl8x3LzeKD1_A9P-1CLNnX8LJJ2xFf65zBgMFLdVqFdrY3n1g4k8DK21TnD4SrgX3HQGZjuGzdLiH2bxKL6p03tvJf3p1JXFjZw4CLwn-_cc5ppBfhP85DBs/s1600/IMG_1184.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrvTbqC_Tj33sKTCVUUopjl8x3LzeKD1_A9P-1CLNnX8LJJ2xFf65zBgMFLdVqFdrY3n1g4k8DK21TnD4SrgX3HQGZjuGzdLiH2bxKL6p03tvJf3p1JXFjZw4CLwn-_cc5ppBfhP85DBs/s1600/IMG_1184.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">For extra credit, I used a nibbler to cut a hole in the plastic housing so that<br />
I can get the wires out, even after I fully re-assemble the remote control.</td></tr>
</tbody></table>
<br />
<u>Attach a Pin Header:</u> To ease the connection of these 5 wires to the Arduino, I decided to solder the free ends of the wires to a piece of <a href="https://www.sparkfun.com/products/117">basic pin header</a>. With these pins, I can easily insert the five wires as a single unit into the sockets on the Arduino board.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhijn2IvAFuB7yVI-RDHgiK216erlTVGmOeUk_ArMv28tKO2cu7m1wkqxz2U4m2Lt01Jt3A2r0X7gySG2j5YVoOgEf0sTYXb0p6bdimKAOCcmweots5A_ykG2qicK7YhplzprpJEJc3h0g/s1600/IMG_1188.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhijn2IvAFuB7yVI-RDHgiK216erlTVGmOeUk_ArMv28tKO2cu7m1wkqxz2U4m2Lt01Jt3A2r0X7gySG2j5YVoOgEf0sTYXb0p6bdimKAOCcmweots5A_ykG2qicK7YhplzprpJEJc3h0g/s1600/IMG_1188.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">To make it easier to connect the wires to an Arduino, I attach the wires to<br />
a basic pin header.</td></tr>
</tbody></table>
<br />
<u>The Hacked Remote:</u> The picture below shows the hacked remote after I reassembled it. I tested it by pressing the buttons with my fingers and the robot moved. So far, so good!<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4ai2wH3C6jqg6kPs877pcNBw8W3B-HOCodezOioKI15VOMPThaN_suJsgyd1UG30aQBnb47pQ6xIN_XWpjiSZxq4nrBh9VwrqRxHcaheqElsc23kxJUMzW7gfQ0ePfh67sZQrZaUIUds/s1600/IMG_1190.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4ai2wH3C6jqg6kPs877pcNBw8W3B-HOCodezOioKI15VOMPThaN_suJsgyd1UG30aQBnb47pQ6xIN_XWpjiSZxq4nrBh9VwrqRxHcaheqElsc23kxJUMzW7gfQ0ePfh67sZQrZaUIUds/s1600/IMG_1190.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">My hacked remote control is now re-assembled and ready for testing.</td></tr>
</tbody></table>
<br />
<u>Software:</u> If I'm going to command this robot from my Arduino, the Arduino needs software. So, I plugged in my remote to the Arduino (because the Arduino is really flexible, I used the Analog Input pins even though these signals are neither Analog nor are they Inputs...but that doesn't matter, you can use the Analog pins as Digital ins and outs, too) and then began coding. <a href="https://github.com/chipaudette/EEGHacker/tree/master/Arduino">My code is available on my GitHub</a> as "TestHexBugController". This code tells the Arduino to listen to commands coming over Serial from the PC. I assign one ke on the PC's keyboard to each of the Hex Bug's four functions: "forward", "turn left", "turn right", and "fire". When the Arduino receives one of these commands, it toggles the relevant pin to pull it LOW for 500 msec. That's all it takes!<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgiDur7Ge6IqBof6i_JkAf5XH_8WrtBJKHOGK4XuSOuUYhAfRYv_zyWNQIJQ0PMoZ6ES8RZM6B7242LaMsLiWSMIDtcG8dw1gD4SiRitoqEX5_Abe9pgwa6MM7-qzrvk_-UOkvnVhTKRIc/s1600/Full+Test.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgiDur7Ge6IqBof6i_JkAf5XH_8WrtBJKHOGK4XuSOuUYhAfRYv_zyWNQIJQ0PMoZ6ES8RZM6B7242LaMsLiWSMIDtcG8dw1gD4SiRitoqEX5_Abe9pgwa6MM7-qzrvk_-UOkvnVhTKRIc/s1600/Full+Test.png" height="348" width="520" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Using the hacked remote so that I can use an Arduino to control my Hex Bug via<br />
commands entered from the PC.</td></tr>
</tbody></table>
<br />
<u>Testing it Out:</u> As you can see in the video at the top of this post, this setup works pretty well for controlling the Hex Bug. I can make it walk anywhere and shoot its gun on command. It's pretty fun. If I cared to, I could no script a whole series of maneuvers. If I attached a pen to the Hex Bug, maybe I could make it walk around on a big piece of paper so that it would draw out a funny picture. That could be fun. Or, I could combine it with some computer vision on my PC and have it chase my cat around. That would definitely be fun. Sadly, I don't know anything about computer vision.<br />
<br />
<u>Brain Control:</u> Really, though, my next step is to control this with my brain. So, I'll attach my OpenBCI shield to this same Arduino and I'll have the Arduino pipe my EEG signal to the PC. On the PC, I'll process the EEG signal and, if it detects the right brainwave signatures, I'll have the PC send robot commands back to the Arduino. The Arduino will then convey those commands to the robot via the IR remote. All of the hardware pieces are in place...now it's time to put it all together!<br />
<br />
<u>Follow-Up:</u> I finally did put all the pieces together. I can now <a href="http://eeghacker.blogspot.com/2014/06/controlling-hex-bug-with-my-brain-waves.html">control the Hex Bug with my brain waves!</a><br />
<div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com1tag:blogger.com,1999:blog-7276377053120174333.post-53754419442601893162014-05-11T09:37:00.001-04:002015-12-08T07:07:48.817-05:00EEG as WAV Files, Go Spectrograms!OK, let's say that I just finished some cool new EEG experiment where I recorded my EEG response to watching <a href="https://www.youtube.com/watch?v=KIePsbJSS04">cat videos</a> while listening to<a href="https://www.youtube.com/watch?v=DZNM9ANajsY"> the Pink Panther at half speed</a>. My next step would be to take a quick look at the data to get the overall big picture. My favorite way of getting that overall view is to make a spectrogram (see example below). My love for these oh-so-colorful plots runs deep. The question is, how does one make spectrograms? Well, in my opinion, if you don't have Matlab (and are afraid of Python), the next best way to make spectrograms is to use one of the multitude of audio editing software packages out there. Many audio edit programs provide a spectrogram view. This post is about getting EEG data into an audio program so that you can see your data.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixyX6jnniwoY4rSkPmm82q5mLFD28kkaWsp2ory8kk1MlNraQBDZ_QrBggNe7hrQ44bK1fbGXOSdzZ2I6fd8C_wzyCbYRxIac4ahyphenhyphenxBQe8EuKmu3qegdP5mnhLwddDI64tnEYlJEKNh08/s1600/Matlab_Chan2.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="210" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixyX6jnniwoY4rSkPmm82q5mLFD28kkaWsp2ory8kk1MlNraQBDZ_QrBggNe7hrQ44bK1fbGXOSdzZ2I6fd8C_wzyCbYRxIac4ahyphenhyphenxBQe8EuKmu3qegdP5mnhLwddDI64tnEYlJEKNh08/s1600/Matlab_Chan2.png" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A Spectrogram of EEG data that was Made in Matlab. This shows data from my previous post,<br />
where I was watching a movie with two different blink rates. You can see how my brainwaves<br />
entrained with the changing blink rate in the movie.</td></tr>
</tbody></table>
<br />
<u>Converting to a WAV File:</u> The first step in using an audio program for EEG analysis is to convert one's EEG data into an audio file. Since I usually work in Windows, I tend to convert all of my EEG data into WAV files. I choose WAV because it is uncompressed. I never choose MP3 because it is very unclear what its "perceptual coding" would do to my precious brainwave data. So, a WAV file is what I would recommend. But how do you get EEG data into a WAV format? If your EEG data is in text format (such as is logged by the OpenBCI GUI), you could use my Processing sketch "ConvertToWAV". This sketch will read in an OpenBCI log file and write each EEG channel out as its own WAV file. You can get the sketch <a href="https://github.com/chipaudette/OpenBCI/tree/master/Processing_GUI">on my GitHub</a>.<br />
<br />
<u>Audacity:</u> Once the data is in WAV format, you can open it in any audio program. A popular (and free!) audio editing program is <a href="http://audacity.sourceforge.net/">Audacity</a>. While it is not my favorite audio editing program, it is perfectly sufficient for working with EEG data. After opening your EEG data, the trick is to figure out how to switch the display from waveform to spectrogram. The screen shot below shows how to do it.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjVwd7K-DOK6ORZtZatUvlYDo9nAnmBTeffXXz8RBVkx1aHOc5fTjilkAD2JfwBmIxp0cyk37LeaWV3S5Iixd4htEokW71GxBTcv_7Oh08dMCGr1S8oHcq8UnbxRPC46awKKiInWEmW9m0/s1600/01-Audacity_ChangeToSpectrogram.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="374" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjVwd7K-DOK6ORZtZatUvlYDo9nAnmBTeffXXz8RBVkx1aHOc5fTjilkAD2JfwBmIxp0cyk37LeaWV3S5Iixd4htEokW71GxBTcv_7Oh08dMCGr1S8oHcq8UnbxRPC46awKKiInWEmW9m0/s1600/01-Audacity_ChangeToSpectrogram.png" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Changing to Spectrogram View in Audacity</td></tr>
</tbody></table>
<br />
Once Audacity is in spectrogram mode, you need to zoom in on the vertical axis in order to see the interesting EEG features, which are usually focused in the lower frequencies. In Audacity, you zoom simply with a click-and-drag on the vertical axis. Then, after manipulating the spectrogram settings under the "Preferences" menu, you can get a spectrogram like the one shown below. While the color scheme hurts the eyes a bit, this spectrogram is good enough to see the same kind of EEG entrainment as seen in my original Matlab plot. Furthermore, the tools in Audacity let you further analyze the EEG data through zooming, filtering, amplifying, and (if you change the file's sample rate to increase the playback speed) you can use Audacity to listen to your own brain waves! Audacity is definitely a useful tool for working with EEG data.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxEMvdBJ56t4w_-zhbNhyphenhyphenfKPntZLavW0b1t5fuWIjAFfYevVi4fttq_JZ6qS-crg9EPTfvR6WjwfAOYrl617f1zGLbceYJRgpNKAe-sKjAj47XZnIBl7ZiUiv_V1OSYSUs9KUgygn1maY/s1600/02-Audacity.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="336" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxEMvdBJ56t4w_-zhbNhyphenhyphenfKPntZLavW0b1t5fuWIjAFfYevVi4fttq_JZ6qS-crg9EPTfvR6WjwfAOYrl617f1zGLbceYJRgpNKAe-sKjAj47XZnIBl7ZiUiv_V1OSYSUs9KUgygn1maY/s1600/02-Audacity.png" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">In Audacity, a Spectrogram of my EEG Data</td></tr>
</tbody></table>
<br />
The spectrogram settings that I used are shown in the screen shot below.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8RjD3nsKW9ktKkUw06QRWy7NDQbY7-7_3wAvVEMHlWpB3GIFbyyyphX-LpZ4_tAHLHiT-fkQuDgZtiTP7PTR7VqM4vzwIIrF28mvb-fpEWJgvOIsfvxThQ0LVPXCN3im8hWnvNqPKCRo/s1600/03-AudacitySettings.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="242" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8RjD3nsKW9ktKkUw06QRWy7NDQbY7-7_3wAvVEMHlWpB3GIFbyyyphX-LpZ4_tAHLHiT-fkQuDgZtiTP7PTR7VqM4vzwIIrF28mvb-fpEWJgvOIsfvxThQ0LVPXCN3im8hWnvNqPKCRo/s1600/03-AudacitySettings.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">My Display Settings for Making EEG Spectrograms in Audacity. I changed<br />
the Window Size, the Gain, and the Range.</td></tr>
</tbody></table>
<u><br /></u>
<u>Cool Edit Pro:</u> I first started getting into spectrograms in the late 90's because this is when I started working with audio and music on the computer. What got me hooked on spectrograms was a piece of shareware called Cool Edit. It was a stupid name for an otherwise outstanding program. It was so useful that I spent the extra dollars and bought its upgrad -- <a href="http://en.wikipedia.org/wiki/Cool_Edit_Pro">Cool Edit Pro</a>. Cool Edit Pro has a *great* spectrogram display, as shown below. Unlike Audacity, which requires lots of manipulation of the spectrogram settings to get a useful view, the Cool Edit Pro display always seems just right. Unfortunately, Cool Edit Pro isn't available anymore -- it was bought by Adobe in the early 2000s and became <a href="http://www.adobe.com/products/audition.html">Adobe Audition</a>. Audition is also fine for making spectrograms (I have only used up to Audition 3.0), but it is expensive.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5vHSoTdWmLFC0aXLCbNCfD2wbnsz3W8LaQt57lCb1hPtJg1Lt-hi2FwRFLBnYJIZtf9U8Hv7Y6jkMZqAWlrDYoHzWETt1j5ES44AQ7XS9_4YfgGwI6PdUdUYQRqUdB6d07GbRuS2xngo/s1600/04-CoolEditPro.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="336" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5vHSoTdWmLFC0aXLCbNCfD2wbnsz3W8LaQt57lCb1hPtJg1Lt-hi2FwRFLBnYJIZtf9U8Hv7Y6jkMZqAWlrDYoHzWETt1j5ES44AQ7XS9_4YfgGwI6PdUdUYQRqUdB6d07GbRuS2xngo/s1600/04-CoolEditPro.png" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">An EEG Spectrogram in Cool Edit Pro V1.2a. It's an old school program that totally rocks.</td></tr>
</tbody></table>
<br />
In Cool Edit Pro, the only display parameter that you need to change is the "Resolution" (ie, FFT size). You do that under the "Settings" menu.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2qS7p2H0lQfWq_LZrU2_4W-hhnoMMu8Y96qu4F8F_OkYIP92e4z2eMe1Vb2vQ_qibJdkBq_tNUrk81ibov8YgAjRdXA1b7OiRTYonxOwPiQcRCHlFX9ym2gs-564pKyAdTO-XODXlFv8/s1600/05-CoolEditProSettings.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="318" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg2qS7p2H0lQfWq_LZrU2_4W-hhnoMMu8Y96qu4F8F_OkYIP92e4z2eMe1Vb2vQ_qibJdkBq_tNUrk81ibov8YgAjRdXA1b7OiRTYonxOwPiQcRCHlFX9ym2gs-564pKyAdTO-XODXlFv8/s1600/05-CoolEditProSettings.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">My settings for viewing EEG spectrograms in Cool Edit Pro.<br />
I changed the Resolution value.</td></tr>
</tbody></table>
<br />
<u>Raven Lite:</u> A third option for making spectrograms is a bit more obscure. A bunch of years ago, I came across a program called "<a href="http://www.birds.cornell.edu/brp/raven/ravenoverview.html">Raven Lite</a>", which is produced by the Ornithology Lab (ie, bird science) at Cornell University. The "Lite" version is free. You can download it and immediately use it for spectrograms, though it is crippled in other ways until you email them for a free (non-commercial) key. What I really like about Raven is that, as shown in the screen shot below, its spectrogram controls are right on the main window for easy manipulation. Also, I like its color map options way better than what is available in Audacity. Finally, Raven is one of the few programs that let you see both the spectrogram view and the waveform view <i>at the same time</i> (not shown). It is really nice to have that capability.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2mb5oUVAKVreRc_bZNUjAyoUTVqI02aaONh31iXB5MeWUOzXW3FTp1lfaooZkhA86g6N39_E5ZAg0PhrLRuMMDGxmOCvtxZOOWruWXp9dfwS82WjU7aoQJCpiwGA8Us2sctx1wpZUfXM/s1600/06-RavenLite.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="305" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2mb5oUVAKVreRc_bZNUjAyoUTVqI02aaONh31iXB5MeWUOzXW3FTp1lfaooZkhA86g6N39_E5ZAg0PhrLRuMMDGxmOCvtxZOOWruWXp9dfwS82WjU7aoQJCpiwGA8Us2sctx1wpZUfXM/s1600/06-RavenLite.png" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Raven Lite 1.0 from the Cornell Laboratory of Ornithology. It's a pretty good viewer. The settings for<br />
the display or right here in the main window.</td></tr>
</tbody></table>
<div>
<br />
<div>
<u>Other Options:</u> Because I have Matlab and Cool Edit Pro (and Audacity and Raven) I haven't spent a lot of time looking at other options. Does Garage Band offer a spectrogram view? Is there a plug-in for iTunes or Windows Media Player that gives spectrograms? I'm curious to hear what you folks use. Drop a comment and let me know!<br />
<br /></div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com12tag:blogger.com,1999:blog-7276377053120174333.post-10274774708821894652014-05-10T10:48:00.003-04:002014-06-09T21:09:18.078-04:00Controlling Entrainment Through AttentionIn <a href="http://eeghacker.blogspot.com/2014/05/visual-entrainment-blinking-screen.html">a previous post</a>, I showed that I could induce (entrain) brain waves at different frequencies simply by staring at blinking movies playing on my computer. Having demonstrated this basic feasibility, my goal now is to exploit this phenomenon to make a brain-computer interface (BCI) to control future hacks. My idea is to play two blinking movies simultaneously -- one at a slow speed and one at a fast speed. I'm hoping that my brainwaves will only entrain with the blinking from the one movie that I choose to focus on. Does my brain work this way? Will my brain successfully reject the blinking from the movie that I'm ignoring? Let's find out!<br />
<br />
<div style="text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="//www.youtube.com/embed/WJDOmC1U0Y0" width="420"></iframe>
</div>
<br />
<u><br /></u>
<u>Simultaneous Blinking at Two Speeds:</u> Previously, I made some blinking movies where the whole screen would blink black or white at a given speed. To make this idea work for a BCI, I want my screen to blink at two different rates at the same time. So, as you can see in the video above, I made the left side of my screen blink at one rate while the right side of my screen blinks at a different rate. I'm hoping that, if I focus my attention on the left side of my screen, by brainwaves will only become entrained at the left-side blink frequency, whereas if I were to focus on the right side of the screen, my brainwaves would follow the right-side blink frequency.<br />
<br />
<u>Swapping Sides:</u> To help with this test, I wanted to remove any effect of turning my head to change my gaze between the two sides of my screen. So, in creating my dual-rate blinking movie, I had the movie automatically swap sides every 20 seconds. As a result, it starts with fast blinking on the left and slow blinking on the right. After 20 seconds, it swaps so that slow is on the left and fast is on the right. It does this swap a few times. The Matlab code that I used to make these movies is <a href="https://github.com/chipaudette/EEGHacker/tree/master/Matlab">here</a>.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhg2rJbiU4Ub-DYFAG5AjqP46IbDxXFsQ1I3_XF3Xvr94SK3YS4phGCqe_0bBFSFYnxtSwgh8FfhmlEWC4HXjiccbVEyvEadYwJCQm3IFjgqj7gHwZKbP6fgZi5gUMk6NacBtXmjhCwo9Y/s1600/BothScreens.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhg2rJbiU4Ub-DYFAG5AjqP46IbDxXFsQ1I3_XF3Xvr94SK3YS4phGCqe_0bBFSFYnxtSwgh8FfhmlEWC4HXjiccbVEyvEadYwJCQm3IFjgqj7gHwZKbP6fgZi5gUMk6NacBtXmjhCwo9Y/s1600/BothScreens.png" height="254" width="530" /></a></td></tr>
<tr><td class="tr-caption" style="font-size: 13px;">I created a movie where the left and right sides blink independently -- left is fast and right is slow.<br />
For this test, the two blink rates swapped sides every 20 seconds.</td></tr>
</tbody></table>
<br />
<u>Choosing my Blink Rates:</u> Based on <a href="http://eeghacker.blogspot.com/2014/05/visual-entrainment-blinking-screen.html">my previous results</a>, it looks like my brain (coupled with my computer's <a href="http://eeghacker.blogspot.com/2014/05/measuring-video-blink-rate-with-arduino.html">limited ability to blink steadily</a>) is most easily entrained in the 6-10 Hz frequency range. So, for this dual-rate movie, I chose "slow" to toggle between black and white at 10 Hz (ie, a 5 Hz white-white rate) and "fast" to toggle at 15 Hz (ie, a 7.5 Hz white-white rate). In truth, I made a bunch of movies at different rates, but the the 10/15Hz movie worked the best, so I'll only show its results.<br />
<br />
<u>EEG Setup</u>: With my movies prepared, I gathered up my EEG stuff. Like usual, I used my <a href="http://www.openbci.com/">OpenBCI </a>board and a few cup electrodes with Ten20 paste. I put one electrode on the left side of my forehead (Fp1), on one the left side of the back of my head (O1) and one on the right side of the back of my head (O2). Using the <a href="http://eeghacker.blogspot.com/2014/04/openbci-measuring-electrode-impedance.html">impedance measuring feature</a>, my impedances were 11 kOhm, 67 kOhm, and 28 kOhm (I seem to have an on-going problem getting a low impedance at O1). My reference electrode was on my left ear lobe and my bias electrode was on my right ear lobe. My OpenBCI board was connected to the PC via USB and I was logging data using my <a href="https://github.com/chipaudette/OpenBCI/tree/master/Processing_GUI">OpenBCI GUI in Processing</a>. For this test, I also used <a href="http://eeghacker.blogspot.com/2014/05/measuring-video-blink-rate-with-arduino.html">my photocell</a> to confirm that my computer's blinking was sufficiently steady.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhU9-yvISQftELlA1x_fqevjA3wDkf4pYbJXdHdwRDofk4q1M7YTNXkPP1Y97ZlxktVCrHbT35pNVRuVzS0eBMGK5EUImEsOtHIEx-Y09QmJG4TG5LBVvReQZ8cxdQKhogiUw3xaS5UrDY/s1600/Setup.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhU9-yvISQftELlA1x_fqevjA3wDkf4pYbJXdHdwRDofk4q1M7YTNXkPP1Y97ZlxktVCrHbT35pNVRuVzS0eBMGK5EUImEsOtHIEx-Y09QmJG4TG5LBVvReQZ8cxdQKhogiUw3xaS5UrDY/s1600/Setup.png" height="324" width="480" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I used my OpenBCI V1 board to record my EEG into the computer. I also attached a<br />
photocell to confirm that the screen was blinking at the right rate.</td></tr>
</tbody></table>
<br />
<u>Results:</u> After setting everything up, I started recording my EEG data and then I started playing the dual-rate blinking movie. It was night time, so my room was pretty dark. I focused my attention at the center of the left-hand movie. As described above, the left movie toggled fast-slow-fast-slow every 20 seconds, while the right movie played the opposite -- slow-fast-slow-fast. Spectrograms of the EEG signals from my head are shown in the figure below. As you can see, there was no entrainment seen in the signals from my forehead (as expected) but there was entrainment in the back of my head (also as expected). The best entrainment was seen on the left side of my head. <br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgXk-puErCkLrB-JCFuzj3iDXDEbxgVjzbbG2QckWpXq0JCChFkAv3mzfQpht0m-sfOhkmVpZ5fzRjsttnVb56_Q8VzRd9LGxrKOu5bigKdmk1ZEHi42S1VGxBZ1BbZc_BQU0gwRHA4yfU/s1600/AllResults.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgXk-puErCkLrB-JCFuzj3iDXDEbxgVjzbbG2QckWpXq0JCChFkAv3mzfQpht0m-sfOhkmVpZ5fzRjsttnVb56_Q8VzRd9LGxrKOu5bigKdmk1ZEHi42S1VGxBZ1BbZc_BQU0gwRHA4yfU/s1600/AllResults.png" height="485" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Spectrograms of my EEG signals recorded while watching my dual-rate blinking movie. The left-back<br />
of my head exhibited the strongest entrainment to the blinking of my movie. </td></tr>
</tbody></table>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<u>Only Seeing the Left Blink Rate:</u> Because the left-back of my head (O1) gave the best entrainment, let's just focus on its results. The figure below shows just the results for the left-back of my head. Note how,once the movie starts playing, my EEG signals seem to toggle between a fast blink rate (~7.5 Hz) and a slow link rate (~5 Hz). This exactly follows the white-white blink rate of the left movie. So, my brainwaves successfully entrained to the movie that I was watching. Most importantly, there seems to be no signature in my EEG data from the blinking of the right movie. This is success!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCdSHmksbwIvU6m4c0NWRxqtAJgokCQI9EKSl_s1ranuAjo4q1RpI7GhezSWcZVVzErpqN-MK8r2M1Pmppb4oJ9mrgILaxf9yGr25QlnUGi8h4EH9A-AQrhgFIPal20QyaT4xfAGvJR0o/s1600/AllResults_Chan2.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCdSHmksbwIvU6m4c0NWRxqtAJgokCQI9EKSl_s1ranuAjo4q1RpI7GhezSWcZVVzErpqN-MK8r2M1Pmppb4oJ9mrgILaxf9yGr25QlnUGi8h4EH9A-AQrhgFIPal20QyaT4xfAGvJR0o/s1600/AllResults_Chan2.png" height="200" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Spectrogram of my EEG data from the left-back of my head (O1) while watching my dual-rate<br />
blinking movie. I was focused just on the left movie. Because of this focus, my brainwaves<br />
appear to have entrained only with the left movie's blink rate.</td></tr>
</tbody></table>
<br />
<div>
<u>Purposely Shifting My Attention:</u> OK,so I've demonstrated that my mind can successfully ignore one of the movies. That's really good. But, maybe I'm just biased to looking left. To really make this work for a BCI, I need to be able to shift my attention to either movie and have my brainwaves follow. So, for my 2nd test, I started the same movie playing back. But, this time, when the movies swapped sides every 20 seconds, I switched my attention to follow the movie that blinked faster. This means that I started by watching the left movie, then I watched the right, then left, then right. My EEG response is shown below. Note that I showed strong entrainment and,most importantly, that my brainwaves only show the fast blink rate (7.5 Hz). So, by shifting my attention to follow the faster movie, I successfully rejected the effect of the slower blinking movie. Success again!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6c5MPEonpeivFSDxQbDBrfMnJ4xVMBoy3k1sj_vVq96Dnpg2bXzwBa1BkS_qiJLOiFTM6VmXdmvtwZSI_RWtMj92NoXEjSw_-7w6mcfUjGpMuq4P0UKWmmJG42qUFWPATN_2NDLv0ehs/s1600/AllResults_Switch.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6c5MPEonpeivFSDxQbDBrfMnJ4xVMBoy3k1sj_vVq96Dnpg2bXzwBa1BkS_qiJLOiFTM6VmXdmvtwZSI_RWtMj92NoXEjSw_-7w6mcfUjGpMuq4P0UKWmmJG42qUFWPATN_2NDLv0ehs/s1600/AllResults_Switch.png" height="226" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Spectrogram of my EEG data from the left-back of my head (O1) while watching my dual-rate blinking<br />
movie. While watching the movie, I switched my attention between left and right to follow the movie<br />
that blinked faster. Because of this focus, my brainwaves remained entrained only at the faster rate.</td></tr>
</tbody></table>
<br />
<u>All the Elements are In Place:</u> It looks like I now have the elements in place for a 3-state BCI. If I don't look at the movie at all, I get State 1: "Nothing". If I watch the blinking of the left movie, I get State 2: "Left". If I watch the blinking of the right movie, I get State 3: "Right". It may be possible to further divide my screen to get more blinking regions to add more BCI states. Maybe that's a good experiment for the future. Right now, though, I think that I'm going to turn my attention to <a href="http://www.hexbug.com/mechanical/battlespider">a little robot that I got</a> (thanks for the pointer Conor!) to see if I can control it with visual entrainment. This is gonna be fun!</div>
<div>
<br />
<u>Follow-Up:</u> Interested in getting the EEG data from this post? Try downloading it <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2014-05-08%20Multi-Rate%20Visual%20Evoked%20Potentials">from my github</a>!<br />
<br />
<u>Follow-Up:</u> I successfully used <a href="http://eeghacker.blogspot.com/2014/06/controlling-hex-bug-with-my-brain-waves.html">visual entrainment to control a six-legged robot</a>!</div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com3tag:blogger.com,1999:blog-7276377053120174333.post-37157781364743907112014-05-07T09:59:00.004-04:002015-01-03T10:58:59.443-05:00Measuring Video Blink Rate with an ArduinoIn my <a href="http://eeghacker.blogspot.com/2014/05/visual-entrainment-blinking-screen.html">previous post</a>, I used blinking videos on my computer to entrain my brain waves. A key question, though, was whether my computer could play those blinking videos steadily. If the blinking isn't steady, it won't entrain brainwaves that are easily detected. So, in this new post, I show how I hacked a photocell and my Arduino to measure the blink rate that my computer is actually producing. It's a pretty simple (and cheap!) setup and, as you'll see below, its data explains some of the important findings in my EEG data!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjC1sl92LPPTZwQ_sz-2YDtoqNygodO_uhJeRP-n43ZnFuTYqBfDXIsNRXz8A7Uv9KhyCKNVsKZ_LM6AgzrvJlusmF4vvx8TfVKhaQIfQHUYWC8iU8M2akBuF_Aa3_neJvL7TGaqi8L-sY/s1600/IMG_2797.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjC1sl92LPPTZwQ_sz-2YDtoqNygodO_uhJeRP-n43ZnFuTYqBfDXIsNRXz8A7Uv9KhyCKNVsKZ_LM6AgzrvJlusmF4vvx8TfVKhaQIfQHUYWC8iU8M2akBuF_Aa3_neJvL7TGaqi8L-sY/s1600/IMG_2797.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Measuring the Blink Rate From the Movies Played Back by my Computer</td></tr>
</tbody></table>
<br />
<u>Using a Photocell:</u> My approach to measuring the video blink rate is to quickly and continuously measure the light produced by my computer screen. I chose to use a <a href="https://www.sparkfun.com/products/9088">photocell</a>, mostly because I had one that came with my very first Arduino. To learn how to use a photocell, I followed the <a href="https://learn.adafruit.com/photocells">tutorial at Adafruit</a>. It explains what a photocell is and it explains exactly how to<a href="https://learn.adafruit.com/photocells/using-a-photocell"> hook it up to an Arduino</a>. The key is that you connect a photocell and a 10K resistor in series. Together, they form a voltage divider. Then, you connect one end to +5V and the other end to ground. In the middle, at the junction between the photocell and the 10K resistor, you connect that point to the Arduino's analog input pin. Pretty easy!<br />
<br />
<u>Wiring It Up:</u> To connect all of the bits together, I needed to solder a few things. First, I gathered my components -- the photocell, some wire, and some shrink tube (to keep the soldered wires from shorting to each other).<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6kJW7DRzQHTNh2Me-D-KhlffSPwQ2h7elDBB39-3wTVqa_Ow99ZEua8evKzAlMYnISlUVZcZnKPtWz3AB4kZR_ppGXDLMI1dNEiYFKlEhFQNHj4aSjbQdSu_-NfmE6UEFWAUgq6GFiPc/s1600/IMG_2778.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6kJW7DRzQHTNh2Me-D-KhlffSPwQ2h7elDBB39-3wTVqa_Ow99ZEua8evKzAlMYnISlUVZcZnKPtWz3AB4kZR_ppGXDLMI1dNEiYFKlEhFQNHj4aSjbQdSu_-NfmE6UEFWAUgq6GFiPc/s1600/IMG_2778.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The Components: A Super-Cheap Photocell, some Wire,<br />
and some Shrink Tube. The 10K resistor is not shown.</td></tr>
</tbody></table>
<br />
Then, I soldered the wires to the legs of the photocell and insulated them with the shink tube. The photo below shows the components after this assembly. Looks decent enough.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiwtu-lgtWQ78quoUF6CMmZgxzheSkX5jTx0W5GBsybBpGSt7VOIw2whYun-AL1m2WINIeZaam_yj_cE7md_I5G9CZ2NrzvXHhgVVoUnHIRcnCgHE1lu8Y3G6_fLCZQact2YkXAS89npVM/s1600/IMG_2782.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiwtu-lgtWQ78quoUF6CMmZgxzheSkX5jTx0W5GBsybBpGSt7VOIw2whYun-AL1m2WINIeZaam_yj_cE7md_I5G9CZ2NrzvXHhgVVoUnHIRcnCgHE1lu8Y3G6_fLCZQact2YkXAS89npVM/s1600/IMG_2782.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Fully Assembled. This is the first version that I tried. Notice that the back<br />
and sides of the photocell are exposed. This turned out to be bad.</td></tr>
</tbody></table>
<br />
Unfortunately, when I hooked it up to my Arduino, I found that I was not seeing any change in the light level from my computer screen. After some playing around, I found that the photocell is sensitive to light from the back and sides, in addition to being sensitive to light from the front. So, as seen in the photo below, I added another layer of shrink tube to block out the light entering from the back and sides.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjl5su9WR_Hq_shtAGoYyRdSkjKBSCILy13GnrwftZOsx8MaGOCXIA_awfRMKuNUOhTct_eie7zy5rCPj5GZfNj6F5UhMGiGmrm_aiD9BKqW6ij7WB3NA4KseUVTyOw2pIJNpEYFMX4Zvo/s1600/IMG_2783.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjl5su9WR_Hq_shtAGoYyRdSkjKBSCILy13GnrwftZOsx8MaGOCXIA_awfRMKuNUOhTct_eie7zy5rCPj5GZfNj6F5UhMGiGmrm_aiD9BKqW6ij7WB3NA4KseUVTyOw2pIJNpEYFMX4Zvo/s1600/IMG_2783.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Modified Assembly. I added more shrink tube to wrap around the sides<br />
back of the photocell. You have to keep out that light!</td></tr>
</tbody></table>
<br />
<u>Mounting Everything:</u> Once I had my photocell on those long(ish) lead wires, I connected it to the Arduino as discussed on the Adafruit site. I then needed a way to hold the photocell close to the computer screen so that I could measure its blink rate. As shown in the photo below, I found that my adjustable soldering fixture (sometimes called a "3rd Hand" fixture) works really well. It works best if you position the photocell to be VERY close to the computer screen.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhWZo-M4YkfTbc-4JsUBZkRdgzO2dHFWCTOJ0AT0-i_h_URB2-zKyMT27xGxXsVqKGNAiArRUFDQXKXzxUJtNPn5LfCHQeFBlnGEdrIqb9G3S-YVPW15FU1CJNBNM9k0pNt7oYzMhCZWQM/s1600/AnnotatedSetup.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhWZo-M4YkfTbc-4JsUBZkRdgzO2dHFWCTOJ0AT0-i_h_URB2-zKyMT27xGxXsVqKGNAiArRUFDQXKXzxUJtNPn5LfCHQeFBlnGEdrIqb9G3S-YVPW15FU1CJNBNM9k0pNt7oYzMhCZWQM/s1600/AnnotatedSetup.png" height="285" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I held the photocell to the computer screen using a"3rd Hand" soldering fixture.<br />
To record the photocell signal, I used one of the Analog Inputs available on the<br />
Arduino that is the host for m OpenBCI shield.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2czGwCnwGhZAIi9HUgAYPduL6BdqTaWdmPpqt8E-jTOBECw_qkXJOfiu4CWFRMwiq6YM8bf4FNJ5Tw1-lhYJQLHPdc6AZteJKGcBHIhLVwrQ2_lNKQZswRlK38p-nd-d7DXYAIlCYUg8/s1600/IMG_2791.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2czGwCnwGhZAIi9HUgAYPduL6BdqTaWdmPpqt8E-jTOBECw_qkXJOfiu4CWFRMwiq6YM8bf4FNJ5Tw1-lhYJQLHPdc6AZteJKGcBHIhLVwrQ2_lNKQZswRlK38p-nd-d7DXYAIlCYUg8/s1600/IMG_2791.JPG" height="285" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Position the photocell to be VERY close to the screen.</td></tr>
</tbody></table>
<br />
<u>Arduino Software:</u> If I'm going to use my Arduino to read the photocell, I need to some software for the Arduino. My first step was to use the built-in Arduino example called "<a href="http://arduino.cc/en/Tutorial/AnalogInOutSerial">AnalogInOutSerial</a>". I then extended this program to report the actual resistance of the photocell under different lighting conditions ("<a href="https://github.com/chipaudette/EEGHacker/tree/master/Arduino">ReadPhotocellResistance</a>"). While either of these programs works fine to read the photocell, neither is clocked to read the values a steady pace. If the sampling isn't steady, there's no way to know if the video blinking istelf is steady. To fix this, you need to setup an Arduino timer. Or, you could...<br />
<br />
<u>Integrate with OpenBCI:</u> The OpenBCI shield generates data packets at a very precise rate (I usually configure mine to sample at 250 Hz). It could act as the clock to drive the Arduino to sample the photocell steadily. So, I modified the <a href="https://github.com/OpenBCI/OpenBCI/tree/master/Arduino">OpenBCI Arduino sketch</a> to read one of the analog input pins every time that it receives data from the OpenBCI shield. It then appends this extra data value to the OpenBCI data packet and sends it to the PC. Finally, I modified my <a href="https://github.com/chipaudette/OpenBCI/tree/master/Processing_GUI/OpenBCI_GUI">OpenBCI GUI</a> to receive the extra data and to include it in its log file.<br />
<br />
<u>Results:</u> I used this system to record the blinking produced by the blinking movies from <a href="http://eeghacker.blogspot.com/2014/05/visual-entrainment-blinking-screen.html">my previous post</a>. Each movie was about 20 seconds long and each movie blinked at a different rate. The digitized Photocell values are shown in the figure below as raw counts from the Arduino. Clearly, this graph is a bit too zoomed out to see much of interest (though you can see the non-steady amplitude when at the fastest speed on the right). We need to zoom in to see more detail...<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiANJJBMgzpIDa0iWsc5DSWUjEPtoRuMiL4OFSMwMT5M-5Kflz-NiIj4LkLZB3hmwRncuEbFFQoEbga0wtwIWvccXhDnwUFWxGGOBotEFijNnBFcYbfyWN7CMzA6lyGm1bbN057YmtoS7M/s1600/01-FullRawWaveform.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiANJJBMgzpIDa0iWsc5DSWUjEPtoRuMiL4OFSMwMT5M-5Kflz-NiIj4LkLZB3hmwRncuEbFFQoEbga0wtwIWvccXhDnwUFWxGGOBotEFijNnBFcYbfyWN7CMzA6lyGm1bbN057YmtoS7M/s1600/01-FullRawWaveform.png" height="210" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Sample values recorded from the photocell by the Arduino's analog input pin. I played<br />
my 10 whole-screen-blinking movies. Each movie is about 20 seconds long. Each movie<br />
has a different blink rate -- from a 1 Hz white-to-white blink rate up to a 10 Hz w-w rate.</td></tr>
</tbody></table>
<br />
<u>Zooming-In</u>: Excerpts from three of the movies are shown below. In these plots, you can see that the light pulses recorded during the 3 Hz and 10 Hz movies look to be steadily paced, whereas the pulses in the 7 Hz movie looks much more irregular. Based on this qualitative view, I'd say that the irregularity of the 7Hz movie might cause complications when used for EEG experiments.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcrK6Gg5MhfXlnB28em-uM53kxwTStgoDsYl5yTYjN6bb32DEyW_62a9xyhFLk_H2SCmLMWncl75fXEbsjxQ2_H7qh52bM_sLQkCUB_xPtLZy7sn_jx0EK5msAtal6vDRhXEIqOWzo1Vo/s1600/02-ZoomedWaveforms.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcrK6Gg5MhfXlnB28em-uM53kxwTStgoDsYl5yTYjN6bb32DEyW_62a9xyhFLk_H2SCmLMWncl75fXEbsjxQ2_H7qh52bM_sLQkCUB_xPtLZy7sn_jx0EK5msAtal6vDRhXEIqOWzo1Vo/s1600/02-ZoomedWaveforms.png" height="164" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Zoomed-In waveforms recorded from the photocell during my blinking movies. Excerpts at three<br />
different blink rates are shown. The red and blue dots show features that I used to quantify each<br />
movie's blink rate. Note that the time scale is different for each of the movies so that you always<br />
see 4 periods, despite their increasing speed.</td></tr>
</tbody></table>
<br />
<u>Measuring the Blink Rate:</u> To better assess the steadiness of each movie, I setup a routine to quantify the blink rate on a blink-by-blink basis. I did this by, first, computing the mean sensor value for the whole recording. This is my threshold for deciding whether the screen is "white" or "black". This threshold value is shown by the horizontal black line in the excerpts above. Then, I detected when the signal crossed this threshold. Each threshold crossing is shown as a blue dot in the figures above. To compute the blink rate, I compute the difference between the dots. The "white-to-white" blink rate is the difference between the red dots. Alternatively, the rate at which the screen merely changed (either from white-to-black or black-to-white), I measured the difference between the blue dots.<br />
<br />
<div>
<u>Blink Rate Throughout the Test:</u> The plot below shows the results of quantifying the blink rate throughout the test. In red, the plot shows the white-to-white blink rate. In blue, I show the blink rate from both transitions. As expected, the blink rate counting both transitions is twice as fast as the blink rate when counting just from white-to-white.<br />
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhXtQYS-yfAmkDn4qK39hXWae8blbzMCweNwKclpvcThHX3NUUNWBMZ0FumQEgPx2x1cntYkJzGS7FLm-OFproEvov02dUh6tHB93q-ETFN1kOU-9I_GGPH_fpfGB5Vx0xs5UPccm6NXpI/s1600/03-BlinkRate.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhXtQYS-yfAmkDn4qK39hXWae8blbzMCweNwKclpvcThHX3NUUNWBMZ0FumQEgPx2x1cntYkJzGS7FLm-OFproEvov02dUh6tHB93q-ETFN1kOU-9I_GGPH_fpfGB5Vx0xs5UPccm6NXpI/s1600/03-BlinkRate.png" height="211" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Blink Rate Measured for my 10 Movies. The measured blink rate generally follows the expected blink<br />
rate, though the measured blink rate exhibits unsteadiness at the faster speeds. The blink rate when<br />
counting both transitions (whit-to-black and black-to-white) is especially unsteady at the higher speeds.</td></tr>
</tbody></table>
<br />
<u>Unsteadiness:</u> As can be seen in the plot above, the blink rate is pretty steady for the first four movies (ie, speeds of 1-4 Hz W-W). For the 5th movie (5 Hz W-W), the plot above starts to look messier, especially the blue line. This means that the system is not playing back the blinking movie smoothly. As we get into the faster movies (6-9) Hz, the blue line gets extremely messy. Clearly, the system is unable to keep a steady pace of white-to-black and black-to-white transitions (the blue line), though the white-to-white period (the red line) isn't too as bad. Funnily, at 10 Hz, note that the white-to-white blink rate gets very stable again. It seems that at 10 Hz, the individual movie frames must be well-aligned with the natural update rate of the video system on my computer. <br />
<br />
<u>Relationship to EEG Data:</u> The whole purpose of this investigation was to see if my EEG results from my previous post (copied again below, for convenience) were reflecting properties of my brain, or if they were reflecting artifacts from imperfections in my computer's movie playback. My main question with my EEG data is why I exhibited no video-entrained EEG signals above 10 Hz. Well, looking at the graph of the computer's blink rate (above), we see that the video blink rate becomes extremely unstable for any frequency above 10 Hz. My computer, in other words, was unable to generate steady visual stimulation above 10 Hz. Without stable stimulation, my brain had nothing to entrain with. Therefore, these limitations in my video system mean that I cannot declare either way whether my brain can entrain with visual stimuli at speeds greater than 10 Hz. With a more stable video system, maybe I could entrain with the faster blinks.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZpig1RHwQCQfy3yowIU64BTiTMiGln8iHH1zAtq1GqOTI0rBiyWLNN4B2gIRN1m80KM-Gvg5bAQsnQEJodIT0ga3iwzSZOtyU7F_yiFJvM9VgvkLVKwhjWAsAcO18cge7F2iKqCUPDDw/s1600/Spectrogram-Case05.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZpig1RHwQCQfy3yowIU64BTiTMiGln8iHH1zAtq1GqOTI0rBiyWLNN4B2gIRN1m80KM-Gvg5bAQsnQEJodIT0ga3iwzSZOtyU7F_yiFJvM9VgvkLVKwhjWAsAcO18cge7F2iKqCUPDDw/s1600/Spectrogram-Case05.png" height="214" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">EEG data shown in my previous post. This is the signal recorded from the back of my head (reference<br />
on left ear) when staring at my blinking movies. The signals marked by the blue arrows seem to indicate<br />
periods when my brain entrained with the video on every transition of white-to-black and black-to-white.<br />
The periods marked by the red arrows seem to indicate periods when my brain entrained on just the<br />
white-to-white blink rate.</td></tr>
</tbody></table>
<br />
<div>
<u>Next Steps:</u> With this system, I have proven that I can assess the steadiness of my video playback system. Steady playback is critical to inducing visual entrainment of brainwaves. So, as I move forward with trying to create a BCI based on visual entrainment, I can use this synchronized photocell recording to confirm that the video stimulation is sufficient to (hopefully) induce EEG responses. Let the development of the visual BCI begin!<br />
<div>
<br /></div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com3tag:blogger.com,1999:blog-7276377053120174333.post-85788110969416119732014-05-04T12:11:00.000-04:002014-05-19T22:35:57.935-04:00Inducing Brain Waves with Visual EntrainmentA while back, I had a friend come over and I measured his <a href="http://eeghacker.blogspot.com/2014/01/blinky-lights-visual-entrainment.html">EEG in response to staring at a blinking light</a>. We saw (as we hoped) that his brainwaves oscillated in sync with the blinking of the light. I thought that this visual entrainment (aka "<a href="http://en.wikipedia.org/wiki/Steady_state_visually_evoked_potential">steady-stead visual evoked potential</a>") was pretty cool. Since then, I've learned that it can be used as the basis for a brain-computer interface (BCI). Because I'm still searching for a good BCI paradigm, I decided to return to my exploration of visual entrainment. Today, I'm going to show how I successfully used visual stimuli to induce brainwaves at different frequencies. As a result, I can now see an good avenue for an EEG-based BCI. Yes! Let's go!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhm4LEeNXbojtoGD7S6roBnqNnLYpZoFtJ8tyjFKYuCunraKFRclHDm3lWGmfxIzAbD03wRVvXGN1b1x6t-KYFaif6tKwbhNmJVivdr6gpic3q9-k7_mPX16quH3GrXvcf4ZiWLj9bOvgI/s1600/IMG_2724.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhm4LEeNXbojtoGD7S6roBnqNnLYpZoFtJ8tyjFKYuCunraKFRclHDm3lWGmfxIzAbD03wRVvXGN1b1x6t-KYFaif6tKwbhNmJVivdr6gpic3q9-k7_mPX16quH3GrXvcf4ZiWLj9bOvgI/s1600/IMG_2724.JPG" height="270" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Inducing SSVEP Using a Toggling Checkerboard Pattern on my Computer Screen</td></tr>
</tbody></table>
<u>Goal:</u> My goal today is to use visual stimuli to induce brainwaves across a range of frequencies. Because I want to use this for a BCI, I'm trying to determine what kind of visual stimuli I should use and what EEG frequencies I can induce. What does it take to make this work reliably?<br />
<br />
<u>Visual Setup:</u> In <a href="http://eeghacker.blogspot.com/2014/01/blinky-lights-visual-entrainment.html">my previous post</a>, my visual stimulation was simply a blinking head-lamp. It was effective (and really bright!), but I had no control over its blink frequency. As a result, I also had no control over the frequency of the brainwaves that it induced. So, for today's test, I needed to get fancier. I ditched the head-lamp and, instead, created a series of blinking movies that I could playback on my computer. I controlled the "blinking rate" by saving my movies at different frame rates. What exactly did the movie look like? Well, at first, because of a paper that I read in the VEP literature, my movie used the checkerboard pattern shown in the picture above. The movie toggled back-and-forth between this image and the inverse image (swap blacks and whites). While this worked OK, I later switched to a simpler movie (<a href="https://github.com/chipaudette/EEGHacker/tree/master/Matlab/makeBlinkingMovie">code here</a>) where the screen was simply all-white or all-black. That seemed to work better.<br />
<br />
<u>EEG Setup:</u> Once I made my movies, I set myself up with my EEG system (<a href="http://www.openbci.com/">OpenBCI </a>). I my usual gold cup electrodes with Ten20 EEG paste. I put one electrode on the back of my head (near O1) and I put another electrode on my forehead. My reference electrode was on my left ear lobe and my bias electrode was on my right ear lobe. Using the <a href="http://eeghacker.blogspot.com/2014/04/impedance-of-electrodes-on-my-head.html">impedance-measuring feature</a> of OpenBCI, the electrode on my forehead had an impedance of 24 kOhm and the one on the back of my head was about 65 kOhm. I couldn't seem to get the back electrode to a lower value.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjO6h6akoYuGaq5dh_TxJiXBQCGd7IAaLR_hKmjOrGKhhBl2LYPkUBkCyKMbUWEybzYemtWRhQQiXOErE0tH0C3RKOFyLM2KjnSbI8y4QdFexmRZvBIgg3euOVel8OlTxTz6MVw3xBP-kc/s1600/Electrodes+on+Head.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjO6h6akoYuGaq5dh_TxJiXBQCGd7IAaLR_hKmjOrGKhhBl2LYPkUBkCyKMbUWEybzYemtWRhQQiXOErE0tH0C3RKOFyLM2KjnSbI8y4QdFexmRZvBIgg3euOVel8OlTxTz6MVw3xBP-kc/s1600/Electrodes+on+Head.png" height="225" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Channel 1 was on my Forehead, Channel 2 was on the back of my head.<br />
My left ear lobe was my reference. My right ear lobe was the bias.</td></tr>
</tbody></table>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi1CJZXMiObFP18wEqkYY8LI6M8wCasEAtLWsjxYwVLu1M1xb7O7MASTgrx7kuh7xZQHcyegBOlv4apq2OsaFDiVPcQV9jHFinkMjhnobjGdM_9opq3xnMeTemqa_YgcRaXdI7M1gadECg/s1600/IMG_2744.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi1CJZXMiObFP18wEqkYY8LI6M8wCasEAtLWsjxYwVLu1M1xb7O7MASTgrx7kuh7xZQHcyegBOlv4apq2OsaFDiVPcQV9jHFinkMjhnobjGdM_9opq3xnMeTemqa_YgcRaXdI7M1gadECg/s1600/IMG_2744.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">I used my OpenBCI V1 board with Ten20 Paste.<br />
I think that those owl napkins are fun!</td></tr>
</tbody></table>
<br />
<u>Test Method:</u> Once I got everything setup, I launched the <a href="https://github.com/OpenBCI/OpenBCI/tree/master/Processing_GUI">OpenBCI GUI in Processing</a> and started an EEG recording. To playback my homemade blinking movies, I opened up Windows Media Player and set it to full-screen mode. I had ten movies, with each movie blinking at a different rate. I had WMP play all 10 movies continuously in sequence. Each movie was 20 seconds long, so the whole test took about 200 seconds. It was nighttime when I did this test and my room was dimly lit. I tried to stare at the screen and I tried to only blink my eyes at the transition between the different blinking rates.<br />
<br />
<u>Results, Checkerboard:</u> As usual, my preferred way to view the data is to make spectrograms. In the figure below, the top plot is the data from my forehead and the bottom plot is the data from the back of my head. From my forehead, there is nothing interesting except my eye blinks. From the back of my head, we see several interesting features, which I've marked with blue and white arrows. Note that these interesting features change every 20 seconds, which is the same as my 20-second movies. It seems clear to me that these features are my brainwaves responding to changing of the blink rate in my movies. Excellent!<br />
<u><br /></u>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi69pJcLAgbzNaBc5UcDSbYwVTZPa4klEo75C-4sydVCxUR5w5V3rIqczo10wohs9ZILUvrSOoCNtmFBTmvLMM-ccrEsW4JzSfysX1nnCKxnHAcz6CpUobka5hIMobw5WRcl-ovpdYuE2c/s1600/Spectrogram-Case02.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi69pJcLAgbzNaBc5UcDSbYwVTZPa4klEo75C-4sydVCxUR5w5V3rIqczo10wohs9ZILUvrSOoCNtmFBTmvLMM-ccrEsW4JzSfysX1nnCKxnHAcz6CpUobka5hIMobw5WRcl-ovpdYuE2c/s1600/Spectrogram-Case02.png" height="410" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Spectrogram of EEG Signal Recorded While Watching the Blinking Checkerboard Pattern.<br />
The top plot is the signal from my forehead. The bottom plot is from the back of my head.<br />
"W-B" is the rate at which the movie switched from either white-to-black or black-to-white.<br />
"W-W" is the rate if you measure just from white-to-white.</td></tr>
</tbody></table>
<br />
<u>Entrained with the Blink Rate?</u> Looking at the three blue arrows, it appears that I have entrained brainwaves at 2 Hz, 4 Hz, and 6 Hz. At these times, any given square in my movie was blinking at 1 Hz, 2 Hz, and then 3 Hz, if you count from white period to white period ("W-W"). Because I have entrained brainwaves as 2x the white-to-white frequency, it suggests that it is NOT white-to-white that matters, but that it is the transition between white/black or black/white that matters. At least, that is what is implied for these three (out of 10) cases for the checkerboard stimuli.<br />
<br />
<u>Complications:</u> While that would be a fine conclusion, why does this rule not continue through the other 7 cases in this checkerboard test? Why does it only work for the three cases with the blue arrows? The cases with the white arrows do show some sort of EEG response, but not at any frequency that makes sense given the speed of my movies. What is going on? I've got two possible explanations: (1) either my movies are not playing back reliably during these other cases, or (2) the checkerboard pattern is too complicated to be a good starting point for learning about my brainwaves.<br />
<br />
<u>Modifying the Test:</u> Of these two possible explanations, it's easier for me to simplify the checkerboard than it is for me to fix the reliability of my movie playback. So, I changed my movies so that the whole screen is either all all black or all white. Hopefully, this simpler visual stimuli will make my EEG response easier to understand. <br />
<br />
<u>Results, Whole-Screen Blinking:</u> After recording my EEG while staring at the new movies, the spectrograms of my data are shown below. Again, all of the interesting action is in the back of my head. The bottom plot shows that I got good entrainment of my brainwaves for nearly *all* of the new movies. I'm very pleased. I'm also very curious about the jump between the cases marked with blue arrows versus the cases marked with red arrows. What is happening here?<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_fpv1bAd8DbI20kPJD0NCZBAF8QVsdPhjPTItPje6Tinb_CiH0Leca68nsdG1aJPfGdQkSMeSoyDOD9X9bb4_3xYSFtZd6r3MjEqalr3_nDHeYAmlnTc_QyVgkt3DkzKQGl68DtYx-uU/s1600/Spectrogram-Case05.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_fpv1bAd8DbI20kPJD0NCZBAF8QVsdPhjPTItPje6Tinb_CiH0Leca68nsdG1aJPfGdQkSMeSoyDOD9X9bb4_3xYSFtZd6r3MjEqalr3_nDHeYAmlnTc_QyVgkt3DkzKQGl68DtYx-uU/s1600/Spectrogram-Case05.png" height="410" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Spectrogram of EEG Signal Recorded While Watching the Whole Screen Toggle White or Black.<br />
The top plot is the signal from my forehead. The bottom plot is from the back of my head.</td></tr>
</tbody></table>
<br />
<div>
<u>What Blink Rate Matters?</u> Looking at the first half of this plot, the blue arrows indicate cases that have results similar to the checkerboard data shown earlier. Here, my brain seems to respond to every <i>transition</i> from white to black and from black to white (aka, the "W-B-W" speed). But, for the second half of this plot, when the blinking is faster, it looks like my brainwaves follow the slower rate resulting from just the white-to-white frequency ("W-W"). Based on this weird result, I'm thinking that my brain doesn't actually care so much about whether the stimuli is W-B-W or W-W...it is simply sensitive to rhythmic visual stimuli in a certain frequency range. I'm thinking that, whatever rhythmic stimuli falls in this frequency range, my brain will become entrained with it.<br />
<u><br /></u>
<u>Quantifying Entrainment vs Frequency:</u> If it's simply the frequency that matters, it would be good to see which frequencies yield the strongest entrainment. Sure, the spectrograms above suggest which frequencies are best, but I took the next step and actually measured the EEG response at each of the stimulation frequencies. The plot below shows the EEG amplitude that I measured for each of the visual blinking frequencies. Note that there are two lines, one for if you count based on the white-to-white frequency (blue line) or whether you're counting based on all the white/black and black/white transitions (red line). This graph suggests that I seem to yield decent responses in the 6-10 Hz frequency range. So, if I'm looking to use visual entrainment for a BCI, I should focus on the 6-10 Hz band.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiKSwvdhWfPU2nrsDGnFrZr4tt9GHpyrECPfwTfFsOmtU-mqMuewQG1J4-3OVegmziiJB0ro1lK-HZJfzG7WFSdzJ1LwD-w7hQ8nszU3Wq3RtxWNGV8-2sXDw0tmXv2hq71OVik4OOeoe0/s1600/SummaryAmplitude.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiKSwvdhWfPU2nrsDGnFrZr4tt9GHpyrECPfwTfFsOmtU-mqMuewQG1J4-3OVegmziiJB0ro1lK-HZJfzG7WFSdzJ1LwD-w7hQ8nszU3Wq3RtxWNGV8-2sXDw0tmXv2hq71OVik4OOeoe0/s1600/SummaryAmplitude.png" height="222" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Amplitude of EEG Signals Induced by Visual Entrainment.<br />
My best responses seem to be in the 6-10 Hz band.<br />
That could be a good target frequency range for use in a BCI.</td></tr>
</tbody></table>
<br />
<u>Computer Could be Limiting my Performance:</u> As mentioned earlier, all of these results could be confounded by the possibility that my computer cannot reliably and steadily refresh my screen. Perhaps it can reliably handle the frequencies at 10 Hz and below, but is not steady above 10 Hz. Perhaps that's why my apparent response above 10 Hz falls off. Sure, my computer claims that the screen has a 60 Hz refresh rate, but that doesn't mean that Windows or that Windows Media Player can keep up. So, any next steps should include some method of assessing whether the computer is actually displaying my movies smoothly at the rate that I expect.<br />
<br />
<u>Entrainment for BCI:</u> My overall goal is to make a cool brain-computer interface (BCI). Because I am showing that I can successfully measure visual entrainment, I would like to further explore how visual entrainment could be exploited for a BCI. One idea is that I could simultaneously show two movies side-by-side, each blinking at its own rate. Perhaps, if I'm lucky, my brainwaves will only respond to the one movie that I'm actually watching. If that's the case, then I would have conscious control over my brainwaves (and, therefore, the BCI) simply by selecting which of the two movies that I watch. That could be very cool.<br />
<br />
<br />
<u>Follow-Up:</u> I setup a photocell and my Arduino to measure the actual blink rate of the movies on my computer. <a href="http://eeghacker.blogspot.com/2014/05/measuring-video-blink-rate-with-arduino.html">In my results</a>, I found that I can't generate steady blinking faster than 10 Hz. This is probably a strong reason why my EEG recordings exhibited no entrainment above 10 Hz...how can I entrain to signals that aren't there?!?<br />
<br />
<u>Follow-Up: </u>I extended this work by having one movie blink at two different rates. I found that I could control my entrained brainwaves by choosing which of the blink rates I focused on. Pretty cool! If you're interested, you can see the results<a href="http://eeghacker.blogspot.com/2014/05/controlling-entrainment-through.html"> in this post</a>.<br />
<br />
<u>Follow-Up:</u> Interested in getting the EEG data from this post? Try downloading it <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2014-04-30%20Visual%20Steady-State%20Evoked%20Potentials">from my github</a>!</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com8tag:blogger.com,1999:blog-7276377053120174333.post-72577027740787303482014-04-29T21:08:00.000-04:002014-05-19T22:35:03.304-04:00Concentration - Birds Beat the InternetNow I'm getting serious. In my <a href="http://eeghacker.blogspot.com/2014/04/detecting-concentration.html">last post</a>, I finally saw for myself what "concentration" looks like in my EEG signals. And now I'm totally hooked. Now I want to expand my goals. How? Well, let's see how my concentration varies during natural activities, not just during my synthetic concentration exercise. Where to start? Well, how about at breakfast? For me, breakfast includes some eating, some Internet, some bird watching...good stuff! I hoped that my new EEG metric for "concentration" might reveal some interesting trends about my brain while breakfasting. And, as you'll see, I was not disappointed...<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEikhV_FrfcsWzvgZvdjS2gIXFHNXMs0yoMhRM_FAa7T_-PBk4z69qePXBxiwKr0myqTT6gJrnujqujof7AFa6DzFDw3_maJYha6uYUZYQedB4M_OO1PpXR_5AZVDbgaNPWNFMEgDZnTC-M/s1600/IMG_2716.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEikhV_FrfcsWzvgZvdjS2gIXFHNXMs0yoMhRM_FAa7T_-PBk4z69qePXBxiwKr0myqTT6gJrnujqujof7AFa6DzFDw3_maJYha6uYUZYQedB4M_OO1PpXR_5AZVDbgaNPWNFMEgDZnTC-M/s1600/IMG_2716.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Today's Breakfast Attire...Electrodes on the Forehead and Ear Lobes.</td></tr>
</tbody></table>
<br />
<u>Goal:</u> My goal was to record some EEG signals to see how my concentration varies with different natural activities. Today, I recorded EEG while breakfasting.<br />
<br />
<u>Setup:</u> My setup for recording my EEG was similar to the previous post -- a gold electrode on the forehead, a gold electrode on my left ear lobe as reference, and a ear clip electrode on my right ear as bias. Today, I also added a second gold electrode to my forehead (Chan 1 is my left, Chan 2 is might right). The picture above shows their locations. To keep the wires out of my face (important for eating), I looped the electrode wires over my ears. I connected the electrodes to my <a href="http://www.openbci.com/">OpenBCI </a>V2 board (shown below) and recorded the data using my <a href="https://github.com/OpenBCI/OpenBCI">GUI in Processing</a>. My <a href="http://eeghacker.blogspot.com/2014/04/impedance-of-electrodes-on-my-head.html">electrode impedances</a> measured about 20 kOhm.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhFdebyILjroew2syh-pokZvMwsKcIrrTb1ED_ihJ8krWf-zivsBd9ZQLTZrvaO-i3aAivwMnCZiSPKxZJWwdsHOl9b1ltwWXK67cxhNonUQy6AL9Satq2LQOKHaWtw9Bc-tHJ3_-TYJng/s1600/IMG_2718.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhFdebyILjroew2syh-pokZvMwsKcIrrTb1ED_ihJ8krWf-zivsBd9ZQLTZrvaO-i3aAivwMnCZiSPKxZJWwdsHOl9b1ltwWXK67cxhNonUQy6AL9Satq2LQOKHaWtw9Bc-tHJ3_-TYJng/s1600/IMG_2718.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">My Usual Connection to my OpenBCI Board. Confusingly, my electrode breakout<br />
is mislabeled..."SRB1" is actually SRB2.</td></tr>
</tbody></table>
<br />
<u>Procedure:</u> Since I wanted to record natural activities, I did not define a rigid test procedure prior to the test. Without a scripted procedure, it's really tough to know what you did (and exactly *when* you did it) during a long test such as this. To address this problem, I setup my camera to record a video of the whole test. That video is my "truth". In the movie (some example frames are below), I saw that I spent some time setting up the electrodes, some time eating my food, some time on the Internet (reading and writing), some time gazing out the window at the birds (my favorite part), and some time doing more work on the Internet. Finally, at the end, I did my regular EEG concentration test -- counting backwards by 3 from 100. I've got all this as one long EEG record.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEia5Po0wXDr2Td4OCg9weBP9Rk38ObYlMG06mC7N_DzTzO6-GFrTG5QlOtYILlFmhK32EoR3dY5_RAHP4dMVVuMQ0onPk_XEhaLo_3C7WmVOYh2G99YiN4wrGQ_ZKFL5_pBnxOti-7Zqvg/s1600/Pictures+at+Table.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEia5Po0wXDr2Td4OCg9weBP9Rk38ObYlMG06mC7N_DzTzO6-GFrTG5QlOtYILlFmhK32EoR3dY5_RAHP4dMVVuMQ0onPk_XEhaLo_3C7WmVOYh2G99YiN4wrGQ_ZKFL5_pBnxOti-7Zqvg/s1600/Pictures+at+Table.png" height="267" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="font-size: 13px;">I used my camera to record a movie of me eating breakfast. I used this<br />
as a record of "truth" to see what activity caused what EEG signal.</td></tr>
</tbody></table>
<br />
<u>Data, The Quick Overview:</u> The EEG spectrogram below is the whole data record as seen my the electrode on the left side of my forehead. As you can see, there's a block of activity at the beginning (up to 240-300 sec). This is what was recorded while I was attaching the electrodes to my head. After that, there's a block of activity from 300-650 sec with some really crazy signals, followed by a long block with more typical EEG signals. What was happening during that crazy time? <br />
<br />
<div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6DTokn3menH069XgT9ZnQJOMfUpBpoc23MPYvb1xxGa37NlxqZTu7BlDHP33WciFpjgBzN3dz46KMnDurUkSVm69Fh-e2wCsR3n1_Asiex06ciPGMEW98AQ6RnFR1uxYIesYOG5YE7Nk/s1600/01-Overview-Chan1.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6DTokn3menH069XgT9ZnQJOMfUpBpoc23MPYvb1xxGa37NlxqZTu7BlDHP33WciFpjgBzN3dz46KMnDurUkSVm69Fh-e2wCsR3n1_Asiex06ciPGMEW98AQ6RnFR1uxYIesYOG5YE7Nk/s1600/01-Overview-Chan1.png" height="226" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The Complete EEG Record During Breakfast. Chewing is clearly a very<br />
intense signal that masks all true EEG activity.</td></tr>
</tbody></table>
<u><br /></u>
<u>Chewing Destroys EEG Signals</u>: By aligning the EEG data with the movie, it is clear that this period from 300-650 seconds is when I was eating my breakfast. That morning, breakfast was some wheat Chex and grapefruit juice. Pretty exciting? No? Well, the EEG signals sure are exciting. See all that strong broadband red activity? That's the effect that chewing has on EEG. Dramatic! I don't know if the cause is muscle artifact or if it is the jiggling of the electrode wires (or both), but the signals are huge! If you zoom in (not shown), you can see each individual chew. So, if you wanted a "CCI" (a Chew-Computer Interface) in addition to a "BCI" (Brain-Computer Interface), an EEG system would be a great way to do it. But, if you wanted to see <b>brain</b>waves while eating (like I was hoping to see), the act of chewing will basically destroy your data.<br />
<br />
<u>The Rest of My Data:</u> After eating, I still had another 20 minutes (1200 sec) of EEG data, so it wasn't too sad that chewing destroyed the early part of my data. The spectrogram below zooms in on just the data after my chewing. This looks like a more normal EEG recording. Below the spectrogram, I show some processed results. Specifically, I show the magnitude of the EEG signal in just the 22-100 Hz band, which was chosen based on the "count backwards by 3" experiment in my previous post. So, if "counting backwards by 3" is considered "concentration", then this blue line is a measure of concentration. At least, it is a measure of one type of concentration. In the figure, note that my concentration level does seem to change in response to my different activities. I find this to be very cool.<br />
<br /></div>
<div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjc1Z6ZIkEKxgeBIbJ1IHxcTruY3MnD5t2ZFjH8AJqHJhntpwC_VM2qKW1MJLmULOzlgdW6zXpJh5bCUIsMWFkTiOgOSo0S10_GQGA7VLb2vXcwmcY_wsedSdtHEFU7WQNQ3RlI5L7MQ0w/s1600/03-Chan1-withBP.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjc1Z6ZIkEKxgeBIbJ1IHxcTruY3MnD5t2ZFjH8AJqHJhntpwC_VM2qKW1MJLmULOzlgdW6zXpJh5bCUIsMWFkTiOgOSo0S10_GQGA7VLb2vXcwmcY_wsedSdtHEFU7WQNQ3RlI5L7MQ0w/s1600/03-Chan1-withBP.png" height="399" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Zooming in on the activity after my chewing. The top plot is the spectrogram of the data.<br />
The bottom plot shows the magnitude of the portion of the EEG in the 22-100 Hz band.</td></tr>
</tbody></table>
<u><br /></u>
<u>Birds are Better than the Internet:</u> Looking at the graph above, you can see that my concentration level starts pretty low while I'm working on the Internet. Surprisingly, the movie shows that I'm not passively reading. No, it shows that I am actively engaged (mostly typing a reply regarding a <a href="http://en.wikipedia.org/wiki/Theremin">theremin</a>). Given this engagement, I would have expected my concentration to be strong. Nope. Compare this to the next section of time, where I'm simply gazing out the window at the birds and trees. My apparent concentration level (or, at least, my EEG activity in the 22-100 Hz band) gets noticeably higher. Wow! Then, when I return to my Internet work, it drops strongly. I guess that birds are more stimulating than the Internet! Go birds!<br />
<u><br /></u>
<u>Stronger Concentration Today:</u> At the end of this test, I closed my eyes and relaxed, which caused my the EEG signal level to drop, as expected. Then, I opened my eyes and did my concentration exercise where I count backwards by 3. This portion of my test repeats what I did in my previous post. In today's recording, however, my signal levels were much higher. As shown in the plot below, today's data shows 2.8 uV with my eyes closed and 7.6 uV while counting backwards. Compare this to the previous post where I showed only 2.0 uV and 3.4 uV, respectively. So, I was 3.4 uV and now I'm 7.6 uV. This means that my "concentration" intensity is nearly twice as strong! Why? Was it because this data was from the morning, when I was fresher and could maybe concentrate "stronger"? I don't know. I do find it interesting, though.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSTCP7lKGH5Xc0C4tb5focoDNWzYdf_5KQlAS-ui9HGQPigqE-eCFUeQ782T1MK9IwsaXrXxUrnTMzfNY16HaJ8aLCNfdvU-iNS1YIdP9FXy20EiW2z1kLtSNSfk0aAZdeE6P9X2JrRHU/s1600/04-Chan1-BPvalues.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgSTCP7lKGH5Xc0C4tb5focoDNWzYdf_5KQlAS-ui9HGQPigqE-eCFUeQ782T1MK9IwsaXrXxUrnTMzfNY16HaJ8aLCNfdvU-iNS1YIdP9FXy20EiW2z1kLtSNSfk0aAZdeE6P9X2JrRHU/s1600/04-Chan1-BPvalues.png" height="215" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Quantifying the EEG Signal Level During the Different Periods.</td></tr>
</tbody></table>
<br />
<u>Summary So Far:</u> Even with just this simplistic analysis, the data has been way more surprising than I would have guessed. I would have thought that breakfast would have been a little boring...I mean, I'm just <i>sitting </i>there. But this data has been surprisingly rich. Three things have surprised me:<br />
<ol>
<li>Chewing makes huge signals as seen by an EEG system</li>
<li>Birds and trees stimulate my brain* more than the Internet</li>
<li>My peak concentration level* can change a lot day-to-day</li>
</ol>
<div>
(* In both cases, "my brain" and "concentration level" really just mean "my EEG signals in the 22-100 Hz band". But it sounds a lot less exciting when said that way.)</div>
<div>
<br /></div>
<u>One More Thing...:</u> At this point, I figured that I was done. I mean, three new findings is certainly enough excitement for me. But then I remembered that I had data from the 2nd electrode that was on my forehead. We already looked at the data from the left electrode (spectrogram repeated below). What did the data from the right electrode show? Its data as shown as the 2nd spectrogram below, though it's not particularly exciting by itself...it shares many of the signatures seen in the first electrode. The excitement comes when I examine the "coherence" of the signals between these two electrodes. The coherence as a function of time and frequency is shown in the third plot. It looks pretty boring, except right there at the end. What is happening there?<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiIMs0QD3dT4DNPwVzJTSRMA1s-dOZ6iUbJJK4h2DezKY4AEA4bZ4qwI8abAp1nY_QYbMJY_B0Mnt0lqRU9N92CuWOPnoTxIBlux8OBPOSCre09R0enf2P0wEHMIELhES0BenTvrSzPf_M/s1600/05+-+Coherence.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiIMs0QD3dT4DNPwVzJTSRMA1s-dOZ6iUbJJK4h2DezKY4AEA4bZ4qwI8abAp1nY_QYbMJY_B0Mnt0lqRU9N92CuWOPnoTxIBlux8OBPOSCre09R0enf2P0wEHMIELhES0BenTvrSzPf_M/s1600/05+-+Coherence.png" height="534" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Measuring the Coherence Between the Left and Right Electrodes on my Forehead.<br />
For the "concentration" signals prior to counting backwards, the signals are<br />
not coherent. For the counting backwards, they are coherent. Why?!?</td></tr>
</tbody></table>
<u>What is Coherence?</u>: Coherence is a measure of how two signals move together -- if one signal gets stronger, does the other get stronger, too? If one gets weaker, does the other get weaker at the same time? Signals that move together have a high coherence (ie, a value near 1.0). Signals that do not move together have low coherence (near 0.0). I've analyzed the coherence a couple of times before, such as in this <a href="http://eeghacker.blogspot.com/2014/01/breathing-meditation-alpha-coherence.html">earlier post</a>. <br />
<br />
<u>Today's Coherence Data:</u> For today's data, the coherence plot above shows a few interesting features. First, in the lower frequencies (10 Hz and below), this plot shows that the signals from the two electrodes on my forehead exhibit high coherence (the plot has a lot of red). OK. Above 10 Hz, though, the signals from these two electrodes are not coherent (blue). Fine. At then end, though, while I'm counting backwards, these higher frequency EEG signals suddenly become coherent (red). Whoa! What happened?!?<br />
<br />
<u>Counting Backward Must be Different:</u> If "concentration" is reflected as activity in the 22-100 Hz band, this coherence plot suggests that my "concentration" is different at the end compared to the rest of the test. It appears that the Internet and the gazing outdoors both induce independent (ie, not coherent) activity in the left and right sides of my forehead. Then, at the end, it appears that my counting exercise causes synchronized (ie, coherent) activity on both sides of my forehead. Counting backwards must require different brain activity than the concentration associated with the Interent and birds. While this sounds obvious, these objectively-recorded EEG signals are saying the same thing. I think that's amazing.<br />
<br />
<div>
<u>Next Steps:</u> I've discovered many features in this single recording that get me really excited. Before I get too excited, I should repeat the experiment. If these phenomena appear again (especially the finding regarding the coherence), I would feel a lot more confident that it is true. At that point, I would be really interested in seeing if something similar happens in other people. If so, perhaps its a known phenomenon discussed in the literature. Perhaps there is a known cause and a description of the brain mechanism(s) in action. I'm interested to know!<br />
<br />
<u>Follow-Up:</u> Interested in getting the EEG data from this post? Try downloading it <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2014-04-23%20OpenBCI%20EEG%20over%20Breakfast%20and%20Birds">from my github</a>!</div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com12tag:blogger.com,1999:blog-7276377053120174333.post-84784548200442624002014-04-22T13:54:00.001-04:002014-05-19T22:34:23.758-04:00Detecting ConcentrationA couple of weeks ago, <a href="http://www.sparkfun.com/">Sparkfun's </a>new product post was all about the <a href="https://www.sparkfun.com/products/12805">Neurosky Mindwave</a>. What really grabbed my attention was Nick Poole's video of his hack of using the Mindwave to bend a spoon. That was a really fun and creative way to use EEG to interact with the physical world. What also grabbed my attention was that it was yet another example of consumer EEG system saying that it detects "concentration", as if it were a well-known and well-defined EEG signature. Along with terms like "focus" and "relaxation", I always felt that a word like "concentration" was too amorphous for serious consideration. I mean, what exactly do "concentration" brain waves look like? What is the signature? I don't know. But, given the coolness of Nick's demo, I decided to do some EEG Hacking to find out!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<iframe allowfullscreen="" frameborder="0" height="315" src="//www.youtube.com/embed/WeQaNRgiwXQ" width="560"></iframe>
</div>
<br />
<u><br /></u>
<u>Neurosky Mindwave Electrode Setup</u>: I don't own a Neurosky Mindwave so I can't use that hardware to explore these "concentration" brain waves. But, I do have an <a href="http://www.openbci.com/">OpenBCI </a>system, and it's pretty flexible, so I'll try that instead. The main question is how to setup the electrodes. Looking at the videos for the Mindwave, and looking at the <a href="https://learn.sparkfun.com/tutorials/hackers-in-residence---hacking-mindwave-mobile/what-is-the-mindwave-mobile">Sparkfun hack pages</a>, the Mindwave appears to use an electrode on the forehead and then another on an ear clip. I'm assuming that the one on the ear clip is the reference electrode. It does not appear to use a bias electrode, probably because they found that it was not needed for this body-mounted, battery-powered system.<br />
<br />
<u>OpenBCI Electrode Setup:</u> To mimic the Mindwave setup, I put a gold cup electrode on my forehead and another on my left ear lobe. The one on my forehead was plugged into Channel 1 of my OpenBCI board and the one on my ear lobe was used as the reference. Because my system is not battery powered, I did use a bias electrode, which was an ear clip electrode placed on my right ear lobe (this is the first time I've tried the ear clip electrodes). I also chose to stick another gold cup electrode on the back of my head, just to see what happened back there during this experiment. Oh, and to attach my electrodes, I used standard <a href="http://www.mfimedical.com/weaver-ten20-conductive-paste.html">Ten20 conductive paste</a>. My <a href="http://eeghacker.blogspot.com/2014/04/impedance-of-electrodes-on-my-head.html">impedance check</a> showed about 30 kOhm for each electrode, so not too bad.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGSq-RG9FfWPCwaDFS4aIdZAenBqZlzERjq6b6U0JShKRBibsb5qtjwCYQLMadb2buZvz5iUEMpBOQ3FbtJ_xA-ZHCEeg1JcNvaLM_JIVAD-v1awFPfCyli36146fF13FOvdRsjsmyGug/s1600/IMG_2687.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGSq-RG9FfWPCwaDFS4aIdZAenBqZlzERjq6b6U0JShKRBibsb5qtjwCYQLMadb2buZvz5iUEMpBOQ3FbtJ_xA-ZHCEeg1JcNvaLM_JIVAD-v1awFPfCyli36146fF13FOvdRsjsmyGug/s1600/IMG_2687.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Using OpenBCI (V2), 3 gold cup electrodes, and one ear clip.<br />
Oh, and some guy's brain, too.</td></tr>
</tbody></table>
<br />
<u>Procedure:</u> Watching Nick's video, he says that he is able to trigger the Mindwave's concentration detector by mentally counting backwards by 3, starting from 100. This sounds pretty straight-forward and he clearly had good success with it. Frankly, I was a little more skeptical about my own ability to make it happen. So, in my data, to make it clear to me where I was trying to concentrate, I closed my eyes for a short period before and after my mental counting. I did this because, by closing my eyes, I would generate strong alpha waves (10 Hz) that would clearly show up in the data. As a result, after the test, I could look for the data between the two alpha wave recordings...this would be the period when I was concentrating. Let's see what I got.<br />
<br />
<u>My First Look at the Data:</u> The spectrogram below shows how I typically look at an EEG recording for the first time. Note that frequency is on the vertical axis and time is on the horizontal axis. You can definitely see the signature of the alpha waves (that horizontal stripe around 10 Hz) at the beginning and at the end of my recording. In the middle is the period of time when I was concentrating. In this plot, I don't see anything interesting during the concentration portion of the test. I just see some "noise" that looks little different from everything around it. Bummer.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgEeVVtnb9pUDZTqCg3YMbs5tDCP4DiASGt13KRtcXqhkoSuBFGoJlngAX2W4UWeil_5sG-x5jxHYEkzeuTj4pA3s53BDwYLO7QPhpYLSMCVoG1pnDxpKMRyCE5KprFNoxMWOliDTyx0qY/s1600/02-Test12Overview.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgEeVVtnb9pUDZTqCg3YMbs5tDCP4DiASGt13KRtcXqhkoSuBFGoJlngAX2W4UWeil_5sG-x5jxHYEkzeuTj4pA3s53BDwYLO7QPhpYLSMCVoG1pnDxpKMRyCE5KprFNoxMWOliDTyx0qY/s1600/02-Test12Overview.png" height="228" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">You can definitely see the alpha waves from my eyes being closed. Good.<br />
But, is anything happening during concentration?</td></tr>
</tbody></table>
<br />
<u>Higher Frequencies?</u> But then I remembered reading somewhere (like in one of <a href="http://eeghacker.blogspot.com/2013/10/eeg-frequency-bands-jorge-ochoa.html">my own early posts?</a>) that "concentration" was usually seen as increased activity in the higher EEG frequencies -- the so-called Beta waves (13-30 Hz). So, I replotted the data where, this time, I zoomed way out on the frequency axis. As you can see below, I'm now showing zero to 100 Hz. In this new plot, you can clearly see that there is more EEG activity when I was concentrating compared to when I was not. Now we're getting somewhere!<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiLjWXHwxBDfemld9MmT_Lhy6FFlw36Z7kimgiMjWaE9yOfpDCrIFrfOpwxdKA80ezO0c3__F9GWa6X7y8YJBFgxcV6QLmdYY4IkXLzlj-vglrSny_WDG5l4i22y5Q436g3yh6b6pNZhnk/s1600/03-Test12ZoomOut.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiLjWXHwxBDfemld9MmT_Lhy6FFlw36Z7kimgiMjWaE9yOfpDCrIFrfOpwxdKA80ezO0c3__F9GWa6X7y8YJBFgxcV6QLmdYY4IkXLzlj-vglrSny_WDG5l4i22y5Q436g3yh6b6pNZhnk/s1600/03-Test12ZoomOut.png" height="249" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">By zooming out to see the higher frequencies, it does look like there are more activity<br />
in the high frequencies (20-100 Hz) when I am concentrating. Cool! (Note: the dark<br />
horizontal stripe in the middle is the effect of my 60 Hz notch filter.)</td></tr>
</tbody></table>
<br />
<u>Comparing the Spectra:</u> While spectrograms like the one above are helpful for quick qualitative views of both time and frequency, it is difficult to be quantitative with a spectrogram. So, in the plot below, I show the average spectrum for a period of strong concentration (t = 90-130 sec) and I show the average spectrum for a period where my eyes were closed and my mind was especially quiet (t = 155-178 sec). As can be seen below, the two spectra are definitely different, especially for frequencies above 22 Hz. <br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEghhnHkzHKGV0ihTAKlbEQmwZK61QjLBZcyRBVjDmxB0Th7_xJiEp1C1TFJ6KHCnaBJ34H80XiklyP2KFvE5WJL90YX6xzpnDdzBycS4W60XFrkipO-NRklsyl4eW_1TcfIzSUu5iLFYAg/s1600/04-Test12Frequencies.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEghhnHkzHKGV0ihTAKlbEQmwZK61QjLBZcyRBVjDmxB0Th7_xJiEp1C1TFJ6KHCnaBJ34H80XiklyP2KFvE5WJL90YX6xzpnDdzBycS4W60XFrkipO-NRklsyl4eW_1TcfIzSUu5iLFYAg/s1600/04-Test12Frequencies.png" height="277" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Comparing the average frequency spectrum with my eyes closed (t=90-130 sec)<br />
to the average spectrum while concentrating (t=155-178 sec). Note that<br />
above 22 Hz, concentration exhibits more signal energy.</td></tr>
</tbody></table>
<br />
<u>Detecting Concentration:</u> With the knowledge that, in my brain, "concentration" starts to show itself as increased EEG energy above 22 Hz, I can now contemplate building a concentration detector. The key is to filter my EEG data so that I can assess the intensity of EEG activity in frequencies above 22 Hz. Then, I'd pick a threshold to which I can compare the EEG intensity level. If my EEG signals are <b>stronger </b>than my threshold, my detector would say that I <b>am </b>concentrating. If I'm <b>weaker </b>than the threshold, my detector would say that I <b>am not</b> concentrating. Sounds pretty easy, right?<br />
<br />
<u>Applying to My Recorded Data:</u> In the figure below, I apply this idea to the data that we've been discussing. The top plot is the same spectrogram that I showed below. The bottom plot is what happens when I filter the EEG data to show the intensity just for frequencies between 22 and 100 Hz. You can see, the trace does indeed move up and down to reflect whether I'm concentrating or not. Specifically, for the sustained concentration (t = 90-130 sec), my filtered EEG signal is running about 3.4 uVrms. Then, when I close my eyes and relax (after t = 140 sec), my EEG signals drop down to about 2.0 uVrms. So, if I were to define a threshold for detecting concentration, I might put it somewhere in the middle...say, around 2,7 uVrms.<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiE2YbtRYPOJsXvEt9D9BXVEicyEimncwKTMhTh0A4HVw4yCStK7MDod2_tt8Qh8hnM7sR4sSgwTfMnScVB8LregWJaJvacYFae1fdWnc8ELODY9uEspqpa0XBr97r09HxM_0ms40d2Xrw/s1600/05-Test12FilteredRMS.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiE2YbtRYPOJsXvEt9D9BXVEicyEimncwKTMhTh0A4HVw4yCStK7MDod2_tt8Qh8hnM7sR4sSgwTfMnScVB8LregWJaJvacYFae1fdWnc8ELODY9uEspqpa0XBr97r09HxM_0ms40d2Xrw/s1600/05-Test12FilteredRMS.png" height="416" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Measuring the EEG amplitude in the 22-100 Hz frequency band. Note how it is low while<br />
my eyes are closed and that it goes higher while concentrating.</td></tr>
</tbody></table>
<br />
<u>Feeling Some Success:</u> The plot above is making me pretty excited. It suggests that I have conscious control over my EEG signals. To date, I've only had strong success with controlling my Alpha waves (by opening and closing my eyes). I've also had some small success with <a href="http://eeghacker.blogspot.com/2013/11/waveforms-from-homemade-electrodes.html">Mu waves</a>, but they're really hard for me to get. So, seeing this concentration-induced Beta (13-30 Hz) and Gamma (30-100 Hz) is pretty darned exciting.<br />
<br />
<u>Criticism:</u> A critic reading this post might argue that I have not proven any link to concentration. A critic might say that the increased high frequency EEG energy could just be a natural result of opening my eyes. Based on the data shown so far, that would be a fair criticism.<br />
<br />
<u>Gathering More Eyes-Open Data:</u> To counter this criticism, the data below is from another test that I performed using the same setup. In this test, I performed a similar procedure where I started with my eyes closed, had a period with my eyes open, and then finished with my eyes closed. Unlike the previous test, though, I did not do my concentration exercise during the eyes-open period. As a result, we should be able to see whether the increased high-frequency EEG activity is due to concentration or due to simply having my eyes open. <br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhwO6H2nEimadUPIExOiUDGgcLoPYBLKtmVogqnb03CzsMXl43vHcbl1QafeFhlVXc063p04QISYtVclLW_JscpUklrrCFkhyphenhyphenOgbJQZVngyufB7TylKGKw7BwcdtQbX2WXx1O5RGbpauVc/s1600/07-Test10_NotConcentrating.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhwO6H2nEimadUPIExOiUDGgcLoPYBLKtmVogqnb03CzsMXl43vHcbl1QafeFhlVXc063p04QISYtVclLW_JscpUklrrCFkhyphenhyphenOgbJQZVngyufB7TylKGKw7BwcdtQbX2WXx1O5RGbpauVc/s1600/07-Test10_NotConcentrating.png" height="425" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">A second EEG test where I was NOT purposely concentrating during the eyes-open portion of the test. Note that the EEG intensity is much less intense than seen during my previous test where I was purposely concentrating.</td></tr>
</tbody></table>
<br />
<u>Not Concentrating:</u> In the plot above, you can see that there is a trend in my EEG signal strength, but that it is not related to the opening of my eyes. At the beginning, when my eyes were closed, my high-frequency EEG signals were pretty low at 1.9 uVrms. Then, when I opened my eyes (t = 210 sec), my EEG intensity increase only slightly to 2.0 uVrms and stayed that way for quite a while. I think that this is strong evidence that simply opening your eyes does not specifically trigger increased Beta and Gamma activity.<br />
<br />
<u>Wandering Mind:</u> In the second half of my eyes-open period, we do see that my EEG intensity drifts upward. Eventually, it averages about 2.5 uVrms. Perhaps this increase reflects that I got bored and started thinking about my next EEG test. Regardless of the reason, you'll note that even the increase to 2.5 uVrms still does not exceed the 2.7 uVrms threshold that we set a couple of paragraphs ago. So, this small increase does not meet our criteria for "concentration".<br />
<br />
<u>Conclusion:</u> I think that this second data set is good evidence to declare that intense (>2.7 uVrms) Beta and Gamma activity is <b>not</b> due simply to opening my eyes. I am feeling pretty confident that the intense high frequency EEG activity seen in the first data set is due to my concentration. This means that Beta and Gamma activity is under my conscious control, which is the most exciting EEG result that I've had in a long time.<br />
<br />
<u>Next Steps:</u> Being under conscious control means that I could potentially use "concentration" as part of a <a href="http://eeghacker.blogspot.com/search/label/BCI">brain-computer interface</a> for future hacks. I'm always looking for ways that I can try to control things in the physical world using just my brain waves. Perhaps with some practice, I could use this technique to compete with Nick Poole in a spoon-bending competition!<br />
<br />
<u>Follow-Up:</u> I recorded my concentration level while eating breakfast, <a href="http://eeghacker.blogspot.com/2014/04/concentration-birds-beat-internet.html">and found some really cool changes</a>!<br />
<u>Follow-Up:</u> Interested in getting the data from this post? Try downloading it <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2014-04-05%20Impedance%20and%20Concentration">from my github</a>!Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com11tag:blogger.com,1999:blog-7276377053120174333.post-17364564839398337932014-04-11T22:12:00.001-04:002014-05-19T22:33:26.023-04:00Impedance of Electrodes on my HeadFollowing from my previous post on figuring out how to get OpenBCI to <a href="http://eeghacker.blogspot.com/2014/04/openbci-measuring-electrode-impedance.html">measure the electrode-to-skin impedance</a>, I figured that now is the time to actually measure the impedance of real electrodes on my actual skin. I decided to try two types of electrodes: (1) disposable ECG electrodes and (2) re-usable gold cup EEG electrodes. Here's the story of what I found...<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQRWhcfN2kbfOh_XgXUoqYDDyjlAS3UC-u0VLK6KLqjpbZwBfgXRT6n43xKuZ4kDAsKAXMM-uyjNrLI_sxuTE5LiLFf50dZqHEWvuR9crZlO8ZwX-uH4eZuWiXEHBrU32_a7MmcvtMgvE/s1600/ChipPhotos.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQRWhcfN2kbfOh_XgXUoqYDDyjlAS3UC-u0VLK6KLqjpbZwBfgXRT6n43xKuZ4kDAsKAXMM-uyjNrLI_sxuTE5LiLFf50dZqHEWvuR9crZlO8ZwX-uH4eZuWiXEHBrU32_a7MmcvtMgvE/s1600/ChipPhotos.png" height="282" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">On the left, I'm trying ECG electrodes. On the right, I'm trying<br />
gold cup EEG electrodes. In both cases, I'm looking pretty sharp.</td></tr>
</tbody></table>
<br />
<u>Disposable ECG Electrodes:</u> First, I decided to try some disposable ECG electrodes. These are cheap and really easy to use. They're not very good for using in your hair, but they're great for sticking on your forehead. For the reference and bias connections, I used an ECG electrode on the mastoid bone behind each of my ears. The picture below shows the pre-gelled, self-adhesive ECG electrodes that I used along with the clip-type ECG electrode wires (see<a href="http://eeghacker.blogspot.com/2013/11/collecting-ecg-with-my-eeg-setup.html"> this post</a> for specific recommendations). <br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsMiZDdaPDWmuKuirbE-PjOIVzErdBsRi5Dmb3XMdN4IRKnsukTnR3GVoQwxFN0RuFaLgV63t8aLQLq-oB37sO6KHre6tn1r95z6kHDRxOKqbsyZn-o7YkWcjxPpB4jozQMVUFBChIlI0/s1600/IMG_2673.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsMiZDdaPDWmuKuirbE-PjOIVzErdBsRi5Dmb3XMdN4IRKnsukTnR3GVoQwxFN0RuFaLgV63t8aLQLq-oB37sO6KHre6tn1r95z6kHDRxOKqbsyZn-o7YkWcjxPpB4jozQMVUFBChIlI0/s1600/IMG_2673.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">ECG Electrodes and Clip Leads That I Used on my Head</td></tr>
</tbody></table>
<br />
Once I stuck three of these electrodes on my head (forehead and behind each ear), I connected the lead wires to my <a href="http://www.openbci.com/">OpenBCI </a>V1 board using <a href="http://eeghacker.blogspot.com/2013/11/making-eeg-electrode-adapter.html">my homemade adapter</a>.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiw2gKs8DUfoc4t1wQ3bjqZBF6AaAgIp0nX-OkyiCOk1LjyMKlXPW8w0N315WPoGcjk-ahZIJyDxqudlABvEJnkOzg4Ou3zWO-6E3NhZxbIW9Fs9vzj6kdwSBt4S96009urpGzfPIVN-mQ/s1600/IMG_2642.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiw2gKs8DUfoc4t1wQ3bjqZBF6AaAgIp0nX-OkyiCOk1LjyMKlXPW8w0N315WPoGcjk-ahZIJyDxqudlABvEJnkOzg4Ou3zWO-6E3NhZxbIW9Fs9vzj6kdwSBt4S96009urpGzfPIVN-mQ/s1600/IMG_2642.JPG" height="268" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Connecting the ECG Leads to my OpenBCI V1 Board.</td></tr>
</tbody></table>
<br />
After getting all connected, I activated the ADS1299's "Lead Off" excitation signal for the channel that was connected to the ECG electrode on my forehead. As discussed in my previous post, the excitation signal is a 6 nA AC current source that the ADS1299 toggles at 31.2 Hz. The flow of current through the electrode creates a voltage that can be measured by OpenBCI just like a normal EEG signal. I configured my OpenBCI board to digitize the data and send it to the PC. On the PC, I used the <a href="https://github.com/OpenBCI/OpenBCI/tree/master/Processing_GUI">OpenBCI GUI</a> to view the data in real time and to record it for post-test analysis. <br />
<br />
A zoomed-in plot of the recorded waveform is shown on the left of the figure below. As you can see, it has a fairly large amplitude of 508 uVrms. This corresponds to an impedance of 120 kOhm. That's really big! Being surprised by that large value, I swapped the wires around so that I was measuring the electrode that I had been using as my reference (or "-") electrode. As seen in the waveform above on the right, I got a very similar value. That's not cool.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjI6XfcefadQ6QH0YTbslAj3S3kGJqsyb0TypSPFO9766oSanCsRq430TvX0WVowBvwa45dqBvpoE4Vh8rlAa7BuH-b6L3bVSyzxzBEFOweXUkYHLkquooRCsRZ7Chd3lxe3kEKDtjNPhA/s1600/01-Traces-ECG_Traces.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjI6XfcefadQ6QH0YTbslAj3S3kGJqsyb0TypSPFO9766oSanCsRq430TvX0WVowBvwa45dqBvpoE4Vh8rlAa7BuH-b6L3bVSyzxzBEFOweXUkYHLkquooRCsRZ7Chd3lxe3kEKDtjNPhA/s1600/01-Traces-ECG_Traces.png" height="195" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Example Waveforms Recorded While Using OpenBCI To Measure the<br />
Electrode-to-Skin Impedance of (Left) the ECG Electrode on my<br />
Forehead and (Right) the ECG Electrode behind my Left Ear.</td></tr>
</tbody></table>
<br />
Zooming out so that you can see more of my recording, the figure below shows about a minute's worth of data. This is the full recording from which I made the excerpts shown above. The longer view below shows the story of me recording data for one electrode (up to about t = 119), of how the signal goes away as I unplug and swap the electrode connections (from t = 120 to t = 132), and then how the signal returns once I am connected to the other electrode. Again, you can see that I measured 120K from one electrode and 116K from the other electrode.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj64oipa2GLXNevc6U7qlZVou0v9qhm0FPJBV5omhGLTrmJD7Jsxo2cr6qxMjtTZ3azkoTS1N5T0TjZSeujGZNDuBXNrHKm_9AXqffz1GHW7BBkY_JigrcszKr3-yf_n_AzlFffRfZ3lrE/s1600/02-Traces-ECG_Story.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj64oipa2GLXNevc6U7qlZVou0v9qhm0FPJBV5omhGLTrmJD7Jsxo2cr6qxMjtTZ3azkoTS1N5T0TjZSeujGZNDuBXNrHKm_9AXqffz1GHW7BBkY_JigrcszKr3-yf_n_AzlFffRfZ3lrE/s1600/02-Traces-ECG_Story.png" height="460" width="540" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Zoomed-Out View of my Recordings Using the "Lead-Off Detection" Excitation<br />
while Using Disposable ECG electrodes.</td></tr>
</tbody></table>
<br />
<u>Gold Cup EEG Electrodes</u>: Because I found the impedance of the ECG electrodes to be surprisingly high, I tried using some <a href="http://www.mfimedical.com/reusable-gold-eeg-cup-electrodes.html">gold cup EEG electrodes</a>. The picture below shows the electrodes and the c<a href="http://www.mfimedical.com/weaver-ten20-conductive-paste.html">onductive electrode paste</a> that I used. Like with the ECG electrodes, I put one of these on my forehead, one on the bone behind my left ear for the EEG reference and one one the bone behind my right ear as the EEG bias.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj_LeuNC0Mt0ZZICWt8w1jEnwA0pIUK3OLHKa-AAGHMUo4Gl9096TX41mLKTZL0x0aGQuIPa55UDzAbJSc5ifpefU5x61SbJeCKkAW236gL8jjnzbhMEQbTBtFTepMa1SMmPCNU1k0QaWs/s1600/IMG_2672.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj_LeuNC0Mt0ZZICWt8w1jEnwA0pIUK3OLHKa-AAGHMUo4Gl9096TX41mLKTZL0x0aGQuIPa55UDzAbJSc5ifpefU5x61SbJeCKkAW236gL8jjnzbhMEQbTBtFTepMa1SMmPCNU1k0QaWs/s1600/IMG_2672.JPG" height="266" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Gold Cup EEG Electrodes and Ten20 Brand EEG Paste</td></tr>
</tbody></table>
<br />
After putting on the electrodes, I activated the "Lead Off" excitation, like before. Some examples waveforms from the data that I recorded are shown below. As expected, the waveform shape is the same as seen before, but the amplitude is different, which reflects the fact that the impedance of the electrode-to-skin interface is different. <br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQnD1KYp9B4vX-EKNTRwcgrh8aIvGNQ_Vtm3f16UJG6sjQNoYNk8LsJOgUzSCbX1k7oqncI4KD8WPBlR9B18vhrnZE2OCWiS-TTpVuyb80GszZ2ECEI30O1LRJxCvSa-28mzTFX0-tGaw/s1600/03-Traces-EEG_Traces.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQnD1KYp9B4vX-EKNTRwcgrh8aIvGNQ_Vtm3f16UJG6sjQNoYNk8LsJOgUzSCbX1k7oqncI4KD8WPBlR9B18vhrnZE2OCWiS-TTpVuyb80GszZ2ECEI30O1LRJxCvSa-28mzTFX0-tGaw/s1600/03-Traces-EEG_Traces.png" height="207" width="560" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Waveforms Recorded from the Gold Cup Electrodes During the "Lead Off" Excitation.</td></tr>
</tbody></table>
<br />
Notice that the three plots show decreasing amplitude, which means that I was getting better contact and less impedance in each case. What was happening?<br />
<br />
Well, for the first waveform (the one on the left) was shows what I measured when I first attached the electrodes. It shows an RMS amplitude of 389 uV, which corresponds to an impedance of approximately 92 kOhm. This was still higher than I wanted, so I fiddled with the electrode and pushed it into my skin to try to make better contact. That's when I got the middle graph -- 230 uV and 54 kOhm. Finally, I pulled off the electrode, replaced the conductive paste, and really pressed and twisted the electrode against my skin. That's when I got the graph on the right -- 64 uV, which corresponds to 15 kOhm. That is more like the kind of value that I was hoping to see.<br />
<br />
Below is a zoomed-out plot of the whole scenario with the gold cup electrode. Again, this is the full record from which I made the excerpts above. On the left side of the plot, you can see the 389 uV, 92 kOhm condition that I showed before. Then, you can see my multiple attempts at re-seating and re-pasting the electrode. Finally, at the end, I finally got to the 64 uV / 15 kOhm condition. So, while it does take some effort, it is possible to improve the electrical contact between the electrode and your skin.<br />
<div>
<br /></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEfvhWwuNcG-5QWwq4wWD_im1BlL_zl89pfWxijZ3hEyu8nRz9e-7WoZ0AtCAOn3RZMOAUPO8sK0c9CsuS15TXAEUcm-N0OTyPBUriXM-tTErHkJbOB1bcOl-y00B1CYZyQ-vteIWJhaw/s1600/04-Traces-EEG_Story.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEfvhWwuNcG-5QWwq4wWD_im1BlL_zl89pfWxijZ3hEyu8nRz9e-7WoZ0AtCAOn3RZMOAUPO8sK0c9CsuS15TXAEUcm-N0OTyPBUriXM-tTErHkJbOB1bcOl-y00B1CYZyQ-vteIWJhaw/s1600/04-Traces-EEG_Story.png" height="530" width="580" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Zoomed-Out View of my Recordings Using the "Lead-Off Detection" Excitation<br />
while Using the Gold Cup EEG Electrodes</td></tr>
</tbody></table>
<br />
<u>Why Were the ECG Electrodes So Bad?</u> This experiment started with the ECG electrodes, which yielded a very high impedance of 120 kOhm. If I could only use the ECG electrodes, this high impedance value would have prompted me to remove the electrode, to scrub the skin (hard!) with alcohol and a rough pad, and then to attach a new electrode. Maybe this would have worked to lower the impedance, or maybe not. If it would not have helped, the problem could be that my ECG electrodes are really old. If you look really closely at my picture showing the electrodes, you'll see that the ECG packet in the background shows a date of "June 2012". Yikes! I have a friend who is developing a hacker-friendly EMG system (go <a href="http://www.flexvoltbiosensor.com/">FlexVolt</a>!) who has mentioned to me that he has seen difficulty when using old ECG electrodes. So, I'm thinking that maybe disposable electrodes have a limited shelf life...and that 2 years is maybe too old.<br />
<br />
<u>Next Steps</u>: With the impedance monitoring working on my OpenBCI board, I'm hoping that it will enable me to make more reliable EEG recordings. Hopefully, getting good low-impedance connections will increase my chances of detecting those low-level signals that have vexed me with their sometimes-I-see-them and sometimes-I-don't behavior. In particular, I'm thinking about those pesky <a href="http://eeghacker.blogspot.com/2013/10/finding-my-mu-waves.html">Mu waves</a> that have been hard for me! We'll see if this impedance checking can help...<br />
<div>
<br />
<div>
Thanks for reading!<br />
<br />
Follow-Up: Want to get the data from this post? Try downloading it <a href="https://github.com/chipaudette/EEGHacker/tree/master/Data/2014-04-05%20Impedance%20and%20Concentration">from my github</a>!<br />
<br /></div>
</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com11tag:blogger.com,1999:blog-7276377053120174333.post-86419339028327972762014-04-10T07:01:00.001-04:002019-12-03T08:40:15.119-05:00OpenBCI: Measuring Electrode ImpedanceAn important driver of EEG signal quality is how well the electrodes are electrically connected to the skin. Common clinical and research guidance often says to use skin cleansing and skin abrasion to get the electrode-to-skin impedance down below <a href="http://www.acns.org/pdf/guidelines/Guideline-3.pdf">10 kOhm</a> or even <a href="http://www.medicine.mcgill.ca/physio/vlab/biomed_signals/eeg_n.htm">5 kOhm</a>. If you don't, you can get noisy or unrepeatable measurements. Most EEG systems allow you to measure the impedance at each electrode. The <a href="http://www.openbci.com/">OpenBCI</a> system that I have been helping to develop is also capable of doing this (see other's work on <a href="http://www.openbci.com/forums/topic/ads1299-electrode-impedance-measurement-algorithm/">using the ADS1299 to measure impedance</a>) but I have not taken the time to figure out how to use it. Until now...<br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhfv9DY9M2_iVGoWVHQmoiXE-W5tZK63AoiJvD50BhsTyPFf5MhntN_y42oEFfYPeQz6iSizXUKJpXsf0Y5h1REUNUB5IA-OH9jHW5ByZXr_FmQ5ro8N9KyG-O2ZTRARm0CF9HA6XovQU8/s1600/OverallSetup.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="295" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhfv9DY9M2_iVGoWVHQmoiXE-W5tZK63AoiJvD50BhsTyPFf5MhntN_y42oEFfYPeQz6iSizXUKJpXsf0Y5h1REUNUB5IA-OH9jHW5ByZXr_FmQ5ro8N9KyG-O2ZTRARm0CF9HA6XovQU8/s1600/OverallSetup.png" width="580" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Measuring Electrode Impedance Using the ADS1299's "Lead Off" 6nA Current Source</td></tr>
</tbody></table>
<u><br />Impedance from Voltage and Current:</u> The main idea with measuring the impedance of the electrode-to-skin interface is for the EEG system to inject a known current through the electrode and to measure the resulting voltage difference. Since V = I*R, you can easily compute impedance "R" by taking the measured voltage "V" and dividing by the known current "I". Pretty easy, right? Well, how do you inject the current? And how do you measure the voltage across the electrode-to-skin interface?<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<u>Injecting the Known Current:</u> The core of the OpenBCI board is the ADS1299 integrated circuit from Texas Instruments. It has a feature called "<a href="http://e2e.ti.com/support/data_converters/precision_data_converters/f/73/p/238755/959214.aspx#959214">Lead Off Detection</a>" that does this trick of injecting a known current into each electrode. As you can see in the figure above, a very small current (shown as 6 nA) is forced into the electrode line by a current source built into the ADS1299. So, no matter how much resistance or impedance is between the current source and ground (within reason, of course), the system forces 6 nA through the electrode to ground.<br />
<div>
<br />
<u>What is Ground?</u> Unless you are sitting in salt water or touching something big and metal, it is unlikely that your body is connected to ground. To address this, an EEG system often provides an extra connection in addition to the regular electrodes. This extra connection is usually called something like "<a href="http://e2e.ti.com/support/data_converters/precision_data_converters/f/73/t/314891.aspx">bias</a>", or "<a href="http://www.bci2000.org/wiki/index.php/User_Tutorial:EEG_Measurement_Setup">driven ground</a>", or "<a href="http://en.wikipedia.org/wiki/Driven_right_leg_circuit">driven right leg (DRL)</a>". The purpose of this connection is to keep your body's DC voltage level within an acceptable range and to keep any common-mode AC signals in your body minimized. Therefore, the bias line will act to source or sink whatever current as necessary (within reason) to minimize your DC and common-mode AC signals. As a result, for the 6nA current that we are injecting, "ground" is the bias driver, as shown in the figure above.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEht1tV5BOWRKwf7r_1ivDVduRoMdukei49xmlunBAL7T522on0phgH3BnOjVnGolaKmabFgjbvpHDAdjhWugV12L16bS9rDmXBjzJOceYrBOCP-LnaYeJzC9Mz0E5gjV83vAUXj0q_VRqQ/s1600/CircuitModel.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEht1tV5BOWRKwf7r_1ivDVduRoMdukei49xmlunBAL7T522on0phgH3BnOjVnGolaKmabFgjbvpHDAdjhWugV12L16bS9rDmXBjzJOceYrBOCP-LnaYeJzC9Mz0E5gjV83vAUXj0q_VRqQ/s1600/CircuitModel.png" width="367" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Making a Mental Model of Where the Current Goes</td></tr>
</tbody></table>
<br />
<u>Measuring Just the Electrode-to-Skin Voltage:</u> If the bias driver is our ground, the figure shows one way to model where the current goes. Note that the current passes through several unknown impedances on its way to ground. How do we evaluate just the impedance of the "+" electrode's interface to the skin? The answer is to remember that an EEG system measures the voltage between its "+" input and its reference (or "-") input. Because of the high-input impedance of the differential amplifier, no current flows into the "-" electrode line, so none of its impedance elements matter (for the purpose of this measurement). Therefore, we can easily measure just the voltage drop across the first three elements -- the 5K in-series resistor, the electrode-to-skin impedance, and the impedance of a portion of the human body. Because the series resistor is known, and because the impedance of the body is too small to matter, we have only one unknown remaining -- the impedance of the electrode-to-skin interface. <br />
<br />
<u>Calculating the Impedance:</u> The model above shows the electrode-to-skin interface as a simple resistor. While this is not quite right (it does have a capacitive component as well), we can use this model to roughly estimate the number we need. Following from the basic V = I*R, we shuffle the terms to get R = V/I. We know both "V" (the measured voltage drop) and "I" (the known 6 nA current), so we easily get "R". The only trick is to make sure that you're using compatible units for voltage and current. In my case, I'm measuring the voltage as an *RMS* value (not an amplitude value). My current value (6 nA), however, is an amplitude value, not an rms value. So, to make the units compatible my calculation must includes a factor of sqrt(2) to convert RMS into amplitude:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace;">R = (Measured Voltage * sqrt(2))/(Known Current) </span><br />
<br />
Finally, remember that the "R" here is the the series resistance of the electrode-to-skin interface plus the 5K resistor built into the OpenBCI board. So, to get the impedance of just the electrode-to-skin interface, you need to subtract 5K.<br />
<br />
<u>Testing on OpenBCI:</u> To confirm that this all works in real life and not just on paper (or, um, just on a blog page), I used one of my OpenBCI boards to test it out and confirm that it works. The simplest test that I could devise was to use clip leads (see picture below) to jump together the electrode connectors. Specifically, I connected four of the "+" electrode connectors to the single reference (ie,"-") electrode connector, which is then jumped to the bias electrode connector. This configuration eliminates all of the electrode-to-skin impedances and the human body impedances. The only impedances remaining are the 5K series resistors. Hopefully, when I do my voltage measurements and divide by the 6 nA current, I'll get a number close to 5K.<br />
<u><br /></u>
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOxOVE-Uk8sJf21qHaXaFEwKmhAUhZelXRlJjkQr92TOcxWhuihfarbOpv63SUSJZv-RO0hxb-MWJ9qCUQJ5GRoTRJMlN-iqBvqOboFocWLvlnS4cNnYc5_UjA1AXwRvN9D4-F2hK4rew/s1600/OpenBCI_V1_Shorted_1-4_toRefAndtoBias.JPG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOxOVE-Uk8sJf21qHaXaFEwKmhAUhZelXRlJjkQr92TOcxWhuihfarbOpv63SUSJZv-RO0hxb-MWJ9qCUQJ5GRoTRJMlN-iqBvqOboFocWLvlnS4cNnYc5_UjA1AXwRvN9D4-F2hK4rew/s1600/OpenBCI_V1_Shorted_1-4_toRefAndtoBias.JPG" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Testing the Impedance Measurements on an OpenBCI V1 Board.<br />
The colored wires are directly connecting (ie, shorting) the electrodes<br />
to the reference electrode and to the bias electrode.</td></tr>
</tbody></table>
<br />
<u>Configuring for the Test:</u> Because I only had a few clip leads, I could only jumper four of the eight "+" electrode connectors. As a result, when I ran OpenBCI, I only activated channels 1-4. I configured the "Lead-Off Detection" settings to generate a 6 nA current source at a frequency of 31.2 Hz. I then started to activate the current sources on the "P" side of each EEG channel. I saw that I could activate and deactivate the signal on any given channel without affecting the signals seen on the other channels. Excellent. <br />
<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgsbLrAm_0NxkJ9SL5o_6SwSmKaT1RePafVgxGTFRL_blsScSiaP9AP7TKXE163XzapQDc7vHQxIHFk5po07lFFELzTM_MUt-v4h-RZ1_Y4wQjEDKFUIRjDV41fBYY98A2sdpq0MmyrE_c/s1600/OpenBCI-2853_fourElecImpedanceTest.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="370" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgsbLrAm_0NxkJ9SL5o_6SwSmKaT1RePafVgxGTFRL_blsScSiaP9AP7TKXE163XzapQDc7vHQxIHFk5po07lFFELzTM_MUt-v4h-RZ1_Y4wQjEDKFUIRjDV41fBYY98A2sdpq0MmyrE_c/s1600/OpenBCI-2853_fourElecImpedanceTest.jpg" width="580" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Screenshot of the OpenBCI GUI when running an impedance test on channels 1-4. Click to zoom.<br />
Note that ~31 Hz test signal is present in the first four channels<br />
and that the resulting per-channel impedance is 5.4-5.8 kOhm.</td></tr>
</tbody></table>
<br />
<u>Results</u>: With all four current sources active, I got the results seen in the screenshot above. As you can see (click to enlarge), a 31 Hz signal is present in the first four channels. You can see this in the time-domain montage on the right and in the frequency spectrum plot in the bottom-left. On the time-domain plot, you can see that my labels indicate that the voltage induced on each EEG channel is between 23.0 uVrms and 24.5 uVrms. Using my equation from a few paragraphs back, this yields impedance estimates of 5.43 to 5.77 kOhm. Since I'm expecting to see the resistance of each channel's 5 kOhm resistor, this corresponds to an error of 9-15%. Given that the ADS1299 datasheet says that the "known" current is only known to +/- 20%, I find my result to be very satisfying. I'm feeling pretty happy right now.<br />
<br />
<u>Super-Advanced Topics:</u> With the method described above, you can measure electrode-to-skin impedance of the electrodes attached to the "P" inputs of this system. The question remains on how to measure the impedance of the reference electrode (which is attached to the "N" inputs) and of the bias electrode. The short answer for the reference electrode is that you can use the similar ADS1299 feature for the "N" channel (as long as you do not use SRB1 as a cheater way to mux the REF to all of the "N" inputs, which is what we do in OpenBCI V1). And the short answer for the bias electrode is that the ADS1299 can detect if it is attached, but it cannot measure its impedance. Luckily, the impedance of the bias electrode is not as relevant.<br />
<br />
<u>Overall Success:</u> So, I'm feeling pretty good about getting this impedance measuring to work. The code for implementing this has been pushed to the <a href="https://github.com/OpenBCI/OpenBCI">OpenBCI GitHub</a>. The next step is do do some impedance measurements using actual electrodes on my actual head. Look for a follow-up post!<br />
<br />
<u>Follow-Up:</u> I measured the impedance using disposable ECG electrodes as well as re-usable gold cup EEG electrodes. You can see the results <a href="http://eeghacker.blogspot.com/2014/04/impedance-of-electrodes-on-my-head.html">here</a>:<br />
<br />
Follow-Up: I linked this post to answer a question on the Texas Instruments user forums (regarding the Texas Instruments ADS1299 chip at the heart of OpenBCI). This post was recognized by TI for a <a href="http://e2e.ti.com/group/helpcentral/b/weblog/archive/2015/08/06/ti-community-awards-august-2015">TI Community Award, Aug 2015</a>. Thanks, TI!</div>
Chiphttp://www.blogger.com/profile/10352943033779293161noreply@blogger.com47