UMassTTS: OpenBCI Tech Talk

July 24, 2019 posted by



this is an avocado this is a banana oh the YouTube it is playing the background that's why ah there you go I can just take this youtube link and copy like actually myself then we can start also couey's here for writing issues or something so the videos report Connor Connor there you go we owe your middle names Daniel that is correct armored Facebook can you tell you so maybe we want to get started because I do have to sign off promptly at 6 p.m. yeah all right so you can start right now sounds good how's everyone doing okay yes you're all UMass students yeah cool what are you guys in any specific department or kind of from from all around um s who show of hands who has experience working with EEG not many anybody I don't know I can't see the back of the room so okay how about any type of bio bio tracking bio signal bio data nothing cool all right bunch of noobs cool well my name is Connor I'm co-founder and CEO of open BCI over here I'll give you a little camera tour that's Joel Joel's my business partner and fellow co-founder you met him earlier and there's Irene who is our resident neuroscientist she's at the ultra cortex table Ultra cortex assembly table currently when these are our 3d printers and we've got a bunch of other stuff any rate you are you are peering into the open BCI laboratory workshop HQ whatever you want to call it do you all is familiar with the device on my head a little bit so this is the ultra cortex the latest version of the ultra cortex it's the mark 3 supernova crazy name I know it has a 16 channel EEG amplifier embedded into the back I'll just show you a different headset so here's another version this is the a channel version of the board so this is a open BCI board that can take in eight channels EEG EMG or ECG data so similar to an Arduino in the sense that it's programmable you can change the firmware you can adjust the settings you can put in other inputs so there's a few GPIO pins that you can use for inputs and outputs so you can add additional sensors but then it is also extendable into a 16 channel version so you can and put what we call the the Daisy module onto the board and plug it in so you can double the inputs to 16 channels yeah so I guess taking a step back is everyone familiar with EEG to a certain degree so EEG is stands for electro encephalogram and it is the measurement of electrical signals from the scalp and simultaneously when you're when you're measuring from the head you also get not just brain data but but muscle data from the muscles around your ears and your eyes muscles that you use to close your jaw so when you're wearing a headset like this you can pick up you know micro-expressions from your face every time your eye blinks all of this data can be recorded and very easily seen and then the the brain data you have to kind of it's a little bit it's a smaller signal much smaller signal but it's very it's you know I actually I personally I'm just as interested in EMG data a muscle data as I am EEG data which is the actual brain activity but yeah what I'm gonna do now is actually giving you guys a live demo so what I want to do is share my screen if I can figure out how to do this ah that's the one you're dude yeah cool can you see my screen yes cool all right so this is actually let me before I do this someone take a step back so this is our website open BCI and you can see a spinning 3d model of a head wearing the headset that I'm wearing and I'll go into more detail about aspects of our website this is kind of if you want to dig deeper into the platform but right now what I'm going to do is show you our go-to software so we have software that we call the open BCI GUI who here is familiar with processing as a development language so processing is a Java based creative coding language it was developed I think by some guys at ITP MIT okay MIT not ICP and it's it's this kind of really you know very open and accessible development language so we built our software on top of that and here let me show it to you so I am currently running so I'm actually plugged in right now can you guys see this graphical user interface yeah great so you see that eight channels here channels one two three four five six seven eight yep okay so those eight channels are mapped onto my head as seen in in the heat plot over here the head map so that little triangle at the front is my nose and these little half ellipses are my ears so channels one two or on the front of my head and channel seven and eight and eight are on the back of my head and these are mapped to these channels one two three four five six seven eight so if I start blinking my eyes very rapidly do you see how the front of my head lights up and all of those blue eye blinks are seen right there yeah wait can you you can't actually see me can you anymore no no all right so okay hang on one sec let's see if I can trick the computer mmm well whatever I'm doing these things you see the eye blinks and the thin channels wanted to wait basically I'm blinking both of my eyes rapidly and it's creating those very strong wave artifacts and now I'm stopping and now I'm going again cool and now what I'm gonna do is I'm gonna grip my jaw so do you see that huge data dude all that EMG data so that's muscle data as a result of me clenching my teeth and what you'll notice is that channels 3 & 4 produce the strongest signal if I scale this down so do you see how channels 3 & 4 here are producing the biggest amplitude scratchy frequency high frequency wave and those are visualized in channels three four here and that's because if you put your hands on the side of your head and you grit your teeth the muscles responsible for using your jaw or on the side of your head and you can actually feel that happen so now what I want to do is show you some brainwaves so when you close your eyes you produce produce an alpha frequency or a essentially a frequency attend her two wave form in the back of your head that comes as a result of you kind of relaxing your visual cortex so your eyes are closed and you're no longer processing tons of visuals stimuli and as a result your visual cortex returns to this kind of default frequency so watch as I close my eyes look for a ten Hertz frequency so each of these tall vertical gray lines that separate the EEG montage that's one second of data so try to count the number of waves in a second and you'll notice that you'll see about ten Hertz I'm not doing it yet simultaneously you'll notice that on the frequency domain over here on the lower right just above ten or maybe you're about eleven Hertz you'll see the data spiked up specifically with the red and brown lines so I'm about to do it now did you see the waveform here cool so I'll do it again just so you can get another look but as soon as I open my eyes the the Alpha frequency disappears and the spike in the F of T plot drops as well so I'll do it one more time cool so that's that right there is very good so I just took a screenshot but let me come back here now what I'm going to stop sharing my screen for a second and I'm back it has gone so any any questions on that have you ever seen was that first time seeing EEG and EMG for many of you ok cool so you know what we want to do at open BCI is essentially make that data as accessible as possible and get you know grad students phd's undergraduates and maybe even high school students in the coming years getting a better understanding of what what this biometric data looks like so having the ability to actually retrieve this data and interact with it in real time so actually write code that is responding to this data as an input and so that is in many ways the basis of the open BCI platform it is essentially a tool kit or a prototyping platform for harnessing the electrical data from the human body and using that as a medium to develop around so what I want to do now is get back in the screen sharing well let's see are we good are we back great so now what I want to show you is the ok so we also have this I just changed the graph a little bit but do you see this visualizer here on the upper right so this visualizer is mapped to channel 3 and I'm specifically interested in the amplitude of the data that's coming through that channel averaged over a short period of time so what I can do is I can grip my jaw and every time I do it you see how I fill the bar up and I basically make the red circle and the green circle fill the green circle yep yes no okay so when I do that I'm creating a large EMG artifact that I can then use as an input for software or hardware so we have a different version of the GUI that actually looks at up to 16 channels across the head and and in essence it gives you roughly sixteen potentiometers that you you know are digital potentiometers that you can trigger with muscle data from your scalp so what we've done is we we we've mapped this data into 3d printed robotic hands RC helicopters musical instruments so you're triggering different audio samples based on whether you're blinking your right eye or your left eye or flexing your bicep or your forearm so the open BCI board can also be used not just with the headset but with other types of electrodes so you could put some electrodes on your head some electrodes on your body and you can be simultaneously measuring brain activity you know from your visual cortex your frontal lobe and then also looking at EMG data from your bicep or EKG which is muscle data across across your chest and then using all of this data in real time for either research or applications and I'll show you right now so when I get the command from someone in the audience just tell me when and I'll drive the bar up all right tell me again cool now tell me if you wanted halfway or all the way and now I'm letting go so you can see that it's not just push buttons interactivity I can actually with with pretty you know minimal training and you know relative ease I can actually just kind of keep the the bar at a mid level and then when I decide to I can just drive it all the way up like that so you know I think the implications for this are really profound I think a lot of times people are very excited about the potential you know newcomers to the space are really excited about the potential of brain-computer interface this or mind-controlled that but i think it's also important to remember that you know we have this wealth of motor data that we still aren't implementing or harnessing that is actually you know a lifetime's worth of our brain training our muscles to behave in a certain way based on brain intention so don't my only advice to you is don't get blinded by the the sex appeal of brain computer interfaces when there's a wealth of bio data outside of that that's a lot more practical for many applications so just a little food for thought we were you know we've been interested in helping this individual with ALS for a long time and in our kind of initial research we had a conversation with someone at the ALS foundation and they said that 90% of ALS patients maintain some level of motor function in other words muscle control until they die so you're looking at a subset of people now who are probably the people who need some type of you know bio computer interface whether it be brain or muscle more than anyone else in the world you know I would say this group and probably quadriplegics are people with strokes doing stroke recovery so 90% of this population still has motor function over muscles in their face or their scalp or around their body so you know when you're thinking about designing a practical application for a group of people that need it don't be blinded by the cool and wow factor of BC brain computer interface when this other data is also available now I'm not saying that Yugi is not exciting because it is exciting for both interactivity and other applications but just definitely keep in mind that there there is a lot of data so for when it comes to EEG there's a few ways that the data is looked at one is is classification in real time so being able to look at the frequency spectrum the frequency spectrum and frequency bands in real time and try and essentially reverse-engineer them to try and understand my brains intention at any given moment so one way you can do this is called motor classification or sensory motor classification where you actually build an interface and you tell a user or subject to wear the headset or wear the EEG device and think a certain thought for a period of time so essentially okay think about closing your left hand or squeezing your left hand and we're gonna record 30 seconds of data now great we got that data now do the same thing but with your right hand and we're gonna record 30 more seconds of data and then you do this over and over and over again and then you switch the system live so you have this these binned data sets and now you can try and extract features or artifacts from the live data stream that match one of these bin data sets and if it reaches a certain threshold you can say or a certain similarity threshold you can say okay the user is most likely trying to trigger there like the thought that's correlated with them opening and closing their right hand so let's trip that switch so that in a nutshell is kind of a motor classification system so that's one way of using brain data for interactivity but there's also the EEG you can also be used for in more of a passive recording technique for both sleep studies and also looking at something called an event related potential or a p300 is a subset of event-related potentials but it event related pension potential is essentially exactly what it sounds like it is your brains electrical reaction or potential reaction to being presented with a stimuli or an event so if you are sitting calmly and the expectation of a certain event is there you know maybe you are so for example wanted one an example of this is a p300 speller so if you are trying to spell words there is an interface that basically has a grid of letters and I'm gonna switch back to me see I can kind of do a visual demo so here's a p300 interface the grid is has all the letters of the alphabet in a two dimensional array and there's two lines that are essentially looping over the letters and so you as someone with the p300 speller head head where on are expecting us the next letter you want to spell you know to eventually be targeted by the crosshair of the two lines the moment that it happens your brain will elicit a potential which is essentially your brains way of saying yay that's the one and when that happens the system can can detect that because the it will see a large spike in electrical activity and then the system is able to determine or to decide okay great the letter T was just you know targeted and the user wanted that to be spelled moving on and so this is a very slow way of typing but in some cases it's the most effective way of typing for people who have no other means so so that's an example of using event-related potentials once again for interactivity but I think they're also very interesting to think about things like interest and engagement because there are large ERPs or large potentials that are that come as a result of stimuli but there are also much smaller versions that are correlated with everyday life and daily events so you know most of the day your mind is probably in a mind-wandering state where you're kind of searching for the next important thought i mean every once in a while something distracts you or catches your attention and that's associated with a small ERP or a kind of this moment where your brain goes yeah let's go there and so that type of information is really interesting because you actually have this kind of like essentially a metric for interest or engagement as your mind is shifting directions any questions at this point can you guys all still hear me yes all right cool that's good so yeah that's I think with that I'm gonna leave you that's I'm gonna leave you with that in terms of the science lesson for the day and I'll let you kind of explore more on your own but now what I want to do is show you where you can actually go explore more on your own especially if you want to work with open BCI so this I'm gonna go back to screen sharing bumpin oh cool are we back alright great so this is the open BCI website as I said before that is not the link I want to be on this is our Twitter account so we're pretty active on Twitter if you want to follow us we you know post interesting information about what we're doing and also other stuff in the neuro tech space we are entirely open source as a company and so all of the code and hardware that we design we put on github so I recommend you checking out the open BCI github many repositories so for one a good example is the open BCI processing get repository so this is where all of the code lives for the software that I was just showing you so you can come in here tweak any any of the code and you can even come here and check out all the other branches here is the audio player so this is where we used some of the data to trip audio samples this is an example where we took the EMG data from channel three or four I can't remember from basically jaw data and mapped it to a new serial port going out to control a 3d printed robotic hand this is a neuro presentation essentially where we took left and right eye blinks and triggered slides moving forward and backwards and then jaw grits to disengage and engage the presentation and essentially switch between GUI mode or presentation mode down here we've got a bunch of variants that are using the Alpha and something called SS V DP which is a data tract from the back of the head or the visual cortex that essentially if you look at a flashing frequency your visual cortex will mimic that frequency and so what we did is we set up an interface where you could look at different frequencies the open BCI system would detect the fact that you're looking at a certain frequency and then we would map those different frequencies to triggers to a robot so you could control a little hex robot make it turn left to make it turn right and walk forward using two different frequencies and then closing your eyes to produce alpha so all of these open BCI processing variants on the on the github go through different implementations of that and in general most of that code that's tweaking that's tweaking the software is in the EEG processing dot PDE file because the EEG processing has a class called EEG processing user and this is essentially our code code playground where if we're testing out you know new interactive features to the GUI or we want to build an EMG visualizer we build it here first and then when it becomes you know flushed out enough we then turn it into it to its own class and then you know add it to essentially the the default open BCI processing GUI so on top of that we also have no or we have a JavaScript SDK a node module we have a Python repository where we've got some basic Python getting started code to essentially get the data out of the open BCI hardware and into a Python app and a number of other code repositories but I'd say the node.js the processing and the our our our most developed software packages and then also all of the firmware that's uploaded to the open BCI boards via Arduino is also up here so here we also have on our github the ultra cortex repository this is where all of the 3d files for 3d printing the headset that I'm wearing this is where they live so this is the first the mark one or the first version and now we're all the way up to the mark three Nova which is the one I'm wearing on my head and here you can see an image that the exact headset I'm wearing on my head actually and we are currently in development of the mark 4 which was featured on our recent Kickstarter campaign and here is the concept image of the mark 4 though it'll probably look a little bit different when it goes live so yeah so another thing I want to show you is the community page so we've been putting more and more effort into the community page recently but essentially this is a place kind of a clearinghouse where anybody that's working with open BCI are interested in open BCI can post projects they're working on or research that they're interested in pursuing or events that are coming up so definitely you know if you guys are doing anything neuro technology related I recommend just putting an event here and maybe anyone near you mask and find it not sure but we're now but definitely if you're working on projects related to open BCI I highly recommend that you post them here and then also reach out to me and let me know and I'll I'll help you go through the process but essentially this is a group WordPress account where you can join or login if you already have an account and then once you do you end up on the people page and so these are all people who have registered as community members and every person has a little karma score associated so here we've got ship he's our electric karma king he's made the most posts of anyone and here's his little bio and links to all of the posts he's made and then the learning page of our website is where we put kind of official tutorials for instance the getting started guide tutorials on how to connect the open BCI hardware to MATLAB nor open vibe in neuro more which are commonly used signal processing software for for EEG and other biosensing tools yeah and so downloads is essentially just kind of a portal to a lot of the stuff that's on github the forum is a great place for asking questions if you're interested in learning what people are doing with open BCI and also other EEG and related Hardware definitely go to the community page and kind of dig into some of the posts that people have created and then obviously our store so this is where we sell gadgets and gizmos for working with open BCI so as you can see the for pre-order we have a number of products from our last Kickstarter campaign and actually the nova and supernova just went in the store today so they are the latest revision of the mark 3 or the third version of the headset and then we've got our boards here which are kind of the micro controllers that we use for actually acquiring the data and these plug into all of our headsets yeah and here if you if you're interested in kind of learning about the history of open BCI you can check out both of our Kickstarter campaigns so we were funded on Kickstarter twice so we we ran a successful Kickstarter in late 2013 and then we just ran another one in late 2015 which we are still in the process of fulfilling and hope to finish fulfilling by the end of the summer yeah and so that is pretty much open BCI in a nutshell so I'm gonna bring it back to brag back to face to face but do you guys have any questions before we wrap it up so you said the headsets are all 3d printed yeah so we offer we offer print It Yourself kits but you know if you really dig in you can source all the components yourself and if you have access to 3d printers you can pretty much 3d print I would say the majority of the head said at least the structure the mechanical parts so that there's little nodes the majority of the actual volume of the headset is 3d printed and then we sell electrodes and wires and the embedded electronics to make the headset function and then if you are too lazy to print your own headset we sell a fully assembled or an unassembled version of the headset as well any other questions oh yeah ERP or event related potential mg like response to something event like have someone use use that to like use one person's data would that correspond to like someone else like trying to control like do you know it like would it could you use one person's like I don't know whatever whatever task you're trying to train I see you're saying are you so is is a training is a train data set transferable between people not really I mean so yes and I mean I would say that if you were training a system or a classifier you would most likely target the same regions of the brain on a given person though not necessarily but if you were implementing or kind of engaging for instance a motor or motor cortex interactivity system you would have to retrain the system for a new into like it would it would almost definitely not work if you took the headset off someone and put it on someone else you at least would need some level of calibration where you're you kind of know what you're looking for on the new subject but you still have to train that data set so there's something called the 10:20 system which is a essentially a internationally accepted map for placing electrodes in the context of EEG so it's literally just like number 10 – 20 system and so this map is what we use for establishing the node Network of our headsets and then if you you know for instance this note here is fpz and then the middle is CZ and then the back of the head is oszi and then there's a numbering and lettering system to relay information like oh I was you know I had my electrodes on c3 c4 if you said that to someone else working the EEG they would know where to place the electrodes to replicate the experiment you said like using the muscle data applications well like I'm wondering why that is because you you have like a fairly large amount of data that you get from so especially with somebody who can with ALS a new part of their cheek or something what like like why is it that you get you know get more useable data just like looking at the poles of that muscle rather than something something in their brain their issues I mean it's just for one brain data is teeny tiny in terms of power relative to a muscle data coming out of a muscle so you're talking like micro volts you know you know one to ten microvolts as opposed to a hundred or 200 microvolts from or maybe even more from from muscle data so just this the strength of the signal is much smaller and then you've also got all of that data has to has to get through skull skin sweat hair so the you know the elect the actual electricity or the electrons actually have to flow through us you know substantially more material which creates noise and dissipation of the signal so you know you're just you have many more obstacles in terms of getting a clear signal if you're recording EEG than you do muscle because muscle the only various barriers skin and the signal is much stronger so there's that and then there's also the fact that you know your muscles is essentially this kind of like biomechanical amplifier of brain data that's being sent through your nervous system so your brain is the source it's kind of processing information that it's receiving and then translating that into movement or action but your muscle you know is at the end it's kind of the farthest extension of your nervous system and it is literally burning calories to amplify electricity to generate movement so the signal is is significantly stronger than something that you're recording from the head and it's also you know the intention of the signal so when you're flexing your muscle the only reason that electricity is there is because your brain told it to be there whereas your brain is like this ocean of all of this stuff going on at the same time you know vision hearing you know language you know everything that could be actually be possibly conceived by consciousness so it's just a much more complex signal so you know for for interactivity and for kind of getting the response that you're anticipating targeting a muscle with an electrode is a much more effective way of getting a one-to-one mapping of intention and out but I have a question and you were creating this you know OPEC I thought what was the most challenging part the whole process what was the one thing where if there was one thing that really stood out when yours mine a pillar that's a great question I mean I really think it's all been difficult but fun I think yeah that's really kind of too hard there have been many difficult things I think some of the most stressed stressful or frustrating parts of the whole experience where kickstarter fulfillment so if you're ever thinking about running a crowdfunding campaign just make sure that you don't bite off more than you can chew because i think it's really easy during crowdfunding to promise more than you can deliver because you're in you're kind of in the spotlight and you feel like you have this limited amount of time to to tell the world what you're capable of and then you end up maybe taking risks that you shouldn't and i think we did a pretty good job of not biting off more than we could chew but i think you know we definitely had our hands full after the first Kickstarter campaign and I like to think this time around we're a little bit better primed to deliver on time so definitely you know like you know if you ever gonna run a crowdfunding campaign for one make sure that you budget the value of your own time you know make sure that you're paying yourself to actually do the work well you know like I guess the first first suggestion would be like respect your own time and don't don't turn yourself in to an accidental charity and then to you know make sure that you're not making false claims about what you're actually capable of building because then you're just gonna end up in a really stressful place I'm not saying we did that i but I think there was fear of that at times and then we ended up kind of pulling through other like uh is like basic EEG like signals besides that alpha ten Hertz signal like is red and yellow like it's like really simple so yes and no I mean the Alpha is by far the easiest brainwave to just kind of like consciously turn on because all you have to do is close your eyes in most cases if you've if you've drink you know if you've had coffee recently sometimes it's harder to produce alpha so if you're if your mind is kind of stimulated it's a little bit harder but I drink a lot of coffee so I think at this point I can just always produce alpha but the you know if you're if you're falling into sleep or if you're in deep sleep your brain is producing Delta or theta which is kind of a lower frequency higher amplitude wave some people who are experienced and who have trained with neurofeedback systems can actually just engage and disengage various frequencies on command without closing their eyes so we had a we had someone come into the lab a few weeks ago who is who runs a neurofeedback clinic for for helping people with ADHD and things like that and you know it was evident that he had trained a lot and I was like hey can you uh we put the headset on him and he was like ah this is great this is really easy but I was basically like I can you can you produce alpha with your eyes open because this is something I've been trying to do for awhile like basically train myself to not have to close my eyes freeze alpha and he was a yeah let me let me see you know it's been a while and in two seconds every single channel was alpha like all eight channels on his head like in like almost instantly I was just like I was blown away because like normally when I close my eyes I only produce alpha in the back of my head you know one place and he basically went boom all on and he was saying how you know he can train himself because there's there's certain conditions that are associated with imbalances in frequencies between hemispheres and things like that and he was saying that when he trains he trains to have he base he tries to increase his alpha frequency so he says naturally he has a low low alpha frequency meaning you know around nine Hertz because alpha from person to person or from moment to moment can actually vary between eight and twelve Hertz so he was like yeah you know when I'm not trying or when I'm kind of like you know just going about my business my alpha is around eight or nine Hertz but I want it to be higher and I forget why he said I think maybe like the the average human bell curve puts the median human at around ten Hertz and I guess like if you have a higher frequency alpha you're more alert or more attentive during the day I'm not exactly sure what it was but he was like yeah I want to train this on a regular basis but that being said he could just turn on and off alpha on command which was really impressive thank you for speaking today no problem thank you thank you for having me only avocados minions vo all right I'll leave my banana but thank you again for presenting and I hope you enjoyed this all right thank you feel free to reach out if you've got follow-up questions anyone so yeah I'll send this information first of all if anyone can sign in there great right wait good rush your day you too

No Comments

Leave a Comment

Your email address will not be published. Required fields are marked *