of my group's work with the BrainGate Neural Interface System. (NIS). For this ...
to be a part of this effort to create state-of-the-art technology that hopefully one ...
turning thoughts into Action Almost 16 years ago, a woman suffered a brainstem stroke that left her quadriplegic and unable to speak but cognitively intact—a condition called locked-in syndrome. Researchers know her as S3: the number she was assigned as a participant in a clinical trial of a neural interface system called BrainGate. Neural interface systems allow people who are paralyzed by disease or injury to control external devices just by thinking about moving their paralyzed limbs. Last year, S3 made news when she operated a robotic arm to serve herself a sip of coffee, a task she accomplished using only her thoughts. Behind this groundbreaking achievement was a multidisciplinary team of scientists based at Brown University, Massachusetts General Hospital, Stanford University, and Providence VA Medical Center. Here, four BrainGate researchers discuss their contributions to this exciting project that, in the words of neuroengineer David Borton, makes “the thinkable possible.”
AT THE INTERFACE OF BRAIN AND MACHINE BY DAN BACHER
One day in late 2010, something remarkable happened that changed my life. I had been leading a project to develop a communication system for people with locked-in syndrome as part of my group’s work with the BrainGate Neural Interface System (NIS). For this specific project, our objective was to create an interface that would allow users to communicate using only their thoughts. I was responsible for developing the virtual keyboard software and integrating it with the NIS. On that day in 2010, the plan was to test my keyboard interface with clinical trial participant S3. At the time, her usual method of communicating was to slowly move her eyes to individual letters printed on a clear piece of plastic, while a person behind the plastic would record each letter she chose. But on this day, she would use only her thoughts to move and click a computer cursor to type with my onscreen keyboard. S3’s eyes lit up when she saw the keyboard. I was trying to
16 imagine
demonstrate some of the features when instead she defiantly started typing on her own: first “thank,” then “you.” Those two simple words—so commonly and automatically exchanged— were the most powerful words that had ever been spoken to me. (I do mean spoken: S3 used the built-in text-to-speech feature I’d integrated to have the computer speak her message.) This transformative moment was the first of what would become a series of exciting, humbling, and emotional experiences with S3 and other participants in the BrainGate clinical trial. In the following months, I worked with a team of engineers to create software that could translate the BrainGate system’s command signals into coordinated movements of an advanced robotic arm. Months of long hours of developing, refining, and validating our software were put to the test in April 2011. I was by S3’s side once again when she used this robotic arm to give herself a drink of coffee. Controlling the robotic arm only with her imagined movements, she reached out, picked up a bottle, took a drink, and put the bottle back down onto the table—a feat she last performed with her own arm nearly 15 years earlier.
sept/oct 2012
BrAinGATE2.orG
Again I felt humbled and proud while sharing in this emotional moment with S3 and our research team. Again I realized how incredibly cool my job is and how amazing it is to be a part of this effort to create state-of-the-art technology that hopefully one day will help people with locked-in syndrome. If you’re ever searching for inspiration and purpose, I encourage you to seek out those who need help the most, identify what they need, and if you can’t help them find it, go make it for them yourself! As Goethe once wrote, “Knowing is not enough; we must apply. Willing is not enough; we must do.” Dan Bacher received his Bs in biomedical engineering from syracuse University and his ms in bioengineering from the University of Pittsburgh. now a senior research and development engineer at Brown University, dan is also an aspiring entrepreneur who enjoys playing music, trying to stay in shape, and reading in his spare time.
www.cty.jhu.edu/imagine
DECODING NEURAL SIGNALS BY BEATA JAROSIEWICZ , P h D
I am a neuroscientist by training, but during the course of my research career, I have learned computer programming skills that have become crucial to my work on BrainGate. My focus has been on using my neuroscience knowledge to help improve the computer programs that decode neural signals associated with the intent to move a limb. The starting point of the BrainGate neural interface system is an electrode array placed in the hand/arm area of the motor cortex. These electrodes record action potentials, or “spikes,” from neurons. When the person opens or closes (or imagines opening or closing) her hand, we find some neurons that increase or decrease their spiking rate. Other neurons change their spike rate for different intended directions of movement. For example, one neuron might increase its spiking rate for a rightward arm movement and decrease its rate for a leftward movement. That neuron would be said to have a “preferred direction” to the right. Other nearby neurons might have preferred directions to the left, up, down, forward, backward, or anywhere in between.
imagine
17
BrAinGATE2.orG
We begin each research session with our study participants by figuring out how each recorded neuron’s firing rate modulates with intended movements. We do this by displaying a cursor programmed to move to targets that appear one by one on a computer monitor while the participant imagines using her hand to move the cursor. During this calibration, a computer registers the spike rate of each neuron. Then, using the spiking information and the imagined movement information, the computer creates a model of each recorded neuron’s preferred direction.
Before each research session, clinical trial participants first perform a calibration task designed to match their neural activity to their intended movements.
UNTETHERING THE LOCKED-IN MIND
As a neuroengineer, I try to solve neuroscience problems with the use of modern technology, such as custom electrical circuits, chips, and software. People often say that technological advances have made the unthinkable possible, but in the case of neuroengineering, they’ve made the thinkable possible. The human brain consists of more than 80 billion neurons making over 100 trillion connections. These neurons communicate with each other by sending electrical pulses, called action potentials, along their long axons and to neighboring neurons. How do we listen to, and make sense of, so many signals? Neuroengineers have already met one part of the challenge by designing specialized “microphones” that can sense the millions of action potentials every second as the neurons communicate with one another. Currently, using a microelectrode array of 100 recording elements, we can listen to the activity of roughly 100 individual neurons at once. In the BrainGate project, these signals are transmitted outside the body through a long cable, amplified to distinguish them from background noise in the brain, digitized into binary code, and processed with computational algorithms to decode what the signals might mean.
18 imagine
Once this model is created, the participant can start controlling the movement of the cursor. A computer algorithm called the “decoder” compares incoming neural activity with the model to figure out in which direction the person wants to move the cursor, and then sends this signal to the computer cursor. In this way, the person can control the cursor’s movement just by thinking about where she wants it to go. This is called “neural control.” My contribution to this research has been to figure out the best way to calibrate the model and keep it calibrated when neural signals change (which can happen with, for example, tiny movements of brain tissue resulting from local blood pressure changes). Combining my neuroscience knowledge and my computer programming skills, I helped design a method to recalibrate the model using neural data collected during neural control. This not only makes the model more accurate from the start, but also keeps the model calibrated for long periods of time without having to interrupt neural control. My next challenge is to make the NIS software fully automated or user-controlled so that its use does not depend on a trained technician or caregiver. This will bring us one step closer to helping people with paralysis communicate and interact with the environment more independently—a goal that motivates us all. Beata Jarosiewicz earned her Phd in neuroscience from the center for the neural Basis of cognition at the University of Pittsburgh and is now an investigator in the neuroscience department at Brown University. in her free time, Beata likes to play volleyball, do gymnastics, and train her cats to impersonate dogs and people.
BY DAVID BORTON , P h D
While this method of neural recording works incredibly well, when we look to a future when locked-in patients are moving their own limbs to walk down the hallway, we realize that the transmission of all this neural data must be done wirelessly. To achieve this, we must reinvent the amplifier, digitizer, and data transmission mechanisms so they can be implanted in the patient. The amplifiers currently used by the BrainGate team are the size of a hardback book, and the digital signal processors take up the majority of a personal computer’s memory. To create smaller electronics, we leveraged advances in microelectronics ranging from chip design to flexible printed circuit board technology. We have designed custom ultra-low-power application-specific integrated circuits (or ASICs), amplifiers the size of an m&m, and integrated digitization circuitry—and put all of this into a device the size of a U.S. quarter. Through an encoded high-frequency radio transmission scheme similar to 4G LTE, this device transmits the digital neural data from the patient to a computer across the room, where it can be processed into prosthetic control signals. The device is packaged
sept/oct 2012
PUTTING RESEARCH TO THE TEST One afternoon at work, S3 and I were having a conversation much like one you would have with any friend or coworker. She was telling me a story about her grandson, recalling his reaction to a present he received on his birthday. Our conversation, however, looked anything but typical: I was holding up a clear letter board as she focused her eyes on individual letters. When I met her gaze through the board, I said the letter out loud; if I was correct, we would move on to the next one, spelling out the words and sentences that made up the conversation. With the BrainGate system, I have since seen S3 use typing interfaces to quickly spell out phrases on a computer. Our goal is that people will someday be able to use the system 24/7, without any assistance. For now, because our study is still in Phase 1, to evaluate the device’s safety, participants can use the BrainGate system only when a trained technician, like me, is present during our twice-weekly research sessions. My job as a clinical research assistant is to run these sessions. On a typical day, I travel to the participant’s home and ask him or her to consent to a research session. After downloading the session software sent by the research scientists, I set up the neural connection by attaching a cable to the connector implanted on the participant’s head. I then explain the experiment, describing the task and what type of imagery they should use to complete the task, which varies from moving a computer cursor to operating a robotic arm. Sessions run for about four hours, and I interact with the participant the whole time, answering their
BY ERIN GALLIVAN
questions and giving instructions, making a detailed log of the session, and recording video. At the end of the session, I send the data to the researchers at Brown for analysis. It is part of my job to make sure that the technology we are developing is easy and enjoyable to use, and the participants offer a lot of great feedback and suggestions, which I pass on to the rest of the team. For example, when using the keyboard interfaces, participants offered suggestions for making the layout easier to navigate and adding shortcuts to help them type faster. My primary goal is to collect and deliver the data, but my personal interaction with our participants allows me to see how the research will directly impact their lives. Working with BrainGate has opened my eyes to all the good that can be done through science, medicine, and engineering. I will be leaving this position to go to medical school next month, and although I am sad to be moving on, the work that I have done here has made me realize that this is the correct career path for me. Erin Gallivan earned a Bs in mechanical engineering from Boston University. she worked as a mechanical design engineer at raytheon and then as a data specialist at dana-Farber cancer institute before joining the BrainGate team. she is now in her first year at the keck school of medicine at the University of southern california. BrAinGATE2.orG
In the BrainGate neural interface system, an implanted microelectrode array detects brain signals that are then translated into computer code, allowing the user to operate external devices with only their thoughts.
in medical-grade silicone for implantation under the skin. We hope that this wireless neural interface will soon allow locked-in patients to be untethered from their hospital beds. Neuroengineers have one of the most exciting jobs in the world. We get to design, develop, and deploy the next generation of brain-recording devices that will one day enable patients with spinal cord injury or neurodegenerative diseases to freely interact with the world around them. We have been working toward this goal for many years, but advances in neurotechnology are bringing it close to reality.
David Borton earned his bachelor’s degree in biomedical engineering from Washington University in st. Louis and his Phd in biomedical engineering from Brown University. dave is an avid soccer player and played professionally for a year in Brazil. he also plays trumpet and guitar, and enjoys sailing, biking, rock climbing, and running marathons.
learn more about braingate and the work these researchers do at www.braingate2.org. www.cty.jhu.edu/imagine
imagine
19