Brain-Machine Interface for Reaching: Accounting for Target Size, Multiple Motor Plans, and Bimanual Coordination
Brain-machine interfaces (BMIs) offer the potential to assist millions of people worldwide suffering from immobility due to loss of limbs, paralysis, and neurodegenerative diseases. BMIs function by decoding neural activity from intact cortical brain regions in order to control external devices in real-time. While there has been exciting progress in the field over the past 15 years, the vast majority of the work has focused on restoring of motor function of a single limb. In the work presented in this thesis, I first investigate the expanded role of primary sensory (S1) and motor (M1) cortex during reaching movements. By varying target size during reaching movements, I discovered the cortical correlates of the speed-accuracy tradeoff known as Fitts' law. Similarly, I analyzed cortical motor processing during tasks where the motor plan is quickly reprogrammed. In each study, I found that parameters relevant to the reach, such as target size or alternative movement plans, could be extracted by neural decoders in addition to simple kinematic parameters such as velocity and position. As such, future BMI functionality could expand to account for relevant sensory information and reliably decode intended reach trajectories, even amidst transiently considered alternatives.
The second portion of my thesis work was the successful development of the first bimanual brain-machine interface. To reach this goal, I expanded the neural recordings system to enable bilateral, multi-site recordings from approximately 500 neurons simultaneously. In addition, I upgraded the experiment to feature a realistic virtual reality end effector, customized primate chair, and eye tracking system. Thirdly, I modified the tuning function of the unscented Kalman filter (UKF) to conjointly represent both arms in a single 4D model. As a result of widespread cortical plasticity in M1, S1, supplementary motor area (SMA), and posterior parietal cortex (PPC), the bimanual BMI enabled rhesus monkeys to simultaneously control two virtual limbs without any movement of their own body. I demonstrate the efficacy of the bimanual BMI in both a subject with prior task training using joysticks and a subject naïve to the task altogether, which simulates a common clinical scenario. The neural decoding algorithm was selected as a result of a methodical comparison between various neural decoders and decoder settings. I lastly introduce a two-stage switching model with a classify step and predict step which was designed and tested to generalize decoding strategies to include both unimanual and bimanual movements.
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Rights for Collection: Duke Dissertations
Works are deposited here by their authors, and represent their research and opinions, not that of Duke University. Some materials and descriptions may include offensive content. More info