Home Contact Us Customer Service
THE SINGAPORE MAGAZINE OF RESEARCH,
TECHNOLOGY AND EDUCATION
About Innovation
COVER STORY:
Mind-Brain-Machine Interface Put to Good Use: Revolutionizing Prosthetics
by Nitish Thakor

or millennia, philosophers and laypersons have been intrigued by the mind-brain duality. The past century has produced an explosion of research methods to rigorously study the brain functions, from recording electrical activity of multitudes of brain cells (neurons) to imaging functioning human brain, and thus provide tools and scientific understanding to open a window into human mind. Now, armed with the technological developments that have taken place over the past two-three decades, we are poised to put that brain-mind interface to good use. Our dream, of course, is to make this mind-brain-machine interface work for practical human use and benefit. That is, we want to translate our thoughts (mind) to activity of neurons (brain) to action (machine) for improving health or overcome impairments. For example, can we use this brain-mind interface to help a patient with paralysis to control a wheel chair or an amputee to control a robotic arm and hand prosthesis with his or her thoughts alone? This is the new frontier of Brain-Machine Interface (BMI).

Wars and industrial or traffic accidents, among myriad other causes, result in hundreds of thousands of amputations annually. Amputees often receive prostheses for their lost upper or lower limb(s), although that technology until recently was quite primitive (compared with our natural limbs). Design of upper limb prosthesis with anything approaching the complexity and flexibility of the human arm poses a great number of technical challenges. An even greater challenge is that of controlling the artificial limb: how do you command a prosthetic arm with shoulder, elbow, wrist and multiple finger mechanisms to reach, grasp, touch, and even write? There are currently only a very few ways to control simple prosthetic arms, ranging from a harness providing mechanical actuation to use of electrical activity in amputees' residual muscles (myoelectric signals). These approaches remain quite limited in what they can do — for example, they can only command to open and close a prosthetic claw-like hand or rotate a wrist. They do not provide solutions to controlling dexterous, anthropomorphic (human-like) prosthetic arms. The present mechanical or muscle-based solutions are also unnatural, as they do not directly link human mind and its intent to the goal of carrying out dexterous activities. The ideal solution, therefore, would be to establish a direct mind-to-hand connection so that the amputee's own mind can influence and control an anthropomorphic prosthetic hand (Figure 1)!

To achieve this connection, we must first recognize that the mind sets in motion a chain of events (Figure 2). The desire to move a limb manifests as electrical activity of brain cells or neurons in large areas of brain, most importantly those areas associated with movement (or "sensorimotor" areas). Brain's commands proceed to the hand via the spinal cord and the nerves that innervate the muscles. By brain's activity, we mean the activity or "firing" of multitudes of neurons.

A large-scale coordinated firing sequence of the brain cells can result in brain waves that are measurable on the scalp. These neural signals are measurable on the scalp surface as the Electroencephalogram, or EEG. Although an EEG signal appears quite complex, it also displays rhythmic patterns with rich frequency content (Figure 3). The changes in the EEG signal are surprisingly consistent across subjects for tasks in which an amputee subject utilizes motor imagery as a substitute to moving his or her arm.

These EEG signal changes can be seen as a modulation of the energy at particular frequencies (Figure 4). Motor imagery results in an intense modulation of the EEG spectrum. Using an array of electrodes on the scalp, a "topographic map" of EEG signals is created. To summarize, an amputee thinks about moving a limb. That mental process results in modulation of brain rhythms; this activity is typically in the µ (pronounced "mu", 8-12 Hz) and ß (pronounced "beta", 12-30 Hz) frequencies.

EEG frequency modulation can be detected in real-time and used to control a prosthetic limb. This signal changes, such as the power in the ß bands (or other derived quantitative measures) can be used to command a prosthetic hand to open and close. As the photographic sequence in Figure 5 shows, the subject wearing an EEG "cap" (array of EEG recording electrodes) modulates the brain rhythm to control the open and close sequence of a multi-fingered robotic arm.

This scientific frontier, broadly known as the field of Brain-Machine Interface (BMI) has progressed rapidly over the past 20 years. It has, however, become clear that EEG signals from the scalp are too weak and noisy to give us enough information to reliably control a prosthetic arm with the dexterity required for the functional tasks that amputees would like to accomplish. Other more invasive sources of neural signals have been investigated as a potential source of information about the subject's movement intent: electrodes can be placed directly on the brain surface implanted into the surface of the subject's brain to record the action potentials of single neurons or inside the nerves that send signals to the amputated muscles.

In parallel, prosthesis technology has been revolutionized, thanks to an initiative undertaken by the Defense Advanced Research Project Agency (DARPA) in the United States. DARPA, known for its support of bold and pioneering technologies, recognized the need to develop the next generation of limbs for war amputees. DARPA challenged a large team of researchers led by Johns Hopkins University to produce an anthropomorphic (human-like) dexterous limb, with comparable size, weight, functionality, and dexterity to the human arm. The Modular Prosthetic Limb (MPL) is a magnificent result of this initiative (Figure 6).

In its current form, the Johns Hopkins MPL is an incredibly complex and delicate machine not yet ready for routine commercial or clinical use (nor is it affordable!). Like many pioneering ideas and initiatives, that research program also spawned the development of many affordable commercial and clinically ready devices. One of these, produced by the author's team and partners at the Infinite Biomedical Technologies (Baltimore, USA), displayed in Figure 7, is quite affordable and available for human use today.

Back to our goal: from mind to brain to hand. How do we relay the amputee's thoughts or intentions to this cutting edge dexterous hand? How do we capture and interpret signals from the neurons in the brain, the downstream nerves, or the muscles themselves?

Nerves that formerly sent commands to amputated muscles can be surgically re-routed to residual muscles in a procedure called Targeted Muscle Reinnervation (TMR). TMR uses the residual muscles as biological signal amplifiers so that the nerve signals can be easily detected on the surface of the skin. The nerves command the muscles and the muscle signal is picked up from the skin surface on the torso, decoded and then these decoded signals are used to control the advanced prosthetic arm (Figure 8). This nerve re-routing also provides the opportunity to provide feeling in the amputated limb by stimulating the residual muscles with vibration.

Capturing the full intent of controlling a dexterous limb accurately and rapidly from the Central Nervous System (CNS) would require recording directly from a large population of neurons. Recording from neurons would, however, requires a brain surgery to implant microelectrodes, which record the action potentials of single neurons, or microelectrode arrays, which record the population activity of thousands of underlying neurons (Figure 9). The neurons produce a veritable symphony of signals, fairly random-looking to the naked eye, but nevertheless together encoding the information to control the muscles of our arm, hand and fingers. Advanced mathematical algorithms come into play to decode this seemingly complex pattern of neural activity and translate it into the coordinated action of the prosthetic hand and fingers.

So here we are, at the frontiers of brain-machine interface. Very recently, human volunteer subjects have been implanted with microelectrodes and microelectronic circuits that interface with the brain. Even though these are early, pioneering investigations, they clearly demonstrate that the human mind and brain can control a sophisticated robotic arm or prosthesis. Still, it will be a long journey to affirm that these brain interfaces are clinically-ready, i.e. they are both safe and effective devices. Such research studies and subsequent clinical procedures will need to be carried out under full ethical considerations and oversight (i.e. to help and do no harm). This exotic technology will also need to become more affordable to be accessible to a wide population. In the mean time we may ask: What is the future of this exciting field? There are many areas of exciting research, challenging problems, and frontiers of discovery that remain unexplored.

So far, we have mostly considered efforts towards achieving movement of prosthesis. What about sensing or perceiving? Can sensation - whether it is the sense of touch or perceiving an object's texture or temperature - be conveyed to the amputee? How do we relay sensory perception back to the brain? One avenue is to electrically stimulate the brain areas involved in sensory perception. But will this electrical stimulation provide unnatural sensory perception? Can we control and modulate vast arrays of sensory cues and convert these into a form that the subject's brain and mind will accept? Many challenging questions will have to be answered, but the technical path seems evident and feasible.

Another more "futuristic" example is to decode human language or speech (Figure 10). The idea is to put electrodes in or on the regions of the brain known to be responsible for speech perception and production (e.g. Wernicke's area and Broca's area). Then, would it be possible to decode and interpret speech or language from the brain signals recorded from these electrodes? There are early hints that it may be possible to decode sound or voice levels and even some language features from human brain signals. Additionally, we should be able to micro-stimulate the brain areas responsible for hearing to create synthetic sound perceptions. While such research is ongoing, such a hearing or language BMI systems are far from being practical at this time.

Taking these BMI technologies to amputees, paralyzed patients, or patients who are "locked in" (i.e. unable to move or speak at all) will require considerable effort by engineers, clinicians, and industry to produce safe and effective solutions. Regulatory and ethical oversight will be critical so that we interface to the human brain and mind with consideration for the health benefits to the patients and a justifiable cost-benefit ratio to the society. While the feasibility of BMI has been demonstrated in the context of neuroprosthesis, significant technological, clinical, and regulatory challenges remain for its broad clinical utilization. Partnerships among researchers and industry will be needed to make this technology reliable and affordable so as to reach the broad population. However, a horizon is in sight at which we will be able to access human mind, interface to human brain, and develop practical interfaces - whether to control robotic limbs or to create artificial speech with our thoughts alone.

Click here to download the full issue for USD 6.50

Advertisement
» addiction treatment
» Business Directory
» Drug and Alcohol Treatment Centers
» essay
» Merchant Account
» advanceloan.net
» Nursing Courses
» Credit Cards
» Drug Treatment Centers
» Durack Institute of Technology
» Online Forex Trading
» Opérateur mobile
» Source Quality China Products
» Yodle
 
Copyright© 2021 World Scientific Publishing Co Pte Ltd and National University of SingaporePrivacy Policy
INNOVATION magazine is a joint publication of Nanyang Technological University, National University of Singapore and World Scientific Publishing Co Pte Ltd