A Feeling Touch

Screen Shot 2018-07-17 at 10.26.16 AM.jpg

The act of reading this article probably doesn’t seem like a complex task to you—or much of a tactile one. Your eyes simply scan the text; your brain processes the sentences and images to give them meaning. But if you think about it, there’s more to the reading experience. Chances are, you’re using your hands to access these lines—either by turning the pages of the physical magazine, clicking around with a computer mouse, or tapping on a tablet screen. But these motions are so ingrained, so intuitive, you likely don’t think of them as part of the reading process at all. Now imagine what reading a magazine would be like if you weren’t able to feel those movements, if you had to flip through the pages without being able to perceive their texture with your fingers, or if you had to type and click without any feeling in your hands, using only vision to guide you. It would change the entire experience—and make it much more difficult.

This is the challenge experienced by patients who, for whatever reason, have lost the use of their hands or arms, and thus have lost their sense of touch. And giving them back that critical yet oft-ignored sense is why Caltech biologist Richard Andersen is working so hard to incorporate a sense of touch into the neural prosthetics he’s been helping develop for years—devices implanted in the brain that allow a paralyzed patient to manipulate a robotic arm.

Andersen and colleagues first reported success of his original implant in early 2015. The team, led by Andersen, placed their prosthesis in the posterior parietal cortex, an area that controls one’s intent to move rather than controlling movement directly as previous experiments had done. This allowed Erik Sorto, a 35-year-old man who has been paralyzed from the neck down for more than 10 years, to use a robotic arm placed next to his body to perform a fluid hand-shaking gesture, play rock-paper-scissors, and even grasp a bottle of beer and bring it to his mouth for a sip—something he had long dreamed of doing.

“We showed that the posterior parietal cortex is an important source for gathering signals for the robotic arm that allow the patient to think just about the movement in general rather than in detail,” explains Andersen. “As a result, we think moving the arm becomes more intuitive to the patient and requires less concentration. It’s also faster and more efficient.”

THE NEXT LEVEL

This kind of innovative reimagining of how to make a robotic arm move garnered Andersen a 2015 National Science Foundation (NSF) grant from President Obama’s Brain Research through Advancing Innovative Neurotechnology—or BRAIN—Initiative, as well as seed money from the California Blueprint for Research to Advance Innovations in Neuroscience (Cal-BRAIN) program, the California complement to the federal initiative, which gave out its first-ever monetary awards last year to a group of researchers that included Andersen.

Andersen is now using those Cal-BRAIN funds—designed to bring together interdisciplinary teams of scientists and engineers from diverse fields for fundamental brain research—to take his team’s work to the next level. His hope: to enable people using robotic arms to literally regain their sense of touch, their ability to feel an object in their “hands.”

“Our patients with high-level spinal cord injuries are not only paralyzed, but they can’t feel below their necks,” explains Andersen. “The Cal-BRAIN grant is allowing us to put sensors on the hands of the robotic limbs, with the idea that these sensors will communicate with a prosthetic implant to stimulate an area of the brain that could reproduce a sense of touch when the robot hand touches an object.”

Right now patients can use vision to guide the robotic arm. But that’s far from ideal, says Andersen, who notes that if you anesthetize even just the fingertips of a healthy individual they will have great difficulty manipulating objects.

“It’s hard to change the position of the object in the hand or to know if it’s slipping or if the grasp is too tight if you can’t actually feel it,” he points out. “We want to return that sensation.”

To make that possible, researchers will need to know exactly how sensations of touch may arise in the brain through electrical stimulation, so that they can implant prostheses to stimulate those neurons. Andersen decided it would be best to start with participants who do have a sense of touch. So he teamed up with the same colleagues from the University of Southern California who had also worked with Sorto to do a study looking at a subset of patients with epilepsy who have had temporary grids of electrodes implanted in their brains to find the source of their seizures—a fairly common diagnostic practice for epileptic patients who have not responded to drugs and for whom surgery may be an option. These patients are of interest not because of their epilepsy, but because the implanted grids—which often extend into the somatosensory cortex where the sense of touch seems to live—allow Andersen and his team to stimulate this specific area of the cortex, electrode by electrode. Each electrode produces sensations at a different location on the subject’s hand, allowing the team to begin making a stimulation map of the hand.

“So far we’ve had two patients in whom we could map a hand representation using these grids,” says Andersen, who stresses that the stimulation doesn’t produce seizures or any other side effects.

Andersen and colleagues have also tested a healthy monkey’s use of a virtual arm, represented as an avatar on a computer screen. The monkey used limb movements to control the avatar limb on a computer screen. When the virtual limb touched a virtual object, it produced the sense of touching the object.

“The idea here is that the subject is controlling the avatar hand movements with real movements of his own limb, but the sensation he feels in his hand is all coming from the brain,” says Andersen. “And it works very well.”

By combining the hand-representation maps and data from the virtual-arm avatar studies, Andersen and his colleagues have enough findings to implant a paralyzed patient soon.

“We’ll be implanting the recording electrodes in two areas of the cerebral cortex—in the posterior parietal cortex where we implanted Erik Sorto’s prosthesis, and one in the premotor cortex, an area in front of the motor cortex that’s also important for grasping,” says Andersen. Additionally two arrays of stimulating microelectrodes will be implanted in the somatosensory cortex. “That will help us see if we can use the sensory information as feedback to improve dexterity in activities that the subject performs with the brain-controlled robotic limb.”

Since this would be their first attempt at providing sensory input to help a patient better use a prosthesis, there are a number of interesting challenges ahead, he says. For example, the team doesn’t know if stimulation through the somatosensory cortex implants will reproduce sensations in the hand naturally, or if it will be a bit like the cochlear implant, which creates sensations similar to natural hearing—but not exactly alike—that the users must then learn to interpret as speech.

“We’d like it to be as natural as possible, but I have a feeling our subjects will need to do some interpreting of the signals, because even though the implanted electrodes are tiny, the stimulation will still likely activate hundreds of neurons, which may be too many neurons to produce a highly natural sensation” says Andersen. People with the implant, then, are likely going to need to learn how to use sensory signals that feel somewhat less natural.

A MORE HELPFUL ROBOT

Adding touch to a robotic arm isn’t Andersen’s only line of inquiry. Remember the story of Erik Sorto, who was able to reach out and drink beer for the first time since he had become a quadriplegic? That was a feasibility study to show that Sorto could make use of his desire to sip liquid at his own rate. But he didn’t do it entirely on his own; he was assisted by “smart” robotics functions, such as the use of video cameras to look at the location and shape of the object in three dimensions and use that information to help form the correct grasp for the robotic arm so that its hand could grip the beer bottle. It also helped provide more precise control of the hand’s digits (aka fingers) and determined how far the arm needed to reach for an object.

“By blending the intent of the subject with these smart robotics, it makes things much easier because the patient doesn’t have to worry about all the small details,” says Andersen.

It is to support this kind of research that Andersen received—and is using—his NSF award. The smart robotics study builds upon Andersen’s prior experience with Erik Sorto and others like him, and brings together the same team: researchers and physicians from USC and Rancho Los Amigos National Rehabilitation Center. The Applied Physics Lab at Johns Hopkins, which does the robotic technology, is also involved.

“The NSF grant is allowing us to continue our collaboration with our robotics colleagues to move forward with the smart robotics, which is really how things in this area are going to work in the future,” says Andersen. “Hoping that you can control everything you want the arm to do with just the few signals that you get out of the brain is not realistic. We need smart robotics to do at least some of the work.” They plan to combine smart robotics with neural signals that select objects and the actions to be performed on the objects in order to increase the range of activities of daily living that can be performed by the subjects.

Ultimately, what Andersen hopes is to bring this work and the sensory-feedback prosthetic work together, providing people with paralysis from a variety of causes—strokes, nerve injuries, peripheral neuropathies—the control and ability to return to the most rewarding activities of their lives, like typing or playing the piano.

“Further down the line, with more grant funding and other support, we would like to combine a sensorized hand with smart robotics,” says Andersen. “The idea would be to get to the point where the robotic arm is as good as a human arm.”

But, he says, there are a lot of hurdles. “Among them is the current size of the system, which includes computers, cameras, and other machinery,” says Andersen. “Another hurdle is that electrodes themselves need to be more biocompatible so more signals can be recorded in the brain for a much longer period of time.”

In the meantime, Andersen has come up with a bit of a workaround: he’s helping restore some activities to paralyzed patients through the use of a computer tablet. In collaboration with UCLA, Andersen has been working with a subject, Nancy Smith, who has what he calls “really good bilateral finger representations,” meaning that she can imagine finger movements to type or play the piano using a virtual keyboard or virtual piano keys.

“We think an Android tablet or iPad is an especially valuable assistive device because so many functions are controlled by tablets now,” Andersen says. “And all these projects feed into one another—as we learn different information in one, we might be able to apply it elsewhere or to the whole range of studies.”

And all of this possibility and potential, he says, has come as a result of the awards he’s been given by Cal-BRAIN and the NSF, among others.

“Grants help to keep collaborations going and are beneficial to relationships that have been built through research,” says Andersen. “In fact, our projects are examples of essential interdisciplinary funding and collaboration. If all of us weren’t working on these projects, they wouldn’t work. We think that collaboration is going to make a huge difference not only in our understanding of the brain, but also in the lives of the patients. And that’s why we do this.”

Header image illustrated by Mark McGinnis for Caltech