Bridging Minds and Machines

Bridging Minds and Machines

Table of Contents

Photography by Alex Dolce

Tackling the Brain-robot Communication Challenge
By Jeff Brooks-Gillies

Amid all the galaxy-traversing technology in the Star Wars series, one of the greatest scientific advancements on display passes quickly in a seconds-long scene in a doctor’s office. A medical droid pokes Luke Skywalker’s new robotic hand, and it flinches, suggesting it not only sends realistic pain sensations, but also reacts naturally to motor control signals that the brain sends in response.

Here on Earth, recreating this seamless two-way communication between the brain and robotic devices is a "grand challenge" in the field of robotics and neuroscience, according to Erik Engeberg, Ph.D., a professor in Florida Atlantic University’s Department of Ocean and Mechanical Engineering at the College of Engineering and a member of the Florida Atlantic’s Stiles-Nicholson Brain Institute.

Ideally, an artificial limb would respond to the brain’s signals for movements while also providing realistic signals that mimic the full spectrum of sensations related to touch.

"They are two problems that interact with each other and compound upon one another," Engeberg said. "One is motor control, and one is on sensation and feedback, and both of those need to be working really well to enable high functionality."

While scientists have made great advancements in brain-machine interfaces that communicate through the electrical signals that travel through our nerves, there are still obstacles. Those include the costs, risks and regulatory requirements. A new research platform developed by Engeberg and his collaborators could help overcome those hurdles.

The platform pairs a robotic, sensor-equipped hand with a biological neural network — neurons cultured in a multichannel microelectrode array — along with electrodes that record electrical activity. According to an article recently published in the journal Biomimetics, the platform shows the neural network was able to differentiate between two categories of modeled human touch receptors while also exhibiting motor control of the robotic hand.

"I’m trying to find a way to accelerate progress through this new interface and platform," said Engeberg, who is also affiliated with the Department of Biomedical Engineering, Department of Electrical Engineering and Computer Science, and the Center for Complex Systems and Brain Sciences.

"There is good potential to reduce risk during the technology and hardware evaluation phase of experiments, and also for algorithm development with these kinds of interfaces," he said.

Even top-of-the-line assistive devices have limited functionality because they don’t speak the same language as our nervous system, Engeberg said. The new study builds on previous findings from his research group on translating robotic sensations into a code that nerves are able to understand and convert into a plan for action.

In the nervous system, information is transmitted in spikes called action potentials or nerve impulses. The robotic hand has a sensor that measures touch-related parameters like force and slippage. The platform converts those digital measurements into spikes that are quite similar to those produced naturally in a human, Engeberg said.

That signal is then sent to electrically stimulate the cells in the biological neural network, which Engeberg refers to as a brain on a dish.

"The neurons are feeling, in a somewhat realistic way, what the robotic hand is feeling if it were all one biological organism, like a human or an animal," he said.

The platform then records the electrical activity from the neurons that is evoked by the tactile touch signal from the hand, which is then decoded and converted into another signal that can be used to control the hand.

"It forms a loop," Engeberg said. "The signal comes from the hand to the neurons, and then from the neurons to the hand."

For the study published in Biomimetics, the researchers explored how the platform would respond to mathematical models that mimic the sensations associated with two different categories of biological mechanoreceptors. Both types of receptors — rapidly adapting and slowly adapting — detect elements of touch like pressure and vibration. The rapidly adapting sensors respond briefly to a stimulus, while the slowly adapting receptors continue to respond as long as the stimulus is present.

When the research team stimulated the biological neural network using those models, the neurons responded and controlled the robotic system differently, Engeberg said. They were able to detect and classify these signatures in the brain using a convolutional neural network, a type of artificial intelligence used for image recognition.

That shows that the closed-loop robotic hand system is responsive to different sensations of touch, and that researchers can quantify the outcomes that different encoding methods produce, Engeberg said.

"It’s like a proof of concept," he said. "We actually are able to, with some precision and fidelity, tease out the details and understand what the potential ramifications are when we’re trying to remap someone’s sensation of touch."