How bionic prostheses work
Until recently, prostheses were attached to the human body mechanically and had no connection with the nervous system. They could bend in their iron joint joints, but for each movement to be performed, the owner needed to somehow control the behavior of his prosthesis, manually providing feedback. In this way, the man signaled to his leg that there was a puddle ahead and that it needed to be circumvented, and to his hand that he needed to carefully take an egg and prepare an fried egg or, conversely, hold a tool tightly in his hand. It took a long time to teach a person how to control a new limb in this way, and the set of commands was quite limited, so fine motor skills left much to be desired.
But scientists, inspired by the imagination of science fiction writers, were able to do the incredible - to attach a mechanical arm to the human nervous system.
Timur Bergaliev, head of the laboratory of applied cybernetic systems of the Moscow Institute of Physics and Technology, project manager of GalvaniBionix: “To manage prostheses, we are developing a technology that adapts to a person’s personality. On the cult, we place not one pair of electrodes, as is usually done, but several. The more electrodes we use, the larger the sample of signals for analysis we get. Yes, in this way we greatly complicate the work of the computer, since it is more difficult for the processor to analyze a lot of signals. But the patient’s life is greatly simplified. ”
When a person without a hand wants to move a finger, the brain generates an appropriate signal that goes along the nerves leading to the muscles of the limb. But, since there is no hand, the signal goes “into the void”. But what if, somewhere along the way, “intercepting” nerve impulses and on this basis, after analyzing and processing the data, form commands to control the robotic arm? It is on this path that numerous scientific groups go, trying to develop prostheses that read nerve signals and transform them into movements.
At American University of Houston and Rice University, experiments were conducted with the removal of motor nerve signals by electroencephalography (EEG) using electrodes on the scalp. The difficulty is that EEG is a set of a large number of different signals, and the task of distinguishing among them those that control the movement of a limb is akin to searching for a needle in a haystack.
Researchers from the Chalmers Technical University in Gothenburg, Sweden, together with colleagues from the NEBIAS consortium (a project of several European universities) took a different path. Instead of placing the electrodes on the surface of the skin, where the useful signal is very noisy, scientists tried to reduce the effect of interference by suturing the electrodes under the skin. But the physiology of each person is individual, and it is impossible to say in advance exactly where the electrodes should be located for the maximum signal-to-noise ratio.
Currently, the most promising method of controlling bionic prostheses is the reading of electrical potentials from the muscles of the stump - electromyography (EMG). Such high-tech prostheses have already gone beyond the laboratory and are being mass-produced. However, teaching the patient how to properly manage the prosthesis is still a difficult problem.
In the laboratory of applied cybernetic systems of the Moscow Institute of Physics and Technology, they are trying to turn this problem upside down, that is, to "train" the prosthesis to correctly understand the commands of the human brain. The GalvaniBionix team, consisting of students and graduate students of the Moscow Institute of Physics and Technology, led by the head of the laboratory Timur Bergaliev, uses not one pair of electrodes, but many, to read electrical potentials from the muscles. This approach allows us to achieve a significant increase in the level of the useful signal and implement the algorithms of "self-learning". Each combination of signals that came from different electrodes corresponds to a certain hand action, and the task is to create a correspondence library, which the system will access when receiving a new set of pulses. “The software learns to correctly recognize brain commands by adapting to a specific person, ” Bergaliev explains. - We were able to demonstrate the efficiency of the prototype system: an amputee with the help of “muscle signals” could move the cursor on the screen. In the future, we plan to use machine learning algorithms to analyze the frequency of registration of various combinations of signals and with the help of these data to improve recognition. "The article “By a wave of thought” was published in the journal Popular Mechanics (No. 2, February 2016).