How are amputees able to control the fingers in their bionic arm ?


How are amputees able to control the fingers in their bionic arm ?

In: Technology

The same way you control your fingers. The brain sends a small pulse of electricity to the part you want to move and hey presto, you just went and moved a finger. Some of them work by attaching a small transmitter to the brain which allows the person to move the arm wirelessly. Kinda like a Bluetooth arm

Conventional myoelectric prostheses, what 99% of the bionic hands on the market use, are controlled by two inputs. Two sensors on the forearm (in the case of a below elbow amputee) pick up muscle signals. The amputee can either open or close the hand. Programming in the hand and using a variety of combinations of input codes can trigger different grips. For example, triggering open and close at the same time can engage one grip. Holding an open signal for a fixed time can trigger another grip. Sending a very fast/strong open signal can teigger a third grip. Same with a close impluse.

With practice, it can become somewhat natural.

Pattern recognition uses many more electrode sensors to pickup more fine movements that can directly pickup more detailed inputs from the remaining muscles, but it still is not at the individual finger control level yet.

To control 5 individual fingers of flexing and extending. You need 10 different inputs. This level of detail is only possible with inplanted electrodes. To control every joint of each finger like you can with your natural hand, you will need about 35 different inputs.

Just guessing, but no one as of yet has mentioned that the muscles which control your fingers are in your forearm. I bet the prosthesis has sensors in the forearm area to control the fingers.


The company I work at actually exclusively works on this!

/u/WashingtonFierce post is incorrect, we do not yet have commercial technology designed specifically to physically interact with the brain and detect limb movement. The amount of time, money, and risk is prohibitively monumental. Imagine being an ethics review board member having to review an experiment about splitting a subject’s head open to insert a sensor that you may or may not know will work effectively and reliably unless you try.

/u/TheLazyD0G has more or less explained it correctly, and I can add some additional info.

* Your brain sends signals to nerves in the arm, and the nerves are connected to the arm muscles. What actually happens when your arm muscle contracts is a bunch of Calcium and Potassium ions moving about the cell walls of your muscles. Since these ions have positive charges to them, their movement generates a very very tiny voltage. The two sensors that sit in the prosthetic socket and touch the forearm are sensitive enough to pick up this change in voltage. This concept is called [EMG](

* [So depending on the voltage change, the sensors can detect how hard you’re flexing, or even at all.]([email protected]/Energy-envelope-of-the-rectified-EMG-signal-of-LD1-of-three-consecutive-muscle.png)

* [The way the hands are programmed are in that they cycle through different pre-set grip modes, and the patient can only open and close them in different modes.]( The bebionic3, for example, you start out in Tripod grip (so you only close index, middle, and thumb) and can only close and open in that formation. You have to press the button on the back to change to Power grasp, and then you can only open and close them in a fist. You then have to press the button, AGAIN, to go into another grip, say Precision grasp, and then you can only open and close the thumb and index finger together. In a sense, they’re just hand-shaped swiss army knives.

* The patient opens and closes them by flexing their limb in one direction or another. Imagine flexing your wrist towards your chest. That’s close. Now flex your wrist away from you. That’s open.

* This can get tedious (how many times did I have to press the button?) and can get frustrating if you make a mistake in a high-pressure situation (e.g. getting change into your wallet after the cashier hands it to you)

* The pattern recognition that /u/TheLazyD0G mentions attempts to use multiple (3+) sensors and machine learning to have the arm change the grip based on which hand gesture you trained it to do earlier. However, this concept is still bogged down by the hand’s programming of only changing between different pre-set gestures.

* We have not yet achieved the level of fineness in detecting individual finger movements, largely to the concept of “Crosstalk”. With the current size of these skin sensors, the region of muscle they observe can’t distinguish whether a movement was for one finger versus another. Implantable sensors can theoretically solve this issue, but research into them so far have been very preliminary.

Let me know if you have other questions!

Depends on the exact type of bionic arm; but in general, the bionic limbs got a sensor somewhere in the body (sometimes at the stump itself, sometimes elsewhere in the body) that measures either muscle contraction or nerve/neuron activity directly and interprets the different signals into limb position or movement velocity. Some variations are more intuitive to control than others, but in general over time the person learns how to control it with a bit more dexterity.