Can outperform real limbs in prosthetics
Development of the most modern arm prosthesis in the world with model-based design
By James Burck, Johns Hopkins University Applied Physics Laboratory, Michael J. Zeher, Johns Hopkins University Applied Physics Laboratory, Robert Armiger, Johns Hopkins University Applied Physics Laboratory, and James D. Beaty, Johns Hopkins University Applied Physics Laboratory
Few of us are aware of the sensitive interplay of neural, mechanical and sensory systems that is required to cope with such simple tasks as picking up a ball. If you want to create an arm prosthesis with a natural movement profile, you have to simulate all of these complex individual systems and their difficult interactions with the help of state-of-the-art actuators, sensors, microprocessors and embedded regulation and control software. Researchers from the Defense Advanced Research Projects Agency (DARPA) faced this challenge in their “Revolutionizing Prosthetics” program.
The Applied Physics Laboratory (APL) at Johns Hopkins University is the lead in a global team made up of government institutions, universities, and private companies that has set itself the goal of developing a prosthetic arm that will far surpass any prosthesis available to date. The final version of the arm will have control algorithms controlled by nerve impulses, allowing the wearer to act with the speed, dexterity and strength of a real arm. Using the latest sensor feedback methods, you will also be able to feel physical impressions such as pressure, force and temperature. A central milestone of the project was the development of the Virtual Integration Environment (VIE), an environment built using MathWorks tools and model-based design to simulate complete limb systems.
With its standardized architecture and well-defined interfaces, VIE enables experts from over two dozen partner organizations to work together. Model-based design with MathWorks tools was also used in other key phases of development: when modeling limb mechanics, testing new neural decoding algorithms, and developing and verifying control and regulation algorithms.
Architecture of the Virtual Integration Environment
The architecture of the VIE consists of five main modules: input, signal analysis, control / regulation, controlled system and display. The input module contains all of the input devices that patients can use to signal their intentions: surface electromyograms (EMGs), cortical and peripheral neuroimplants, implantable myoelectric sensors (IMESs), and more conventional digital and analog inputs for switches, joysticks, and others used in clinical research Control input devices. The crucial task of this module, however, is to interpret the raw data with the help of pattern recognition algorithms, to derive the intentions of the user and to pass them on to the control module. In the control module, these commands are converted into motor signals that specifically control the individual actuators used to move the arm, hand and fingers. The controlled system module consists of a physical model of the mechanics of the arm. Finally, the display module generates a three-dimensional (3D) animation of the arm movement (Fig. 1).
Connection with the nervous system
Simulink® and the VIE were indispensable in developing an interface that would allow the prosthesis to be controlled in a natural and intuitive way. The researchers record data from implanted neurochips while the test subjects perform tasks such as reaching for a ball in the virtual environment. The VIE's modular input systems receive this data, which MATLAB®- Algorithms deduce the subject's intention through pattern recognition, which relates the nerve activity to the subject's movement (Fig. 2). The results are fed back into the VIE, with which such experiments can be carried out in real time. A number of input devices have been developed using the same workflow, some of which are already being tested by people with prostheses at the Rehabilitation Institute of Chicago.
Construction of prototypes of a real-time control
The VIE signal analysis and regulation modules form the heart of the control and regulation system that will be used in the arm prosthesis at the end of development. The software for these modules was developed at the APL. The individual algorithms were developed with the Embedded MATLAB ™ subset and then integrated as function blocks into a Simulink model of the system. To build a real-time prototype of the control system, the team created a real-time workshop® Program code for the entire system consisting of Simulink and Embedded MATLAB components and tested the code on an xPC Target ™ real-time system.
There were many advantages to this approach. With model-based design and Simulink, the entire system could be modeled and then optimized and verified in simulations. The prototype could be built using rapid prototyping and tested long before the decision for a specific hardware platform was made. With Real-Time Workshop Embedded Coder ™, target system-specific code for the selected processor was finally generated from the Simulink model of the system. This Simulink model had been tested and verified for safety using simulations, and no manual programming was required that could introduce errors or unintended behavior. The developers could therefore trust that their 'Modular Prosthetic Limb' would behave exactly as intended.
Physical modeling and visualization
In order to be able to carry out closed-loop simulations of the open-loop and closed-loop control system, the APL developed a system model that reproduced the inertial properties of the arm. The starting point was formed by partners in SolidWorks® developed CAD assemblies of the arm components from which a SimMechanics ™ model of the arm was automatically generated and linked to the control and regulation system in Simulink. This route model was then connected to a Java ™ 3D graphics engine developed at the University of Southern California, which can be used to animate the movement of the arm in a simulated environment.
On the basis of the powerful virtual system, the APL was also able to create an intuitive environment for clinical work with which the system can be configured and trained. Clinical researchers can use a graphical user interface generated in MATLAB to configure parameters in the VIE and conduct test sessions with volunteer subjects (Fig. 3). The doctors interact with the application running on a host PC, which in turn communicates with the xPC target system, on which the control and regulation software is executed in real time. A third computer takes over the 3D rendering and the representation of the virtual arm. In tests with real limbs, control signals can be correlated and visualized while the test person is moving.
With model-based design, the “Revolutionizing Prosthetics” team completed the Proto 1 and Proto 2 versions as well as the first version of the VIE much earlier than planned. We are currently working on a detailed design of the Modular Prosthetic Limb, the version that will ultimately be handed over to DARPA.
Many partner institutions use the VIE as a test environment to further improve their systems. The VIE is expected to remain the platform for further developments in prosthetics and neuroscience for a few years to come. The APL team has set up a development workflow with which systems can be built from reusable models in a short time and implemented on prototype hardware, and from which not only the "Revolutionizing Prosthetics" project benefits, but also other associated programs.
Released for publication without restriction.
Development of a lifelike system according to strict time specifications
A mechatronic system that exactly simulates natural movements, to develop in just four years (as specified by DARPA) and to make it available for clinical tests, requires radical innovations in the areas of neural control and sensory stimulation, state-of-the-art actuators and mechanics as well a completely new prosthesis design. The best arm prostheses currently available typically only have three active degrees of freedom: bending / extending the elbow, rotating the wrist, and opening and closing the hand. Proto 1, the first prototype of the APL, extends this with five additional degrees of freedom: two on the shoulder (flexion / extension and internal / external rotation), wrist flexion / extension and additional handles for the hand. In order to recreate full natural mobility, even the advances made with Proto 1 were just the beginning. Proto 2, actually an electromechanical proof-of-concept model, had more than 22 degrees of freedom, including additional sideways movements on the shoulder (abduction / adduction), the wrist (bending towards the ulna / radius) and independent finger movements. The hand can also perform several complex hand movements.
The Modular Prosthetic Limb - the version intended for the DARPA - will have 27 degrees of freedom and the ability to feel temperature, touch, pressure and vibrations.
Released 2009 - 91782v00
Show articles for similar areas of application
View articles for related industries
- Einstein suffered from fear
- Which is twice as much as 9
- Why do many people leave Deviantart
- What should a doctor say
- Who is Ashoka 1
- Why is our physical reality so mathematical
- Venture capitalists finance partnership companies
- How physically painful are plastic surgeries
- What are the advantages of alternative fuels
- What does your inner child bring out immediately?
- Why do criminals commit crimes
- What is document translation
- Is HNO2 a weak acid in water
- Why should we trust teachers
- Japan is still the leader in electronics
- What diesel cars
- Which scripting language should I learn first?
- Bangladesh is richer than Cambodia
- How do I create a podcast 1
- Who leaked the Comey memos
- What is the order 4 14 18
- Loses Harvard to Stanford
- Is Gene Simmons IQ really 117
- Why is the study of citizenship important