AI Integration Enhances Prosthetic Hand ControlResearchers have developed a prosthetic hand that utilizes artificial intelligence (AI) to facilitate more natural control for users. The system is designed to recognize user intent and then share control of the required motions to complete a task.### Methodology and OutcomesThe approach combines AI with specialized sensors. Marshall Trout, a researcher at the University of Utah and lead author of the study, reported that four individuals missing a hand were able to simulate drinking from a cup. When the sensors and AI provided assistance, participants demonstrated reliable grasping of a cup. Conversely, without this shared control, participants reportedly either crushed or dropped the cup consistently. The findings were published in the journal Nature Communications.John Downey, an assistant professor at the University of Chicago not involved in the research, noted that the ability to exert controlled grasp force represents a significant challenge in current prosthetics. Problems with control are identified as a factor leading many amputees to discontinue using bionic hands.### Addressing Control ComplexityModern bionic hands incorporate motors for movement and detect electrical signals from muscles. However, increasing capabilities have also increased control complexity for users. According to Trout, users typically need to maintain significant focus on their actions, which differs from the subconscious nature of natural hand movements.Natural hands perform routine tasks, such as reaching for an object, with minimal cognitive effort. This is due to specialized neural circuits in the brain and spine that automate many motions, with conscious intervention only occurring for unexpected events. The research team aimed to replicate this automaticity in a prosthetic.### AI-Driven Shared ControlThe team leveraged AI to manage some subconscious functions. The AI system was trained to detect subtle muscle twitches indicative of an intention to grasp. This triggers the machine controller to initiate grasping actions.To facilitate this, the bionic hand was modified with proximity and pressure sensors. Proximity sensors gauge the distance to an object and assess its shape, while pressure sensors on the fingertips provide feedback on grip firmness to the user.Jacob George, a professor at the University of Utah and director of the Utah NeuroRobotics Lab, stated that while robotic hands can outperform human users in some tasks, users often prefer devices that feel integrated and under their control. The shared control mechanism addresses this by combining machine assistance with human input.Downey emphasized that human motor control involves subconscious reflexes, suggesting that robotic imitations of these reflex loops are crucial. George indicated that the integration of machine and human control is a step toward creating prosthetic limbs that feel like an extension of the user's own body.### Future OutlookThe development aims to foster a sense of embodiment, making the robotic hand a part of the user's experience rather than just a tool. While current bionic hands still require human cognitive input for a broad range of tasks, such as switching between delicate and firm grips, advancements in versatility are anticipated. Researchers anticipate that user control will remain a key aspect of future prosthetic development.