A robot can be useful for performing repetitive tasks with great efficiency and precision, but how can robots be designed with more of the versatility and adaptability of the human body? Michelle Esponda ’17 (MS) explored this problem in her master’s thesis in biomedical engineering at the University of Rochester.
For her thesis, she developed a prototype of an adaptable 3D-printed prosthetic hand that uses a wrist-mounted camera and forearm-mounted surface electromyography (EMG) sensor to control a five-fingered hand in a manner that mimics grasping of a human hand. A neural network trained from a set of examples predicts the best grasp to perform from the camera image. The EMG sensor controls the opening and closing of the fingers to perform the predicted grasp.
The novel aspect of her work involves the development of multiple modes of interaction to acquire new examples for retraining the neural network while wearing the device. In addition to a forearm-mounted touchscreen interface, the adaptable prosthetic hand can use voice commands to initiate desired behaviors and acquire data. The voice command module she developed is an adaptation of the distributed correspondence graph that converts text to one of several grasps and pairs that behavior with the current camera image.
The adaptable prosthetic hand learns to initiate the new grasp behaviors expressed by the new training data by retraining the neural network with the acquired examples offline. In the future, the design could be improved to add more dexterity, reduce size and cost, and use new vision-based classification algorithms and more diverse language to enable more complex behaviors. This work was supported in part by the New York State Center of Excellence in Data Science.
Help us caption & translate this video!