Weather     Live Markets

Cornell researchers have developed a robotic feeding system using computer vision, machine learning, and multimodal sensing to feed people with severe mobility limitations, such as spinal cord injuries, cerebral palsy, and multiple sclerosis. Feeding individuals with severe mobility limitations presents challenges, as many cannot lean forward to eat and require food to be placed directly into their mouths. The system, led by Tapomayukh “Tapo” Bhattacharjee, assistant professor of computer science, aims to address these challenges and improve the quality of life for care recipients with complex medical conditions.

A paper on the robotic feeding system, titled “Feel the Bite: Robot-Assisted Inside-Mouth Bite Transfer using Robust Mouth Perception and Physical Interaction-Aware Control,” was presented at the Human Robot Interaction conference in March and received Best Paper Honorable Mention recognition. The research team also received a Best Demo Award for showcasing their broader robotic feeding system. Bhattacharjee, a leader in assistive robotics, and his EmPRISE Lab have spent years developing the technology to teach machines the intricate process of feeding humans, including identifying food items, picking them up, and transferring them into a care recipient’s mouth.

The challenges in feeding individuals with severe mobility limitations include limited mouth openings, muscle spasms, and the need for precise biting locations inside the mouth. The robotic system developed by researchers features real-time mouth tracking and a dynamic response mechanism that adjusts to users’ movements and interactions. This enables the system to distinguish between spasms, intentional bites, and user attempts to manipulate the utensil inside their mouths. The robotic system successfully fed 13 individuals with diverse medical conditions in a user study conducted at various locations, including the EmPRISE Lab, a medical center in New York City, and a care recipient’s home in Connecticut.

The robot, equipped with a multi-jointed arm and a custom-built utensil, uses mouth tracking trained on thousands of images to detect users’ mouths accurately. The physical interaction-aware response mechanism combines visual and force sensing to perceive how users are interacting with the robot. Researchers like Rajat Kumar Jenamani are thrilled with the user study results, noting the significant emotional impact of the robot on care recipients and their caregivers. The successful feeding session of a daughter with schizencephaly quadriplegia, a rare birth defect, left her parents in tears of joy. While further research is needed to explore the long-term usability of the system, the promising results suggest the potential to improve care recipients’ independence and quality of life.

Co-authors of the paper include Daniel Stabile, Ziang Liu, Abrar Anwar, and Katherine Dimitropoulou from various universities. The research was primarily funded by the National Science Foundation. Bhattacharjee and his team are excited about the possibilities of their robotic feeding system in enhancing the lives of individuals with severe mobility limitations. The system’s success in feeding care recipients safely and comfortably showcases the potential of assistive robotics to provide greater independence and emotional fulfillment for those with complex medical conditions.

Share.
Exit mobile version