News in English

Robotic Feeding System Developed by Cornell Researchers to Aid Individuals with Mobility Challenges

Los Angeles CA (SPX) May 10, 2024 - Researchers at Cornell University have unveiled a robotic feeding system designed to aid individuals with severe mobility limitations, such as those affected by spinal cord injuries, cerebral palsy, and multiple sclerosis. The system leverages advanced technologies including computer vision, machine learning, and multimodal sensing to deliver food safely and effectively.

"Feeding individuals with severe mobility limitations with a robot is difficult, as many cannot lean forward and require food to be placed directly inside their mouths," said Tapomayukh "Tapo" Bhattacharjee, assistant professor of computer science in the Cornell Ann S. Bowers College of Computing and Information Science and senior developer behind the system. "The challenge intensifies when feeding individuals with additional complex medical conditions."

The technology was highlighted in a paper titled "Feel the Bite: Robot-Assisted Inside-Mouth Bite Transfer using Robust Mouth Perception and Physical Interaction-Aware Control," presented at the Human Robot Interaction conference. The paper received Best Paper Honorable Mention, and a demo of the broader system won a Best Demo Award.

Bhattacharjee's EmPRISE Lab has dedicated years to mastering the nuanced process of robotic feeding. "This last 5 centimeters, from the utensil to inside the mouth, is extremely challenging," Bhattacharjee commented.

The system is equipped with real-time mouth tracking and a dynamic response mechanism that adjusts to spontaneous user movements and interactions. "Current technology only looks at a person's face once and assumes they will remain still, which is often not the case and can be very limiting for care recipients," said Rajat Kumar Jenamani, the paper's lead author and a doctoral student.

The robotic feeder has been tested across three locations and received positive feedback for its safety and comfort. "This is one of the most extensive real-world evaluations of any autonomous robot-assisted feeding system with end-users," Bhattacharjee noted.

A dual-camera setup and a sensitive utensil tip facilitate accurate mouth tracking and interaction detection, enabling precise control even with minimal user movements, such as tongue manipulation. "We're empowering individuals to control a 20-pound robot with just their tongue," Jenamani explained.

The emotional impact of the system was profound, especially during a session where the parents of a child with severe disabilities saw her feed herself for the first time. "It was a moment of real emotion; her father raised his cap in celebration, and her mother was almost in tears," Jenamani recalled.

While further research is needed to enhance long-term usability, the system's potential to improve independence and quality of life for individuals with mobility challenges remains significant. "It's amazing," Bhattacharjee said, "and very, very fulfilling."

Research Report:Feel the Bite: Robot-Assisted Inside-Mouth Bite Transfer using Robust Mouth Perception and Physical Interaction-Aware Control

Читайте на 123ru.net