Integrating YOLO ORB SLAM via an Intel Realsense on a Mobile Robot to Detect Indoor Objects for Blind and Low-Vision People

Supervisors: Yong Joon Thoo, Julien Nembrini, Denis Lalanne

Student: Benjamin Del Don

Project status: Ongoing

Year: 2024-2025

Using a mobile robot equipped with a 3D camera for indoor 3D mapping and object recognition allows to generate a 3D model with categorized objects such as chairs, doors, tables, etc. Thanks to the autonomy of the robot, the mapping procedure can be autonomously repeated to cope with dynamic indoor environments that are typically challenging for blind or low-vision people. The produced information can be subsequently used in an AR headset to overlay information meaningful to low-vision people.

Document: