This project integrates digital tactile paving with a smart guiding cane and MR (Mixed Reality) smart glasses to create a more accurate and user-friendly navigation aid. It addresses the limitations of physical tactile paving—such as damage, obstruction, and interference with wheelchair accessibility—highlighting a more inclusive approach to public space design. The motivation for this project stems from the challenges faced by individuals with low vision or blindness, who often encounter disorientation and fall risks in low-contrast or unfamiliar environments. A more reliable assistive system is needed to enhance mobility independence and ensure public safety.
To solve these issues, the project adopts a digital twin approach. Using cameras and depth sensors, the system performs real-time environmental scanning to generate a virtual navigation pathway. The smart guiding cane delivers directional cues through distinct vibration patterns, assisting users in determining movement and orientation. Meanwhile, MR smart glasses enhance visual guidance and obstacle alerts, achieving a multisensory navigation experience.
Subsequent development will include mechanical and field testing, user trials, and public demonstrations, along with continued second-generation R&D. The goal is to build a sustainable, scalable, and continuously iterative assistive solution.
Link to Awarded Website:https://ustp.ntpu.edu.tw/info/6/method/176