Prototyping a user-testable product in 14 days.

This project presents an innovative assistive technology solution that combines computer vision, haptic feedback, and AI-powered voice descriptions to provide accessible navigation assistance for visually impaired individuals. The system integrates multiple sensing technologies with intelligent processing to create an affordable and effective mobility aid.

SafePath demonstration and user testing
SafePath smart glasses prototype at the Idea Fair

System Architecture

The SafePath navigation system addresses the critical need for accessible and affordable navigation tools through a multi-modal approach combining visual, tactile, and auditory feedback mechanisms. Current navigation solutions for visually impaired individuals are often expensive and inaccessible, creating a significant gap in mobility assistance technology.

Technical Implementation

The SafePath system integrates multiple advanced technologies:

  • Computer vision processing with real-time environment scanning and object detection
  • Multi-modal haptic feedback using precision vibration alerts for spatial awareness
  • AI-powered natural language processing for contextual voice descriptions of surroundings
  • Open-source architecture enabling community development and continuous improvement
Engaging with the visually-impaired community for user feedback

Research & Validation

To ensure our solution addressed real needs, we conducted over 30 in-depth interviews with members of the visually-impaired community. This user-centered approach validated our design decisions and helped us create a truly useful product.

Team

This project was developed in collaboration with an amazing team: Anand Erdenebat, Fabian Schmid, Pedro and Jinze Jiang, with invaluable support from Think.Make.Start. coaches and fellow students.