A Multi-Camera Wearable Assistive System for Environmental Awareness in Visually Impaired Users Using Mobile Vision and Real-Time Feedback
- Publicada
- Servidor
- Preprints.org
- DOI
- 10.20944/preprints202512.2022.v1
Traditional mobility aids for visually impaired individuals are limited in detecting obstacles at multiple heights or in dynamic environments. This paper presents a low-cost assistive system leveraging embedded cameras, mobile artificial intelligence (AI) models, and wearable mounting strategies to support safer navigation. The system integrates multiple camera options—including smartphone-mounted and shoe-mounted modules—to provide comprehensive detection of obstacles above, at, and below the user's waistline. A detailed algorithmic pipeline is proposed, along with potential datasets, evaluation metrics, and integration methods for audio and haptic guidance. The system addresses critical gaps in current assistive technology by utilizing edge computing for privacy preservation, lightweight neural networks for real-time performance, and modular hardware design for affordability and accessibility.