Asynchronous Particle Filter with Pedestrian Graph Integration for Visually Impaired Navigation


Wang M., Danis F. S., Renaudin V., Servières M.

14th International Conference on Indoor Positioning and Indoor Navigation, IPIN 2024, Kowloon, Hong Kong, 14 - 17 Ekim 2024 identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1109/ipin62893.2024.10786110
  • Basıldığı Şehir: Kowloon
  • Basıldığı Ülke: Hong Kong
  • Anahtar Kelimeler: map matching, particle filter, pedestrian navigation, visually impaired people
  • Galatasaray Üniversitesi Adresli: Hayır

Özet

Visually impaired people face particular challenges when it comes to positioning and navigation. This paper presents an innovative assistive technology designed to improve the independent mobility of visually impaired people by integrating multiple navigation inputs into a cohesive real-time positioning system. At the heart of our approach is the optimized use of an asynchronous particle filter - a probabilistic, recursive algorithm tailored to pedestrian navigation using Pedestrian Dead-Reckoning (PDR) and Map Matching. Unlike traditional mesh graphs, we employ pathway graphs specifically designed for visually impaired people, incorporating landmarks as critical nodes to enhance accuracy and flexibility. This design adapts to real pedestrian movement and improves the practicality of the system on walkable paths by allowing deviations from theoretical paths. Our system integrates graph edge observations, GNSS data, and landmarks to achieve high localization accuracy and robustness. We have evaluated three advanced navigation models that show significant improvement in stride trajectory estimation accuracy. Model 1, using GNSS, reduced the mean and median Euclidean distance to ground-truth to 2.56 meters and ${2. 2 6}$ meters, respectively. Model 2, which incorporates simulated landmarks, achieved mean distances of 2.17 meters and a median of 1.97 meters, while Model 3, which uses a graph edge-based approach, achieved the best results at a mean of 1.7 meters and a median of 1.47 meters. These results underline the improved accuracy of our navigation systems for visually impaired users, especially in complex environments.