Towards walkable footpath detection for the visually impaired on Bangladeshi roads with smartphones using deep edge intelligence

One of the ongoing prevalent issues is the challenge faced by visually impaired people when crossing footpaths, especially in a densely populated geographic location such as Dhaka city in Bangladesh, where numerous accidents take place that primarily result in the demise of the affected individuals....

Full description

Saved in:
Bibliographic Details
Main Authors: Md. Ishan Arefin Hossain, Jareen Anjom, Rashik Iram Chowdhury
Format: Article
Language:English
Published: Elsevier 2025-07-01
Series:Array
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2590005625000153
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:One of the ongoing prevalent issues is the challenge faced by visually impaired people when crossing footpaths, especially in a densely populated geographic location such as Dhaka city in Bangladesh, where numerous accidents take place that primarily result in the demise of the affected individuals. Visually impaired people find themselves in precarious situations while navigating through these footpaths. So, having an accessible edge device like a smartphone capable of predicting walkable footpaths by detecting obstacles in real-time is a blessing. However, little work has been done on efficient obstacle detection on footpaths and their corresponding distance prediction in real-time. To address this burning issue, a U-Net-based lightweight deep learning model called QPULM along with an obstacle distance measurement technique called SODD have been proposed in this research, which is utilized in an Android application to detect walkable footpath by avoiding the obstacles via the image captured and to broadcast the directions of the walkable paths using audio feedback. The proposed novel lightweight model at the Edge showed an excellent accuracy of 99.37% with a faster prediction time in milliseconds in real-time, which is significantly better and more efficient than the existing related solutions.
ISSN:2590-0056