Vision-based approach to knee osteoarthritis and Parkinson’s disease detection utilizing human gait patterns

Recently, the number of cases of musculoskeletal and neurological disorders, such as knee osteoarthritis (KOA) and Parkinson’s disease (PD), has significantly increased. Numerous clinical methods have been proposed in research to diagnose these disorders; however, a current trend in diagnosis is thr...

Full description

Saved in:
Bibliographic Details
Main Authors: Zeeshan Ali, Jihoon Moon, Saira Gillani, Sitara Afzal, Muazzam Maqsood, Seungmin Rho
Format: Article
Language:English
Published: PeerJ Inc. 2025-05-01
Series:PeerJ Computer Science
Subjects:
Online Access:https://peerj.com/articles/cs-2857.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recently, the number of cases of musculoskeletal and neurological disorders, such as knee osteoarthritis (KOA) and Parkinson’s disease (PD), has significantly increased. Numerous clinical methods have been proposed in research to diagnose these disorders; however, a current trend in diagnosis is through human gait patterns. Several researchers proposed different methods in this area, including gait detection utilizing sensor-based data and vision-based systems that include both marker-based and marker-free techniques. The majority of current studies are concerned with the classification of Parkinson’s disease. Furthermore, many vision-based algorithms rely on human gait silhouettes or gait representations and employ traditional similarity-based methodologies. However, in this study, a novel approach is proposed in which spatiotemporal features are extracted via deep learning methods with a transfer learning paradigm. Following that, advanced deep learning approaches, including sequential models like gated recurrent unit (GRU), are used for additional analysis. The experimentation is performed on the publicly available KOA–PD–normal dataset comprising gait videos with various abnormalities, and the proposed model has the highest accuracy of approximately 94.81%.
ISSN:2376-5992