PIFR: A novel approach for analyzing pose angle-based human activity to automate fall detection in videos.

Falls pose a significant health risk for elderly populations, necessitating advanced monitoring technologies. This study introduces a novel two-stage fall detection system that combines computer vision and machine learning to accurately identify fall events. The system uses the YOLOv11 object detect...

Full description

Saved in:
Bibliographic Details
Main Authors: Vungsovanreach Kong, Saravit Soeng, Munirot Thon, Wan-Sup Cho, Anand Nayyar, Tae-Kyung Kim
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0325253
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850208818295734272
author Vungsovanreach Kong
Saravit Soeng
Munirot Thon
Wan-Sup Cho
Anand Nayyar
Tae-Kyung Kim
author_facet Vungsovanreach Kong
Saravit Soeng
Munirot Thon
Wan-Sup Cho
Anand Nayyar
Tae-Kyung Kim
author_sort Vungsovanreach Kong
collection DOAJ
description Falls pose a significant health risk for elderly populations, necessitating advanced monitoring technologies. This study introduces a novel two-stage fall detection system that combines computer vision and machine learning to accurately identify fall events. The system uses the YOLOv11 object detection model to track individuals and estimate their body pose by identifying 17 key body points across video frames. The proposed approach extracts nine critical geometric features, including the center of mass and various body angles. These features are used to train a support vector machine (SVM) model for binary classification, distinguishing between standing and lying with high precision. The system's temporal validation method analyzes sequential frame changes, ensuring robust fall detection. Experimental results, evaluated on the University of Rzeszow Fall Detection (URFD) dataset and the Multiple Cameras Fall Dataset (MCFD), demonstrate exceptional performance, achieving 88.8% precision, 94.1% recall, an F1-score of 91.4%, and a specificity of 95.6%. The method outperforms existing approaches by effectively capturing complex geometric changes during fall events. The system is applicable to smart homes, wearable devices, and healthcare monitoring platforms, offering a scalable, reliable, and efficient solution to enhance safety and independence for elderly individuals, thereby contributing to advancements in health-monitoring technology.
format Article
id doaj-art-4f61eaef8b024adf8c20187a2a6af2ed
institution OA Journals
issn 1932-6203
language English
publishDate 2025-01-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS ONE
spelling doaj-art-4f61eaef8b024adf8c20187a2a6af2ed2025-08-20T02:10:09ZengPublic Library of Science (PLoS)PLoS ONE1932-62032025-01-01206e032525310.1371/journal.pone.0325253PIFR: A novel approach for analyzing pose angle-based human activity to automate fall detection in videos.Vungsovanreach KongSaravit SoengMunirot ThonWan-Sup ChoAnand NayyarTae-Kyung KimFalls pose a significant health risk for elderly populations, necessitating advanced monitoring technologies. This study introduces a novel two-stage fall detection system that combines computer vision and machine learning to accurately identify fall events. The system uses the YOLOv11 object detection model to track individuals and estimate their body pose by identifying 17 key body points across video frames. The proposed approach extracts nine critical geometric features, including the center of mass and various body angles. These features are used to train a support vector machine (SVM) model for binary classification, distinguishing between standing and lying with high precision. The system's temporal validation method analyzes sequential frame changes, ensuring robust fall detection. Experimental results, evaluated on the University of Rzeszow Fall Detection (URFD) dataset and the Multiple Cameras Fall Dataset (MCFD), demonstrate exceptional performance, achieving 88.8% precision, 94.1% recall, an F1-score of 91.4%, and a specificity of 95.6%. The method outperforms existing approaches by effectively capturing complex geometric changes during fall events. The system is applicable to smart homes, wearable devices, and healthcare monitoring platforms, offering a scalable, reliable, and efficient solution to enhance safety and independence for elderly individuals, thereby contributing to advancements in health-monitoring technology.https://doi.org/10.1371/journal.pone.0325253
spellingShingle Vungsovanreach Kong
Saravit Soeng
Munirot Thon
Wan-Sup Cho
Anand Nayyar
Tae-Kyung Kim
PIFR: A novel approach for analyzing pose angle-based human activity to automate fall detection in videos.
PLoS ONE
title PIFR: A novel approach for analyzing pose angle-based human activity to automate fall detection in videos.
title_full PIFR: A novel approach for analyzing pose angle-based human activity to automate fall detection in videos.
title_fullStr PIFR: A novel approach for analyzing pose angle-based human activity to automate fall detection in videos.
title_full_unstemmed PIFR: A novel approach for analyzing pose angle-based human activity to automate fall detection in videos.
title_short PIFR: A novel approach for analyzing pose angle-based human activity to automate fall detection in videos.
title_sort pifr a novel approach for analyzing pose angle based human activity to automate fall detection in videos
url https://doi.org/10.1371/journal.pone.0325253
work_keys_str_mv AT vungsovanreachkong pifranovelapproachforanalyzingposeanglebasedhumanactivitytoautomatefalldetectioninvideos
AT saravitsoeng pifranovelapproachforanalyzingposeanglebasedhumanactivitytoautomatefalldetectioninvideos
AT munirotthon pifranovelapproachforanalyzingposeanglebasedhumanactivitytoautomatefalldetectioninvideos
AT wansupcho pifranovelapproachforanalyzingposeanglebasedhumanactivitytoautomatefalldetectioninvideos
AT anandnayyar pifranovelapproachforanalyzingposeanglebasedhumanactivitytoautomatefalldetectioninvideos
AT taekyungkim pifranovelapproachforanalyzingposeanglebasedhumanactivitytoautomatefalldetectioninvideos