Anthropometric Landmark Detection Network via Geodesic Heatmap on 3D Human Scan

In recent years, with the commercialization of three-dimensional (3D) scanners, there is an increasing demand for automated techniques that can extract anthropometric data accurately and swiftly from 3D human body scans. With advancement in computer vision and machine learning, researchers have incr...

Full description

Saved in:
Bibliographic Details
Main Authors: Min Hee Cha, Jae Hyeon Park, Ji Sun Byun, Sangyeon Ahn, Gyoomin Lee, Seung Hyun Yoon, Sung In Cho
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10806647/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In recent years, with the commercialization of three-dimensional (3D) scanners, there is an increasing demand for automated techniques that can extract anthropometric data accurately and swiftly from 3D human body scans. With advancement in computer vision and machine learning, researchers have increasingly focused on developing automated anthropometric data extraction technique. In this paper, we propose a deep learning method for automatic anthropometric landmark extraction from 3D human scans. We adopt a coarse-to-fine approach consists of a global detection stage and a local refinement stage to fully utilize the original geometric information of input scan. Moreover, we introduce a novel geodesic heatmap that effectively captures the point distribution of 3D shapes, even in the presence of variations in scanning pose. As a result, our method provides the lowest average detection error on the SHREC’14 dataset over the six anthropometric landmarks, demonstrating a maximum error reduction of 76.14%. Additionally, we created a dataset consisting of human scans with various poses to demonstrate robustness of our method. Thanks to our new datasets, our end-to-end strategy showed its effectiveness to various human postures without any predefined features and templates.
ISSN:2169-3536