Distortion Measurement Metric for Human Image Refinement and Evaluation Using Distorted Image Datasets
This paper introduces novel evaluation metrics for quantifying distortion in human image retouching and presents a COCO-based retouched image dataset to validate their effectiveness. The dataset is constructed using human images from the COCO dataset, which provides a large-scale and diverse collect...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10892112/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850078504773746688 |
|---|---|
| author | Gwangyeol Yu Kyungseo Yoon Sangwoo Lee Yeju Shin Jonghyuk Park |
| author_facet | Gwangyeol Yu Kyungseo Yoon Sangwoo Lee Yeju Shin Jonghyuk Park |
| author_sort | Gwangyeol Yu |
| collection | DOAJ |
| description | This paper introduces novel evaluation metrics for quantifying distortion in human image retouching and presents a COCO-based retouched image dataset to validate their effectiveness. The dataset is constructed using human images from the COCO dataset, which provides a large-scale and diverse collection of images, including a variety of facial structures, poses, and lighting conditions, making it well-suited for human-centered retouching research. Despite extensive research into the perceptual assessment of visual discomfort induced by distorted images, a significant gap exists in the availability of quantifiable evaluation metrics and dedicated distortion datasets. To address this issue, we developed two distinct evaluation metrics and generated an optimal dataset tailored for this purpose. The two-evaluation metrics proposed in this study are the Distorted Line Similarity (DLS) metric, which uses edge detection, and the Point-to-Point (P2P) metric, which leverages corner detection. In contrast to traditional image quality assessment metrics such as PSNR and SSIM, these new metrics quantify the changes in lines, curves, and feature points within an image, providing meaningful results when assessing image distortion. Furthermore, the proposed framework produces results with reduced distortion compared to existing retouching applications. Using the proposed metrics, we demonstrate that the images generated by the proposed framework exhibit significantly less distortion than the original distorted images. The demand for retouching images in which the main object is a person increases. Excessive distortions can be quantified and detected in contexts such as immigration procedures using the proposed metrics. By publicly releasing the COCO-based retouched image dataset, which includes the original images, distorted images, and images with minimized distortions used in our experiments, we aim to demonstrate the quality of our dataset and contribute to the field of image distortion minimization research. |
| format | Article |
| id | doaj-art-8f27f9efdfdc43e3a9016d295caaee70 |
| institution | DOAJ |
| issn | 2169-3536 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-8f27f9efdfdc43e3a9016d295caaee702025-08-20T02:45:32ZengIEEEIEEE Access2169-35362025-01-0113343903440810.1109/ACCESS.2025.354354210892112Distortion Measurement Metric for Human Image Refinement and Evaluation Using Distorted Image DatasetsGwangyeol Yu0https://orcid.org/0009-0000-7873-9356Kyungseo Yoon1https://orcid.org/0009-0001-5629-703XSangwoo Lee2https://orcid.org/0009-0002-1459-0163Yeju Shin3https://orcid.org/0009-0001-2151-9034Jonghyuk Park4https://orcid.org/0000-0003-4283-1155Department of AI, Big Data, and Management, Kookmin University, Seongbuk-gu, Seoul, Republic of KoreaDepartment of AI, Big Data, and Management, Kookmin University, Seongbuk-gu, Seoul, Republic of KoreaDepartment of AI, Big Data, and Management, Kookmin University, Seongbuk-gu, Seoul, Republic of KoreaDepartment of AI, Big Data, and Management, Kookmin University, Seongbuk-gu, Seoul, Republic of KoreaDepartment of AI, Big Data, and Management, Kookmin University, Seongbuk-gu, Seoul, Republic of KoreaThis paper introduces novel evaluation metrics for quantifying distortion in human image retouching and presents a COCO-based retouched image dataset to validate their effectiveness. The dataset is constructed using human images from the COCO dataset, which provides a large-scale and diverse collection of images, including a variety of facial structures, poses, and lighting conditions, making it well-suited for human-centered retouching research. Despite extensive research into the perceptual assessment of visual discomfort induced by distorted images, a significant gap exists in the availability of quantifiable evaluation metrics and dedicated distortion datasets. To address this issue, we developed two distinct evaluation metrics and generated an optimal dataset tailored for this purpose. The two-evaluation metrics proposed in this study are the Distorted Line Similarity (DLS) metric, which uses edge detection, and the Point-to-Point (P2P) metric, which leverages corner detection. In contrast to traditional image quality assessment metrics such as PSNR and SSIM, these new metrics quantify the changes in lines, curves, and feature points within an image, providing meaningful results when assessing image distortion. Furthermore, the proposed framework produces results with reduced distortion compared to existing retouching applications. Using the proposed metrics, we demonstrate that the images generated by the proposed framework exhibit significantly less distortion than the original distorted images. The demand for retouching images in which the main object is a person increases. Excessive distortions can be quantified and detected in contexts such as immigration procedures using the proposed metrics. By publicly releasing the COCO-based retouched image dataset, which includes the original images, distorted images, and images with minimized distortions used in our experiments, we aim to demonstrate the quality of our dataset and contribute to the field of image distortion minimization research.https://ieeexplore.ieee.org/document/10892112/Computer visiondatasetimage distortionmetric |
| spellingShingle | Gwangyeol Yu Kyungseo Yoon Sangwoo Lee Yeju Shin Jonghyuk Park Distortion Measurement Metric for Human Image Refinement and Evaluation Using Distorted Image Datasets IEEE Access Computer vision dataset image distortion metric |
| title | Distortion Measurement Metric for Human Image Refinement and Evaluation Using Distorted Image Datasets |
| title_full | Distortion Measurement Metric for Human Image Refinement and Evaluation Using Distorted Image Datasets |
| title_fullStr | Distortion Measurement Metric for Human Image Refinement and Evaluation Using Distorted Image Datasets |
| title_full_unstemmed | Distortion Measurement Metric for Human Image Refinement and Evaluation Using Distorted Image Datasets |
| title_short | Distortion Measurement Metric for Human Image Refinement and Evaluation Using Distorted Image Datasets |
| title_sort | distortion measurement metric for human image refinement and evaluation using distorted image datasets |
| topic | Computer vision dataset image distortion metric |
| url | https://ieeexplore.ieee.org/document/10892112/ |
| work_keys_str_mv | AT gwangyeolyu distortionmeasurementmetricforhumanimagerefinementandevaluationusingdistortedimagedatasets AT kyungseoyoon distortionmeasurementmetricforhumanimagerefinementandevaluationusingdistortedimagedatasets AT sangwoolee distortionmeasurementmetricforhumanimagerefinementandevaluationusingdistortedimagedatasets AT yejushin distortionmeasurementmetricforhumanimagerefinementandevaluationusingdistortedimagedatasets AT jonghyukpark distortionmeasurementmetricforhumanimagerefinementandevaluationusingdistortedimagedatasets |