Generating Automatically Print/Scan Textures for Morphing Attack Detection Applications

The focus of Morphing Attack Detection (MAD) is to identify unauthorised attempts to use a legitimate identity. One common scenario involves creating altered images and using them in passport applications. Currently, there are limited datasets available for training the MAD algorithm due to privacy...

Full description

Saved in:
Bibliographic Details
Main Authors: Juan E. Tapia, Maximilian Russo, Christoph Busch
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10945320/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849736998956302336
author Juan E. Tapia
Maximilian Russo
Christoph Busch
author_facet Juan E. Tapia
Maximilian Russo
Christoph Busch
author_sort Juan E. Tapia
collection DOAJ
description The focus of Morphing Attack Detection (MAD) is to identify unauthorised attempts to use a legitimate identity. One common scenario involves creating altered images and using them in passport applications. Currently, there are limited datasets available for training the MAD algorithm due to privacy concerns and the challenges of obtaining and processing a large number of printed and scanned images. A larger and more diverse dataset representing passport application scenarios, including various devices and resulting printed, scanned, or compressed images, is needed to enhance the detection capabilities and identify such morphing attacks. However, generating training data that accurately represents the variety of attacks is a labour-intensive task since the training material is created manually. This paper presents two methods based on texture transfer techniques for the automatic generation of digital print and scan facial images, which are utilized to train a Morphing Attack Detection algorithm. Our proposed methods achieve an Equal Error Rate (EER) of 3.84% and 1.92% on the FRGC/FERET database when incorporating our synthetic and texture-transferred print/scan images at 600 dpi alongside handcrafted images, respectively.
format Article
id doaj-art-ec9c5da6afd94d80843ddb2fda7acd70
institution DOAJ
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-ec9c5da6afd94d80843ddb2fda7acd702025-08-20T03:07:05ZengIEEEIEEE Access2169-35362025-01-0113552775528910.1109/ACCESS.2025.355592210945320Generating Automatically Print/Scan Textures for Morphing Attack Detection ApplicationsJuan E. Tapia0https://orcid.org/0000-0001-9159-4075Maximilian Russo1Christoph Busch2https://orcid.org/0000-0002-9159-2923Hochschule Darmstadt, da/sec-Biometrics and Internet Security Research Group, Darmstadt, GermanyHochschule Darmstadt, da/sec-Biometrics and Internet Security Research Group, Darmstadt, GermanyHochschule Darmstadt, da/sec-Biometrics and Internet Security Research Group, Darmstadt, GermanyThe focus of Morphing Attack Detection (MAD) is to identify unauthorised attempts to use a legitimate identity. One common scenario involves creating altered images and using them in passport applications. Currently, there are limited datasets available for training the MAD algorithm due to privacy concerns and the challenges of obtaining and processing a large number of printed and scanned images. A larger and more diverse dataset representing passport application scenarios, including various devices and resulting printed, scanned, or compressed images, is needed to enhance the detection capabilities and identify such morphing attacks. However, generating training data that accurately represents the variety of attacks is a labour-intensive task since the training material is created manually. This paper presents two methods based on texture transfer techniques for the automatic generation of digital print and scan facial images, which are utilized to train a Morphing Attack Detection algorithm. Our proposed methods achieve an Equal Error Rate (EER) of 3.84% and 1.92% on the FRGC/FERET database when incorporating our synthetic and texture-transferred print/scan images at 600 dpi alongside handcrafted images, respectively.https://ieeexplore.ieee.org/document/10945320/Biometricsface generationprint-scanmorphingGANs
spellingShingle Juan E. Tapia
Maximilian Russo
Christoph Busch
Generating Automatically Print/Scan Textures for Morphing Attack Detection Applications
IEEE Access
Biometrics
face generation
print-scan
morphing
GANs
title Generating Automatically Print/Scan Textures for Morphing Attack Detection Applications
title_full Generating Automatically Print/Scan Textures for Morphing Attack Detection Applications
title_fullStr Generating Automatically Print/Scan Textures for Morphing Attack Detection Applications
title_full_unstemmed Generating Automatically Print/Scan Textures for Morphing Attack Detection Applications
title_short Generating Automatically Print/Scan Textures for Morphing Attack Detection Applications
title_sort generating automatically print scan textures for morphing attack detection applications
topic Biometrics
face generation
print-scan
morphing
GANs
url https://ieeexplore.ieee.org/document/10945320/
work_keys_str_mv AT juanetapia generatingautomaticallyprintscantexturesformorphingattackdetectionapplications
AT maximilianrusso generatingautomaticallyprintscantexturesformorphingattackdetectionapplications
AT christophbusch generatingautomaticallyprintscantexturesformorphingattackdetectionapplications