Facial Kinship Recognition Through Dilated-Stacked-Unified Attention Network

Facial kinship verification (FKV) and facial kinship identification (FKI) are two major tasks in facial kinship recognition (FKR). Despite that, most of the recent works focus mainly on the FKV and only a few works on FKI. The joint learning of both tasks can enhance inference metrics and already ad...

Full description

Saved in:
Bibliographic Details
Main Authors: Ahmed A. Dabas, Mohamed A. Ismail, Nagia M. Ghanem
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11048515/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Facial kinship verification (FKV) and facial kinship identification (FKI) are two major tasks in facial kinship recognition (FKR). Despite that, most of the recent works focus mainly on the FKV and only a few works on FKI. The joint learning of both tasks can enhance inference metrics and already adopted in recent works modelling the problem as a stacked channel-spatial attention scheme. Despite that, those joint models focus mainly on learning the channel and spatial features in attention chain using bottom-up top-down feed forward mechanism ignoring the fact that this could impede the global spatial features representation failing to capture the genetic similarities that are spread across the entire facial images. Starting from this point, we propose a FKV network built on top of a dilated stack of channel-spatial attention modules that focus on enlarging the receptive fields for enhancing the performance of attending to discriminative kinship features while maintaining the global spatial information. We investigate the effects of learning those features in an alternative domain of decoupled channel-spatial. Finally, we extend our work by building a jointly learnt ensemble of unified FKV networks for the FKI task and show that our scheme performs well compared to the state-of-the-art methods on the benchmark datasets.
ISSN:2169-3536