Gender Classification Using Face Vectors: A Deep Learning Approach Without Classical Models
In recent years, deep learning techniques have become increasingly prominent in face recognition tasks, particularly through the extraction and classification of face vectors. These vectors enable the inference of demographic attributes such as gender, age, and ethnicity. This study introduces a gen...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-06-01
|
| Series: | Information |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2078-2489/16/7/531 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | In recent years, deep learning techniques have become increasingly prominent in face recognition tasks, particularly through the extraction and classification of face vectors. These vectors enable the inference of demographic attributes such as gender, age, and ethnicity. This study introduces a gender classification approach based solely on face vectors, avoiding the use of traditional machine learning algorithms. Face embeddings were generated using three popular models: dlib, ArcFace, and FaceNet512. For classification, the Average Neural Face Embeddings (ANFE) technique was applied by calculating distances between vectors. To improve gender recognition performance for Asian individuals, a new dataset was created by scraping facial images and related metadata from AsianWiki. The experimental evaluations revealed that ANFE models based on ArcFace achieved classification accuracies of 93.1% for Asian women and 90.2% for Asian men. In contrast, the models utilizing dlib embeddings performed notably lower, with accuracies dropping to 76.4% for women and 74.3% for men. Among the tested models, FaceNet512 provided the best results, reaching 97.5% accuracy for female subjects and 94.2% for males. Furthermore, this study includes a comparative analysis between ANFE and other commonly used gender classification methods. |
|---|---|
| ISSN: | 2078-2489 |