Masked and unmasked Face Recognition Model Using Deep Learning Techniques. A case of Black Race.
Currently, many institutions of higher learning in Uganda are faced with major security threats ranging from burglary to cyber threats. Consequently, the institutions have recruited and deployed several trained personnel to offer the desired security. As human beings, these personnel can make errors...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | en_US |
Published: |
Kabale University
2024
|
Subjects: | |
Online Access: | http://hdl.handle.net/20.500.12493/2003 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Currently, many institutions of higher learning in Uganda are faced with major security threats ranging from burglary to cyber threats. Consequently, the institutions have recruited and deployed several trained personnel to offer the desired security. As human beings, these personnel can make errors either by commission or omission. To overcome the limitation of trained security personnel, many face recognition models that detect masked and unmasked faces automatically to allow access to sensitive premises have been developed. However, the state-of-the-art models are not generalizable across populations and probably will not work in the Ugandan context because they have not been implemented with capabilities to eliminate racial discrimination in face recognition. This study therefore developed a deep learning model for masked and unmasked face recognition based on local context. The model was trained and tested on 1000 images taken from students of Kabale University using a Nikon d850 camera. Machine learning techniques such as Principal Component Analysis, Geometric Feature-Based Methods, and double threshold techniques were used in the development phase while results were classified using CNN pre-trained models. From the results obtained, VGG19 achieved a higher accuracy of 91.2% followed by Inception V 3 at 90.3% and VGG16 at 89.69% whereas the developed model achieved 90.32%. |
---|