Effectiveness and optimization of bidirectional long short-term memory (BiLSTM) based fast detection of deep fake face videos for real-time applications

This study proposes a rapid detection method for deepfake face videos designed for real-time applications using bidirectional long short-term memory (BiLSTM) networks. The aim is to overcome the limitations of current technologies in terms of efficiency and accuracy. An optimized BiLSTM architecture...

Full description

Saved in:
Bibliographic Details
Main Author: Haoxiang Wang
Format: Article
Language:English
Published: PeerJ Inc. 2025-05-01
Series:PeerJ Computer Science
Subjects:
Online Access:https://peerj.com/articles/cs-2867.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study proposes a rapid detection method for deepfake face videos designed for real-time applications using bidirectional long short-term memory (BiLSTM) networks. The aim is to overcome the limitations of current technologies in terms of efficiency and accuracy. An optimized BiLSTM architecture and training strategy are employed, enhancing recognition capabilities through data preprocessing and feature enhancement while also minimizing computational complexity and resource consumption during detection. Experiments were conducted on the FaceForensics++ dataset, which includes both authentic and four types of manipulated videos. The results show that the proposed BiLSTM-based approach outperforms existing methods in real-time detection. Specifically, the integration of temporal analysis and conditional random fields (CRF) resulted in significant accuracy improvements: a 1.6% increase in checking accuracy, a 2.0% improvement in checking completeness, and a 2.5% increase in the F1-score. The BiLSTM-based rapid detection approach demonstrated high efficiency and accuracy across multiple standard datasets, achieving notable performance gains over current technologies. These findings highlight the method’s potential and value for real-time deepfake detection applications.
ISSN:2376-5992