Enhancing the YOLOv8 model for realtime object detection to ensure online platform safety

Abstract In today’s digital environment, effectively detecting and censoring harmful and offensive objects such as weapons, addictive substances, and violent content on online platforms is increasingly important for user safety. This study introduces an Enhanced Object Detection (EOD) model that bui...

Full description

Saved in:
Bibliographic Details
Main Authors: Mohammed Kawser Jahan, Fokrul Islam Bhuiyan, Al Amin, M. F. Mridha, Mejdl Safran, Sultan Alfarhood, Dunren Che
Format: Article
Language:English
Published: Nature Portfolio 2025-07-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-08413-4
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract In today’s digital environment, effectively detecting and censoring harmful and offensive objects such as weapons, addictive substances, and violent content on online platforms is increasingly important for user safety. This study introduces an Enhanced Object Detection (EOD) model that builds upon the YOLOv8-m architecture to improve the identification of such harmful objects in complex scenarios. Our key contributions include enhancing the cross-stage partial fusion blocks and incorporating three additional convolutional blocks into the model head, leading to better feature extraction and detection capabilities. Utilizing a public dataset covering six categories of harmful objects, our EOD model achieves superior performance with precision, recall, and mAP50 scores of 0.88, 0.89, and 0.92 on standard test data, and 0.84, 0.74, and 0.82 on challenging test cases–surpassing existing deep learning approaches. Furthermore, we employ explainable AI techniques to validate the model’s confidence and decision-making process. These advancements not only enhance detection accuracy but also set a new benchmark for harmful object detection, significantly contributing to the safety measures across various online platforms.
ISSN:2045-2322