Comparative Analysis of YOLO Variants Based on Performance Evaluation for Object Detection

This study focuses on analysing and exploring the You Only Look Once (YOLO) algorithm. Specifically, this article analyses the evolution and performance of three versions (YOLOv1, YOLOv5, and YOLOv8) in object detection. The research begins by detailing the fundamental concepts of object detection a...

Full description

Saved in:
Bibliographic Details
Main Author: Chen Aoxiang
Format: Article
Language:English
Published: EDP Sciences 2025-01-01
Series:ITM Web of Conferences
Online Access:https://www.itm-conferences.org/articles/itmconf/pdf/2025/01/itmconf_dai2024_03008.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study focuses on analysing and exploring the You Only Look Once (YOLO) algorithm. Specifically, this article analyses the evolution and performance of three versions (YOLOv1, YOLOv5, and YOLOv8) in object detection. The research begins by detailing the fundamental concepts of object detection and the datasets commonly used in this field. It then delves into the specific architectures and experimental outcomes associated with each YOLO version. The analysis reveals that while YOLOv8 introduces advanced features and improvements, earlier versions like YOLOv5 may offer superior stability and performance under certain conditions, particularly in specific tasks such as car detection. The discussion emphasizes the significant impact of factors such as batch size on model performance, suggesting that fine-tuning these parameters can optimize the algorithm for particular applications. The study concludes that the future of YOLO development lies in exploring and refining different variants, particularly those of YOLOv8, to better meet diverse requirements. By focusing on five distinct YOLOv8 variants, the research aims to enhance the adaptability and effectiveness of the YOLO framework across a wide range of object detection challenges, thereby contributing valuable insights into the ongoing advancement of this technology.
ISSN:2271-2097