YOLO-Type Neural Networks in the Process of Adapting Mathematical Graphs to the Needs of the Blind

This publication focuses on verifying the AI effectiveness in adapting traditional educational materials to digital form, with a focus on blind people. Despite the existence of solutions to assist visually impaired people, the adaptation of graphics is still problematic. To address these challenges,...

Full description

Saved in:
Bibliographic Details
Main Authors: Mateusz Kawulok, Michał Maćkowski
Format: Article
Language:English
Published: MDPI AG 2024-12-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/14/24/11829
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This publication focuses on verifying the AI effectiveness in adapting traditional educational materials to digital form, with a focus on blind people. Despite the existence of solutions to assist visually impaired people, the adaptation of graphics is still problematic. To address these challenges, the use of machine learning, which is becoming increasingly prominent in modern solutions, can be effective. Of particular note are YOLO neural networks, known for their ability to analyze images accurately and in real time. The potential of these networks has not yet been fully validated in the context of mathematical graphics for the visually impaired. This research allowed for the determination of the effectiveness of selected versions of YOLO in recognizing relevant elements in mathematical graphs and the identification of the advantages and limitations of each version. It also helped to point out further potential developments in adapting graphs to accessible forms for blind people. The obtained results indicate that YOLOv5 and YOLOv8 have the most potential in this field. This research not only highlights the applicability of machine learning to accessibility challenges but also provides a foundation for the development of automated tools that can assist teachers in inclusive classroom environments.
ISSN:2076-3417