Tiny Machine Learning and On-Device Inference: A Survey of Applications, Challenges, and Future Directions

The growth in artificial intelligence and its applications has led to increased data processing and inference requirements. Traditional cloud-based inference solutions are often used but may prove inadequate for applications requiring near-instantaneous response times. This review examines Tiny Mach...

Full description

Saved in:
Bibliographic Details
Main Authors: Soroush Heydari, Qusay H. Mahmoud
Format: Article
Language:English
Published: MDPI AG 2025-05-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/10/3191
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850256389573705728
author Soroush Heydari
Qusay H. Mahmoud
author_facet Soroush Heydari
Qusay H. Mahmoud
author_sort Soroush Heydari
collection DOAJ
description The growth in artificial intelligence and its applications has led to increased data processing and inference requirements. Traditional cloud-based inference solutions are often used but may prove inadequate for applications requiring near-instantaneous response times. This review examines Tiny Machine Learning, also known as TinyML, as an alternative to cloud-based inference. The review focuses on applications where transmission delays make traditional Internet of Things (IoT) approaches impractical, thus necessitating a solution that uses TinyML and on-device inference. This study, which follows the PRISMA guidelines, covers TinyML’s use cases for real-world applications by analyzing experimental studies and synthesizing current research on the characteristics of TinyML experiments, such as machine learning techniques and the hardware used for experiments. This review identifies existing gaps in research as well as the means to address these gaps. The review findings suggest that TinyML has a strong record of real-world usability and offers advantages over cloud-based inference, particularly in environments with bandwidth constraints and use cases that require rapid response times. This review discusses the implications of TinyML’s experimental performance for future research on TinyML applications.
format Article
id doaj-art-a8a92f78b9d04ca08daffa2cef92e59e
institution OA Journals
issn 1424-8220
language English
publishDate 2025-05-01
publisher MDPI AG
record_format Article
series Sensors
spelling doaj-art-a8a92f78b9d04ca08daffa2cef92e59e2025-08-20T01:56:39ZengMDPI AGSensors1424-82202025-05-012510319110.3390/s25103191Tiny Machine Learning and On-Device Inference: A Survey of Applications, Challenges, and Future DirectionsSoroush Heydari0Qusay H. Mahmoud1Department of Electrical, Computer and Software Engineering, Ontario Tech University, Oshawa, ON L1G 0C5, CanadaDepartment of Electrical, Computer and Software Engineering, Ontario Tech University, Oshawa, ON L1G 0C5, CanadaThe growth in artificial intelligence and its applications has led to increased data processing and inference requirements. Traditional cloud-based inference solutions are often used but may prove inadequate for applications requiring near-instantaneous response times. This review examines Tiny Machine Learning, also known as TinyML, as an alternative to cloud-based inference. The review focuses on applications where transmission delays make traditional Internet of Things (IoT) approaches impractical, thus necessitating a solution that uses TinyML and on-device inference. This study, which follows the PRISMA guidelines, covers TinyML’s use cases for real-world applications by analyzing experimental studies and synthesizing current research on the characteristics of TinyML experiments, such as machine learning techniques and the hardware used for experiments. This review identifies existing gaps in research as well as the means to address these gaps. The review findings suggest that TinyML has a strong record of real-world usability and offers advantages over cloud-based inference, particularly in environments with bandwidth constraints and use cases that require rapid response times. This review discusses the implications of TinyML’s experimental performance for future research on TinyML applications.https://www.mdpi.com/1424-8220/25/10/3191TinyMLIoTsensorsedge AIedge computingembedded ML
spellingShingle Soroush Heydari
Qusay H. Mahmoud
Tiny Machine Learning and On-Device Inference: A Survey of Applications, Challenges, and Future Directions
Sensors
TinyML
IoT
sensors
edge AI
edge computing
embedded ML
title Tiny Machine Learning and On-Device Inference: A Survey of Applications, Challenges, and Future Directions
title_full Tiny Machine Learning and On-Device Inference: A Survey of Applications, Challenges, and Future Directions
title_fullStr Tiny Machine Learning and On-Device Inference: A Survey of Applications, Challenges, and Future Directions
title_full_unstemmed Tiny Machine Learning and On-Device Inference: A Survey of Applications, Challenges, and Future Directions
title_short Tiny Machine Learning and On-Device Inference: A Survey of Applications, Challenges, and Future Directions
title_sort tiny machine learning and on device inference a survey of applications challenges and future directions
topic TinyML
IoT
sensors
edge AI
edge computing
embedded ML
url https://www.mdpi.com/1424-8220/25/10/3191
work_keys_str_mv AT soroushheydari tinymachinelearningandondeviceinferenceasurveyofapplicationschallengesandfuturedirections
AT qusayhmahmoud tinymachinelearningandondeviceinferenceasurveyofapplicationschallengesandfuturedirections