State of the Art and Future Directions of Small Language Models: A Systematic Review

Small Language Models (SLMs) have emerged as a critical area of study within natural language processing, attracting growing attention from both academia and industry. This systematic literature review provides a comprehensive and reproducible analysis of recent developments and advancements in SLMs...

Full description

Saved in:
Bibliographic Details
Main Authors: Flavio Corradini, Matteo Leonesi, Marco Piangerelli
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Big Data and Cognitive Computing
Subjects:
Online Access:https://www.mdpi.com/2504-2289/9/7/189
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849246759703805952
author Flavio Corradini
Matteo Leonesi
Marco Piangerelli
author_facet Flavio Corradini
Matteo Leonesi
Marco Piangerelli
author_sort Flavio Corradini
collection DOAJ
description Small Language Models (SLMs) have emerged as a critical area of study within natural language processing, attracting growing attention from both academia and industry. This systematic literature review provides a comprehensive and reproducible analysis of recent developments and advancements in SLMs post-2023. Drawing on 70 English-language studies published between January 2023 and January 2025, identified through Scopus, IEEE Xplore, Web of Science, and ACM Digital Library, and focusing primarily on SLMs (including those with up to 7 billion parameters), this review offers a structured overview of the current state of the art and potential future directions. Designed as a resource for researchers seeking an in-depth global synthesis, the review examines key dimensions such as publication trends, visual data representations, contributing institutions, and the availability of public datasets. It highlights prevailing research challenges and outlines proposed solutions, with a particular focus on widely adopted model architectures, as well as common compression and optimization techniques. This study also evaluates the criteria used to assess the effectiveness of SLMs and discusses emerging de facto standards for industry. The curated data and insights aim to support and inform ongoing and future research in this rapidly evolving field.
format Article
id doaj-art-f92ecee59e4e44729585e27e25f3e3dd
institution Kabale University
issn 2504-2289
language English
publishDate 2025-07-01
publisher MDPI AG
record_format Article
series Big Data and Cognitive Computing
spelling doaj-art-f92ecee59e4e44729585e27e25f3e3dd2025-08-20T03:58:25ZengMDPI AGBig Data and Cognitive Computing2504-22892025-07-019718910.3390/bdcc9070189State of the Art and Future Directions of Small Language Models: A Systematic ReviewFlavio Corradini0Matteo Leonesi1Marco Piangerelli2Computer Science Division, School of Science and Technology, University of Camerino, Via Madonna delle Carceri 7, 62032 Camerino, ItalyComputer Science Division, School of Science and Technology, University of Camerino, Via Madonna delle Carceri 7, 62032 Camerino, ItalyComputer Science Division, School of Science and Technology, University of Camerino, Via Madonna delle Carceri 7, 62032 Camerino, ItalySmall Language Models (SLMs) have emerged as a critical area of study within natural language processing, attracting growing attention from both academia and industry. This systematic literature review provides a comprehensive and reproducible analysis of recent developments and advancements in SLMs post-2023. Drawing on 70 English-language studies published between January 2023 and January 2025, identified through Scopus, IEEE Xplore, Web of Science, and ACM Digital Library, and focusing primarily on SLMs (including those with up to 7 billion parameters), this review offers a structured overview of the current state of the art and potential future directions. Designed as a resource for researchers seeking an in-depth global synthesis, the review examines key dimensions such as publication trends, visual data representations, contributing institutions, and the availability of public datasets. It highlights prevailing research challenges and outlines proposed solutions, with a particular focus on widely adopted model architectures, as well as common compression and optimization techniques. This study also evaluates the criteria used to assess the effectiveness of SLMs and discusses emerging de facto standards for industry. The curated data and insights aim to support and inform ongoing and future research in this rapidly evolving field.https://www.mdpi.com/2504-2289/9/7/189small language modelssystematic literature reviewarchitectural compressionbenchmarkingfuture directionsgenerative AI
spellingShingle Flavio Corradini
Matteo Leonesi
Marco Piangerelli
State of the Art and Future Directions of Small Language Models: A Systematic Review
Big Data and Cognitive Computing
small language models
systematic literature review
architectural compression
benchmarking
future directions
generative AI
title State of the Art and Future Directions of Small Language Models: A Systematic Review
title_full State of the Art and Future Directions of Small Language Models: A Systematic Review
title_fullStr State of the Art and Future Directions of Small Language Models: A Systematic Review
title_full_unstemmed State of the Art and Future Directions of Small Language Models: A Systematic Review
title_short State of the Art and Future Directions of Small Language Models: A Systematic Review
title_sort state of the art and future directions of small language models a systematic review
topic small language models
systematic literature review
architectural compression
benchmarking
future directions
generative AI
url https://www.mdpi.com/2504-2289/9/7/189
work_keys_str_mv AT flaviocorradini stateoftheartandfuturedirectionsofsmalllanguagemodelsasystematicreview
AT matteoleonesi stateoftheartandfuturedirectionsofsmalllanguagemodelsasystematicreview
AT marcopiangerelli stateoftheartandfuturedirectionsofsmalllanguagemodelsasystematicreview