A Survey on Hardware Accelerators for Large Language Models
Large language models (LLMs) have emerged as powerful tools for natural language processing tasks, revolutionizing the field with their ability to understand and generate human-like text. As the demand for more sophisticated LLMs continues to grow, there is a pressing need to address the computation...
Saved in:
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/15/2/586 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832589302335275008 |
---|---|
author | Christoforos Kachris |
author_facet | Christoforos Kachris |
author_sort | Christoforos Kachris |
collection | DOAJ |
description | Large language models (LLMs) have emerged as powerful tools for natural language processing tasks, revolutionizing the field with their ability to understand and generate human-like text. As the demand for more sophisticated LLMs continues to grow, there is a pressing need to address the computational challenges associated with their scale and complexity. This paper presents a comprehensive survey of hardware accelerators designed to enhance the performance and energy efficiency of large language models. By examining a diverse range of accelerators, including GPUs, FPGAs, and custom-designed architectures, we explore the landscape of hardware solutions tailored to meet the unique computational demands of LLMs. The survey encompasses an in-depth analysis of architecture, performance metrics, and energy efficiency considerations, providing valuable insights for researchers, engineers, and decision-makers aiming to optimize the deployment of LLMs in real-world applications. |
format | Article |
id | doaj-art-ff54f2e133a741929dded9d035aef12a |
institution | Kabale University |
issn | 2076-3417 |
language | English |
publishDate | 2025-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj-art-ff54f2e133a741929dded9d035aef12a2025-01-24T13:19:56ZengMDPI AGApplied Sciences2076-34172025-01-0115258610.3390/app15020586A Survey on Hardware Accelerators for Large Language ModelsChristoforos Kachris0Department of Electrical and Electronics Engineering, University of West Attica, 12243 Egaleo, GreeceLarge language models (LLMs) have emerged as powerful tools for natural language processing tasks, revolutionizing the field with their ability to understand and generate human-like text. As the demand for more sophisticated LLMs continues to grow, there is a pressing need to address the computational challenges associated with their scale and complexity. This paper presents a comprehensive survey of hardware accelerators designed to enhance the performance and energy efficiency of large language models. By examining a diverse range of accelerators, including GPUs, FPGAs, and custom-designed architectures, we explore the landscape of hardware solutions tailored to meet the unique computational demands of LLMs. The survey encompasses an in-depth analysis of architecture, performance metrics, and energy efficiency considerations, providing valuable insights for researchers, engineers, and decision-makers aiming to optimize the deployment of LLMs in real-world applications.https://www.mdpi.com/2076-3417/15/2/586large language modelshardware acceleratorsFPGAsGPUsurveyTransformer |
spellingShingle | Christoforos Kachris A Survey on Hardware Accelerators for Large Language Models Applied Sciences large language models hardware accelerators FPGAs GPU survey Transformer |
title | A Survey on Hardware Accelerators for Large Language Models |
title_full | A Survey on Hardware Accelerators for Large Language Models |
title_fullStr | A Survey on Hardware Accelerators for Large Language Models |
title_full_unstemmed | A Survey on Hardware Accelerators for Large Language Models |
title_short | A Survey on Hardware Accelerators for Large Language Models |
title_sort | survey on hardware accelerators for large language models |
topic | large language models hardware accelerators FPGAs GPU survey Transformer |
url | https://www.mdpi.com/2076-3417/15/2/586 |
work_keys_str_mv | AT christoforoskachris asurveyonhardwareacceleratorsforlargelanguagemodels AT christoforoskachris surveyonhardwareacceleratorsforlargelanguagemodels |