Large-Language-Model-Enabled Text Semantic Communication Systems
Large language models (LLMs) have recently demonstrated state-of-the-art performance in various natural language processing (NLP) tasks, achieving near-human levels in multiple language understanding challenges and aligning closely with the core principles of semantic communication Inspired by LLMs’...
Saved in:
| Main Authors: | Zhenyi Wang, Li Zou, Shengyun Wei, Kai Li, Feifan Liao, Haibo Mi, Rongxuan Lai |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-06-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/15/13/7227 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Lightweight joint source-channel coding method based on MIMO-CSI prediction
by: YU Chuangyu, et al.
Published: (2025-06-01) -
Importance-Aware Resource Allocations for MIMO Semantic Communication
by: Yue Cao, et al.
Published: (2025-06-01) -
A Deep Variational Approach to Multiterminal Joint Source-Channel Coding Based on Information Bottleneck Principle
by: Shayan Hassanpour, et al.
Published: (2025-01-01) -
JSCC-Aided INR for High-Frequency Detail Preservation in LiDAR
by: Akihiro Kuwabara, et al.
Published: (2025-01-01) -
Next-Gen Decoding: Non-Binary LDPC Algorithms for Emerging Power Line and Visible Light Communications
by: Waheed Ullah, et al.
Published: (2025-01-01)