Frozen Weights as Prior for Parameter-Efficient Fine-Tuning
In the fields of natural language processing and computer vision, the emergence of large pre-trained models has led to the adoption of fine-tuning them for downstream tasks as an important paradigm. However, the full fine-tuning approach often comes with a hefty cost, which is not feasible for many...
Saved in:
Main Authors: | Xiaolong Ma, Peishun Liu, Haojie Gao, Zikang Yan, Ningning Ma, Wenqiang Liu, Xuefang Wang, Ruichun Tang |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10840174/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
TIBW: Task-Independent Backdoor Watermarking with Fine-Tuning Resilience for Pre-Trained Language Models
by: Weichuan Mo, et al.
Published: (2025-01-01) -
A scientific-article key-insight extraction system based on multi-actor of fine-tuned open-source large language models
by: Zihan Song, et al.
Published: (2025-01-01) -
Fine-tuning a local LLaMA-3 large language model for automated privacy-preserving physician letter generation in radiation oncology
by: Yihao Hou, et al.
Published: (2025-01-01) -
Innovative Design of the Carbon-free Car Fine-tuning Mechanism based on the Trajectory Analysis Method
by: Liu Yang, et al.
Published: (2015-01-01) -
Klasifikasi Citra Generasi Artificial Intellegence menggunakan Metodde Fine Tuning pada Residual Network
by: Sulthan Abiyyu Hakim, et al.
Published: (2024-07-01)