BERT-Residual Quantum Language Model Inspired by ODE Multi-Step Method
Quantum-inspired language models model finer-grained semantic interactions in higher-order Hilbert spaces. However, previous methods usually capture semantic features based on context-free word vectors such as Word2Vec and GloVe. Building on natural language encoding, incorporating quantum-inspired...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10852213/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832575621477171200 |
---|---|
author | Shaohui Liang Yingkui Wang Shuxin Chen |
author_facet | Shaohui Liang Yingkui Wang Shuxin Chen |
author_sort | Shaohui Liang |
collection | DOAJ |
description | Quantum-inspired language models model finer-grained semantic interactions in higher-order Hilbert spaces. However, previous methods usually capture semantic features based on context-free word vectors such as Word2Vec and GloVe. Building on natural language encoding, incorporating quantum-inspired density matrix modeling can capture more fine-grained semantic interactions. However, when applied to large pre-trained language models like BERT, using quantum density matrices often leads to issues such as gradient explosion or vanishing. Therefore, how to effectively integrate the quantum-inspired language model and the pre-trained model, and make them function under the fine-tuning paradigm of the pre-trained model has become a key issue for the further development of the quantum-inspired language model. Therefore, in this paper, we propose the BERT-Residual quantum language model inspired by the multi-step method of ordinary differential equations (ODE), using the density matrix to capture the semantic high-order interaction features missing in the BERT modeling process, and obtain the sentence representation, and perform the first step Residuals. Then quantum measurement is performed on the sentence representation, and the second step of residual connection is performed with the BERT layer. This residual connection method based on the multi-step method can more effectively combine the advantages of BERT representation and quantum density matrix representation to enhance representation learning. Experiments show that in text classification benchmarks, our proposed method generally surpasses baseline models. |
format | Article |
id | doaj-art-db5ec8913338486a88a50a6d0abaa31b |
institution | Kabale University |
issn | 2169-3536 |
language | English |
publishDate | 2025-01-01 |
publisher | IEEE |
record_format | Article |
series | IEEE Access |
spelling | doaj-art-db5ec8913338486a88a50a6d0abaa31b2025-01-31T23:05:04ZengIEEEIEEE Access2169-35362025-01-0113193201932810.1109/ACCESS.2025.353355410852213BERT-Residual Quantum Language Model Inspired by ODE Multi-Step MethodShaohui Liang0https://orcid.org/0009-0009-3088-9967Yingkui Wang1https://orcid.org/0000-0002-2382-880XShuxin Chen2https://orcid.org/0009-0009-1611-2481Library, Tianjin Renai College, Tianjin, ChinaSchool of Intelligent Computing Engineering, Tianjin Renai College, Tianjin, ChinaSchool of Intelligent Computing Engineering, Tianjin Renai College, Tianjin, ChinaQuantum-inspired language models model finer-grained semantic interactions in higher-order Hilbert spaces. However, previous methods usually capture semantic features based on context-free word vectors such as Word2Vec and GloVe. Building on natural language encoding, incorporating quantum-inspired density matrix modeling can capture more fine-grained semantic interactions. However, when applied to large pre-trained language models like BERT, using quantum density matrices often leads to issues such as gradient explosion or vanishing. Therefore, how to effectively integrate the quantum-inspired language model and the pre-trained model, and make them function under the fine-tuning paradigm of the pre-trained model has become a key issue for the further development of the quantum-inspired language model. Therefore, in this paper, we propose the BERT-Residual quantum language model inspired by the multi-step method of ordinary differential equations (ODE), using the density matrix to capture the semantic high-order interaction features missing in the BERT modeling process, and obtain the sentence representation, and perform the first step Residuals. Then quantum measurement is performed on the sentence representation, and the second step of residual connection is performed with the BERT layer. This residual connection method based on the multi-step method can more effectively combine the advantages of BERT representation and quantum density matrix representation to enhance representation learning. Experiments show that in text classification benchmarks, our proposed method generally surpasses baseline models.https://ieeexplore.ieee.org/document/10852213/Pre-trained modelsquantum language modelsresidual connection |
spellingShingle | Shaohui Liang Yingkui Wang Shuxin Chen BERT-Residual Quantum Language Model Inspired by ODE Multi-Step Method IEEE Access Pre-trained models quantum language models residual connection |
title | BERT-Residual Quantum Language Model Inspired by ODE Multi-Step Method |
title_full | BERT-Residual Quantum Language Model Inspired by ODE Multi-Step Method |
title_fullStr | BERT-Residual Quantum Language Model Inspired by ODE Multi-Step Method |
title_full_unstemmed | BERT-Residual Quantum Language Model Inspired by ODE Multi-Step Method |
title_short | BERT-Residual Quantum Language Model Inspired by ODE Multi-Step Method |
title_sort | bert residual quantum language model inspired by ode multi step method |
topic | Pre-trained models quantum language models residual connection |
url | https://ieeexplore.ieee.org/document/10852213/ |
work_keys_str_mv | AT shaohuiliang bertresidualquantumlanguagemodelinspiredbyodemultistepmethod AT yingkuiwang bertresidualquantumlanguagemodelinspiredbyodemultistepmethod AT shuxinchen bertresidualquantumlanguagemodelinspiredbyodemultistepmethod |