Contrastive Learning Pre-Training and Quantum Theory for Cross-Lingual Aspect-Based Sentiment Analysis

The cross-lingual aspect-based sentiment analysis (ABSA) task continues to pose a significant challenge, as it involves training a classifier on high-resource source languages and then applying it to classify texts in low-resource target languages, thereby bridging linguistic gaps while preserving a...

Full description

Saved in:
Bibliographic Details
Main Authors: Xun Li, Kun Zhang
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/27/7/713
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The cross-lingual aspect-based sentiment analysis (ABSA) task continues to pose a significant challenge, as it involves training a classifier on high-resource source languages and then applying it to classify texts in low-resource target languages, thereby bridging linguistic gaps while preserving accuracy. Most existing methods achieve exceptional performance by relying on multilingual pre-trained language models (mPLM) and translation systems to transfer knowledge across languages. However, little attention has been paid to factors beyond semantic similarity, which ultimately hinders classification performance in target languages. To address this challenge, we propose CLQT, a novel framework that combines contrastive learning pre-training with quantum theory to address the cross-lingual ABSA task. Firstly, we develop a contrastive learning strategy to align data between the source and target languages. Subsequently, we incorporate a quantum network that employs quantum projection and quantum entanglement to facilitate effective knowledge transfer across languages. Extensive experiments reveal that the novel CLQT framework both achieves strong results and has a beneficial overall influence on the cross-lingual ABSA task.
ISSN:1099-4300