Deep transfer learning mechanism for fine-grained cross-domain sentiment classification
The goal of cross-domain sentiment classification is to utilise useful information in the source domain to help classify sentiment polarity in the target domain, which has a large number of unlabelled data. Most of the existing methods focus on extracting the invariant features between two domains....
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Taylor & Francis Group
2021-10-01
|
| Series: | Connection Science |
| Subjects: | |
| Online Access: | http://dx.doi.org/10.1080/09540091.2021.1912711 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | The goal of cross-domain sentiment classification is to utilise useful information in the source domain to help classify sentiment polarity in the target domain, which has a large number of unlabelled data. Most of the existing methods focus on extracting the invariant features between two domains. But they cannot make better use of the unlabelled data in the target domain. To solve this problem, we present a deep transfer learning mechanism (DTLM) for fine-grained cross-domain sentiment classification. DTLM provides a transfer mechanism to better transfer sentiment across domains by incorporating BERT(Bidirextional Encoder Representations from Transformers) and KL (Kullback-Leibler) divergence. We introduce BERT as a feature encoder to map the text data of different domains into a shared feature space. Then, we design a domain adaptive model using KL divergence to eliminate the difference of feature distribution between the source domain and target domain. In addition, we introduce the entropy minimisation and consistency regularisation to process unlabelled samples in the target domain. Extensive experiments on the datasets from YelpAspect, SemEval 2014 task 4 and Twitter not only demonstrate the effectiveness of our proposed method but also provide a better way for cross-domain sentiment classification. |
|---|---|
| ISSN: | 0954-0091 1360-0494 |