Entity Pair Relation Classification Based on Contrastive Learning and Biaffine Model

In natural language processing, the biaffine model can effectively captures sentence structure and word relationships for tasks like text classification and relation extraction. However, it struggles with entity pair relation classification, particularly in overlapping or complex scenarios. To addre...

Full description

Saved in:
Bibliographic Details
Main Authors: Songhua Hu, Ziming Zhang, Hengxin Wang, Lihui Jiang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11095676/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850075412648951808
author Songhua Hu
Ziming Zhang
Hengxin Wang
Lihui Jiang
author_facet Songhua Hu
Ziming Zhang
Hengxin Wang
Lihui Jiang
author_sort Songhua Hu
collection DOAJ
description In natural language processing, the biaffine model can effectively captures sentence structure and word relationships for tasks like text classification and relation extraction. However, it struggles with entity pair relation classification, particularly in overlapping or complex scenarios. To address this, this paper proposes BERT-CL-Biaffine, an improved relation classification model integrating bidirectional entity contrastive learning and a global pointer network. The model enhances the biaffine architecture by training it to identify entity boundaries and leveraging contrastive learning to strengthen semantic associations between overlapping entity pairs. Experiments on the NYT and WebNLG datasets demonstrate that BERT-CL-Biaffine outperforms baseline models, achieving F1 score improvements of 1% and 1.2%, respectively. The model excels in classifying overlapping entity pairs and handles challenges like imbalanced relation types and ambiguous entity features, particularly in complex scenarios. The results validate that bidirectional entity contrastive learning and global pointer networks significantly enhance the biaffine model’s feature representation and classification performance. This approach offers a robust solution for relation extraction in intricate textual contexts.
format Article
id doaj-art-4bc51895e88f4ac0886dc39f047e09d2
institution DOAJ
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-4bc51895e88f4ac0886dc39f047e09d22025-08-20T02:46:19ZengIEEEIEEE Access2169-35362025-01-011313128913130210.1109/ACCESS.2025.359220311095676Entity Pair Relation Classification Based on Contrastive Learning and Biaffine ModelSonghua Hu0https://orcid.org/0000-0002-7071-1356Ziming Zhang1https://orcid.org/0009-0006-0172-1311Hengxin Wang2Lihui Jiang3https://orcid.org/0000-0003-4770-2459School of Artificial Intelligence and Big Data, Hefei University, Hefei, ChinaSchool of Artificial Intelligence and Big Data, Hefei University, Hefei, ChinaSchool of Artificial Intelligence and Big Data, Hefei University, Hefei, ChinaSchool of Artificial Intelligence and Big Data, Hefei University, Hefei, ChinaIn natural language processing, the biaffine model can effectively captures sentence structure and word relationships for tasks like text classification and relation extraction. However, it struggles with entity pair relation classification, particularly in overlapping or complex scenarios. To address this, this paper proposes BERT-CL-Biaffine, an improved relation classification model integrating bidirectional entity contrastive learning and a global pointer network. The model enhances the biaffine architecture by training it to identify entity boundaries and leveraging contrastive learning to strengthen semantic associations between overlapping entity pairs. Experiments on the NYT and WebNLG datasets demonstrate that BERT-CL-Biaffine outperforms baseline models, achieving F1 score improvements of 1% and 1.2%, respectively. The model excels in classifying overlapping entity pairs and handles challenges like imbalanced relation types and ambiguous entity features, particularly in complex scenarios. The results validate that bidirectional entity contrastive learning and global pointer networks significantly enhance the biaffine model’s feature representation and classification performance. This approach offers a robust solution for relation extraction in intricate textual contexts.https://ieeexplore.ieee.org/document/11095676/Entity pair relation classificationcontrastive learningnatural language processingtext classification
spellingShingle Songhua Hu
Ziming Zhang
Hengxin Wang
Lihui Jiang
Entity Pair Relation Classification Based on Contrastive Learning and Biaffine Model
IEEE Access
Entity pair relation classification
contrastive learning
natural language processing
text classification
title Entity Pair Relation Classification Based on Contrastive Learning and Biaffine Model
title_full Entity Pair Relation Classification Based on Contrastive Learning and Biaffine Model
title_fullStr Entity Pair Relation Classification Based on Contrastive Learning and Biaffine Model
title_full_unstemmed Entity Pair Relation Classification Based on Contrastive Learning and Biaffine Model
title_short Entity Pair Relation Classification Based on Contrastive Learning and Biaffine Model
title_sort entity pair relation classification based on contrastive learning and biaffine model
topic Entity pair relation classification
contrastive learning
natural language processing
text classification
url https://ieeexplore.ieee.org/document/11095676/
work_keys_str_mv AT songhuahu entitypairrelationclassificationbasedoncontrastivelearningandbiaffinemodel
AT zimingzhang entitypairrelationclassificationbasedoncontrastivelearningandbiaffinemodel
AT hengxinwang entitypairrelationclassificationbasedoncontrastivelearningandbiaffinemodel
AT lihuijiang entitypairrelationclassificationbasedoncontrastivelearningandbiaffinemodel