Entity Pair Relation Classification Based on Contrastive Learning and Biaffine Model
In natural language processing, the biaffine model can effectively captures sentence structure and word relationships for tasks like text classification and relation extraction. However, it struggles with entity pair relation classification, particularly in overlapping or complex scenarios. To addre...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11095676/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | In natural language processing, the biaffine model can effectively captures sentence structure and word relationships for tasks like text classification and relation extraction. However, it struggles with entity pair relation classification, particularly in overlapping or complex scenarios. To address this, this paper proposes BERT-CL-Biaffine, an improved relation classification model integrating bidirectional entity contrastive learning and a global pointer network. The model enhances the biaffine architecture by training it to identify entity boundaries and leveraging contrastive learning to strengthen semantic associations between overlapping entity pairs. Experiments on the NYT and WebNLG datasets demonstrate that BERT-CL-Biaffine outperforms baseline models, achieving F1 score improvements of 1% and 1.2%, respectively. The model excels in classifying overlapping entity pairs and handles challenges like imbalanced relation types and ambiguous entity features, particularly in complex scenarios. The results validate that bidirectional entity contrastive learning and global pointer networks significantly enhance the biaffine model’s feature representation and classification performance. This approach offers a robust solution for relation extraction in intricate textual contexts. |
|---|---|
| ISSN: | 2169-3536 |