Learning With Partial-Label and Unlabeled Data: Contrastive With Negative Example Separation
Semi-Supervised Partial Label Learning (SSPLL) is an important branch of weakly supervised learning, where the data consists of both partial label examples and unlabeled ones. In SSPLL, the existence of unlabeled examples presents a great challenge to train a model with good generalization ability....
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11098874/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Semi-Supervised Partial Label Learning (SSPLL) is an important branch of weakly supervised learning, where the data consists of both partial label examples and unlabeled ones. In SSPLL, the existence of unlabeled examples presents a great challenge to train a model with good generalization ability. To tackle this challenge, in this work we propose a new method termed Learning with Partial-Label and Unlabeled Data: Contrastive with Negative Example Separation (LPU-CNES), which leverages the powerfulness of Contrastive Learning in extracting high-level semantic representations for weakly supervised and unsupervised scenarios. Specifically, LPU-CNES first transforms unlabeled examples into partial label ones by assigning pseudo candidate label sets to them, and then introduces negative example separation to construct the contrastive loss, and finally trains the model by minimizing the sum of contrastive loss, regularization loss and other classic PLL classification losses. We demonstrate the effectiveness of our method in terms of classification accuracy across multiple benchmarks. |
|---|---|
| ISSN: | 2169-3536 |