Learning With Partial-Label and Unlabeled Data: Contrastive With Negative Example Separation
Semi-Supervised Partial Label Learning (SSPLL) is an important branch of weakly supervised learning, where the data consists of both partial label examples and unlabeled ones. In SSPLL, the existence of unlabeled examples presents a great challenge to train a model with good generalization ability....
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11098874/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849772011444764672 |
|---|---|
| author | Bangfa Jiang Chengkun Liu Jing Chai |
| author_facet | Bangfa Jiang Chengkun Liu Jing Chai |
| author_sort | Bangfa Jiang |
| collection | DOAJ |
| description | Semi-Supervised Partial Label Learning (SSPLL) is an important branch of weakly supervised learning, where the data consists of both partial label examples and unlabeled ones. In SSPLL, the existence of unlabeled examples presents a great challenge to train a model with good generalization ability. To tackle this challenge, in this work we propose a new method termed Learning with Partial-Label and Unlabeled Data: Contrastive with Negative Example Separation (LPU-CNES), which leverages the powerfulness of Contrastive Learning in extracting high-level semantic representations for weakly supervised and unsupervised scenarios. Specifically, LPU-CNES first transforms unlabeled examples into partial label ones by assigning pseudo candidate label sets to them, and then introduces negative example separation to construct the contrastive loss, and finally trains the model by minimizing the sum of contrastive loss, regularization loss and other classic PLL classification losses. We demonstrate the effectiveness of our method in terms of classification accuracy across multiple benchmarks. |
| format | Article |
| id | doaj-art-0191b7d799684346abfa0614cf89f1f2 |
| institution | DOAJ |
| issn | 2169-3536 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-0191b7d799684346abfa0614cf89f1f22025-08-20T03:02:26ZengIEEEIEEE Access2169-35362025-01-011313449713450510.1109/ACCESS.2025.359364211098874Learning With Partial-Label and Unlabeled Data: Contrastive With Negative Example SeparationBangfa Jiang0Chengkun Liu1Jing Chai2https://orcid.org/0000-0001-6203-2874School of Information Science and Engineering, Yunnan University, Kunming, ChinaSchool of Information Science and Engineering, Yunnan University, Kunming, ChinaSchool of Information Science and Engineering, Yunnan University, Kunming, ChinaSemi-Supervised Partial Label Learning (SSPLL) is an important branch of weakly supervised learning, where the data consists of both partial label examples and unlabeled ones. In SSPLL, the existence of unlabeled examples presents a great challenge to train a model with good generalization ability. To tackle this challenge, in this work we propose a new method termed Learning with Partial-Label and Unlabeled Data: Contrastive with Negative Example Separation (LPU-CNES), which leverages the powerfulness of Contrastive Learning in extracting high-level semantic representations for weakly supervised and unsupervised scenarios. Specifically, LPU-CNES first transforms unlabeled examples into partial label ones by assigning pseudo candidate label sets to them, and then introduces negative example separation to construct the contrastive loss, and finally trains the model by minimizing the sum of contrastive loss, regularization loss and other classic PLL classification losses. We demonstrate the effectiveness of our method in terms of classification accuracy across multiple benchmarks.https://ieeexplore.ieee.org/document/11098874/Partial label learningsemi-supervised learningsemi-supervised partial label learningnegative example separation |
| spellingShingle | Bangfa Jiang Chengkun Liu Jing Chai Learning With Partial-Label and Unlabeled Data: Contrastive With Negative Example Separation IEEE Access Partial label learning semi-supervised learning semi-supervised partial label learning negative example separation |
| title | Learning With Partial-Label and Unlabeled Data: Contrastive With Negative Example Separation |
| title_full | Learning With Partial-Label and Unlabeled Data: Contrastive With Negative Example Separation |
| title_fullStr | Learning With Partial-Label and Unlabeled Data: Contrastive With Negative Example Separation |
| title_full_unstemmed | Learning With Partial-Label and Unlabeled Data: Contrastive With Negative Example Separation |
| title_short | Learning With Partial-Label and Unlabeled Data: Contrastive With Negative Example Separation |
| title_sort | learning with partial label and unlabeled data contrastive with negative example separation |
| topic | Partial label learning semi-supervised learning semi-supervised partial label learning negative example separation |
| url | https://ieeexplore.ieee.org/document/11098874/ |
| work_keys_str_mv | AT bangfajiang learningwithpartiallabelandunlabeleddatacontrastivewithnegativeexampleseparation AT chengkunliu learningwithpartiallabelandunlabeleddatacontrastivewithnegativeexampleseparation AT jingchai learningwithpartiallabelandunlabeleddatacontrastivewithnegativeexampleseparation |