Method for Knowledge Transfer via Multi-Task Semi-Supervised Self-Paced
Adequate labeled data is essential for learning a reliable and generalizable model in many machine learning tasks. However, labeled data is becoming scarce and costly to obtain, which has spurred consistent interest in knowledge transfer techniques. Therefore, semi-supervised and multi-task learning...
Saved in:
| Main Authors: | , , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11017642/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850219269394006016 |
|---|---|
| author | Yao Zhao Hongying Liu Huaxian Pan Zhen Song Chunting Liu Anni Wei Baoshuang Zhang Wei Lu |
| author_facet | Yao Zhao Hongying Liu Huaxian Pan Zhen Song Chunting Liu Anni Wei Baoshuang Zhang Wei Lu |
| author_sort | Yao Zhao |
| collection | DOAJ |
| description | Adequate labeled data is essential for learning a reliable and generalizable model in many machine learning tasks. However, labeled data is becoming scarce and costly to obtain, which has spurred consistent interest in knowledge transfer techniques. Therefore, semi-supervised and multi-task learning is combined to alleviate the challenge, but the complexity of the task should be considered. To achieve more effective knowledge transfer with limited labeled data, we propose a unified multi-task semi-supervised self-paced learning (MSSP) scheme in this paper. MSSP naturally integrates the common structures shared by multiple related tasks and the manifold structure regularized by unlabeled data, enabling the respective knowledge transferred from the feature space and the instance space to complement and constrain each other. This leads to faster and more accurate searches in the underlying hypothesis space. We adopt Alternating convex search (ACS) method to solve MSSP, that is, each iteration sequentially trains the prediction model with a fixed set of labeled instances and then updates the labeled training set by adding more complex instances. With the aid of a self-controlled learning pace, a more robust and globally optimal model can be gradually constructed. Experimental results on several benchmark datasets show that our method achieves a performance gain of 3%-15% in classification accuracy compared to baseline algorithms, along with significant advantages in convergence speed. |
| format | Article |
| id | doaj-art-99471709e7964bda9ec196cf8a9ef361 |
| institution | OA Journals |
| issn | 2169-3536 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-99471709e7964bda9ec196cf8a9ef3612025-08-20T02:07:26ZengIEEEIEEE Access2169-35362025-01-011310140510141410.1109/ACCESS.2025.357498211017642Method for Knowledge Transfer via Multi-Task Semi-Supervised Self-PacedYao Zhao0https://orcid.org/0009-0008-4302-3663Hongying Liu1Huaxian Pan2https://orcid.org/0009-0006-9505-8097Zhen Song3Chunting Liu4Anni Wei5https://orcid.org/0009-0008-9449-8722Baoshuang Zhang6https://orcid.org/0000-0002-4166-6568Wei Lu7https://orcid.org/0000-0002-0098-7584School of Economics and Statistics, Xingzhi College, Xi’an University of Finance and Economics, Xi’an, Shaanxi, ChinaSchool of Economics and Statistics, Xingzhi College, Xi’an University of Finance and Economics, Xi’an, Shaanxi, ChinaSchool of Economics and Statistics, Xingzhi College, Xi’an University of Finance and Economics, Xi’an, Shaanxi, ChinaSchool of Economics and Statistics, Xingzhi College, Xi’an University of Finance and Economics, Xi’an, Shaanxi, ChinaSchool of Economics and Statistics, Xingzhi College, Xi’an University of Finance and Economics, Xi’an, Shaanxi, ChinaSchool of Economics and Statistics, Xingzhi College, Xi’an University of Finance and Economics, Xi’an, Shaanxi, ChinaSchool of Economics and Statistics, Xingzhi College, Xi’an University of Finance and Economics, Xi’an, Shaanxi, ChinaSchool of Information, Xi’an University of Finance and Economics, Xi’an, Shaanxi, ChinaAdequate labeled data is essential for learning a reliable and generalizable model in many machine learning tasks. However, labeled data is becoming scarce and costly to obtain, which has spurred consistent interest in knowledge transfer techniques. Therefore, semi-supervised and multi-task learning is combined to alleviate the challenge, but the complexity of the task should be considered. To achieve more effective knowledge transfer with limited labeled data, we propose a unified multi-task semi-supervised self-paced learning (MSSP) scheme in this paper. MSSP naturally integrates the common structures shared by multiple related tasks and the manifold structure regularized by unlabeled data, enabling the respective knowledge transferred from the feature space and the instance space to complement and constrain each other. This leads to faster and more accurate searches in the underlying hypothesis space. We adopt Alternating convex search (ACS) method to solve MSSP, that is, each iteration sequentially trains the prediction model with a fixed set of labeled instances and then updates the labeled training set by adding more complex instances. With the aid of a self-controlled learning pace, a more robust and globally optimal model can be gradually constructed. Experimental results on several benchmark datasets show that our method achieves a performance gain of 3%-15% in classification accuracy compared to baseline algorithms, along with significant advantages in convergence speed.https://ieeexplore.ieee.org/document/11017642/Multi-task learningself-paced learningsemi-supervised learningalternating convex search |
| spellingShingle | Yao Zhao Hongying Liu Huaxian Pan Zhen Song Chunting Liu Anni Wei Baoshuang Zhang Wei Lu Method for Knowledge Transfer via Multi-Task Semi-Supervised Self-Paced IEEE Access Multi-task learning self-paced learning semi-supervised learning alternating convex search |
| title | Method for Knowledge Transfer via Multi-Task Semi-Supervised Self-Paced |
| title_full | Method for Knowledge Transfer via Multi-Task Semi-Supervised Self-Paced |
| title_fullStr | Method for Knowledge Transfer via Multi-Task Semi-Supervised Self-Paced |
| title_full_unstemmed | Method for Knowledge Transfer via Multi-Task Semi-Supervised Self-Paced |
| title_short | Method for Knowledge Transfer via Multi-Task Semi-Supervised Self-Paced |
| title_sort | method for knowledge transfer via multi task semi supervised self paced |
| topic | Multi-task learning self-paced learning semi-supervised learning alternating convex search |
| url | https://ieeexplore.ieee.org/document/11017642/ |
| work_keys_str_mv | AT yaozhao methodforknowledgetransferviamultitasksemisupervisedselfpaced AT hongyingliu methodforknowledgetransferviamultitasksemisupervisedselfpaced AT huaxianpan methodforknowledgetransferviamultitasksemisupervisedselfpaced AT zhensong methodforknowledgetransferviamultitasksemisupervisedselfpaced AT chuntingliu methodforknowledgetransferviamultitasksemisupervisedselfpaced AT anniwei methodforknowledgetransferviamultitasksemisupervisedselfpaced AT baoshuangzhang methodforknowledgetransferviamultitasksemisupervisedselfpaced AT weilu methodforknowledgetransferviamultitasksemisupervisedselfpaced |