Error Bounds for lp-Norm Multiple Kernel Learning with Least Square Loss
The problem of learning the kernel function with linear combinations of multiple kernels has attracted considerable attention recently in machine learning. Specially, by imposing an lp-norm penalty on the kernel combination coefficient, multiple kernel learning (MKL) was proved useful and effective...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2012-01-01
|
| Series: | Abstract and Applied Analysis |
| Online Access: | http://dx.doi.org/10.1155/2012/915920 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850220103290847232 |
|---|---|
| author | Shao-Gao Lv Jin-De Zhu |
| author_facet | Shao-Gao Lv Jin-De Zhu |
| author_sort | Shao-Gao Lv |
| collection | DOAJ |
| description | The problem of learning the kernel function with linear combinations of multiple kernels has attracted considerable attention recently in machine learning. Specially, by imposing an lp-norm penalty on the kernel combination coefficient, multiple kernel learning (MKL) was proved useful and effective for theoretical analysis and practical applications (Kloft et al., 2009, 2011). In this paper, we present a theoretical analysis on the approximation error and learning ability of the lp-norm MKL. Our analysis shows explicit learning rates for lp-norm MKL and demonstrates some notable advantages compared with traditional kernel-based learning algorithms where the kernel is fixed. |
| format | Article |
| id | doaj-art-7790ca5ad25245fb876b8c5e94dc0cff |
| institution | OA Journals |
| issn | 1085-3375 1687-0409 |
| language | English |
| publishDate | 2012-01-01 |
| publisher | Wiley |
| record_format | Article |
| series | Abstract and Applied Analysis |
| spelling | doaj-art-7790ca5ad25245fb876b8c5e94dc0cff2025-08-20T02:07:10ZengWileyAbstract and Applied Analysis1085-33751687-04092012-01-01201210.1155/2012/915920915920Error Bounds for lp-Norm Multiple Kernel Learning with Least Square LossShao-Gao Lv0Jin-De Zhu1Statistics School, Southwestern University of Finance and Economics, Chengdu 611130, ChinaThe 2nd Geological Party of Bureau of Geology and Mineral Resources, Henan, Jiaozuo 450000, ChinaThe problem of learning the kernel function with linear combinations of multiple kernels has attracted considerable attention recently in machine learning. Specially, by imposing an lp-norm penalty on the kernel combination coefficient, multiple kernel learning (MKL) was proved useful and effective for theoretical analysis and practical applications (Kloft et al., 2009, 2011). In this paper, we present a theoretical analysis on the approximation error and learning ability of the lp-norm MKL. Our analysis shows explicit learning rates for lp-norm MKL and demonstrates some notable advantages compared with traditional kernel-based learning algorithms where the kernel is fixed.http://dx.doi.org/10.1155/2012/915920 |
| spellingShingle | Shao-Gao Lv Jin-De Zhu Error Bounds for lp-Norm Multiple Kernel Learning with Least Square Loss Abstract and Applied Analysis |
| title | Error Bounds for lp-Norm Multiple Kernel Learning with Least Square Loss |
| title_full | Error Bounds for lp-Norm Multiple Kernel Learning with Least Square Loss |
| title_fullStr | Error Bounds for lp-Norm Multiple Kernel Learning with Least Square Loss |
| title_full_unstemmed | Error Bounds for lp-Norm Multiple Kernel Learning with Least Square Loss |
| title_short | Error Bounds for lp-Norm Multiple Kernel Learning with Least Square Loss |
| title_sort | error bounds for lp norm multiple kernel learning with least square loss |
| url | http://dx.doi.org/10.1155/2012/915920 |
| work_keys_str_mv | AT shaogaolv errorboundsforlpnormmultiplekernellearningwithleastsquareloss AT jindezhu errorboundsforlpnormmultiplekernellearningwithleastsquareloss |