Recursive Sample Scaling Low-Rank Representation
The low-rank representation (LRR) method has recently gained enormous popularity due to its robust approach in solving the subspace segmentation problem, particularly those concerning corrupted data. In this paper, the recursive sample scaling low-rank representation (RSS-LRR) method is proposed. Th...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2021-01-01
|
Series: | Journal of Mathematics |
Online Access: | http://dx.doi.org/10.1155/2021/2999001 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832547271889125376 |
---|---|
author | Wenyun Gao Xiaoyun Li Sheng Dai Xinghui Yin Stanley Ebhohimhen Abhadiomhen |
author_facet | Wenyun Gao Xiaoyun Li Sheng Dai Xinghui Yin Stanley Ebhohimhen Abhadiomhen |
author_sort | Wenyun Gao |
collection | DOAJ |
description | The low-rank representation (LRR) method has recently gained enormous popularity due to its robust approach in solving the subspace segmentation problem, particularly those concerning corrupted data. In this paper, the recursive sample scaling low-rank representation (RSS-LRR) method is proposed. The advantage of RSS-LRR over traditional LRR is that a cosine scaling factor is further introduced, which imposes a penalty on each sample to minimize noise and outlier influence better. Specifically, the cosine scaling factor is a similarity measure learned to extract each sample’s relationship with the low-rank representation’s principal components in the feature space. In order words, the smaller the angle between an individual data sample and the low-rank representation’s principal components, the more likely it is that the data sample is clean. Thus, the proposed method can then effectively obtain a good low-rank representation influenced mainly by clean data. Several experiments are performed with varying levels of corruption on ORL, CMU PIE, COIL20, COIL100, and LFW in order to evaluate RSS-LRR’s effectiveness over state-of-the-art low-rank methods. The experimental results show that RSS-LRR consistently performs better than the compared methods in image clustering and classification tasks. |
format | Article |
id | doaj-art-1b6010b3ba414392b17ffe3b8a940c98 |
institution | Kabale University |
issn | 2314-4785 |
language | English |
publishDate | 2021-01-01 |
publisher | Wiley |
record_format | Article |
series | Journal of Mathematics |
spelling | doaj-art-1b6010b3ba414392b17ffe3b8a940c982025-02-03T06:45:28ZengWileyJournal of Mathematics2314-47852021-01-01202110.1155/2021/2999001Recursive Sample Scaling Low-Rank RepresentationWenyun Gao0Xiaoyun Li1Sheng Dai2Xinghui Yin3Stanley Ebhohimhen Abhadiomhen4Nanjing LES Information Technology Co., LTDNanjing LES Information Technology Co., LTDNanjing LES Information Technology Co., LTDCollege of Computer and InformationDepartment of Computer ScienceThe low-rank representation (LRR) method has recently gained enormous popularity due to its robust approach in solving the subspace segmentation problem, particularly those concerning corrupted data. In this paper, the recursive sample scaling low-rank representation (RSS-LRR) method is proposed. The advantage of RSS-LRR over traditional LRR is that a cosine scaling factor is further introduced, which imposes a penalty on each sample to minimize noise and outlier influence better. Specifically, the cosine scaling factor is a similarity measure learned to extract each sample’s relationship with the low-rank representation’s principal components in the feature space. In order words, the smaller the angle between an individual data sample and the low-rank representation’s principal components, the more likely it is that the data sample is clean. Thus, the proposed method can then effectively obtain a good low-rank representation influenced mainly by clean data. Several experiments are performed with varying levels of corruption on ORL, CMU PIE, COIL20, COIL100, and LFW in order to evaluate RSS-LRR’s effectiveness over state-of-the-art low-rank methods. The experimental results show that RSS-LRR consistently performs better than the compared methods in image clustering and classification tasks.http://dx.doi.org/10.1155/2021/2999001 |
spellingShingle | Wenyun Gao Xiaoyun Li Sheng Dai Xinghui Yin Stanley Ebhohimhen Abhadiomhen Recursive Sample Scaling Low-Rank Representation Journal of Mathematics |
title | Recursive Sample Scaling Low-Rank Representation |
title_full | Recursive Sample Scaling Low-Rank Representation |
title_fullStr | Recursive Sample Scaling Low-Rank Representation |
title_full_unstemmed | Recursive Sample Scaling Low-Rank Representation |
title_short | Recursive Sample Scaling Low-Rank Representation |
title_sort | recursive sample scaling low rank representation |
url | http://dx.doi.org/10.1155/2021/2999001 |
work_keys_str_mv | AT wenyungao recursivesamplescalinglowrankrepresentation AT xiaoyunli recursivesamplescalinglowrankrepresentation AT shengdai recursivesamplescalinglowrankrepresentation AT xinghuiyin recursivesamplescalinglowrankrepresentation AT stanleyebhohimhenabhadiomhen recursivesamplescalinglowrankrepresentation |