Pruning convolution neural networks using filter clustering based on normalized cross-correlation similarity
Despite all the recent development and success of deep neural networks, deployment of a deep model onto the resource-constrained devices still remains challenging. However, model pruning can resolve this issue for Convolutional Neural Networks (CNNs), since it is one of the most popular approaches t...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Taylor & Francis Group
2025-04-01
|
| Series: | Journal of Information and Telecommunication |
| Subjects: | |
| Online Access: | https://www.tandfonline.com/doi/10.1080/24751839.2024.2415008 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850128829233758208 |
|---|---|
| author | Niaz Ashraf Khan A. M. Saadman Rafat |
| author_facet | Niaz Ashraf Khan A. M. Saadman Rafat |
| author_sort | Niaz Ashraf Khan |
| collection | DOAJ |
| description | Despite all the recent development and success of deep neural networks, deployment of a deep model onto the resource-constrained devices still remains challenging. However, model pruning can resolve this issue for Convolutional Neural Networks (CNNs), since it is one of the most popular approaches to reducing computational complexities. Therefore, this article presents a pruning model for convolutional neural networks. The proposed method classifies and arranges similar filters into the same cluster where the similarity is calculated using a three-dimensional normalized cross-correlation. Moreover, these steps can be completed entirely based on the filter values while not requiring a set of test images as well as the acquisition of any filter activation. In the research, the performances of the proposed model pruning method have been evaluated, where it is observed that the proposed approach is computationally light and requires significantly less time and resources compared to ML and activation-based approaches. In the experiments, using the VGG16 model on the Cifar10 dataset, the proposed approach results in the pruned model(s) which are comparable in performance with models found using activation-based methods and expensive ML-based methods. Similar results are found when pruning a custom CNN on the MNIST and Fashion MNIST datasets as well. |
| format | Article |
| id | doaj-art-75d70d0a7d4e4e1cb34ea813312b971b |
| institution | OA Journals |
| issn | 2475-1839 2475-1847 |
| language | English |
| publishDate | 2025-04-01 |
| publisher | Taylor & Francis Group |
| record_format | Article |
| series | Journal of Information and Telecommunication |
| spelling | doaj-art-75d70d0a7d4e4e1cb34ea813312b971b2025-08-20T02:33:11ZengTaylor & Francis GroupJournal of Information and Telecommunication2475-18392475-18472025-04-019219020810.1080/24751839.2024.2415008Pruning convolution neural networks using filter clustering based on normalized cross-correlation similarityNiaz Ashraf Khan0A. M. Saadman Rafat1Department of Computer Science and Engineering, University of Liberal Arts Bangladesh, Dhaka, BangladeshDepartment of Electrical and Computer Engineering, North South University, Dhaka, BangladeshDespite all the recent development and success of deep neural networks, deployment of a deep model onto the resource-constrained devices still remains challenging. However, model pruning can resolve this issue for Convolutional Neural Networks (CNNs), since it is one of the most popular approaches to reducing computational complexities. Therefore, this article presents a pruning model for convolutional neural networks. The proposed method classifies and arranges similar filters into the same cluster where the similarity is calculated using a three-dimensional normalized cross-correlation. Moreover, these steps can be completed entirely based on the filter values while not requiring a set of test images as well as the acquisition of any filter activation. In the research, the performances of the proposed model pruning method have been evaluated, where it is observed that the proposed approach is computationally light and requires significantly less time and resources compared to ML and activation-based approaches. In the experiments, using the VGG16 model on the Cifar10 dataset, the proposed approach results in the pruned model(s) which are comparable in performance with models found using activation-based methods and expensive ML-based methods. Similar results are found when pruning a custom CNN on the MNIST and Fashion MNIST datasets as well.https://www.tandfonline.com/doi/10.1080/24751839.2024.2415008Convolutional neural networksdeep neural networkspruning |
| spellingShingle | Niaz Ashraf Khan A. M. Saadman Rafat Pruning convolution neural networks using filter clustering based on normalized cross-correlation similarity Journal of Information and Telecommunication Convolutional neural networks deep neural networks pruning |
| title | Pruning convolution neural networks using filter clustering based on normalized cross-correlation similarity |
| title_full | Pruning convolution neural networks using filter clustering based on normalized cross-correlation similarity |
| title_fullStr | Pruning convolution neural networks using filter clustering based on normalized cross-correlation similarity |
| title_full_unstemmed | Pruning convolution neural networks using filter clustering based on normalized cross-correlation similarity |
| title_short | Pruning convolution neural networks using filter clustering based on normalized cross-correlation similarity |
| title_sort | pruning convolution neural networks using filter clustering based on normalized cross correlation similarity |
| topic | Convolutional neural networks deep neural networks pruning |
| url | https://www.tandfonline.com/doi/10.1080/24751839.2024.2415008 |
| work_keys_str_mv | AT niazashrafkhan pruningconvolutionneuralnetworksusingfilterclusteringbasedonnormalizedcrosscorrelationsimilarity AT amsaadmanrafat pruningconvolutionneuralnetworksusingfilterclusteringbasedonnormalizedcrosscorrelationsimilarity |