The Comparison of Activation Functions in Feature Extraction Layer using Sharpen Filter
Activation functions are a critical component in the feature extraction layer of deep learning models, influencing their ability to identify patterns and extract meaningful features from input data. This study investigates the impact of five widely used activation functions—ReLU, SELU, ELU, sigmoid...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Yayasan Pendidikan Riset dan Pengembangan Intelektual (YRPI)
2025-06-01
|
| Series: | Journal of Applied Engineering and Technological Science |
| Subjects: | |
| Online Access: | http://journal.yrpipku.com/index.php/jaets/article/view/5895 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849434053714903040 |
|---|---|
| author | Oktavia Citra Resmi Rachmawati Ali Ridho Barakbah Tita Karlita |
| author_facet | Oktavia Citra Resmi Rachmawati Ali Ridho Barakbah Tita Karlita |
| author_sort | Oktavia Citra Resmi Rachmawati |
| collection | DOAJ |
| description |
Activation functions are a critical component in the feature extraction layer of deep learning models, influencing their ability to identify patterns and extract meaningful features from input data. This study investigates the impact of five widely used activation functions—ReLU, SELU, ELU, sigmoid, and tanh—on convolutional neural network (CNN) performance when combined with sharpening filters for feature extraction. Using a custom-built CNN program module within the researchers’ machine learning library, Analytical Libraries for Intelligent-computing (ALI), the performance of each activation function was evaluated by analyzing mean squared error (MSE) values obtained during the training process. The findings revealed that ReLU consistently outperformed other activation functions by achieving the lowest MSE values, making it the most effective choice for feature extraction tasks using sharpening filters. This study provides practical and theoretical insights, highlighting the significance of selecting suitable activation functions to enhance CNN performance. These findings contribute to optimizing CNN architectures, offering a valuable reference for future work in image processing and other machine-learning applications that rely on feature extraction layers. Additionally, this research underscores the importance of activation function selection as a fundamental consideration in deep learning model design.
|
| format | Article |
| id | doaj-art-5c99cd603f344ea0b921346186c43f1e |
| institution | Kabale University |
| issn | 2715-6087 2715-6079 |
| language | English |
| publishDate | 2025-06-01 |
| publisher | Yayasan Pendidikan Riset dan Pengembangan Intelektual (YRPI) |
| record_format | Article |
| series | Journal of Applied Engineering and Technological Science |
| spelling | doaj-art-5c99cd603f344ea0b921346186c43f1e2025-08-20T03:26:48ZengYayasan Pendidikan Riset dan Pengembangan Intelektual (YRPI)Journal of Applied Engineering and Technological Science2715-60872715-60792025-06-016210.37385/jaets.v6i2.5895The Comparison of Activation Functions in Feature Extraction Layer using Sharpen FilterOktavia Citra Resmi Rachmawati0Ali Ridho Barakbah1Tita Karlita2Electronic Engineering Polytechnic Institute of SurabayaElectronic Engineering Polytechnic Institute of SurabayaElectronic Engineering Polytechnic Institute of Surabaya Activation functions are a critical component in the feature extraction layer of deep learning models, influencing their ability to identify patterns and extract meaningful features from input data. This study investigates the impact of five widely used activation functions—ReLU, SELU, ELU, sigmoid, and tanh—on convolutional neural network (CNN) performance when combined with sharpening filters for feature extraction. Using a custom-built CNN program module within the researchers’ machine learning library, Analytical Libraries for Intelligent-computing (ALI), the performance of each activation function was evaluated by analyzing mean squared error (MSE) values obtained during the training process. The findings revealed that ReLU consistently outperformed other activation functions by achieving the lowest MSE values, making it the most effective choice for feature extraction tasks using sharpening filters. This study provides practical and theoretical insights, highlighting the significance of selecting suitable activation functions to enhance CNN performance. These findings contribute to optimizing CNN architectures, offering a valuable reference for future work in image processing and other machine-learning applications that rely on feature extraction layers. Additionally, this research underscores the importance of activation function selection as a fundamental consideration in deep learning model design. http://journal.yrpipku.com/index.php/jaets/article/view/5895Convolutional Neural NetworksActivation FunctionFeature ExtractionSharpen FilterImage ProcessingDeep Learning |
| spellingShingle | Oktavia Citra Resmi Rachmawati Ali Ridho Barakbah Tita Karlita The Comparison of Activation Functions in Feature Extraction Layer using Sharpen Filter Journal of Applied Engineering and Technological Science Convolutional Neural Networks Activation Function Feature Extraction Sharpen Filter Image Processing Deep Learning |
| title | The Comparison of Activation Functions in Feature Extraction Layer using Sharpen Filter |
| title_full | The Comparison of Activation Functions in Feature Extraction Layer using Sharpen Filter |
| title_fullStr | The Comparison of Activation Functions in Feature Extraction Layer using Sharpen Filter |
| title_full_unstemmed | The Comparison of Activation Functions in Feature Extraction Layer using Sharpen Filter |
| title_short | The Comparison of Activation Functions in Feature Extraction Layer using Sharpen Filter |
| title_sort | comparison of activation functions in feature extraction layer using sharpen filter |
| topic | Convolutional Neural Networks Activation Function Feature Extraction Sharpen Filter Image Processing Deep Learning |
| url | http://journal.yrpipku.com/index.php/jaets/article/view/5895 |
| work_keys_str_mv | AT oktaviacitraresmirachmawati thecomparisonofactivationfunctionsinfeatureextractionlayerusingsharpenfilter AT aliridhobarakbah thecomparisonofactivationfunctionsinfeatureextractionlayerusingsharpenfilter AT titakarlita thecomparisonofactivationfunctionsinfeatureextractionlayerusingsharpenfilter AT oktaviacitraresmirachmawati comparisonofactivationfunctionsinfeatureextractionlayerusingsharpenfilter AT aliridhobarakbah comparisonofactivationfunctionsinfeatureextractionlayerusingsharpenfilter AT titakarlita comparisonofactivationfunctionsinfeatureextractionlayerusingsharpenfilter |