The Comparison of Activation Functions in Feature Extraction Layer using Sharpen Filter

Activation functions are a critical component in the feature extraction layer of deep learning models, influencing their ability to identify patterns and extract meaningful features from input data. This study investigates the impact of five widely used activation functions—ReLU, SELU, ELU, sigmoid...

Full description

Saved in:
Bibliographic Details
Main Authors: Oktavia Citra Resmi Rachmawati, Ali Ridho Barakbah, Tita Karlita
Format: Article
Language:English
Published: Yayasan Pendidikan Riset dan Pengembangan Intelektual (YRPI) 2025-06-01
Series:Journal of Applied Engineering and Technological Science
Subjects:
Online Access:http://journal.yrpipku.com/index.php/jaets/article/view/5895
Tags: Add Tag
No Tags, Be the first to tag this record!