Enhancing Early Exit Performance With Uncertainty-Aware Training in Convolutional Neural Networks for Image Classification
Deep Neural Networks have become more and more complex over the past years delivering efficient results but demanding more requirements for resources. The traditional cloud-based DNN inference produces poor real-time performance due to its high latency requirement in the network. Nowadays, fog and e...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11008656/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849471963719794688 |
|---|---|
| author | Thanu Kurian Somasundaram Thangam |
| author_facet | Thanu Kurian Somasundaram Thangam |
| author_sort | Thanu Kurian |
| collection | DOAJ |
| description | Deep Neural Networks have become more and more complex over the past years delivering efficient results but demanding more requirements for resources. The traditional cloud-based DNN inference produces poor real-time performance due to its high latency requirement in the network. Nowadays, fog and edge have been popular, where computation occurs near the data source or between the source and the cloud. High usage of computational resources by DNN incurs limited room for its deployment in such resource-constrained devices. Deep Neural Networks with multiple exit architecture are an optimal approach to preserve time and computational resources by predicting results at an early stage using multiple early exits to support Edge and Fog Intelligence. Early exit, although lightweight and energy efficient, mostly exhibits degraded performance compared to later exits due to the count of samples that are misclassified. To overcome the downfall, we propose uncertainty estimation in the loss function as an additional parameter for classification. Observational results on CIFAR, MNIST, human lung CT scan, and brain MRI datasets using VGG-16, Lenet-5, and Resnet architectures exhibit that our proposed model is better than the existing models to improve performance as well as prediction confidence in Deep Learning Models with early exits. |
| format | Article |
| id | doaj-art-37a6e77b7ab84a3fba65e3be7cdb3c7c |
| institution | Kabale University |
| issn | 2169-3536 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IEEE |
| record_format | Article |
| series | IEEE Access |
| spelling | doaj-art-37a6e77b7ab84a3fba65e3be7cdb3c7c2025-08-20T03:24:39ZengIEEEIEEE Access2169-35362025-01-0113916719168110.1109/ACCESS.2025.357241511008656Enhancing Early Exit Performance With Uncertainty-Aware Training in Convolutional Neural Networks for Image ClassificationThanu Kurian0https://orcid.org/0009-0006-1559-2645Somasundaram Thangam1https://orcid.org/0000-0001-9284-7724Department of Computer Science and Engineering, Amrita School of Computing, Amrita Vishwa Vidyapeetham, Bengaluru, IndiaDepartment of Computer Science and Engineering, Amrita School of Computing, Amrita Vishwa Vidyapeetham, Bengaluru, IndiaDeep Neural Networks have become more and more complex over the past years delivering efficient results but demanding more requirements for resources. The traditional cloud-based DNN inference produces poor real-time performance due to its high latency requirement in the network. Nowadays, fog and edge have been popular, where computation occurs near the data source or between the source and the cloud. High usage of computational resources by DNN incurs limited room for its deployment in such resource-constrained devices. Deep Neural Networks with multiple exit architecture are an optimal approach to preserve time and computational resources by predicting results at an early stage using multiple early exits to support Edge and Fog Intelligence. Early exit, although lightweight and energy efficient, mostly exhibits degraded performance compared to later exits due to the count of samples that are misclassified. To overcome the downfall, we propose uncertainty estimation in the loss function as an additional parameter for classification. Observational results on CIFAR, MNIST, human lung CT scan, and brain MRI datasets using VGG-16, Lenet-5, and Resnet architectures exhibit that our proposed model is better than the existing models to improve performance as well as prediction confidence in Deep Learning Models with early exits.https://ieeexplore.ieee.org/document/11008656/Deep neural networksearly exitedge computingfog computingMonte Carlo dropoutuncertainty estimation |
| spellingShingle | Thanu Kurian Somasundaram Thangam Enhancing Early Exit Performance With Uncertainty-Aware Training in Convolutional Neural Networks for Image Classification IEEE Access Deep neural networks early exit edge computing fog computing Monte Carlo dropout uncertainty estimation |
| title | Enhancing Early Exit Performance With Uncertainty-Aware Training in Convolutional Neural Networks for Image Classification |
| title_full | Enhancing Early Exit Performance With Uncertainty-Aware Training in Convolutional Neural Networks for Image Classification |
| title_fullStr | Enhancing Early Exit Performance With Uncertainty-Aware Training in Convolutional Neural Networks for Image Classification |
| title_full_unstemmed | Enhancing Early Exit Performance With Uncertainty-Aware Training in Convolutional Neural Networks for Image Classification |
| title_short | Enhancing Early Exit Performance With Uncertainty-Aware Training in Convolutional Neural Networks for Image Classification |
| title_sort | enhancing early exit performance with uncertainty aware training in convolutional neural networks for image classification |
| topic | Deep neural networks early exit edge computing fog computing Monte Carlo dropout uncertainty estimation |
| url | https://ieeexplore.ieee.org/document/11008656/ |
| work_keys_str_mv | AT thanukurian enhancingearlyexitperformancewithuncertaintyawaretraininginconvolutionalneuralnetworksforimageclassification AT somasundaramthangam enhancingearlyexitperformancewithuncertaintyawaretraininginconvolutionalneuralnetworksforimageclassification |