A Jackson-type estimate in terms of the \(\tau\)-modulus for neural network operators in \(L^{p}\)-spaces

In this paper, we study the order of approximation with respect to the \(L^{p}\)-norm for the (shallow) neural network (NN) operators. We establish a Jackson-type estimate for the considered family of discrete approximation operators using the averaged modulus of smoothness introduced by Sendov and...

Full description

Saved in:
Bibliographic Details
Main Authors: Lorenzo Boccali, Danilo Costarelli, Gianluca Vinti
Format: Article
Language:English
Published: Tuncer Acar 2024-08-01
Series:Modern Mathematical Methods
Subjects:
Online Access:https://modernmathmeth.com/index.php/pub/article/view/42
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849768830365073408
author Lorenzo Boccali
Danilo Costarelli
Gianluca Vinti
author_facet Lorenzo Boccali
Danilo Costarelli
Gianluca Vinti
author_sort Lorenzo Boccali
collection DOAJ
description In this paper, we study the order of approximation with respect to the \(L^{p}\)-norm for the (shallow) neural network (NN) operators. We establish a Jackson-type estimate for the considered family of discrete approximation operators using the averaged modulus of smoothness introduced by Sendov and Popov, also known by the name of \(\tau\)-modulus, in the case of bounded and measurable functions on the interval \([-1,1]\). The results here proved, improve those given by Costarelli (J. Approx. Theory 294:105944, 2023), obtaining a sharper approximation. In order to provide quantitative estimates in this context, we first establish an estimate in the case of functions belonging to Sobolev spaces. In the case \(1 < p <+\infty\), a crucial role is played by the so-called Hardy-Littlewood maximal function. The case of \(p=1\) is covered in case of density functions with compact support.
format Article
id doaj-art-a559b87d90604fa9bd85b81d9bbd65fb
institution DOAJ
issn 3023-5294
language English
publishDate 2024-08-01
publisher Tuncer Acar
record_format Article
series Modern Mathematical Methods
spelling doaj-art-a559b87d90604fa9bd85b81d9bbd65fb2025-08-20T03:03:40ZengTuncer AcarModern Mathematical Methods3023-52942024-08-01229010242A Jackson-type estimate in terms of the \(\tau\)-modulus for neural network operators in \(L^{p}\)-spacesLorenzo Boccali0https://orcid.org/0009-0003-3509-5281Danilo Costarelli1https://orcid.org/0000-0001-8834-8877Gianluca Vinti2https://orcid.org/0000-0002-9875-2790University of FlorenceUniversity of PerugiaUniversity of PerugiaIn this paper, we study the order of approximation with respect to the \(L^{p}\)-norm for the (shallow) neural network (NN) operators. We establish a Jackson-type estimate for the considered family of discrete approximation operators using the averaged modulus of smoothness introduced by Sendov and Popov, also known by the name of \(\tau\)-modulus, in the case of bounded and measurable functions on the interval \([-1,1]\). The results here proved, improve those given by Costarelli (J. Approx. Theory 294:105944, 2023), obtaining a sharper approximation. In order to provide quantitative estimates in this context, we first establish an estimate in the case of functions belonging to Sobolev spaces. In the case \(1 < p <+\infty\), a crucial role is played by the so-called Hardy-Littlewood maximal function. The case of \(p=1\) is covered in case of density functions with compact support.https://modernmathmeth.com/index.php/pub/article/view/42neural network operatorsaveraged moduli of smoothnessjackson-type estimatessigmoidal functionshardy-littlewood maximal function
spellingShingle Lorenzo Boccali
Danilo Costarelli
Gianluca Vinti
A Jackson-type estimate in terms of the \(\tau\)-modulus for neural network operators in \(L^{p}\)-spaces
Modern Mathematical Methods
neural network operators
averaged moduli of smoothness
jackson-type estimates
sigmoidal functions
hardy-littlewood maximal function
title A Jackson-type estimate in terms of the \(\tau\)-modulus for neural network operators in \(L^{p}\)-spaces
title_full A Jackson-type estimate in terms of the \(\tau\)-modulus for neural network operators in \(L^{p}\)-spaces
title_fullStr A Jackson-type estimate in terms of the \(\tau\)-modulus for neural network operators in \(L^{p}\)-spaces
title_full_unstemmed A Jackson-type estimate in terms of the \(\tau\)-modulus for neural network operators in \(L^{p}\)-spaces
title_short A Jackson-type estimate in terms of the \(\tau\)-modulus for neural network operators in \(L^{p}\)-spaces
title_sort jackson type estimate in terms of the tau modulus for neural network operators in l p spaces
topic neural network operators
averaged moduli of smoothness
jackson-type estimates
sigmoidal functions
hardy-littlewood maximal function
url https://modernmathmeth.com/index.php/pub/article/view/42
work_keys_str_mv AT lorenzoboccali ajacksontypeestimateintermsofthetaumodulusforneuralnetworkoperatorsinlpspaces
AT danilocostarelli ajacksontypeestimateintermsofthetaumodulusforneuralnetworkoperatorsinlpspaces
AT gianlucavinti ajacksontypeestimateintermsofthetaumodulusforneuralnetworkoperatorsinlpspaces
AT lorenzoboccali jacksontypeestimateintermsofthetaumodulusforneuralnetworkoperatorsinlpspaces
AT danilocostarelli jacksontypeestimateintermsofthetaumodulusforneuralnetworkoperatorsinlpspaces
AT gianlucavinti jacksontypeestimateintermsofthetaumodulusforneuralnetworkoperatorsinlpspaces