Symmetrized Neural Network Operators in Fractional Calculus: Caputo Derivatives, Asymptotic Analysis, and the Voronovskaya–Santos–Sales Theorem

This work presents a comprehensive mathematical framework for symmetrized neural network operators operating under the paradigm of fractional calculus. By introducing a perturbed hyperbolic tangent activation, we construct a family of localized, symmetric, and positive kernel-like densities, which f...

Full description

Saved in:
Bibliographic Details
Main Authors: Rômulo Damasclin Chaves dos Santos, Jorge Henrique de Oliveira Sales, Gislan Silveira Santos
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Axioms
Subjects:
Online Access:https://www.mdpi.com/2075-1680/14/7/510
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This work presents a comprehensive mathematical framework for symmetrized neural network operators operating under the paradigm of fractional calculus. By introducing a perturbed hyperbolic tangent activation, we construct a family of localized, symmetric, and positive kernel-like densities, which form the analytical backbone for three classes of multivariate operators: quasi-interpolation, Kantorovich-type, and quadrature-type. A central theoretical contribution is the derivation of the <i>Voronovskaya–Santos–Sales Theorem,</i> which extends classical asymptotic expansions to the fractional domain, providing rigorous error bounds and normalized remainder terms governed by Caputo derivatives. The operators exhibit key properties such as partition of unity, exponential decay, and scaling invariance, which are essential for stable and accurate approximations in high-dimensional settings and systems governed by nonlocal dynamics. The theoretical framework is thoroughly validated through applications in signal processing and fractional fluid dynamics, including the formulation of nonlocal viscous models and fractional Navier–Stokes equations with memory effects. Numerical experiments demonstrate a relative error reduction of up to <b>92.5%</b> when compared to classical quasi-interpolation operators, with observed convergence rates reaching <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi mathvariant="script">O</mi><mfenced separators="" open="(" close=")"><msup><mi>n</mi><mrow><mo>−</mo><mn>1.5</mn></mrow></msup></mfenced></mrow></semantics></math></inline-formula> under Caputo derivatives, using parameters <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>λ</mi><mo>=</mo><mn>3.5</mn></mrow></semantics></math></inline-formula>, <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>q</mi><mo>=</mo><mn>1.8</mn></mrow></semantics></math></inline-formula>, and <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mi>n</mi><mo>=</mo><mn>100</mn></mrow></semantics></math></inline-formula>. This synergy between neural operator theory, asymptotic analysis, and fractional calculus not only advances the theoretical landscape of function approximation but also provides practical computational tools for addressing complex physical systems characterized by long-range interactions and anomalous diffusion.
ISSN:2075-1680