The Construction and Approximation of ReLU Neural Network Operators

In the present paper, we construct a new type of two-hidden-layer feedforward neural network operators with ReLU activation function. We estimate the rate of approximation by the new operators by using the modulus of continuity of the target function. Furthermore, we analyze features such as paramet...

Full description

Saved in:
Bibliographic Details
Main Authors: Hengjie Chen, Dansheng Yu, Zhong Li
Format: Article
Language:English
Published: Wiley 2022-01-01
Series:Journal of Function Spaces
Online Access:http://dx.doi.org/10.1155/2022/1713912
Tags: Add Tag
No Tags, Be the first to tag this record!