Local kernel renormalization as a mechanism for feature learning in overparametrized convolutional neural networks
Abstract Empirical evidence shows that fully-connected neural networks in the infinite-width limit (lazy training) eventually outperform their finite-width counterparts in most computer vision tasks; on the other hand, modern architectures with convolutional layers often achieve optimal performances...
Saved in:
| Main Authors: | R. Aiudi, R. Pacelli, P. Baglioni, A. Vezzani, R. Burioni, P. Rotondo |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-01-01
|
| Series: | Nature Communications |
| Online Access: | https://doi.org/10.1038/s41467-024-55229-3 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Wilsonian renormalization of neural network Gaussian processes
by: Jessica N Howard, et al.
Published: (2025-01-01) -
A Sliding‐Kernel Computation‐In‐Memory Architecture for Convolutional Neural Network
by: Yushen Hu, et al.
Published: (2024-12-01) -
Audio copy-move forgery detection with decreasing convolutional kernel neural network and spectrogram fusion
by: Canghong Shi, et al.
Published: (2025-07-01) -
Neural quantum kernels: Training quantum kernels with quantum neural networks
by: Pablo Rodriguez-Grasa, et al.
Published: (2025-06-01) -
Renormalization of general Effective Field Theories: formalism and renormalization of bosonic operators
by: Renato M. Fonseca, et al.
Published: (2025-07-01)