The Relationship between Sparseness and Energy Consumption of Neural Networks
About 50-80% of total energy is consumed by signaling in neural networks. A neural network consumes much energy if there are many active neurons in the network. If there are few active neurons in a neural network, the network consumes very little energy. The ratio of active neurons to all neurons of...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2020-01-01
|
| Series: | Neural Plasticity |
| Online Access: | http://dx.doi.org/10.1155/2020/8848901 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849307202387443712 |
|---|---|
| author | Guanzheng Wang Rubin Wang Wanzeng Kong Jianhai Zhang |
| author_facet | Guanzheng Wang Rubin Wang Wanzeng Kong Jianhai Zhang |
| author_sort | Guanzheng Wang |
| collection | DOAJ |
| description | About 50-80% of total energy is consumed by signaling in neural networks. A neural network consumes much energy if there are many active neurons in the network. If there are few active neurons in a neural network, the network consumes very little energy. The ratio of active neurons to all neurons of a neural network, that is, the sparseness, affects the energy consumption of a neural network. Laughlin’s studies show that the sparseness of an energy-efficient code depends on the balance between signaling and fixed costs. Laughlin did not give an exact ratio of signaling to fixed costs, nor did they give the ratio of active neurons to all neurons in most energy-efficient neural networks. In this paper, we calculated the ratio of signaling costs to fixed costs by the data from physiology experiments. The ratio of signaling costs to fixed costs is between 1.3 and 2.1. We calculated the ratio of active neurons to all neurons in most energy-efficient neural networks. The ratio of active neurons to all neurons in neural networks is between 0.3 and 0.4. Our results are consistent with the data from many relevant physiological experiments, indicating that the model used in this paper may meet neural coding under real conditions. The calculation results of this paper may be helpful to the study of neural coding. |
| format | Article |
| id | doaj-art-1bfbf84970d04ecc80b3693550e7a852 |
| institution | Kabale University |
| issn | 2090-5904 1687-5443 |
| language | English |
| publishDate | 2020-01-01 |
| publisher | Wiley |
| record_format | Article |
| series | Neural Plasticity |
| spelling | doaj-art-1bfbf84970d04ecc80b3693550e7a8522025-08-20T03:54:51ZengWileyNeural Plasticity2090-59041687-54432020-01-01202010.1155/2020/88489018848901The Relationship between Sparseness and Energy Consumption of Neural NetworksGuanzheng Wang0Rubin Wang1Wanzeng Kong2Jianhai Zhang3Institute for Cognitive Neurodynamics, School of Science, East China University of Science and Technology, Meilong Road 130 Shanghai 200237, ChinaInstitute for Cognitive Neurodynamics, School of Science, East China University of Science and Technology, Meilong Road 130 Shanghai 200237, ChinaKey Laboratory of Brain Machine Collaborative Intelligence of Zhejiang Province, Hangzhou Dianzi University, Zhejiang, ChinaKey Laboratory of Brain Machine Collaborative Intelligence of Zhejiang Province, Hangzhou Dianzi University, Zhejiang, ChinaAbout 50-80% of total energy is consumed by signaling in neural networks. A neural network consumes much energy if there are many active neurons in the network. If there are few active neurons in a neural network, the network consumes very little energy. The ratio of active neurons to all neurons of a neural network, that is, the sparseness, affects the energy consumption of a neural network. Laughlin’s studies show that the sparseness of an energy-efficient code depends on the balance between signaling and fixed costs. Laughlin did not give an exact ratio of signaling to fixed costs, nor did they give the ratio of active neurons to all neurons in most energy-efficient neural networks. In this paper, we calculated the ratio of signaling costs to fixed costs by the data from physiology experiments. The ratio of signaling costs to fixed costs is between 1.3 and 2.1. We calculated the ratio of active neurons to all neurons in most energy-efficient neural networks. The ratio of active neurons to all neurons in neural networks is between 0.3 and 0.4. Our results are consistent with the data from many relevant physiological experiments, indicating that the model used in this paper may meet neural coding under real conditions. The calculation results of this paper may be helpful to the study of neural coding.http://dx.doi.org/10.1155/2020/8848901 |
| spellingShingle | Guanzheng Wang Rubin Wang Wanzeng Kong Jianhai Zhang The Relationship between Sparseness and Energy Consumption of Neural Networks Neural Plasticity |
| title | The Relationship between Sparseness and Energy Consumption of Neural Networks |
| title_full | The Relationship between Sparseness and Energy Consumption of Neural Networks |
| title_fullStr | The Relationship between Sparseness and Energy Consumption of Neural Networks |
| title_full_unstemmed | The Relationship between Sparseness and Energy Consumption of Neural Networks |
| title_short | The Relationship between Sparseness and Energy Consumption of Neural Networks |
| title_sort | relationship between sparseness and energy consumption of neural networks |
| url | http://dx.doi.org/10.1155/2020/8848901 |
| work_keys_str_mv | AT guanzhengwang therelationshipbetweensparsenessandenergyconsumptionofneuralnetworks AT rubinwang therelationshipbetweensparsenessandenergyconsumptionofneuralnetworks AT wanzengkong therelationshipbetweensparsenessandenergyconsumptionofneuralnetworks AT jianhaizhang therelationshipbetweensparsenessandenergyconsumptionofneuralnetworks AT guanzhengwang relationshipbetweensparsenessandenergyconsumptionofneuralnetworks AT rubinwang relationshipbetweensparsenessandenergyconsumptionofneuralnetworks AT wanzengkong relationshipbetweensparsenessandenergyconsumptionofneuralnetworks AT jianhaizhang relationshipbetweensparsenessandenergyconsumptionofneuralnetworks |