Spike-Based Bayesian-Hebbian Learning of Temporal Sequences.

Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed...

Full description

Saved in:
Bibliographic Details
Main Authors: Philip J Tully, Henrik Lindén, Matthias H Hennig, Anders Lansner
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2016-05-01
Series:PLoS Computational Biology
Online Access:https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1004954&type=printable
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849472189694214144
author Philip J Tully
Henrik Lindén
Matthias H Hennig
Anders Lansner
author_facet Philip J Tully
Henrik Lindén
Matthias H Hennig
Anders Lansner
author_sort Philip J Tully
collection DOAJ
description Many cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule. We find that the formation of distributed memories, embodied by increased periods of firing in pools of excitatory neurons, together with asymmetrical associations between these distinct network states, can be acquired through plasticity. The model's feasibility is demonstrated using simulations of adaptive exponential integrate-and-fire model neurons (AdEx). We show that the learning and speed of sequence replay depends on a confluence of biophysically relevant parameters including stimulus duration, level of background noise, ratio of synaptic currents, and strengths of short-term depression and adaptation. Moreover, sequence elements are shown to flexibly participate multiple times in the sequence, suggesting that spiking attractor networks of this type can support an efficient combinatorial code. The model provides a principled approach towards understanding how multiple interacting plasticity mechanisms can coordinate hetero-associative learning in unison.
format Article
id doaj-art-5634f8a41be9463c8367baa90e1008f1
institution Kabale University
issn 1553-734X
1553-7358
language English
publishDate 2016-05-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS Computational Biology
spelling doaj-art-5634f8a41be9463c8367baa90e1008f12025-08-20T03:24:36ZengPublic Library of Science (PLoS)PLoS Computational Biology1553-734X1553-73582016-05-01125e100495410.1371/journal.pcbi.1004954Spike-Based Bayesian-Hebbian Learning of Temporal Sequences.Philip J TullyHenrik LindénMatthias H HennigAnders LansnerMany cognitive and motor functions are enabled by the temporal representation and processing of stimuli, but it remains an open issue how neocortical microcircuits can reliably encode and replay such sequences of information. To better understand this, a modular attractor memory network is proposed in which meta-stable sequential attractor transitions are learned through changes to synaptic weights and intrinsic excitabilities via the spike-based Bayesian Confidence Propagation Neural Network (BCPNN) learning rule. We find that the formation of distributed memories, embodied by increased periods of firing in pools of excitatory neurons, together with asymmetrical associations between these distinct network states, can be acquired through plasticity. The model's feasibility is demonstrated using simulations of adaptive exponential integrate-and-fire model neurons (AdEx). We show that the learning and speed of sequence replay depends on a confluence of biophysically relevant parameters including stimulus duration, level of background noise, ratio of synaptic currents, and strengths of short-term depression and adaptation. Moreover, sequence elements are shown to flexibly participate multiple times in the sequence, suggesting that spiking attractor networks of this type can support an efficient combinatorial code. The model provides a principled approach towards understanding how multiple interacting plasticity mechanisms can coordinate hetero-associative learning in unison.https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1004954&type=printable
spellingShingle Philip J Tully
Henrik Lindén
Matthias H Hennig
Anders Lansner
Spike-Based Bayesian-Hebbian Learning of Temporal Sequences.
PLoS Computational Biology
title Spike-Based Bayesian-Hebbian Learning of Temporal Sequences.
title_full Spike-Based Bayesian-Hebbian Learning of Temporal Sequences.
title_fullStr Spike-Based Bayesian-Hebbian Learning of Temporal Sequences.
title_full_unstemmed Spike-Based Bayesian-Hebbian Learning of Temporal Sequences.
title_short Spike-Based Bayesian-Hebbian Learning of Temporal Sequences.
title_sort spike based bayesian hebbian learning of temporal sequences
url https://journals.plos.org/ploscompbiol/article/file?id=10.1371/journal.pcbi.1004954&type=printable
work_keys_str_mv AT philipjtully spikebasedbayesianhebbianlearningoftemporalsequences
AT henriklinden spikebasedbayesianhebbianlearningoftemporalsequences
AT matthiashhennig spikebasedbayesianhebbianlearningoftemporalsequences
AT anderslansner spikebasedbayesianhebbianlearningoftemporalsequences