Manz, Paul: Stability of dynamics and memory in the balanced state. - Bonn, 2023. - Dissertation, Rheinische Friedrich-Wilhelms-Universität Bonn.
Online-Ausgabe in bonndoc: https://nbn-resolving.org/urn:nbn:de:hbz:5-71580
@phdthesis{handle:20.500.11811/11247,
urn: https://nbn-resolving.org/urn:nbn:de:hbz:5-71580,
author = {{Paul Manz}},
title = {Stability of dynamics and memory in the balanced state},
school = {Rheinische Friedrich-Wilhelms-Universität Bonn},
year = 2023,
month = jan,

note = {Computational modeling of neural circuits has successfully explained the observed irregular and asynchronous activity in the brain as the result of a dynamical balance of excitatory and inhibitory inputs to individual neurons. In this balanced state, the activity of each neuron is governed by fluctuations rather than the mean of net input resulting in irregular spiking. The balanced state raises some questions concerning stability, however: First, the high dimensional and irregular activity seems to imply chaos, which may be at odds with stably representing information as sequences of spike times. A second issue arises when synaptic strengths are subject to activity-dependent change called synaptic plasticity: it is not clear how the stability of synaptic weight patterns, which are believed to encode memories, are maintained during irregular activity.
We first study the dynamical stability and phase space structure of balanced networks of inhibitory neurons with external excitatory input. We consider two types of neurons: standard leaky and novel antileaky integrate-and-fire neurons, which accelerate toward the threshold. We determine the voltage probability distributions and self-consistent firing rates of networks with both neuron types. Further, we compute the full spectrum of Lyapunov exponents (LEs) and the covariant Lyapunov vectors (CLVs). While networks with only leaky integrate-and-fire neurons are dynamically stable, we find that there is approximately one positive LE for each antileaky integrate-and-fire neuron in a network, indicating chaos. A simple mean-field approach, which can be justified by properties of the CLVs, explains this finding. As an application, we propose a spike-based computing scheme where our networks serve as computational reservoirs, and their different stability properties yield different computational capabilities.
We then study how strongly interconnected groups of neurons, called assemblies, which may encode memories, can remain stable during balanced state activity. Hebbian plasticity, which strengthens the connections of neurons that receive correlated input can reinforce connections within existing assemblies but is unstable on its own. Previous models of assemblies require additional mechanisms of fast homeostatic plasticity, often with biologically implausible timescales, to stabilize Hebbian plasticity. We provide a model of neuronal assembly generation and maintenance purely based on spike-timing-dependent plasticity (STDP) between excitatory neurons. It uses stochastically spiking neurons and STDP that depresses connections of uncorrelated neurons. We find that assemblies do not grow beyond a certain size, because temporally imprecise spike correlations dominate plasticity in large assemblies. We also demonstrate that assemblies in our model can generate and maintain prominent and stable overlap structures. Our model can furthermore exhibit representational drift, where assemblies over a slow timescale exchange neurons with each other. Finally, the model indicates that assembly size is inversely related to the density of connectivity.},

url = {https://hdl.handle.net/20.500.11811/11247}
}

Die folgenden Nutzungsbestimmungen sind mit dieser Ressource verbunden:

Namensnennung-Nicht kommerziell 4.0 International