Zur Kurzanzeige

Connection weight changes and learning dynamics in models of neural networks

dc.contributor.advisorMemmesheimer, Raoul-Martin
dc.contributor.authorKlos, Christian
dc.date.accessioned2022-05-06T07:06:29Z
dc.date.available2022-05-06T07:06:29Z
dc.date.issued06.05.2022
dc.identifier.urihttps://hdl.handle.net/20.500.11811/9787
dc.description.abstractThe brain can be considered as a complex system of interacting neurons. As many other complex systems, neural circuits can be understood as networks of linked model units. The dynamics of the model units are often on their own rather simple. However, many interesting phenomena emerge from their interaction. In the case of neural network models, these interactions are typically mediated by weighted connections. The weights specify the strength of the coupling between the simple dynamical systems that model the neurons. In the brain, the connection sites between neurons are called synapses and are subject to a realm of plasticity mechanisms that affect their properties. Yet, the roles that synaptic plasticity plays for the functioning of neural circuits are in general poorly understood.
In this thesis, we use neural network models to numerically and analytically study four aspects of biologically inspired forms of connection weight changes. First, synaptic plasticity can effectively change the dynamics of neural networks and thus underlies many types of learning. We develop a novel scheme which uses weight changes to even endow neural networks with the ability to learn with static weights. Such dynamical learning is faster and less laborious than weight learning and thus has a high potential for applications in physics, biology and engineering. We illustrate our scheme by constructing networks that can dynamically learn dynamics ranging from simple oscillations to chaotic dynamical systems. Further, we analyze the underlying network mechanisms using dynamical systems theory. Second, recent results indicate that seemingly random weight changes are a ubiquitous phenomenon in the brain. A learning method called weight perturbation, which performs a random search in weight space, could be the cause for such plasticity. However, it is widely considered to perform poorly due to the high dimensionality of the weight space. By taking the temporal extension and the typically low dimensionality of neural dynamics into account, we show numerically and analytically that it performs much better than expected in biologically realistic settings. Third, while weight changes can allow the learning of new tasks, they may also interfere with previously acquired memories that are stored in the weights. For example, weight changes may destroy strongly connected assemblies of neurons that represent an associative memory. We show how this can be avoided in a model where noisy weight changes only lead to switches of neurons between different assemblies. These noise-induced transitions between meta-stable states can be tracked by the network, thus avoiding the forgetting of the memory. To further elucidate the underlying network mechanisms, we construct a random walk model of the weight dynamics. Fourth, synaptic plasticity is affected by neurological diseases such as epilepsy. Based on experimental data, we construct a model of a network motif that is potentially important for the spread of epileptic seizures. In doing so, we determine how short-term synaptic plasticity, which affects the synapses of the motif, and other network properties change in epilepsy. In addition, we predict how these changes influence the spread of epilepsy-associated activity.
en
dc.language.isoeng
dc.rightsIn Copyright
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/
dc.subjectKomplexe Systeme
dc.subjectDynamische Systeme
dc.subjectNeuronale Netzwerke
dc.subjectLernen
dc.subjectSynaptische Plastizität
dc.subjectComplex systems
dc.subjectDynamical systems
dc.subjectNeural networks
dc.subjectLearning
dc.subjectSynaptic plasticity
dc.subject.ddc530 Physik
dc.titleConnection weight changes and learning dynamics in models of neural networks
dc.typeDissertation oder Habilitation
dc.publisher.nameUniversitäts- und Landesbibliothek Bonn
dc.publisher.locationBonn
dc.rights.accessRightsopenAccess
dc.identifier.urnhttps://nbn-resolving.org/urn:nbn:de:hbz:5-66272
dc.relation.doihttps://doi.org/10.1103/PhysRevLett.125.088103
dc.relation.doihttps://doi.org/10.1101/2021.10.04.463055
dc.relation.doihttps://doi.org/10.1073/pnas.2023832118
dc.relation.doihttps://doi.org/10.1523/JNEUROSCI.2594-18.2019
ulbbn.pubtypeErstveröffentlichung
ulbbnediss.affiliation.nameRheinische Friedrich-Wilhelms-Universität Bonn
ulbbnediss.affiliation.locationBonn
ulbbnediss.thesis.levelDissertation
ulbbnediss.dissID6627
ulbbnediss.date.accepted04.04.2022
ulbbnediss.instituteMathematisch-Naturwissenschaftliche Fakultät : Fachgruppe Biologie / Institut für Genetik
ulbbnediss.fakultaetMathematisch-Naturwissenschaftliche Fakultät
dc.contributor.coRefereeMeißner, Ulf-G.
ulbbnediss.contributor.orcidhttps://orcid.org/0000-0001-7434-7523
ulbbnediss.contributor.gnd1268240079


Dateien zu dieser Ressource

Thumbnail

Das Dokument erscheint in:

Zur Kurzanzeige

Die folgenden Nutzungsbestimmungen sind mit dieser Ressource verbunden:

InCopyright