Klos, Christian: Connection weight changes and learning dynamics in models of neural networks. - Bonn, 2022. - Dissertation, Rheinische Friedrich-Wilhelms-Universität Bonn.
Online-Ausgabe in bonndoc: https://nbn-resolving.org/urn:nbn:de:hbz:5-66272
@phdthesis{handle:20.500.11811/9787,
urn: https://nbn-resolving.org/urn:nbn:de:hbz:5-66272,
author = {{Christian Klos}},
title = {Connection weight changes and learning dynamics in models of neural networks},
school = {Rheinische Friedrich-Wilhelms-Universität Bonn},
year = 2022,
month = may,

note = {The brain can be considered as a complex system of interacting neurons. As many other complex systems, neural circuits can be understood as networks of linked model units. The dynamics of the model units are often on their own rather simple. However, many interesting phenomena emerge from their interaction. In the case of neural network models, these interactions are typically mediated by weighted connections. The weights specify the strength of the coupling between the simple dynamical systems that model the neurons. In the brain, the connection sites between neurons are called synapses and are subject to a realm of plasticity mechanisms that affect their properties. Yet, the roles that synaptic plasticity plays for the functioning of neural circuits are in general poorly understood.
In this thesis, we use neural network models to numerically and analytically study four aspects of biologically inspired forms of connection weight changes. First, synaptic plasticity can effectively change the dynamics of neural networks and thus underlies many types of learning. We develop a novel scheme which uses weight changes to even endow neural networks with the ability to learn with static weights. Such dynamical learning is faster and less laborious than weight learning and thus has a high potential for applications in physics, biology and engineering. We illustrate our scheme by constructing networks that can dynamically learn dynamics ranging from simple oscillations to chaotic dynamical systems. Further, we analyze the underlying network mechanisms using dynamical systems theory. Second, recent results indicate that seemingly random weight changes are a ubiquitous phenomenon in the brain. A learning method called weight perturbation, which performs a random search in weight space, could be the cause for such plasticity. However, it is widely considered to perform poorly due to the high dimensionality of the weight space. By taking the temporal extension and the typically low dimensionality of neural dynamics into account, we show numerically and analytically that it performs much better than expected in biologically realistic settings. Third, while weight changes can allow the learning of new tasks, they may also interfere with previously acquired memories that are stored in the weights. For example, weight changes may destroy strongly connected assemblies of neurons that represent an associative memory. We show how this can be avoided in a model where noisy weight changes only lead to switches of neurons between different assemblies. These noise-induced transitions between meta-stable states can be tracked by the network, thus avoiding the forgetting of the memory. To further elucidate the underlying network mechanisms, we construct a random walk model of the weight dynamics. Fourth, synaptic plasticity is affected by neurological diseases such as epilepsy. Based on experimental data, we construct a model of a network motif that is potentially important for the spread of epileptic seizures. In doing so, we determine how short-term synaptic plasticity, which affects the synapses of the motif, and other network properties change in epilepsy. In addition, we predict how these changes influence the spread of epilepsy-associated activity.},

url = {https://hdl.handle.net/20.500.11811/9787}
}

The following license files are associated with this item:

InCopyright