Kalle Kossio, Felipe Yaroslav: Neural Network Growth, Structure and Dynamics. - Bonn, 2022. - Dissertation, Rheinische Friedrich-Wilhelms-Universität Bonn.
Online-Ausgabe in bonndoc: https://nbn-resolving.org/urn:nbn:de:hbz:5-66816
@phdthesis{handle:20.500.11811/10062,
urn: https://nbn-resolving.org/urn:nbn:de:hbz:5-66816,
author = {{Felipe Yaroslav Kalle Kossio}},
title = {Neural Network Growth, Structure and Dynamics},
school = {Rheinische Friedrich-Wilhelms-Universität Bonn},
year = 2022,
month = jul,

note = {The evolutionary emergence of the nervous system provided individual organisms with a substrate for the development of complex behaviors and fast adaptations to dynamic environments. One of the main ways a nervous system is thought to adapt is by modifying its synapses. These are couplings between neurons, which can change their strength, appear and disappear, thus influencing the neural network dynamics and function. The plasticity rules governing changes in synaptic wiring seem to be relatively simple. This thesis examines how simple plasticity rules may lead to the development of nervous systems with complex dynamics, and how they may preserve memory despite a constant change in synaptic wiring. Finally, the thesis proposes a model for learning without synaptic modifications.
During development some nervous systems display characteristic bursts of activity called neuronal avalanches, which indicate operation near a critical point. We show using a computational model how simple plasticity rules can wire developing nervous systems such that their dynamics are close to a critical point. The simplicity of our model allows us to derive analytical expressions for the neuronal avalanche size and duration distributions. We also analyze how deviations from the critical dynamics may depend on neuronal properties such as spontaneous spiking rate and refractoriness.
Synapses are believed to be the main structures responsible for memory storage. Interestingly, however, they are not very stable; the average synapse lifetime is shorter than the lifetimes of some memories and behaviors. How can neural networks with such unstable couplings store information over long periods of time? We show that simple plasticity rules can maintain neural network function and memory despite complete rewiring of synaptic connections between neurons. In our model changes happen gradually, which allows plasticity to perform a type of error correction. Synaptic rewiring in our model also naturally leads to the change of neural representations (patterns of neural activity corresponding to particular behaviors or memories). Neural representations are often found to drift: they change in time apparently without affecting the corresponding behaviors or memories. Our analysis elucidates the mechanisms that preserve memories and behaviors despite representational drift and synaptic remodeling.
Surprisingly, there is evidence that learning of some behaviors does not require a modification of synaptic wiring. How is the information stored during such learning? We propose that a properly pretrained neural network may store the information in its dynamics eliminating the need for synaptic modifications. In our model, a network learns by modification of synapses a representative sample from a family of tasks, together with indexing tasks from that family. Later, a novel task from the same family can be learned by the network dynamics without synaptic modifications, only its identified low-dimensional index needs to be stored, possibly by neuron-intrinsic mechanisms.},

url = {https://hdl.handle.net/20.500.11811/10062}
}

Die folgenden Nutzungsbestimmungen sind mit dieser Ressource verbunden:

InCopyright