University of Notre Dame
Browse
- No file added yet -

Fixed Points, Learning, and Plasticity in Recurrent Neuronal Network Models

Download (5.45 MB)
thesis
posted on 2023-04-05, 00:00 authored by Vicky Zhu

Recurrent neural network models (RNNs) are widely used in machine learning and in computational neuroscience. While recurrent in artificial neural networks (ANNs) share some basic building blocks with cortical neuronal networks in the brain, they differ in some fundamental ways. For example, neurons communicate and learn differently. In ANNs, neurons communicate through activations. In comparison, biological neurons communicate via synapses and signal processing along with neuron spiking behaviors. To link neuroscience and machine learning, I study models of recurrent neuronal networks to establish direct, one-to-one analogs between artificial and biological neuronal networks.

I first showed their connection by formalizing the features of cortical networks into theorems that link to machine learning activations. This work extended the traditional excitatory-inhibitory balance network theory into a “semi-balanced” state in which networks implement high-dimensional and nonlinear stimulus representations. To understand brain operations and neuron plasticity, I combined numerical simulations of biological networks and mean-field rate models to evaluate the extent to which homeostatic inhibitory plasticity learns to compute prediction errors in randomly connected, unstructured neuronal networks. I found that homeostatic synaptic plasticity alone is not sufficient to learn and perform non-trivial predictive coding tasks in unstructured neuronal network models. To further invest in learning, I derived two new biologically-inspired RNN learning rules for the fixed points of recurrent dynamics. Under a natural re-parameterization of the network model, they can be interpreted as steepest descent and gradient descent on the weight matrix with respect to a non-Euclidean metric and gradient, respectively. Moreover, compared with the standard gradient-based learning methods, one of our alternative learning rules is robust and computationally more efficient. These learning rules produce results that have implications for training RNNs to be used in computational neuroscience studies and machine learning applications.

History

Date Modified

2023-04-23

Defense Date

2023-03-31

CIP Code

  • 27.9999

Research Director(s)

Robert J. Rosenbaum

Degree

  • Doctor of Philosophy

Degree Level

  • Doctoral Dissertation

Alternate Identifier

1375495317

OCLC Number

1375495317

Program Name

  • Applied and Computational Mathematics and Statistics

Usage metrics

    Dissertations

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC