University of Notre Dame
DuSellB042023D.pdf (1.16 MB)

Nondeterministic Stacks in Neural Networks

Download (1.16 MB)
posted on 2023-04-17, 00:00 authored by Brian DuSell

Human language is full of compositional syntactic structures, and although neural networks have contributed to groundbreaking improvements in computer systems that process language, widely-used neural network architectures still exhibit limitations in their ability to process syntax. To address this issue, prior work has proposed adding stack data structures to neural networks, drawing inspiration from theoretical connections between syntax and stacks. However, these methods employ deterministic stacks that are designed to track one parse at a time, whereas syntactic ambiguity, which requires a nondeterministic stack to parse, is extremely common in language. In this dissertation, we remedy this discrepancy by proposing a method of incorporating nondeterministic stacks into neural networks. We develop a differentiable data structure that efficiently simulates a nondeterministic pushdown automaton, representing an exponential number of computations with a dynamic programming algorithm. Since it is differentiable end-to-end, it can be trained jointly with other neural network components using standard backpropagation and gradient descent. We incorporate this module into two predominant architectures: recurrent neural networks (RNNs) and transformers. We show that this raises their formal recognition power to arbitrary context-free languages, and also aids training, even on deterministic context-free languages. Empirically, neural networks with nondeterministic stacks learn context-free languages much more effectively than prior stack-augmented models, including a language with theoretically maximal parsing difficulty. We also show that an RNN augmented with a nondeterministic stack is capable of surprisingly powerful behavior, such as learning cross-serial dependencies, a well-known non-context-free pattern. We demonstrate improvements on natural language modeling and provide analysis on a syntactic generalization benchmark. This work represents an important step toward building systems that learn to use syntax in more human-like fashion.


Date Modified


Defense Date


CIP Code

  • 40.0501

Research Director(s)

David Chiang

Committee Members

Walter Scheirer Taeho Jung Robert Frank


  • Doctor of Philosophy

Degree Level

  • Doctoral Dissertation

Alternate Identifier


OCLC Number


Program Name

  • Computer Science and Engineering

Usage metrics



    No categories selected


    Ref. manager