Nondeterministic Stacks in Neural Networks

Doctoral Dissertation

Abstract

Human language is full of compositional syntactic structures, and although neural networks have contributed to groundbreaking improvements in computer systems that process language, widely-used neural network architectures still exhibit limitations in their ability to process syntax. To address this issue, prior work has proposed adding stack data structures to neural networks, drawing inspiration from theoretical connections between syntax and stacks. However, these methods employ deterministic stacks that are designed to track one parse at a time, whereas syntactic ambiguity, which requires a nondeterministic stack to parse, is extremely common in language. In this dissertation, we remedy this discrepancy by proposing a method of incorporating nondeterministic stacks into neural networks. We develop a differentiable data structure that efficiently simulates a nondeterministic pushdown automaton, representing an exponential number of computations with a dynamic programming algorithm. Since it is differentiable end-to-end, it can be trained jointly with other neural network components using standard backpropagation and gradient descent. We incorporate this module into two predominant architectures: recurrent neural networks (RNNs) and transformers. We show that this raises their formal recognition power to arbitrary context-free languages, and also aids training, even on deterministic context-free languages. Empirically, neural networks with nondeterministic stacks learn context-free languages much more effectively than prior stack-augmented models, including a language with theoretically maximal parsing difficulty. We also show that an RNN augmented with a nondeterministic stack is capable of surprisingly powerful behavior, such as learning cross-serial dependencies, a well-known non-context-free pattern. We demonstrate improvements on natural language modeling and provide analysis on a syntactic generalization benchmark. This work represents an important step toward building systems that learn to use syntax in more human-like fashion.

Attributes

Attribute NameValues
Author Brian DuSell
Contributor David Chiang, Research Director
Contributor Walter Scheirer, Committee Member
Contributor Taeho Jung, Committee Member
Contributor Robert Frank, Committee Member
Degree Level Doctoral Dissertation
Degree Discipline Computer Science and Engineering
Degree Name Doctor of Philosophy
Banner Code
  • PHD-CSE

Defense Date
  • 2023-03-31

Submission Date 2023-04-17
Subject
  • computer science

  • natural language processing

  • formal language theory

  • pushdown automata

  • nondeterminism

  • neural networks

  • syntax

  • context-free languages

Language
  • English

Record Visibility Public
Content License
  • All rights reserved

Departments and Units
Catalog Record

Digital Object Identifier

doi:10.7274/jh343r10k4d

This DOI is the best way to cite this doctoral dissertation.

Files

Please Note: You may encounter a delay before a download begins. Large or infrequently accessed files can take several minutes to retrieve from our archival storage system.