University of Notre Dame
Browse

File(s) under permanent embargo

Quantifying the Impact of Missing-Data Mechanism Uncertainty: A Tailored Sensitivity Analysis Approach for the Behavioral Sciences

thesis
posted on 2021-07-09, 00:00 authored by Brenna Gomer

The problem of missing-data mechanism uncertainty can be addressed in a variety of ways. Statistically, one approach to the issue of missing-data mechanism uncertainty is sensitivity analysis. Most approaches involve specifying a range of fixed values of a sensitivity parameter, which can reflect differences in the assumed missing-data mechanism and/or the severity of the departure from the MAR mechanism. A drawback to this strategy is that researchers must explicitly make a series of mathematical judgments or decisions that are somewhat subjective. However, comparing results obtained under different assumptions of missing-data mechanisms and missingness relationships may be of greater interest than comparing results obtained from fixed values in some settings, particularly in the behavioral sciences. This is a slightly different question than the one posed by typical sensitivity analysis procedures.

In this dissertation, I develop a method for sensitivity analysis that involves decisions with less subjectivity and aims to provide more objective information that is straightforward and easy to interpret. My procedure is designed to capture the degree to which statistical results are impacted by the choice of missing-data mechanism that is assumed. Broadly, the method I propose represents a modification of the usual sensitivity analysis procedure and provides a statistic to quantify the stability of results in the face of missing-data mechanism uncertainty. I provide two alternatives for conducting hypothesis tests in this modified sensitivity analysis framework: a classical ANOVA approach and Monte Carlo approach. I also examine several candidates for a statistic that can serve as an effect size of sensitivity to missing-data mechanism uncertainty. The performance of my procedure is evaluated in three Monte Carlo simulation studies.

Results suggest that the hypothesis testing aspect of my proposed procedure works as intended under a variety of circumstances -- the Monte Carlo approach works particularly well. Cohen's f and the coefficient of variation work well as effect size measures. Applications of my procedure are illustrated using real data examples.

History

Date Modified

2021-09-08

Defense Date

2021-06-28

CIP Code

  • 42.2799

Research Director(s)

Ke-Hai Yuan

Committee Members

Zhiyong Zhang Ross Jacobucci Lijuan Wang

Degree

  • Doctor of Philosophy

Degree Level

  • Doctoral Dissertation

Alternate Identifier

1264420081

Library Record

6106624

OCLC Number

1264420081

Program Name

  • Psychology, Research and Experimental

Usage metrics

    Dissertations

    Categories

    No categories selected

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC