Model flexibility analysis.

Psychol Rev

711th Human Performance Wing, U.S. Air Force Research Laboratory.

Published: October 2015

A good fit of model predictions to empirical data are often used as an argument for model validity. However, if the model is flexible enough to fit a large proportion of potential empirical outcomes, finding a good fit becomes less meaningful. We propose a method for estimating the proportion of potential empirical outcomes that the model can fit: Model Flexibility Analysis (MFA). MFA aids model evaluation by providing a metric for gauging the persuasiveness of a given fit. We demonstrate that MFA can be more informative than merely discounting the fit by the number of free parameters in the model, and show how the number of free parameters does not necessarily correlate with the flexibility of the model. Additionally, we contrast MFA with other flexibility assessment techniques, including Parameter Space Partitioning, Model Mimicry, Minimum Description Length, and Prior Predictive Evaluation. Finally, we provide examples of how MFA can help to inform modeling results and discuss a variety of issues relating to the use of MFA in model validation. (PsycINFO Database Record

Download full-text PDF

Source
http://dx.doi.org/10.1037/a0039657DOI Listing

Publication Analysis

Top Keywords

model
11
model flexibility
8
flexibility analysis
8
good fit
8
fit model
8
proportion potential
8
potential empirical
8
empirical outcomes
8
number free
8
free parameters
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!