Bayesian inference has become a powerful and popular technique for understanding psychological phenomena. However, compared with frequentist statistics, current methods employing Bayesian statistics typically require time-intensive computations, often hindering our ability to evaluate alternatives in a thorough manner. In this article, we advocate for an alternative strategy for performing Bayesian inference, called variational Bayes (VB). VB methods posit a parametric family of distributions that could conceivably contain the target posterior distribution, and then attempt to identify the best parameters for matching the target. In this sense, acquiring the posterior becomes an optimization problem, rather than a complex integration problem. VB methods have enjoyed considerable success in fields such as neuroscience and machine learning, yet have received surprisingly little attention in fields such as psychology. Here, we identify and discuss both the advantages and disadvantages of using VB methods. In our consideration of possible strategies to make VB methods appropriate for psychological models, we develop the differential evolution variational inference algorithm, and compare its performance with a widely used VB algorithm. As test problems, we evaluate the algorithms on their ability to recover the posterior distribution of the linear ballistic accumulator model and a hierarchical signal detection model. Although we cannot endorse VB methods in their current form as a complete replacement for conventional methods, we argue that their accuracy and speed warrant inclusion within the cognitive scientist's toolkit. (PsycInfo Database Record (c) 2020 APA, all rights reserved).
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1037/met0000242 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!