A basic assumption of Signal Detection Theory - a special case of Bayesian Decision Theory - is that decisions are based on likelihood ratios (the likelihood ratio hypothesis). In a preceding paper, Glanzer et al. (2009) tested this assumption in recognition memory tasks.
View Article and Find Full Text PDFThe mirror effect is a pattern of results generally found in two-condition recognition memory experiments that is consistent with normative signal detection theory as a model of recognition. However, the claim has been made that there is a distinct mirror effect, the "strength mirror effect," that differs from the normative one. This claim is based on experiments on recognition memory in which repetition or study time is varied to produce differences in accuracy, where typically the ordinary mirror effect pattern is absent.
View Article and Find Full Text PDFA basic assumption of Signal Detection Theory is that decisions are made on the basis of likelihood ratios. In a preceding paper, Glanzer, Hilford, and Maloney (Psychonomic Bulletin & Review, 16, 431-455, 2009) showed that the likelihood ratio assumption implies that three regularities will occur in recognition memory: (1) the Mirror Effect, (2) the Variance Effect, (3) the normalized Receiver Operating Characteristic (z-ROC) Length Effect. The paper offered formal proofs and computational demonstrations that decisions based on likelihood ratios produce the three regularities.
View Article and Find Full Text PDFWe analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so.
View Article and Find Full Text PDF