Recent work reframes direct effects of covariates on items in mixture models as differential item functioning (DIF) and shows that, when present in the data but omitted from the fitted latent class model, DIF can lead to overextraction of classes. However, less is known about the effects of DIF on model performance-including parameter bias, classification accuracy, and distortion of class-specific response profiles-once the correct number of classes is chosen. First, we replicate and extend prior findings relating DIF to class enumeration using a comprehensive simulation study. In a second simulation study using the same parameters, we show that, while the performance of LCA is robust to the misspecification of DIF effects, it is degraded when DIF is omitted entirely. Moreover, the robustness of LCA to omitted DIF differs widely based on the degree of class separation. Finally, simulation results are contextualized by an empirical example.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7247772PMC
http://dx.doi.org/10.1080/00273171.2019.1596781DOI Listing

Publication Analysis

Top Keywords

mixture models
8
simulation study
8
dif
7
assessing robustness
4
robustness mixture
4
models measurement
4
measurement noninvariance
4
noninvariance work
4
work reframes
4
reframes direct
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!