AI Article Synopsis

  • Separable nonlinear least squares (SNLLS) estimation can be applied effectively to all linear structural equation models (SEMs) represented in RAM notation, enhancing convergence and reducing computation time compared to traditional methods.
  • This method is particularly beneficial for models where some parameters have a linear relationship with the objective function, meaning that for models without unknown directed effects, researchers can obtain least squares estimates analytically.
  • The study also employs trek rules to connect graphical models to their covariance formulations and provides an efficient gradient expression, leading to improved performance in simulations with faster convergence rates and fewer iterations needed.

Article Abstract

We show that separable nonlinear least squares (SNLLS) estimation is applicable to all linear structural equation models (SEMs) that can be specified in RAM notation. SNLLS is an estimation technique that has successfully been applied to a wide range of models, for example neural networks and dynamic systems, often leading to improvements in convergence and computation time. It is applicable to models of a special form, where a subset of parameters enters the objective linearly. Recently, Kreiberg et al. (Struct Equ Model Multidiscip J 28(5):725-739, 2021. https://doi.org/10.1080/10705511.2020.1835484 ) have shown that this is also the case for factor analysis models. We generalize this result to all linear SEMs. To that end, we show that undirected effects (variances and covariances) and mean parameters enter the objective linearly, and therefore, in the least squares estimation of structural equation models, only the directed effects have to be obtained iteratively. For model classes without unknown directed effects, SNLLS can be used to analytically compute least squares estimates. To provide deeper insight into the nature of this result, we employ trek rules that link graphical representations of structural equation models to their covariance parametrization. We further give an efficient expression for the gradient, which is crucial to make a fast implementation possible. Results from our simulation indicate that SNLLS leads to improved convergence rates and a reduced number of iterations.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9977899PMC
http://dx.doi.org/10.1007/s11336-022-09891-5DOI Listing

Publication Analysis

Top Keywords

structural equation
16
equation models
16
trek rules
8
separable nonlinear
8
nonlinear squares
8
linear structural
8
snlls estimation
8
objective linearly
8
directed effects
8
models
7

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!