“Distributionally-Weighted Least Squares in Structural Equation Modeling” is accepted by Psychological Methods

In real data analysis with structural equation modeling, data are unlikely to be exactly normally distributed. If we ignore the non-normality reality, the parameter estimates, standard error estimates, and model fit statistics from normal theory based methods such as maximum likelihood (ML) and normal theory based generalized least squares estimation (GLS) are unreliable. On the other hand, the asymptotically distribution free (ADF) estimator does not rely on any distribution assumption but cannot demonstrate its efficiency advantage with small and modest sample sizes. The methods which adopt misspecified loss functions including ridge GLS (RGLS) can provide better estimates and inferences than the normal theory based methods and the ADF estimator in some cases. We propose a distributionally-weighted least squares (DLS) estimator, and expect that it can perform better than the existing generalized least squares, because it combines normal theory based and ADF based generalized least squares estimation. Computer simulation results suggest that model-implied covariance based DLS ( DLS_M ) provided relatively accurate and efficient estimates in terms of RMSE. In addition, the empirical standard errors, the relative biases of standard error estimates, and the Type I error rates of the Jiang-Yuan rank adjusted model fit test statistic ( T_JY ) in DL S_M were competitive with the classical methods including ML, GLS, and RGLS. The performance of DLS_M depends on its tuning parameter a . We illustrate how to implement DLS_M and select the optimal a by a bootstrap procedure in a real data example.
Du, H., & Bentler, P.M. (In press). Distributionally-weighted least squares in structural equation modeling. Psychological Methods