A Gibbs Sampling Algorithm with Monotonicity Constraints for Diagnostic Classification Models

Diagnostic classification models (DCMs) are restricted latent class models with a set of cross-class equality constraints and additional monotonicity constraints on their item parameters, both of which are needed to ensure the meaning of classes and model parameters. In this paper, we develop an efficient, Gibbs sampling-based Bayesian Markov chain Monte Carlo estimation method for general DCMs with monotonicity constraints. A simulation study was conducted to evaluate parameter recovery of the algorithm which showed accurate estimation of model parameters. Moreover, the proposed algorithm was compared to a previously developed Gibbs sampling algorithm which imposed constraints on only the main effect item parameters of the log-linear cognitive diagnosis model. The newly proposed algorithm showed less bias and faster convergence. An analysis of the 2000 Programme for International Student Assessment reading assessment data using this algorithm was also conducted.

Yamaguchi, K., & Templin, J. (2022). A Gibbs sampling algorithm with monotonicity constraints for diagnostic classification models. Journal of Classification, 39, 24-54. https://doi.org/10.1007/s00357-021-09392-7