Hostname: page-component-cd9895bd7-8ctnn Total loading time: 0 Render date: 2025-01-05T00:09:04.830Z Has data issue: false hasContentIssue false

Evaluating Manifest Monotonicity Using Bayes Factors

Published online by Cambridge University Press:  01 January 2025

Jesper Tijmstra*
Affiliation:
Tilburg University Cito, National Institute for Educational Measurement
Herbert Hoijtink
Affiliation:
Cito, National Institute for Educational Measurement Utrecht University
Klaas Sijtsma
Affiliation:
Tilburg University
*
Correspondence should be made to Jesper Tijmstra, Department of Methodology and Statistics, Faculty of Social and Behavioral Sciences, Tilburg University, PO Box 90153, 5000 LE Tilburg, The Netherlands. Email: j.tijmstra@uvt.nl
Rights & Permissions [Opens in a new window]

Abstract

The assumption of latent monotonicity in item response theory models for dichotomous data cannot be evaluated directly, but observable consequences such as manifest monotonicity facilitate the assessment of latent monotonicity in real data. Standard methods for evaluating manifest monotonicity typically produce a test statistic that is geared toward falsification, which can only provide indirect support in favor of manifest monotonicity. We propose the use of Bayes factors to quantify the degree of support available in the data in favor of manifest monotonicity or against manifest monotonicity. Through the use of informative hypotheses, this procedure can also be used to determine the support for manifest monotonicity over substantively or statistically relevant alternatives to manifest monotonicity, rendering the procedure highly flexible. The performance of the procedure is evaluated using a simulation study, and the application of the procedure is illustrated using empirical data.

Type
Original Paper
Creative Commons
Creative Common License - CCCreative Common License - BY
This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
Copyright
Copyright © 2015 The Psychometric Society

1. Introduction

In item response theory (IRT) for dichotomously scored items, the assumption of latent monotonicity is shared by most parametric and nonparametric models. This assumption states that the probability of observing a positive response to an item is monotonically nondecreasing as a function of the latent variable, and plays an important role in obtaining the monotone likelihood-ratio property of the total score (Grayson, 1988; Hemker, Sijtsma, Molenaar, & Junker, Reference Hemker, Sijtsma, Molenaar and Junker1997). The monotone likelihood-ratio property implies that the total score stochastically orders respondents on the latent variable, and this ordinal level of measurement is crucial to most applications of IRT. Latent monotonicity also captures the idea that the items in a test measure the latent variable (Junker & Sijtsma, Reference Junker and Sijtsma2000). For these reasons, investigating whether the assumption of latent monotonicity holds is important and relevant for many applications of IRT.

Because the latent variable is unobservable, latent monotonicity can only be evaluated indirectly, by considering observable consequences of the assumption. Given the assumption of local independence, latent monotonicity implies monotonicity over a variety of manifest scores, such as a single item score (Mokken, 1971), the unweighted restscore (Rosenbaum, Reference Rosenbaum1984; Junker & Sijtsma, Reference Junker and Sijtsma2000), and any other sum score that does not include the item under consideration. By testing whether monotonicity holds at the manifest level—manifest monotonicity for short—, given the assumption of local independence one can investigate whether latent monotonicity is violated. Tijmstra, Hessen, Van der Heijden, and Sijtsma (Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013) showed how the property of manifest monotonicity can be evaluated for a variety of manifest scores using order-constrained statistical inference, resulting in a likelihood-ratio test that determines whether there is sufficient evidence to reject monotonicity for the manifest score. A violation of manifest monotonicity implies a violation of latent monotonicity, hence a significant test statistic results in the rejection of latent monotonicity. Alternative methods for investigating latent monotonicity exist which use a manifest score (see, e.g., Rosenbaum, Reference Rosenbaum1984) or the set of observed item-score patterns (Scheiblechner, Reference Scheiblechner2003). Other nonparametric approaches have been developed, which estimate the item response function (IRF), making use of binning (Molenaar & Sijtsma, 2000), kernel smoothing (Ramsay, Reference Ramsay1991), or spline-fitting (Abrahamowicz & Ramsay, Reference Abrahamowicz and Ramsay1992). These methods use local statistical tests, and also confidence bands are used to assess manifest monotonicity.

The aforementioned approaches have in common that they use a null hypothesis that specifies a boundary case of manifest monotonicity, also known as the ‘least favorable null hypothesis’ (Silvapulle & Sen, 2005) that still corresponds to manifest monotonicity. This null hypothesis is tested against the alternative hypothesis that manifest monotonicity does not hold. The specific form of this null hypothesis differs for each of these approaches, but they all use the boundary case where there is no association between the item scores and hence where the item-response probabilities are unrelated to the manifest score. The rationale behind using this hypothesis is that it considers the boundary of the part of the parameter space that corresponds to manifest monotonicity; if manifest monotonicity cannot be rejected for those parameter values, the data are consistent with at least one point in the parameter space that corresponds to manifest monotonicity. However, since in test construction items are usually designed to measure one common attribute, this null hypothesis is highly implausible in most practical settings.

Although these approaches are theoretically sound, by using the least favorable null hypothesis they may have suboptimal power to detect violations of manifest monotonicity. That is, in controlling the Type I error rate and ensuring that it does not exceed the specified significance level and that latent monotonicity is not rejected if there is at least one point in the parameter subspace with which the data are consistent, these approaches may be erring on the conservative side and inflate the Type II error rate; that is, they may fail to accumulate enough evidence to correctly reject latent monotonicity. Failing to detect violations of latent monotonicity could lead to using an IRT model whose estimates cannot be trusted. Arguably, this could be worse than incorrectly concluding that latent monotonicity does not hold and not applying an IRT model. Thus, it is important that a test for latent monotonicity has sufficient power to detect violations.

Furthermore, the approaches discussed so far use the null hypothesis testing framework and aim at falsification. That is, the tests attempt to provide a ‘critical test’ for the model assumption to see whether the assumption is able to ‘survive’ this test. However, failing to reject an assumption does not imply that it actually holds, since a Type II error could have been made. Since model assumptions have to hold for the model to be valid, simply noting that the assumption has failed to be rejected does not suffice as justification for applying the model. A power analysis may help to some extent to indirectly assess the amount of support that the model assumption receives when it fails to be rejected. However, one could argue that a more direct way of assessing support in favor of the model assumption is needed if a decision needs to be made whether using the model would be justifiable. The discussed frequentist approaches do not provide this kind of confirmatory support.

It is with these goals of increasing the power and directly assessing the support in favor of monotonicity in mind that we will pursue a Bayesian approach to evaluating latent monotonicity. Many different Bayesian model comparison approaches are available (e.g., see Gelman, Carlin, Stern, & Rubin, 2004), but of special interest here is the approach that focuses on the Bayes factor (see Hoijtink, Reference Hoijtink2012; Kass & Raftery, Reference Kass and Raftery1995). Using this approach, different hypotheses may be compared without assigning special status to one of the hypotheses by labeling it as a ‘null hypothesis.’ Rather than attempting to reject this null hypothesis, one investigates which hypothesis receives the most support from the data. Also, rather than resulting in a dichotomous outcome to reject or retain the assumption of latent or manifest monotonicity, an approach that uses the Bayes factor quantifies the degree of support each hypothesis receives from the data. This approach provides researchers with more information about the plausibility of the different hypotheses and enables them to make an informed decision about the credibility of the assumption of latent monotonicity. Furthermore, a Bayes factor approach allows for more than just contrasting the hypothesis of manifest monotonicity with the general hypothesis that manifest monotonicity does not hold (Tijmstra et al., Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013). Rather, a wide variety of hypotheses that are relevant in the context of monotonicity can be compared, allowing for finer nuances than just accepting or rejecting monotonicity.

This article proposes a Bayesian approach to evaluating manifest monotonicity for dichotomous item scores, in line with the Bayesian informative hypothesis testing framework discussed by Hoijtink (Reference Hoijtink2012). First, several hypotheses that are relevant for latent monotonicity are discussed. Second, following Hoijtink (Reference Hoijtink2012), we discuss how Bayes factors can be used to evaluate informative hypotheses, and we propose a procedure for estimating the relevant Bayes factors using Gibbs sampling. Third, we discuss a simulation study in which the performance of the procedure is evaluated under varying conditions and compared to a null hypothesis testing procedure that evaluates the same hypotheses (Tijmstra et al., Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013). Fourth, we discuss an empirical example of the application of the proposed procedure. The article concludes with a discussion.

2. Relevant Competing Hypotheses

For a test containing k dichotomous items, let Xi\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X_{i}$$\end{document} denote the score on item i, with realization xi=0,1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$x_{i} = 0, 1$$\end{document} for a negative and positive score, respectively. Let θ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\theta $$\end{document} denote the latent variable. Latent monotonicity specifies that the IRF, denoted by P(Xi=1|θ)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$P(X_{i} = 1|\theta )$$\end{document}, is nondecreasing in θ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\theta $$\end{document} (Hambleton & Swaminathan, 1985). The manifest score, denoted by Y and with realization y, is defined (Tijmstra et al., Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013) as

(1)Y=i=1kciXi,\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} Y = \sum _{i=1}^{k} c_{i}X_{i}, \end{aligned}$$\end{document}

where c1,,ck\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$c_{1}, \ldots , c_{k}$$\end{document} are binary item inclusion coefficients that are chosen by the researcher. For example, by choosing cj=0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$c_{j} = 0$$\end{document} and ci=1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$c_{i} = 1$$\end{document} for all ij\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$i \ne j$$\end{document}, one obtains the unweighted restscore for item j. Including item j in the manifest score may confound the results (Junker & Sijtsma, Reference Junker and Sijtsma2000). Instead of using the total score, one may consider using the unweighted restscore. Although other manifest scores could be considered, the restscore is a more reliable ordinal estimator of the latent variable than a manifest score that is based on fewer items, provided the items that are included in the restscore are of good quality. The proposed procedure can be applied regardless of the specific choice of the manifest score.

Let h denote the highest possible value of manifest score Y, to be be obtained by means of h=i=1kci\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$h = \sum _{i=1}^{k}c_{i}$$\end{document}. Furthermore, let πy=P(X=1|Y=y)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{y} = P(X = 1|Y = y)$$\end{document} for the item that is investigated, where subscript j is dropped for notational convenience. The hypothesis that manifest monotonicity over Y holds for a specific item corresponds to

HMM:π0πyπh.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} \begin{array}{ll} H_\mathrm{{MM}}:&\pi _{0} \le \cdots \le \pi _{y} \le \cdots \le \pi _{h}. \end{array} \end{aligned}$$\end{document}

HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} corresponds to the null hypothesis in the order-constrained statistical inference framework discussed by Tijmstra et al. (Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013), and can be contrasted with its negation, which is the hypothesis that there are manifest nonmonotonicities:

HNM:πy>πy+1,for at least one value ofy.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} \begin{array}{lll} H_\mathrm{{NM}}:&\pi _{y} > \pi _{y + 1},&\text {for at least one value of }y. \end{array} \end{aligned}$$\end{document}

Because these hypotheses are mutually exclusive and exhaustive, evaluating manifest monotonicity effectively boils down to choosing between HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} and HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document}. However, HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} is quite general, and hence not very informative. That is, if one accepts HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document}, then little can be said about the ordering of the conditional item probabilities π0,,πh\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{0}, \ldots , \pi _{h}$$\end{document}, other than that their ordering is not completely monotone. Following the terminology of Hoijtink (Reference Hoijtink2012), HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} has a high complexity, or similarly, HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} is relatively unspecific or uninformative.

In practical applications, it may be important to know to which extent manifest monotonicity holds, that is, the extent to which the ordering of the conditional item probabilities are similar to the ordering specified by manifest monotonicity. Items for which the two orderings are almost the same could be considered to be essentially monotone, and might still be of practical use. For example, one could define essential monotonicity as a less restrictive version of manifest monotonicity, allowing for local violations of manifest monotonicity (πy>πy+1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{y} > \pi _{y + 1}$$\end{document} for some y) as long as these violations occur only between adjacent values of Y. If one would consider including such essentially monotone items in a test, one should carefully consider whether this does not threaten the stochastic ordering of persons. The extent to which the stochastic ordering of persons based on the total score is robust against inclusions of not fully monotone items has not been studied extensively (but see Van der Ark, Reference Van der Ark2005), but in case the scale is robust against these kind of violations essentially monotone items could provide a useful addition to a test. Hence, finding out whether items are strictly monotone, essentially monotone, or nonmonotone can be of interest to for example test constructors.

The hypothesis that a form of ‘essential monotonicity’ holds for a specific item may be formulated as

HEM:π0min{π2,π3},π1min{π3,π4},πh-3min{πh-1,πh},πh-2πh.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} \begin{array}{lll} H_\mathrm{{EM}}: &{}\pi _{0} \le min \{\pi _{2},\pi _{3}\},\\ &{}\pi _{1} \le min \{\pi _{3},\pi _{4}\},\\ &{}\quad \vdots \\ &{}\pi _{h-3} \le min \{\pi _{h-1},\pi _{h}\},\\ &{}\pi _{h-2} \le \pi _{h}. \end{array} \end{aligned}$$\end{document}

In this formulation, essential monotonicity is violated as soon as for some y, πy>πy+d\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{y} > \pi _{y + d}$$\end{document} for some d{2,,h-y}\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$d \in \{2, \ldots , h - y\}$$\end{document}. More liberal versions of essential monotonicity can be obtained by letting d{e,,h-y}\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$d \in \{e, \dots , h - y\}$$\end{document}, where e>2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$e > 2$$\end{document}. The larger the value that is chosen for e, the less restrictive and the less informative HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document} becomes, up to the point where HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document} hardly captures monotonicity anymore. In addition to its potential substantive relevance, investigating essential monotonicity helps to increase the power to detect small violations of manifest monotonicity. This potential increase in power is due to HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document} placing more restrictions on the conditional item probabilities than HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document}; hence, HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document} is more specific.

Another interesting alternative to HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} is the postulation of a ceiling or a floor effect, formulated in HC\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{C}}$$\end{document} and HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}}$$\end{document} as, respectively:

HC:π0πc;{πc+1,,πh},andHF:{π0,,πf-1};πfπh,\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} \begin{array}{ll} H_\mathrm{{C}}: &{}\pi _{0} \le \ldots \le \pi _{c}; \{\pi _{c + 1}, \ldots , \pi _{h}\}, \text {and}\\ H_\mathrm{{F}}: &{}\{\pi _{0}, \ldots , \pi _{f - 1}\}; \pi _{f} \le \ldots \le \pi _{h},\\ \end{array} \end{aligned}$$\end{document}

where c denotes the ‘ceiling-value’ and f the ‘floor-value’ of the manifest score. Both HC\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{C}}$$\end{document} and HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}}$$\end{document} leave the ordering of some of the conditional item probabilities open, thus allowing for nonmonotonicities above (HC\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{C}}$$\end{document}) or below (HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}}$$\end{document}) a particular value of Y. This weaker form of monotonicity may be of interest for selection or testing purposes, for example, when the main goal of a test is to distinguish respondents on either the low or on the high end of the distribution but not necessarily across the entire scale. In addition, the hypotheses may be useful in the context of exam items, where the possibility of providing the desired answer may decrease for examinees at the high end of the scale, or in the context of multiple choice items where some distractors may fail for low-ability examinees.

Like HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document}, HC\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{C}}$$\end{document} and HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}}$$\end{document} are more restrictive than HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document}, which could result in increased power to detect specific violations of monotonicity. Focussing on these specific kinds of deviations from monotonicity could result in a higher power to detect these violations, and could also have substantive relevance in some applications of IRT. The section dealing with the empirical example illustrates the value of considering such informative alternative hypotheses in addition to considering HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document}. In order to be able to evaluate the hypotheses, we first discuss the use of Bayes factors.

3. Bayes Factors

The relative support for either of two competing hypotheses can be quantified using the Bayes factor (Jeffreys, Reference Jeffreys1961; Kass & Raftery, Reference Kass and Raftery1995). The Bayes factor balances the fit of the different hypotheses against their complexity. To determine the fit and the complexity of a hypothesis HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document} imposing order constraints on π0,,πh\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{0}, \ldots , \pi _{h}$$\end{document}, a prior distribution of π=(π1,,πh)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }=(\pi _{1}, \ldots , \pi _{h})$$\end{document} needs to be specified, and the posterior distribution of π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} after observing the data also needs to be determined.

In order to ensure that every ordering of π0,,πh\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{0}, \ldots , \pi _{h}$$\end{document} is equally likely a priori (Hoijtink, Reference Hoijtink2012), one can specify the prior distribution to be

(2)h(π)=y=0hBeta(πy;1,1)=1.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} h(\varvec{\pi }) = \prod _{y=0}^{h}Beta(\pi _{y};1,1) = 1. \end{aligned}$$\end{document}

This prior distribution does not favor any specific ordering of π0,,πh\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{0}, \ldots , \pi _{h}$$\end{document}, and for each πy\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{y}$$\end{document} assigns equal probability to all values between 0 and 1; hence, it can be considered to be uninformative (Lynch, 2007). Since under the prior distribution in Equation 2 every ordering is a priori considered to be equally likely, the complexity of every inequality-constrained hypothesis can in principle be determined analytically (Hoijtink, Reference Hoijtink2012).

Assuming the scores on the item to be binomially distributed for each value of the manifest score, the likelihood of the data corresponds to

(3)f(X|π)=y=0hπysy(1-πy)ny-sy,\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} f(\mathbf{X}|\varvec{\pi }) = \prod _{y=0}^{h}\pi _{y}^{s_{y}}(1 - \pi _{y})^{n_{y}-s_{y}}, \end{aligned}$$\end{document}

where X\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbf{X}$$\end{document} denotes the vector containing the scores on the item in question, ny\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n_{y}$$\end{document} denotes the number of respondents with manifest score y, and sy\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$s_{y}$$\end{document} denotes the number of respondents with manifest score y for whom Xj=1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X_j = 1$$\end{document}. The posterior distribution of the conditional item probabilities is proportional to the product of the likelihood and the prior distribution, and corresponds to

(4)g(π|X)=y=0hBeta(πy;sy+1,ny-sy+1).\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} g(\varvec{\pi }|\mathbf{X}) = \prod _{y=0}^{h}Beta(\pi _{y};s_{y} + 1, n_{y} - s_{y} + 1). \end{aligned}$$\end{document}

Following the framework proposed by Hoijtink (Reference Hoijtink2012), the complexity cZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$c_\mathrm{{Z}}$$\end{document} of a hypothesis HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document} can be defined as the proportion of the prior distribution of π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} that is in accordance with this hypothesis. Thus, for a hypothesis HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document},

(5)cZ=h(π)IπHZdπh(π)dπ=h(π)IπHZdπ,\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} c_{Z} = \frac{\int h(\varvec{\pi })\mathcal {I}_{\varvec{\pi }\in \mathcal {H}_\mathrm{{Z}}}d\varvec{\pi }}{\int h(\varvec{\pi })d\varvec{\pi }}= \int h(\varvec{\pi })\mathcal {I}_{\varvec{\pi }\in \mathcal {H}_\mathrm{{Z}}}d\varvec{\pi }, \end{aligned}$$\end{document}

where HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathcal {H}_\mathrm{{Z}}$$\end{document} denotes the infinite set that contains all vectors π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} for which HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document} is fulfilled, and where IπHZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathcal {I}_{\varvec{\pi }\in \mathcal {H}_\mathrm{{Z}}}$$\end{document} is an indicator function that equals 1 if πHZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi } \in \mathcal {H}_\mathrm{{Z}}$$\end{document}, and 0 otherwise. Thus, the complexity of a hypothesis such as HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} corresponds to the probability of obtaining a set of values for π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} that match the constraints specified by HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} if we were to randomly draw values from the prior distribution of π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document}.

In a similar vein, the posterior fit fZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_\mathrm{{Z}}$$\end{document} of hypothesis HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document} to the data can be defined as the proportion of the posterior distribution of π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} that is in accordance with that hypothesis (Hoijtink, Reference Hoijtink2012), and corresponds to

(6)fZ=g(π|X)IπHZdπg(π|X)dπ=g(π|X)IπHZdπ.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} f_\mathrm{{Z}} = \frac{\int g(\varvec{\pi }|\mathbf{X})I_{\varvec{\pi }\in \mathcal {H}_\mathrm{{Z}}}d\varvec{\pi }}{\int g(\varvec{\pi }|\mathbf{X})d\varvec{\pi }} = \int g(\varvec{\pi }|\mathbf{X})I_{\varvec{\pi }\in \mathcal {H}_\mathrm{{Z}}}d\varvec{\pi }. \end{aligned}$$\end{document}

By comparing the fit of a hypothesis with its complexity, one can determine the extent to which the data provide evidence in favor of or against the hypothesis. The ratio fc\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\frac{f}{c}$$\end{document} quantifies how much more likely the hypothesis has become after observing the data, and hence, it reflects the amount of support that the hypothesis receives from the data (Kass & Raftery, Reference Kass and Raftery1995). The Bayes factor comparing two competing hypotheses that specify order constraints for π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} can be calculated by taking the ratio of fc\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\frac{f}{c}$$\end{document} of the two hypotheses (Hoijtink, Reference Hoijtink2012). Thus, the Bayes factor does not simply contrast the fit of two hypotheses to the data, but rewards hypotheses that are more specific by taking their complexity into account.

3.1. Bayes Factors and Monotonicity

With regard to manifest monotonicity, the simplest comparison that can be made is between HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} and the unconstrained alternative HU:{π0,,πh}\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{U}}:\{\pi _{0}, \ldots , \pi _{h}\}$$\end{document}. The corresponding Bayes factor (BF) can be computed by means of

BFMM,U=fMMcMMfUcU=fMMcMM.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} BF_\mathrm{{MM,U}} = \frac{\frac{f_\mathrm{{MM}}}{c_\mathrm{{MM}}}}{\frac{f_\mathrm{{U}}}{c_\mathrm{{U}}}} = \frac{f_\mathrm{{MM}}}{c_\mathrm{{MM}}}. \end{aligned}$$\end{document}

Here, because HU\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{U}}$$\end{document} does not restrict π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} and hence fU=cU=1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_\mathrm{{U}} = c_\mathrm{{U}} = 1$$\end{document}, fUcU\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\frac{f_\mathrm{{U}}}{c_\mathrm{{U}}}$$\end{document} drops out of the equation. If BFMM,U>1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,U}} > 1$$\end{document}, the data provide support for HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document}, whereas BFMM,U<1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,U}} < 1$$\end{document} indicates that the data do not support the hypothesis of manifest monotonicity.

Since HU\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{U}}$$\end{document} incorporates HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document}, contrasting HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} with HU\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{U}}$$\end{document} is not very informative. In order to evaluate HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document}, this hypothesis should be contrasted with a competing hypothesis. For example, one may contrast HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} with its complement HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document}, which posits that the conditional probabilities do not increase monotonically:

BFMM,NM=fMMcNMfNMcMM=fMM1-cMM1-fMMcMM.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} BF_\mathrm{{MM,NM}} = \frac{f_\mathrm{{MM}}c_\mathrm{{NM}}}{f_\mathrm{{NM}}c_\mathrm{{MM}}} = \frac{f_\mathrm{{MM}}\left( 1 - c_\mathrm{{MM}}\right) }{\left( 1 - f_\mathrm{{MM}}\right) c_\mathrm{{MM}}}. \end{aligned}$$\end{document}

Thus, BFMM,NM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}}$$\end{document} quantifies the amount of support that HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} receives from the data when contrasted with its complement. The comparison of HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} and HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} provides useful information about the general support for the hypothesis that the conditional item probabilities are ordered in accordance with manifest monotonicity.

By only considering a subset of the orderings that HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} allows, manifest monotonicity can be contrasted with more specific alternatives. If realistic alternative hypotheses are selected, the power to detect violations of manifest monotonicity may increase, since these alternatives may receive more support from the data than the uninformative HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document}. For example, one may consider contrasting HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} with HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document}, thereby excluding all orderings that deviate strongly from monotonicity. Considering HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document} can be particularly useful when much is known about a test and possible deviations from monotonicity are expected to be modest. In order to construct hypotheses that are mutually exclusive, one can define HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document} as HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document} with the constraint that HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} does not hold. For this comparison, one obtains

BFMM,EM=fMMcEMfEMcMM=fMMcEM-cMMfEM-fMMcMM.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} BF_\mathrm{{MM,EM}'} = \frac{f_\mathrm{{MM}}c_\mathrm{{EM}'}}{f_\mathrm{{EM}'}c_\mathrm{{MM}}} = \frac{f_\mathrm{{MM}}\left( c_\mathrm{{EM}} - c_\mathrm{{MM}}\right) }{\left( f_\mathrm{{EM}} - f_\mathrm{{MM}}\right) c_\mathrm{{MM}}}. \end{aligned}$$\end{document}

Similarly, one can contrast HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} with HC\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{C}'}$$\end{document} or HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}'}$$\end{document}, where HC\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{C}'}$$\end{document} or HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}'}$$\end{document} are obtained from HC\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{C}}$$\end{document} and HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}}$$\end{document} by adding the constraint that HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} does not hold. The Bayes factors BFMM,C\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,C}'}$$\end{document} and BFMM,F\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,F}'}$$\end{document} indicate whether there is reason to suspect that monotonicity is violated at the high end or the low end of the manifest scale, respectively.

3.2. Estimating the Bayes Factors

The estimation of the Bayes factor requires one to obtain the fit and the complexity of the two hypotheses of interest. Under the uninformative prior distribution of π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} in Equation 2 (and under any exchangeable prior), each ordering of the conditional item probabilities is equally likely, and the complexity of any hypothesis HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document} about the ordering of these conditional item probabilities can be obtained by means of

cZ,h=OZ,h(h+1)!,\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} c_\mathrm{{Z},h} = \frac{O_\mathrm{{Z},h}}{(h+1)!}, \end{aligned}$$\end{document}

where OZ,h\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{\mathrm{{Z}},h}$$\end{document} denotes the number of possible orderings of the conditional item probabilities that are allowed by HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document}, given that the highest possible value on the manifest score equals h.

Thus, it follows that OMM,h=1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{MM,h} = 1$$\end{document}, OC,h=(h+1)!(h+1-c)!\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{C,h} = \frac{(h + 1)!}{(h + 1 - c)!}$$\end{document} and OF,h=(h+1)!(f+1)!\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{F,h} = \frac{(h + 1)!}{(f + 1)!}$$\end{document}. The number of orderings that essential monotonicity allows is a number from the Fibonacci sequence. That is, OEM,h=Fibh+3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{\mathrm{EM},h} = \text {Fib}_{h + 3}$$\end{document}, where Fib={0,1,1,2,3,5,8,13,}\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbf{Fib} = \{0, 1, 1, 2, 3, 5, 8, 13,\ldots \}$$\end{document}. Because the constraints in HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document} specify that conditional probabilities two score units apart cannot decrease, increasing h by 1 increases the number of acceptable orderings by OEM,h-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{\mathrm{EM},h-1}$$\end{document}. That is, when h increases by 1 (i.e., an item is added to the test), the highest possible manifest score becomes h+1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$h + 1$$\end{document}, and there are two types of orderings possible that are allowed by HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_{\mathrm{EM}}$$\end{document}: Orderings where πhπh+1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{h} \le \pi _{h+1}$$\end{document}, of which there are OEM,h\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{\mathrm{EM},h}$$\end{document} in total, and orderings where πh-1πh+1<πh\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{h-1} \le \pi _{h+1} < \pi _{h}$$\end{document}, of which there are OEM,h-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{\mathrm{EM},h - 1}$$\end{document}. Thus, for any h>0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$h > 0$$\end{document}, OEM,h+1=OEM,h+OEM,h-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{\mathrm{EM},h+1} = O_{\mathrm{EM},h} + O_{\mathrm{EM},h-1}$$\end{document}, resulting in the Fibonacci sequence. The complexities of HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document}, HC\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{C}'}$$\end{document}, and HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}'}$$\end{document} can be obtained by subtracting 1 from OEM,h\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{\mathrm{EM},h}$$\end{document}, OC,h\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{\mathrm{C},h}$$\end{document} and OF,h\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O_{\mathrm{F},h}$$\end{document}, respectively.

Analytically determining the fit of the hypotheses is not straightforward. Instead of exact integration in Equation 6, a Gibbs sampling procedure can be used to approximate the proportion of the posterior that falls within the specified part of the parameter space. This procedure enables one to repeatedly sample values of π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} from its posterior distribution, thus allowing one to approximate the posterior distribution to any degree of precision and hence, making it possible to approximate the value of fZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_\mathrm{{Z}}$$\end{document} for any HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document}. However, since fZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_\mathrm{{Z}}$$\end{document} may be extremely small for large values of h, estimating fZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_\mathrm{{Z}}$$\end{document} simply by counting the proportion of draws from the posterior distribution of π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} that are in accordance with the constraints specified in HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document} does not necessarily result in an accurate estimate of fZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_\mathrm{{Z}}$$\end{document}, unless one evaluates an excessively large number of draws.

A computationally less demanding approach is to sequentially evaluate the individual constraints specified in HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document}. This can be done by decomposing the Bayes factor of a hypothesis HZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{Z}}$$\end{document} with w constraints against HU\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{U}}$$\end{document} into w Bayes factors (Mulder et al., Reference Mulder, Klugkist, van de Schoot, Meeus, Selfhout and Hoijtink2009) as

(7)BFZ,U=BF1,U×BF2,1××BFv,v-1××BFw,w-1=f1|Uc1|U×f2|1c2|1××fv|v-1cv|v-1××fw|w-1cw|w-1=f1|U×f2|1××fv|v-1××fw|w-1cZ.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} BF_{Z,U}= & {} BF_{1,\mathrm{{U}}} \times BF_{2,1} \times \ldots \times BF_{v,v-1} \times \ldots \times BF_{w,w-1}\nonumber \\= & {} \frac{f_{1|\mathrm{{U}}}}{c_{1|\mathrm{{U}}}} \times \frac{f_{2|1}}{c_{2|1}} \times \ldots \times \frac{f_{v|v-1}}{c_{v|v-1}} \times \ldots \times \frac{f_{w|w-1}}{c_{w|w-1}}\nonumber \\= & {} \frac{f_{1|\mathrm{{U}}} \times f_{2|1} \times \ldots \times f_{v|v-1} \times \ldots \times f_{w|w-1}}{c_{\mathrm{{Z}}}}. \end{aligned}$$\end{document}

Here, BF1,U\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_{1,\mathrm {U}}$$\end{document} is the Bayes factor comparing the hypothesis that the first order constraint holds (H1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_{1}$$\end{document}) with the unconstrained hypothesis (HU\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{U}}$$\end{document}), and BFv,v-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_{v,v-1}$$\end{document} is the Bayes factor comparing the hypothesis that the first v order constraints hold (Hv\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_{v}$$\end{document}) with the hypothesis that the first v-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$v-1$$\end{document} constraints hold (Hv-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_{v-1}$$\end{document}). Furthermore, fv|v-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_{v|v-1}$$\end{document} is the fit of Hv\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_{v}$$\end{document} conditional on the assumption that Hv-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_{v-1}$$\end{document} holds. For each hypothesis Hv\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_{v}$$\end{document}, this conditional fit measure fv|v-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_{v|v-1}$$\end{document} can be estimated using a Gibbs sampling procedure (see e.g. Geman & Geman, Reference Geman and Geman1984) that draws values from the joint posterior distribution of π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document} under the v-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$v-1$$\end{document} constraints of Hv-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_{v-1}$$\end{document}, that is,

(8)g(π|X;πHv-1)y=0hBetaπy;sy+1,ny-sy+1IπHv-1.\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} g\big (\varvec{\pi }|\mathbf{X}; \varvec{\pi }\in \mathcal {H}_{v-1}\big ) \propto \prod _{y=0}^{h}Beta\left( \pi _{y};s_{y} + 1, n_{y} - s_{y} + 1\right) \mathcal {I}_{\varvec{\pi }\in \mathcal {H}_{v-1}}. \end{aligned}$$\end{document}

To sample from this multivariate distribution, in each iteration of the Gibbs sampler we subsequently sample from the individual full conditional posterior distributions of each πy\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{y}$$\end{document}, given the current values of all other parameters. Equation 8 implies that the full conditional posterior distribution of each πy\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{y}$$\end{document} is either a truncated beta distribution if πy\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{y}$$\end{document} is constrained by Hv-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_{v-1}$$\end{document}, or a regular beta distribution otherwise. After allowing for a burn-in period (e.g., after discarding the first 5000 draws), these draws result in an approximation of the joint posterior distribution g(π|X;πHv-1)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$g(\varvec{\pi }|\mathbf{X}; \varvec{\pi }\in \mathcal {H}_{v-1})$$\end{document} that can be used to estimate fv|v-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_{v|v-1}$$\end{document} (e.g., using 10,000 draws). By sequentially applying this Gibbs sampler to estimate f1|u,,fw|w-1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_{1|\mathrm{u}}, \ldots , f_{w|w-1}$$\end{document}, one can approximate fZ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f_{Z}$$\end{document}. This procedure enables the approximation of the fit of any hypothesis imposing order constraints on π\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varvec{\pi }$$\end{document}.

3.3. Using the Bayes Factor

The Bayes factor can be obtained for any pair of order-constrained hypotheses about the conditional item probabilities. The procedure we discussed has been implemented as a function in R (R Core Team, 2014) that can be used to evaluate manifest monotonicity, by contrasting HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} with HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} as well as HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document}. The test function is available on request from the first author.

Kass and Raftery (Reference Kass and Raftery1995) provide general guidelines for the interpretation of Bayes factors (also, see Jeffreys, Reference Jeffreys1961): If 13<BF<3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\frac{1}{3} < BF < 3$$\end{document}, there is little support for either hypothesis; if 3BF<20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$3 \le BF < 20$$\end{document} or 120<BF13\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\frac{1}{20} < BF \le \frac{1}{3}$$\end{document} there is some support in favor of the first hypothesis or the second hypothesis, respectively; if BF20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF\ge 20 $$\end{document} or BF120\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF\le \frac{1}{20} $$\end{document}, there is strong support in favor of the first hypothesis or the second hypothesis, respectively.

One might consider accepting latent monotonicity only if there is strong support for HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} over HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} (BFMM,NM20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}} \ge 20$$\end{document}), and keep the item that was evaluated in the test. If the aim is falsification, one could decide to reject latent monotonicity when strong support is found against manifest monotonicity relative to its complement HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} (BFMM,NM120\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}} \le \frac{1}{20}$$\end{document}). However, this could result in keeping malfunctioning items in a test simply because the evidence was inconclusive. Alternatively, we propose to only retain items for which BFMM,NM20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}} \ge 20$$\end{document}.

Figure 1. The item response functions of the three items that were analyzed.

One may consider to let the consequences of the comparison of HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document} and HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} depend on the particular circumstances of the application at hand. For some low-stakes settings, it may be sufficient that an item shows an overall positive trend (i.e., it is essentially monotone), but for high-stakes tests, one could demand that even small violations of latent monotonicity as captured by HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document} are unacceptable and only retain items for which there is at least some positive evidence (i.e., BFMM,EM3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,EM}'} \ge 3$$\end{document}) that HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} rather than HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document} holds.

4. Simulation Study

4.1. Method

To facilitate the comparison of the proposed procedure to that of existing methods for evaluating latent monotonicity, conditions similar to those discussed by Tijmstra et al. (Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013) were used in a simulation study. In this way, the decisions that would be made using the proposed method could be compared to those that would be made using the order-constrained null hypothesis test discussed by Tijmstra et al. (Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013). The procedure was used to assess manifest monotonicity for three items, corresponding to three different relevant scenarios: A ‘normal’ item with a monotone IRF that discriminates well, a weakly discriminating item with a monotone but nearly flat IRF, and an item with a locally nonmonotone IRF (Figure 1). For convenience, we label these three items ‘monotone item’, ‘weak item’, and ‘nonmonotone item’, respectively. The monotone item represents a typical desirable item that provides a useful contribution to the test, the weak item represents an item that contributes little to the reliable ordering of persons but does not violate latent monotonicity, and the nonmonotone item represents a problematic item that should not be included in the test.

The IRFs of the monotone item and the weak item were two-parameter logistic with difficulty parameters equal to 0 and discrimination parameters equal to 1 and .1, respectively. For the nonmonotone item, a locally nonmonotone IRF was obtained using a polynomial extension of the two-parameter logistic model previously used by Tijmstra et al. (Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013),

P(Xi=1|θ)=expα1i(θ-β1i)+α2i(θ-β2i)2+α3i(θ-β3i)31+expα1i(θ-β1i)+α2i(θ-β2i)2+α3i(θ-β3i)3,\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\begin{aligned} P(X_{i}=1|\theta ) = \frac{\exp \left( \alpha _{1i}\big (\theta - \beta _{1i}\big ) + \alpha _{2i}\big (\theta - \beta _{2i}\big )^{2} + \alpha _{3i}\big (\theta - \beta _{3i}\big )^{3}\right) }{1 + \exp \left( \alpha _{1i}\big (\theta - \beta _{1i}\big ) + \alpha _{2i}\big (\theta - \beta _{2i}\big )^{2} + \alpha _{3i}\big (\theta - \beta _{3i}\big )^{3}\right) }, \end{aligned}$$\end{document}

where β1i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\beta _{1i}$$\end{document}, β2i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\beta _{2i}$$\end{document}, and β3i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\beta _{3i}$$\end{document} influence the difficulty of the item and α1i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha _{1i}$$\end{document}, α2i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha _{2i}$$\end{document}, and α3i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha _{3i}$$\end{document} influence the slope of the IRF. Following Tijmstra et al. (Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013), we chose α1i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha _{1i}$$\end{document}, α2i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha _{2i}$$\end{document} and α3i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha _{3i}$$\end{document} equal to 1, 1.2, and 0.25, respectively, and β1i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\beta _{1i}$$\end{document}, β2i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\beta _{2i}$$\end{document} and β3i\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\beta _{3i}$$\end{document} equal to 2.5, 1.6, and 1.5, respectively.

Test length was varied by considering manifest scores obtained based on 5, 10, and 20 dichotomous monotone items. The items included in the manifest score were specified using the two-parameter logistic model; the IRFs are displayed in Figure 2. Five different IRFs were specified, with difficulty parameters {-1,-0.5,0,0.5,1}\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\{-1, -0.5, 0, 0.5, 1\}$$\end{document} and discrimination parameters {0.5,1.25,1,1.25,1.50}\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\{0.5, 1.25, 1, 1.25, 1.50\}$$\end{document}, matching the design of Tijmstra et al. (Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013). For manifest scores based on 10 and 20 items, two and four duplicates of the 5-item set were used, respectively. Sample sizes (n) of 100, 200, 500, and 1000 were used to study the effect sample size had on the values of the Bayes factors and the resulting decisions about manifest monotonicity based on the proposed guidelines.

Figure 2. The item response functions of the five monotone items, based on the two-parameter logistic model. The discrimination and difficulty parameters are denoted by α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document} and β\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\beta $$\end{document}, respectively.

For each design condition, 1000 replications were generated. For each replication, n values of the latent variable were drawn from a standard normal distribution, and subsequently item scores were generated, yielding data matrices for the item of interest (monotone, weak, or nonmonotone) and the 5, 10, or 20 items that were used to compute the manifest score. Next, the Bayesian procedure was applied to the generated data, using 5000 iterations for the burn-in period of the Gibbs sampler and the subsequent 10,000 iterations to approximate the posterior distribution g(π|X;πHv-1)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$g(\varvec{\pi }|\mathbf{X}; \varvec{\pi }\in \mathcal {H}_{v-1})$$\end{document} for each order constraint v, as detailed in Equation 7. This way, the Bayes factors of HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} versus HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} and of HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} versus HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document} were obtained for each replication.

Table 1. Proportion of rejections of latent monotonicity for the nonmonotone item using the Bayes factor procedure (1000 replications) and the order-constrained NHST procedure, for varying sample size (rows) and test length (columns).

* Means that computational limitations prohibited computation of entries.

4.2. Results

For the nonmonotone item, Table 1 reports the proportion of replications in which strong support is found against manifest monotonicity relative to its complement (BFMM,NM120\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}} \le \frac{1}{20}$$\end{document}), thus leading to a rejection of latent monotonicity. The results show that also for small samples the proposed procedure had a high power to correctly reject latent monotonicity; except for k=5\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k = 5$$\end{document} and n=100\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n = 100$$\end{document}, the observed power levels exceeded .80 for all other conditions. The evidence against latent monotonicity increased quickly as sample size increased. For n500\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n \ge 500$$\end{document}, some of the 1000 replications encountered difficulties with the estimation of the Bayes factor (empty cells in Table 1), as the constraints were so unlikely that the estimation of some of the full conditional posteriors in Equation 8 became unfeasible. Consequently, the Bayes factor could not be estimated for every replication in these conditions. This problem can only occur if there is overwhelming evidence against HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document}, and only happens when the estimate of the Bayes factor approximately equals 0, as is the case when n500\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n \ge 500$$\end{document}. Table 1 also shows that in at most 0.1 % of the replications strong support was found for manifest monotonicity. Thus, if one uses a strict guideline and only retains items for which BF20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF \ge 20$$\end{document}, items like the nonmonotone item will almost always be removed successfully.

Table 1 compares the power of the Bayesian procedure with Tijmstra et al.’s (Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013) procedure based on the null hypothesis statistical testing (NHST) framework. The table presents the results obtained by Tijmstra et al. (Reference Tijmstra, Hessen, Van der Heijden and Sijtsma2013) and compares them with the Bayesian result obtained under the same conditions. The Bayesian procedure outperformed the null hypothesis test, where for the latter acceptable power levels were found only for large sample sizes (n=500\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n = 500$$\end{document}). Unlike the NHST procedure, the Bayes factor procedure shows a marked gain in power as test length increased.

Table 2. Proportion of replications strongly agreeing or disagreeing with HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} when contrasted with HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} for the monotone item and the weak item (1000 replications), for varying sample size (rows) and test length (columns).

Table 2 shows the results for the monotone item and the weak item when contrasting manifest monotonicity with its complement. For the monotone item, the proportion of replications where BFMM,NM20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}} \ge 20$$\end{document}, indicating strong support for manifest monotonicity, exceeded .80 for most conditions, except for n=100\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n = 100$$\end{document} and k=5\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k = 5$$\end{document}. The proportion of replications providing strong support against manifest monotonicity (BFMM,NM120\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}} \le \frac{1}{20}$$\end{document}) was always close to 0. As test length and sample size increased, the proportion of replications providing support for manifest monotonicity approached 1. Thus, in almost all but the most unfavorable conditions the procedure consistently indicated that manifest monotonicity held for the monotone item, and the monotone item had a high probability of correctly passing the first test of the procedure.

Table 2 also shows the results for the weak item. Compared to the monotone item, the proportion of replications providing strong support for manifest monotonicity was considerably smaller for the weak item in all conditions, especially for smaller sample sizes (n=100,200\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n = 100, 200$$\end{document}) and shorter tests (k=5,10\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k = 5, 10$$\end{document}). As n or k increased, the procedure more often found strong support for manifest monotonicity relative to its complement. For longer tests (k=20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k = 20$$\end{document}) and for smaller sample sizes (n<500\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n < 500$$\end{document}), the proportion of replications showing strong support against manifest monotonicity was relatively large, up to .246 for k=20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k = 20$$\end{document} and n=100\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n = 100$$\end{document}. Even though one may expect occasional rejections of manifest monotonicity for weak items such as this one, the results may be considered surprising. Further study showed that the results are due to low-score and high-score groups having few observations in these conditions. When data are sparse, the uniform prior is relatively influential and pushes the estimates of the conditional probabilities toward .5. As a result, some replications result in BF120\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF\le \frac{1}{20}$$\end{document}. For the monotone item, the evidence in favor of monotonicity was much stronger, resulting almost always in BF20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF \ge 20$$\end{document} despite sparse data in some score groups.

The second part of the procedure contrasted HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} with HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document}. Since it is more difficult to distinguish between HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} and HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document}, we focused on the results suggesting at least some support in favor of one of the hypotheses (BF3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF \ge 3$$\end{document} or BF13\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF \le \frac{1}{3}$$\end{document}) rather than requiring strong support. Table 3 shows that for the monotone item, the proportion of replications providing support for manifest monotonicity relative to essential monotonicity varied greatly depending on test length and sample size. The proportion of cases where HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} was correctly supported increased strongly as the sample size increased.

As test length increases, it is more difficult to distinguish the two hypotheses for the monotone item; see the relatively low proportion of cases with support for HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} when k=20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k = 20$$\end{document} (Table 3). The explanation is that as test length increases, the differences in the mean ability of neighboring score groups grow smaller. Moreover, increasing test length given fixed n results in fewer observations per score group and less accurate estimates per group, especially for the extreme score groups. As a result of data sparsity, the estimates of the conditional probabilities in the extreme score groups may be strongly biased toward .5 because of the influence of the uniform prior. This means that for the extreme score groups the estimated conditional probabilities often show a decrease across the first and the last couple of score groups, even though the population conditional probabilities are strictly monotone. These different influences together impair finding evidence for a strictly monotone ordering relative to an essentially monotone ordering when k is large and n is small. As n increases, data sparsity becomes rare, and support for HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} relative to HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM'}}$$\end{document} is found more frequently.

For the weak item, Table 3 shows that for short tests (k10\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k \le 10$$\end{document}), the proportion of replications providing support for manifest monotonicity relative to essential monotonicity was small, even for n=1000\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n = 1000$$\end{document}. This finding is in contrast with the results for the monotone item, where for k10\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k \le 10$$\end{document} and n=1000\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n = 1000$$\end{document} more than 80 % of replications showed support for monotonicity. However, for k=20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k = 20$$\end{document} the differences between the results for the weak item and the monotone item were less extreme and less clear. For longer tests (k=20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k = 20$$\end{document}), the proportion of replications providing support for manifest monotonicity for the weak item increased slowly as n increased, up to .353 for n=1000\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n = 1000$$\end{document}.

Table 3. Proportion of cases agreeing or disagreeing with HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} when contrasted with HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document} for the items with a monotone and a flat IRF (1000 replications), for varying sample size (rows) and test length (columns).

5. Empirical Example

The procedure was applied to evaluate manifest monotonicity for each item from a set of eleven four-option multiple-choice items measuring reading comprehension in sixth grade, primary school students. Data were obtained as part of a larger pilot study, and dichotomously scored responses to these items were available from 773 Dutch students. Because there was no a priori reason to exclude any item from the test, the unweighted restscore was used as the manifest score across which monotonicity was evaluated. For each of the items, the Bayes factors contrasting HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} with HNM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{NM}}$$\end{document} and HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} with HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document} were estimated. Each Bayes factor was obtained through the decomposition in Equation 7, where each decomposed Bayes factor was estimated based on 10,000 draws from the corresponding joint posterior distribution (obtained after a burn-in period of 5000 iterations).

The results of the analysis are displayed in Table 4. It may be noted that since the composition of the restscore differs for each item, the number of observations per restscore group also differs from item to item. The number of observations per restscore group was relatively small for the lower-score groups, and a restscore equal to 0 was only observed for item 8. Thus, most of the information that was relevant for the assessment of monotonicity was obtained from the middle-score to higher-score groups.

Table 4. Conditional proportions py\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$p_{y}$$\end{document} and Bayes factors for the eleven reading comprehension items.

For the comparison of manifest monotonicity with its complement, the values of BFMM,NM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}}$$\end{document} ranged from 0.001 to 90,189. Items 1 and 8 had a Bayes factor lower than 120\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\frac{1}{20}$$\end{document} while all the other items had a Bayes factor higher than 20. Items showing a larger and more stable increase of the proportion of correct responses across the restscore resulted in higher estimates of the Bayes factor. For 8 out of 11 items, the Bayes factor exceeded 1000.

Items 1 and 8 both display nonmonotone orderings. Because the items have multiple choice format, a possible explanation for nonmonotonicity is that particular distractors fail to function for low-ability candidates, resulting in a local decrease of the conditional probabilities. To test the possibility of a floor effect (HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}'}$$\end{document}), we considered the hypothesis that manifest monotonicity only holds for the highest half of the score groups (π5\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{5}$$\end{document} through π10\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{10}$$\end{document}), allowing for possible nonmonotonicities in the lower score groups (π0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{0}$$\end{document} through π4\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\pi _{4}$$\end{document}). Contrasting HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}'}$$\end{document} with HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} for each of the 11 items resulted in Bayes factors that showed strong support for HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}'}$$\end{document} (BFMM,F<0.0001\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,F}'} < 0.0001$$\end{document}) for the two problematic items, while the Bayes factors for the other nine items showed support for manifest monotonicity. For items 1 and 8, the Bayes factor contrasting HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}}$$\end{document} with its complement showed support for HF\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{F}}$$\end{document}, which suggests that the two items may suffer from malfunctioning distractors for low ability candidates.

Because nonmonotone items may confound the restscore, it is advisable to sequentially remove items until no item shows a violation, rather than removing all items with BFMM,NM120\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}} \le \frac{1}{20}$$\end{document} at once. First, item 1 was eliminated from the test and the procedure was applied again to the remaining items. For item 8, the estimated Bayes factor equalled 0.016, and for the other items BFMM,NM20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}} \ge 20$$\end{document}. After item 8 was also removed from the test, for eight out of the remaining nine items, BFMM,NM20\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}} \ge 20$$\end{document}, indicating strong support for manifest monotonicity over its complement. However, for item 10, the estimated Bayes factor was equal to 7.11, indicating only modest support for manifest monotonicity. Because item 10 showed strong support for monotonicity in the previous two analyses, we decided to keep this item in the test.

While the values of BFMM,NM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,NM}}$$\end{document} suggest general support for latent monotonicity for the remaining items, one would like to exclude the possibility that there are small local violations of latent monotonicity for these items. For this purpose, the Bayes factor contrasting manifest monotonicity with essential monotonicity was used. Table 4 shows the estimates of BFMM,EM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,EM}'}$$\end{document} for the original set of 11 items. Only three items show support for manifest monotonicity compared to essential monotonicity (BFMM,EM3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$BF_\mathrm{{MM,EM}'} \ge 3$$\end{document}). After the nonmonotone items 1 and 8 were removed the results improved, with seven out of the remaining nine items showing support for manifest monotonicity. The Bayes factors of item 2 and item 10 did not show support for manifest monotonicity compared to essential monotonicity. Thus, the quality of these items and the extent to which they contribute to the reliability and validity of the test should be critically examined. However, the simulation results suggested that this absence of support may also have resulted from lack of power, because support for HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} relative to HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM'}}$$\end{document} was not always found for well-functioning items under conditions similar to the current condition (n=500,1000\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n = 500, 1000$$\end{document}; k=10\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k = 10$$\end{document}). Overall, these results support latent monotonicity for these nine items.

6. Discussion

This article proposed a methodology for evaluating the amount of support the data provide in favor of manifest monotonicity, which is quantified using the Bayes factor. The procedure remains neutral with respect to whether the aim is verification or falsification. By determining the support for manifest monotonicity compared to its complement, the procedure provides a general measure of the amount of support for this property. Since the complement of manifest monotonicity is unspecific, the procedure can be supplemented by subsequently comparing manifest monotonicity with an informative alternative hypothesis. Informative alternatives can either serve as alternatives that are of substantive interest (such as the floor effect in the empirical example), or as a way of more extensively investigating the amount of support in favor of manifest monotonicity (such as essential monotonicity in the empirical example). Because the Bayes factor can be determined for any set of order constraints on the conditional item probabilities, the approach is flexible with respect to the range of hypotheses that can be compared.

The simulation results showed that contrasting manifest monotonicity with its complement effectively identified the nonmonotone item. Including a second step in the procedure where manifest monotonicity was contrasted with essential monotonicity helped to identify weakly discriminating items, but mainly for short tests. Longer tests seemed to require larger sample sizes before HMM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{MM}}$$\end{document} and HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}'}$$\end{document} can be distinguished sufficiently. This could be an indication that for long tests, it is useful to employ a more liberal version of essential monotonicity—allowing for nonmonotonicities between score groups more than one step removed—in order to successfully differentiate between a completely monotone ordering and approximately monotone orderings of the conditional item probabilities. In addition, these results illustrate that longer tests require larger sample sizes before one can expect to find support for manifest monotonicity relative to essential monotonicity, due to data sparsity in score groups. Thus, for long tests and small sample sizes, removing items that do not show support for manifest monotonicity over essential monotonicity may result in an overly large proportion of well-functioning items being discarded and thus is not advisable. In addition, further research may show that for some applications, having items that are at least essentially monotone might be sufficient. In this case, one could consider contrasting HEM\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$H_\mathrm{{EM}}$$\end{document} with its complement, to determine whether there is support for essential monotonicity (rather than manifest monotonicity).

The procedure could be extended to assess monotonicity for a set of items at once. However, this approach runs the risk of masking violations for a particular item if the other items are monotone, so it seems that any global analysis should be followed by an analysis at the item level even if the global analysis indicates overall support for latent monotonicity. Multiple testing does not appear to be problematic, because the simulation study has shown that regardless of test length and sample size the probability of rejecting monotonicity for an item that is monotone appears to be close to 0. Likewise, the probability of finding strong support in favor of monotonicity when a nonmonotone item is evaluated appeared to be close to 0, also suggesting that multiple testing may not be problematic for the proposed procedure, especially if it is used in an exploratory rather than a confirmatory setting.

The Bayes factor provides a measure of relative support (Kass & Raftery, Reference Kass and Raftery1995), and does not directly inform the researcher about the probability that manifest monotonicity is true but rather about the extent to which this has become more likely after having observed the data. Hence, the Bayes factor provides researchers with an objective assessment of the degree of support in favor or against the hypotheses, which they can use to determine whether they consider a hypothesis to be plausible after having observed the data.

The proposed procedure makes use of an uninformative prior distribution that does not favor any particular ordering of the conditional item probabilities. Because test items are artifacts constructed with the specific purpose of monotonically measuring a specific trait, one could argue that the prior distribution should take this substantive information into account and should to some degree favor monotonic and essentially monotonic orderings over orderings that show large deviations from monotonicity. Such a prior distribution would concentrate its density around the area corresponding to manifest monotonicity. However, such an informative prior would a priori favor the property that is evaluated by the procedure, and this would affect the Bayes factor. We posit that for the assessment of latent monotonicity, a measure of support should solely reflect the extent to which the data (and not the researcher’s prior expectations) support the model assumption, and hence that the use of an uninformative prior should be preferred. We contend that this is consistent with the idea that model assumptions should be critically evaluated and that concerns raised about this assumption should be eliminated not by indicating that items were meant to behave monotonically by the person who designed them, but rather by determining the extent to which the data support this claim. This is precisely what the proposed procedure aims to do.

References

Abrahamowicz, M., & Ramsay, J.O. (1992). Multicategorical spline model for item response theory. Psychometrika, 57, 527.CrossRefGoogle Scholar
Gelman, A., Carlin, J. B., Stern, H. S., & Rubin, D. B. (2004). Bayesian data analysis (2nd ed.). London, UK: Chapman & Hall.Google Scholar
Geman, S., & Geman, D. (1984). Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721741.CrossRefGoogle ScholarPubMed
Grayson, D. A. (1988). Two-group classification in latent trait theory: Scores with monotone likelihood ratio. Psychometrika, 53, 383–392.CrossRefGoogle Scholar
Hambleton, R. K., & Swaminathan, H. (1985). Item response theory: Principles and applications. Boston, MA: Kluwer Nijhof.CrossRefGoogle Scholar
Hemker, B.T., Sijtsma, K., Molenaar, I.W., & Junker, B.W. (1997). Stochastic ordering using the latent trait and the sum score in polytomous IRT models. Psychometrika, 62, 331347.CrossRefGoogle Scholar
Hoijtink, H.J.A. (2012). Informative hypotheses: Theory and practice for behavioral and social scientists. Boca Raton: FL: CRC Press.Google Scholar
Jeffreys, H. (1961). Theory of probability (3rd ed.). Oxford, UK: Oxford University Press.Google Scholar
Junker, B.W., & Sijtsma, K. (2000). Latent and manifest monotonicity in item response models. Applied Psychological Measurement, 24, 6581.CrossRefGoogle Scholar
Kass, R.E., & Raftery, A.E. (1995). Bayes factors. Journal of the American Statistical Association, 90, 773795.CrossRefGoogle Scholar
Lynch, S. M. (2007). Introduction to applied bayesian statistics and estimation for social scientists. New York, NY: Springer.CrossRefGoogle Scholar
Mokken, R. J. (1971). A theory and procedure of scale analysis. Berlin, Germany: De Gruyter.CrossRefGoogle Scholar
Molenaar, I. W., & Sijtsma, K. (2000). User’s manual MSP5 for Windows. Groningen, The Netherlands: ProGAMMA.Google Scholar
Mulder, J., Klugkist, I., van de Schoot, R., Meeus, W., Selfhout, M., & Hoijtink, H. (2009). Bayesian model selection of informative hypotheses for repeated measurements. Journal of Mathematical Psychology, 53, 530546.CrossRefGoogle Scholar
R Core Team (2014). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing.Google Scholar
Ramsay, J.O. (1991). Kernel smoothing approaches to nonparametric item characteristic curve estimation. Psychometrika, 56, 611630.CrossRefGoogle Scholar
Rosenbaum, P.R. (1984). Testing the conditional independence and monotonicity assumptions of item response theory. Psychometrika, 49, 425435.CrossRefGoogle Scholar
Scheiblechner, H. (2003). Nonparametric IRT: Testing the bi-isotonicity of isotonic probabilistic models (ISOP). Psychometrika, 68, 7996.CrossRefGoogle Scholar
Silvapulle, M. J., & Sen, P. K. (2005). Constrained statistical inference: Inequality, order, and shape restrictions. Hoboken, NJ: Wiley.Google Scholar
Tijmstra, J., Hessen, D.J., Van der Heijden, P.G.M., & Sijtsma, K. (2013). Testing manifest monotonicity using order-constrained statisticial inference. Psychometrika, 78, 8397.CrossRefGoogle Scholar
Van der Ark, L.A. (2005). Stochastic ordering of the latent trait by the sum score under various polytomous IRT models. Psychometrika, 70, 283304.CrossRefGoogle Scholar
Figure 0

Figure 1. The item response functions of the three items that were analyzed.

Figure 1

Figure 2. The item response functions of the five monotone items, based on the two-parameter logistic model. The discrimination and difficulty parameters are denoted by α\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$\alpha $$\end{document} and β\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$\beta $$\end{document}, respectively.

Figure 2

Table 1. Proportion of rejections of latent monotonicity for the nonmonotone item using the Bayes factor procedure (1000 replications) and the order-constrained NHST procedure, for varying sample size (rows) and test length (columns).

Figure 3

Table 2. Proportion of replications strongly agreeing or disagreeing with HMM\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$H_\mathrm{{MM}}$$\end{document} when contrasted with HNM\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$H_\mathrm{{NM}}$$\end{document} for the monotone item and the weak item (1000 replications), for varying sample size (rows) and test length (columns).

Figure 4

Table 3. Proportion of cases agreeing or disagreeing with HMM\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$H_\mathrm{{MM}}$$\end{document} when contrasted with HEM′\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$H_\mathrm{{EM}'}$$\end{document} for the items with a monotone and a flat IRF (1000 replications), for varying sample size (rows) and test length (columns).

Figure 5

Table 4. Conditional proportions py\documentclass[12pt]{minimal}\usepackage{amsmath}\usepackage{wasysym}\usepackage{amsfonts}\usepackage{amssymb}\usepackage{amsbsy}\usepackage{mathrsfs}\usepackage{upgreek}\setlength{\oddsidemargin}{-69pt}\begin{document}$$p_{y}$$\end{document} and Bayes factors for the eleven reading comprehension items.