Hostname: page-component-cd9895bd7-p9bg8 Total loading time: 0 Render date: 2024-12-28T20:20:46.740Z Has data issue: false hasContentIssue false

SJTs as Measures of General Domain Knowledge for Multimedia Formats: Do Actions Speak Louder Than Words?

Published online by Cambridge University Press:  23 March 2016

Bobby Naemi*
Affiliation:
Educational Testing Service, Washington, DC
Michelle Martin-Raugh
Affiliation:
Educational Testing Service, San Francisco, California
Harrison Kell
Affiliation:
Educational Testing Service, Princeton, New Jersey
*
Correspondence concerning this article should be addressed to Bobby Naemi, Educational Testing Service, 1800 K Street, NW, Suite 900, Washington, DC 20006. E-mail: bnaemi@ets.org

Extract

Lievens and Motowidlo (2016) present a case for situational judgment tests (SJTs) to be conceptualized as measures of general domain knowledge, which the authors define as knowledge of the effectiveness of general domains such as integrity, conscientiousness, and prosocial behaviors in different jobs. This argument comes from work rooted in the use of SJTs as measures of implicit trait policies (Motowidlo & Beier, 2010; Motowidlo, Hooper, & Jackson, 2006), measured with a format described as a “single response SJT” (Kell, Motowidlo, Martin, Stotts, & Moreno, 2014; Motowidlo, Crook, Kell, & Naemi, 2009). Given evidence that SJTs can be used as measures of general domain knowledge, the focal article concludes with a suggestion that general knowledge can be measured not only by traditional text-based or paper-and-pencil SJTs but also through varying alternate formats, including multimedia SJTs and interactive SJTs.

Type
Commentaries
Copyright
Copyright © Society for Industrial and Organizational Psychology 2016 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Callinan, M., & Robertson, I. T. (2000). Work sample testing. International Journal of Selection and Assessment, 8, 248260.CrossRefGoogle Scholar
Chan, D., & Schmitt, N. (1997). Video-based versus paper-and-pencil method of assessment in situational judgment tests: Subgroup differences in test performance and face validity perception. Journal of Applied Psychology, 82, 143159.CrossRefGoogle Scholar
Christian, M. S., Edwards, B. D., & Bradley, J. C. (2010). Situational judgment tests: Constructs Assessed and a meta-analysis of their criterion-related validities. Personnel Psychology, 63, 83117.CrossRefGoogle Scholar
Daft, R. L., & Lengel, R. H. (1984). Information richness: A new approach to managerial behavior and organization design. In Staw, B. M. & Cummings, L. L. (Eds.), Research in organizational behavior (Vol. 6, pp. 191233). Greenwich, CT: JAI Press.Google Scholar
Funke, U., & Schuler, H. (1998). Validity of stimulus and response components in a video test of social competence. International Journal of Selection and Assessment, 6, 115123.CrossRefGoogle Scholar
Gesn, P. R., & Ickes, W. (1999). The development of meaning contexts for empathic accuracy: Channel and sequence effects. Journal of Personality and Social Psychology, 77, 746761.CrossRefGoogle Scholar
Joseph, D. L., & Newman, D. A. (2010). Emotional intelligence: An integrative meta-analysis and cascading model. Journal of Applied Psychology, 95, 5478.CrossRefGoogle ScholarPubMed
Kell, H. J., Motowidlo, S. J., Martin, M. P., Stotts, A. L., & Moreno, C. A. (2014). Testing for independent effects of prosocial knowledge and technical knowledge on skill and performance. Human Performance, 27, 311327.CrossRefGoogle Scholar
Krumm, S., Lievens, F., Hüffmeier, J., Lipnevich, A. A., Bendels, H., & Hertel, G. (2015). How “situational” is judgment in situational judgment tests? Journal of Applied Psychology, 100, 399416.CrossRefGoogle Scholar
Lievens, F., & Motowidlo, S. J. (2016). Situational judgment tests: From measures of situational judgment to measures of general domain knowledge. Industrial and Organizational Psychology: Perspectives on Science and Practice, 9, 322.CrossRefGoogle Scholar
Lievens, F., & Patterson, F. (2011). The validity and incremental validity of knowledge tests, low-fidelity simulations, and high-fidelity simulations for predicting job performance in advanced-level high-stakes selection. Journal of Applied Psychology, 96, 927940.CrossRefGoogle ScholarPubMed
Lievens, F., & Sackett, P. R. (2006). Video-based versus written situational judgment tests: A comparison in terms of predictive validity. Journal of Applied Psychology, 91, 11811188.CrossRefGoogle ScholarPubMed
McArthur, L. Z., & Baron, R. M. (1983). Toward an ecological theory of social perception. Psychological Review, 90, 215238.CrossRefGoogle Scholar
McHenry, J. J., & Schmitt, N. (1994). Multimedia testing. In Rumsey, M. J., Walker, C. D., & Harris, J. (Eds.), Personnel selection and classification research (pp. 193232). Mahwah, NJ: Erlbaum.Google Scholar
Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist, 50, 741749.CrossRefGoogle Scholar
Motowidlo, S. J., & Beier, M. E. (2010). Differentiating specific job knowledge from implicit trait policies in procedural knowledge measured by a situational judgment test. Journal of Applied Psychology, 95, 321333.CrossRefGoogle ScholarPubMed
Motowidlo, S. J., Crook, A. E., Kell, H. J., & Naemi, B. (2009). Measuring procedural knowledge more simply with a single-response situational judgment test. Journal of Business and Psychology, 24, 281288.CrossRefGoogle Scholar
Motowidlo, S. J., Dunnette, M. D., & Carter, G. W. (1990). An alternative selection procedure: The low-fidelity simulation. Journal of Applied Psychology, 75, 640647.CrossRefGoogle Scholar
Motowidlo, S. J., Hooper, A. C., & Jackson, H. L. (2006). Implicit policies about relations between personality traits and behavioral effectiveness in situational judgment items. Journal of Applied Psychology, 91, 749761.CrossRefGoogle ScholarPubMed
Olson-Buchanan, J. B., & Drasgow, F. (2006). Multimedia situational judgment tests: The medium creates the message. In Weekley, J. A. & Ployhart, R. E. (Eds.), Situational judgment tests: Theory, measurement, and application (pp. 253278). Mahwah, NJ: Erlbaum.Google Scholar
Potosky, D. (2008). A conceptual framework for the role of the administration medium in the personnel assessment process. Academy of Management Review, 33, 629648.CrossRefGoogle Scholar
Rockstuhl, T., Ang, S., Ng, K. Y., Lievens, F., & Van Dyne, L. (2015). Putting judging situations into situational judgment tests: Evidence from intercultural multimedia SJTs. Journal of Applied Psychology, 100, 464480.CrossRefGoogle ScholarPubMed
Trevino, L. K., Lengel, R. H., & Daft, R. L. (1987). Media symbolism, media richness, and media choice in organizations: A symbolic interactionist perspective. Communication Research, 14, 553574.CrossRefGoogle Scholar
Tuzinski, K., Drew, E., Lee, V., & Coughlin, C. (2012, April). Reactions to different multimedia formats of SJTs. Paper presented at 27th Annual Conference of the Society for Industrial and Organizational Psychology, San Diego, CA.Google Scholar