Hostname: page-component-78c5997874-mlc7c Total loading time: 0 Render date: 2024-11-10T15:40:07.606Z Has data issue: false hasContentIssue false

Computer-based oral exams in Italian language studies

Published online by Cambridge University Press:  26 June 2013

C. Paul Newhouse
Affiliation:
Edith Cowan University, Western Australia (email: p.newhouse@ecu.edu.au)
Martin Cooper
Affiliation:
Edith Cowan University, Western Australia (email: m.cooper@ecu.edu.au)

Abstract

In this paper we report on one component of a three-year study into the use of digital technologies for summative performance assessment in senior secondary courses in Western Australia. One of the courses was Italian Studies, which had an oral communication outcome externally assessed with an oral performance for which students travelled to a central location and undertook an interview with two assessors. Apart from the logistical difficulties for both students and the organising body, this method did not leave an enduring record of the process, and raised questions about the reliability of the assessment. Over the three years of this study, we tried several approaches to using digital technology to assess oral performance, including a portfolio of sub-tasks leading up to a video-recorded oral presentation, a computer-based exam, a video recorded interview, and an online exam that included oral audio-recordings. For each of the years online marking tools supported two methods of drawing inferences about student performance from the representations: the more traditional analytical method and the comparative pairs method. Rasch analysis of the results of the two methods showed that both were at an acceptable level of reliability. Overall, students and teachers reported that they liked using audiovisual recordings and online performance tasks for revision but not for summative assessment. The study also demonstrated that the scores from externally marked computer-based oral tasks carried out in class time correlated highly with the scores from traditional face-to-face recorded interviews. Therefore, online assessment of oral performance appears to be an equally effective way to facilitate assessment when compared with traditional methods and offers other affordances, such as convenience and access from a variety of locations, as well as providing an enduring record of student performance.

Type
Research Article
Copyright
Copyright © European Association for Computer Assisted Language Learning 2013 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Barrett, H. C. (2005) Researching Electronic Portfolios and Learner Engagement: The REFLECT Initiative. New York: Taskstream.Google Scholar
British Broadcasting Corporation (2009) Norway tests laptop exam scheme. BBC News. http://news.bbc.co.uk/2/hi/technology/8027300.stmGoogle Scholar
Carbol, B. (2007) Transition to Online Testing: An ROI Analysis. Kelowna BC: Society for the Advancement of Excellence in Education.Google Scholar
Cummins, P. W.Davesne, C. (2009) Using electronic portfolios for second language assessment. The Modern Language Journal, 93(Focus Issue): 848867.CrossRefGoogle Scholar
Dede, C. (2003) No cliché left behind: why education policy is not like the movies. Educational Technology, 43(2): 510.Google Scholar
Douglas, D.Hegelheimer, V. (2007) Assessing language using computer technology. Annual Review of Applied Linguistics, 16: 115132.Google Scholar
Downey, R., Farhady, H., Present-Thomas, H., Suzuki, M.Can Moere, A. (2008) Evaluation of the usefulness of the Versant for English Test: a response. Language Assessment Quarterly, 5: 160167.CrossRefGoogle Scholar
eSchool News. (2011) Obama: Too much testing makes education boring. eSchool News, 14(5): 15.Google Scholar
Garmire, E.Pearson, G. (eds.) (2006) Tech Tally: Approaches to Assessing Technological Literacy. Washington: National Academy Press.Google Scholar
Jamieson, J. (2005) Trends in computer-based second language assessment. Annual Review of Applied Linguistics, 25: 228242.CrossRefGoogle Scholar
Kimbell, R., Wheeler, T., Miller, A.Pollitt, A. (2007) e-scape: e-solutions for creative assessment in portfolio environments. London: Technology Education Research Unit, Goldsmiths College.Google Scholar
Kozma, R. B. (2009) Transforming Education: Assessing and Teaching 21st Century Skills. In: Scheuermann, F. and Bojornsson, J. (eds.), The Transition to Computer-Based Assessment. Ispra, Italy: European Commission, Joint Research Centre, 1323.Google Scholar
Lane, S. (2004) Validity of High-Stakes Assessment: Are Students Engaged in Complex Thinking? Educational Measurement, Issues and Practice, 23(3): 614.CrossRefGoogle Scholar
Lin, H.Dwyer, F. (2006) The fingertip effects of computer-based assessment in education. TechTrends, 50(6): 2731.CrossRefGoogle Scholar
McGaw, B. (2006) Assessment to fit for purpose. Paper presented at the 32nd Annual Conference of the International Association for Educational Assessment, Singapore.Google Scholar
McNamara, T. (2000) Language testing. New York: Oxford University Press.Google Scholar
Myers, M. J. (2002) Computer assisted second-language assessment: to the top of the pyramid. ReCALL, 14(1): 167181.CrossRefGoogle Scholar
Ockey, G. J. (2009) Developments and challenges in the use of computer-based testing for assessing second language ability. The Modern Language Journal, 93(Focus Issue): 836847.CrossRefGoogle Scholar
Pearson Education Australia (2012) Versant. http://www.versanttest.comGoogle Scholar
Pollitt, A. (2004) Let's stop marking exams. Paper presented at the International Association for Educational Assessment Conference, Philadelphia, USA.Google Scholar
Pollitt, A. (2012) The method of adaptive comparative judgement. Assessment in Education: Principles, Policy & Practice, 19(3): 281300.Google Scholar
Ridgway, J., McCusker, S.Pead, D. (2006) Report 10: Literature review of e-assessment. In: Facer, K. (ed.), Futurelab Series. Bristol, UK: Futurelab.Google Scholar
The British Psychological Society (2002) Guidelines for the Development and use of Computer-Based Assessments. Leicester, UK: The British Psychological Society.Google Scholar
Vincent-Durroux, L., Poussard, C., Lavaur, J.Aparicio, X. (2011) Using CALL in a formal learning context to develop oral language awareness in ESL: an assessment. ReCALL, 23(2): 8697.CrossRefGoogle Scholar
Wiegers, J. (2010) E-assessment in The Netherlands, innovations for the 21st Century. Paper presented at the 36th International Association for Educational Assessment Bangkok, Thailand. http://www.iaea.info/documents/paper_4d22770a.pdfGoogle Scholar