Hostname: page-component-78c5997874-8bhkd Total loading time: 0 Render date: 2024-11-10T21:18:24.288Z Has data issue: false hasContentIssue false

Use of planning metrics software for automated feedback to radiotherapy students

Published online by Cambridge University Press:  25 October 2016

Pete Bridge*
Affiliation:
School of Health Sciences, University of Liverpool, Liverpool, UK
Mark Warren
Affiliation:
School of Health Sciences, University of Liverpool, Liverpool, UK
Marie Pagett
Affiliation:
School of Health Sciences, University of Liverpool, Liverpool, UK
*
Correspondence to: Pete Bridge, School of Health Sciences, University of Liverpool, Liverpool L69 3BX, UK. Tel: 0151 795 8366. E-mail: pete.bridge@liverpool.ac.uk

Abstract

Background and purpose

Pre-registration teaching of radiotherapy planning in a non-clinical setting should allow students the opportunity to develop clinical decision-making skills. Students frequently struggle with their ability to prioritise and optimise multiple objectives when producing a clinically acceptable plan. Emerging software applications providing quantitative assessment of plan quality are designed for clinical use but may have value for teaching these skills. This project aimed to evaluate the potential value of automated feedback to second year BSc (Hons) Radiotherapy students.

Materials and methods

All 26 students studying a pre-registration radiotherapy planning module were provided with automated prediction of relative feasibility for left lung tumour planning targets by planning metrics software. Students were also provided with interim quantitative reports during the development of their plan. Student perceptions of the software were gathered using an anonymous questionnaire. Independent blinded marking of plans was performed after module completion and analysed for correlation with software-assigned marks.

Results

In total, 25 plans were utilised for marking comparison and 16 students submitted feedback relating to the software. Overall, student feedback was positive regarding the software. A ‘strong’ Spearman’s rank-order correlation (rs=0·7165) was evident between human and computer marks (p=0·000055).

Conclusions

Automated software is capable of providing useful feedback to students as a teaching aid, in particular with regard to relative feasibility of goals. The strong correlation between human and computer marks suggests a role in benchmarking or moderation; however, the narrow scope of assessment parameters suggests value as an adjunct and not a replacement to human marking.

Type
Educational Note
Copyright
© Cambridge University Press 2016 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Brodin, N P, Maraldo, M V, Aznar, M C et al. Interactive decision-support tool for risk-based radiation therapy plan comparison for Hodgkin lymphoma. Int J Radiat Oncol Biol Phys 2014; 88 (2): 433445.Google Scholar
2. Moore, K L, Brame, R S, Low, D A, Mutic, S. Quantitative metrics for assessing plan quality. Semin Radiat Oncol 2012; 22 (1): 6269.Google Scholar
3. Holloway, L C, Miller, J, Kumar, S, Whelan, B M, Vinod, S K. Comp Plan: a computer program to generate dose and radiobiological metrics from dose-volume histogram files. Med Dosim 2012; 37 (3): 305309.Google Scholar
4. Zhao, B, Joiner, M C, Orton, C G, Burmeister, J. SABER: a new software tool for radiotherapy treatment plan evaluation. Med Phys 2010; 37 (11): 55865592.Google Scholar
5. Crowe, S B, Kairn, T, Kenny, J et al. Treatment plan complexity metrics for predicting IMRT pre-treatment quality assurance results. Australas Phys Eng Sci Med 2014; 37 (3): 475482.Google Scholar
6. Mitra, N K, Barua, A. Effect of online formative assessment on summative performance in integrated musculoskeletal system module. BMC Med Educ 2015; 15 (29): 17.Google Scholar
7. Latifi, S, Gierl, M J, Boulais, A P, De Champlain, A F. Using automated scoring to evaluate written responses in English and French on a high-stakes clinical competency examination. Eval Health Prof 2016; 39 (1): 100113.Google Scholar
8. Alemán, J L F. Automated assessment in a programming tools course. IEEE Trans Educ 2011; 54 (4): 576581.Google Scholar
9. Marks, L B, Bentzen, S M, Deasy, J O et al. QUANTEC: organ-specific paper: radiation dose–volume effects in the lung. Int J Radiat Oncol Biol Phys 2010; 76 (3): S70S76.Google Scholar
10. Voet, P J W, Dirkx, M L P, Breedveld, S, Fransen, D, Levendag, P C, Heijmen, B J M. Towards fully automated multicriteria plan generation: a prospective clinical study. Int J Radiat Oncol Biol Phys 2013; 8 (3): 866872.CrossRefGoogle Scholar