Hostname: page-component-78c5997874-fbnjt Total loading time: 0 Render date: 2024-11-10T06:18:15.307Z Has data issue: false hasContentIssue false

Why does the MRCPsych examination need to change?

Published online by Cambridge University Press:  02 January 2018

Stephen Tyrer*
Affiliation:
Department of Psychiatry, University of Newcastle, Newcastle upon Tyne
Femi Oyebode
Affiliation:
Department of Psychiatry, University of Birmingham, Birmingham, UK
*
Correspondence to: Dr Stephen Tyrer, Clinical Senior Lecturer in Psychiatry, University of Newcastle, Royal Victoria Infirmary, Newcastle upon Tyne NE1 4LP, UK. E-mail: s.p.tyrer@ncl.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

Type
Editorials
Copyright
Copyright © Royal College of Psychiatrists, 2004 

Over the past decade and a half there has been a profound change in the way doctors are perceived by the general public and by regulatory authorities. Until recently it was generally accepted that doctors who had completed an accredited examination and were working in a senior capacity were competent at their jobs and cared for their patients appropriately. The greater knowledge of patients about medical matters, together with the development of lobby groups and their influence on medical policy decisions, has challenged the traditional self-regulation procedures of the medical establishment. These were rocked fundamentally by the ‘earthquake’ affecting British medicine (Reference KleinKlein, 1998; Reference SmithSmith, 1998) following the report of the Bristol heart surgery inquiry. The state had to take action, the need for clinical governance became imperative (National Health Service Executive, 1999) and the effects of the new procedures on the regulation of the work of doctors is now apparent to all working in Britain today.

MEDICAL SELF-REGULATION AND THE MEDICAL ROYAL COLLEGES

One of the underlying tenets of clinical governance is that doctors should be competent in what they do. The medical Royal Colleges have an important role in this regard as each College has the responsibility of ensuring that the standard of specialists accredited by Colleges is sufficient to ensure that specialists perform competently. Each College needs to determine the abilities required to practise in its sphere of competence, and how to assess these abilities reliably.

EVALUATION OF THE MEMBERSHIP EXAMINATION

The Royal College of Psychiatrists developed its membership examination, customarily abbreviated to the MRCPsych examination, 1 year after the College was founded in 1971. The prime aim of the examination was, and still is, to set a standard that determines whether candidates are suitable to progress to higher professional training at specialist registrar level. In addition, possession of the qualification is considered to be an indicator of professional competence in the clinical practice of psychiatry. The examination is clearly important, and failure in it means that access to higher training opportunities is denied. The examination is therefore termed, in educational parlance, a high-stakes examination. It is consequently vital to ensure that those with sufficient ability pass the examination and, of more importance, that candidates with inadequate skills do not. Discrimination on ethnic and gender grounds should be proscribed. Precise assessment of clinical competence is essential (Reference OyebodeOyebode, 2002).

Tests of clinical competence require assessment of knowledge, comprehension of the subject matter, analysis of all aspects of the topic, evaluation of the problem or clinical scenario, synthesis of the issues, and application of these elements in the management of patients. Different parts of the MRCPsych examination address these aspects. It is possible to plan the learning objectives of assessment procedures against the items that need to be tested in any examination in what is known as a blueprint assessment (Reference Dauphinee, Newble, Jolly and WakefordDauphinee, 1994). The revised curriculum for basic specialist training and the MRCPsych examination has been written to illustrate the precise knowledge and competencies required to graduate to higher professional training (Royal College of Psychiatrists, 2001). The content of the examination is now tailored to this course of study.

The first MRCPsych examinations were designed to test factual knowledge and clinical skills. Knowledge was tested by written components - the multiple choice question (MCQ) examination and, later, short answer questions. The candidate also needed to be able to write an essay on a designated psychiatric topic in an acceptable literary style. Clinical competence was assessed by carrying out a psychiatric examination of a patient (the so-called ‘long case’) and by an oral examination or viva.

These components in the examination test essential issues in psychiatry but are educationally incomplete. The reliability of assessment of clinical competency from a single clinical case is very low (Reference Wass, Jones and Van der VleutenWass et al, 2001) and assessment in the MRCPsych examination does not include direct observation of interviewing ability. These early examinations also did not assess the skills of students in appraising evidence. The need for change was apparent and the advice of a medical educationalist, Helen Mulholland, was sought in 1997. The results of her evaluation were that the examination was largely reliable but that greater efforts should be made to ensure that clinical skills are assessed more rigorously. The changes that have been proposed and are now in the process of being implemented have been reported elsewhere (Reference Katona, Tyrer and SmallsKatona et al, 2000). What are the reasons for these changes?

THE OBJECTIVE STRUCTURED CLINICAL EXAMINATION

The most important change is in the clinical assessment component of the Part I examination. The original MRCPsych Part I clinical examination involved examination of a patient by the candidate, followed by an interrogation by examiners of the findings obtained - a standard long case format. There is considerable variation among the patients that appear in the examination, including difficulty of the diagnosis itself, patient complexity (e.g. degree and stage of illness, physical factors such as deafness), and patient factors (e.g. degree of cooperation, strong local regional accent). It has been shown that it is necessary for each candidate to interview at least ten such long cases to achieve the test reliability required for such a high-stakes examination (Reference Wass, Jones and Van der VleutenWass et al, 2001).

To ensure that each candidate is exposed to an adequate number of patients, the long case in the Part I examination has been replaced by an objective structured clinical examination (OSCE). In this examination candidates proceed through a series of ‘stations’ that test clinical skills. In the psychiatry OSCE many of these stations involve assessment of simulated patients, actors who demonstrate identical behaviour, thus ensuring standardisation of the procedure and excellent utility and reliability (Famuyima et al, 1991; Reference Hodges, Regehr and HansonHodges et al, 1997). The more OSCE stations there are, the more reliable the examination. The new Part I OSCE includes 12 stations. The validity of this examination in testing the clinical skills of undergraduates in psychiatry has been shown (Reference Hodges, Regehr and HansonHodges et al, 1998). However, these authors later confirmed what those involved in the design of psychiatric examinations have strongly suspected; that OSCE examinations, when marked by adding items on a checklist, are not suitable for the assessment of more advanced psychiatric skills (Reference Hodges, Regehr and McNaughtonHodges et al, 1999). It is for this reason that the long case has been retained in the Part II examination.

TESTS OF KNOWLEDGE AND REASONING

Alterations have also been made to the MCQ paper in the Part I examination. Multiple choice questions have been used in the MRCPsych examination from its inauguration in 1972. From the outset, the true–false format has been used with a stem question followed by five alternative answers. Until recently, the negative marking technique, with all correct answers rewarded with one mark and incorrect answers receiving a negative mark, was used in marking this test. In a discipline such as psychiatry, where there are few absolutes, this system of marking penalises the intelligent guesser. On the advice of the educationalist, this format was altered to one in which wrong answers are not scored negatively. Furthermore, the stem question format used in the examination has been shown to limit the relevance of the questions selected. This format was therefore replaced 2 years ago with a selection of individual statements, still requiring the true–false response, but avoiding the Procrustean procedure imposed by a single stem.

Reasoning skills are not tested by MCQs and, by their design, they are limited to a choice of two options. To evaluate more in-depth knowledge and provide a greater range of options the extended matching item (EMI) format was piloted. With this technique a scenario is chosen, which may be in the clinical field; a number of options are selected and a specific problem is listed for which the appropriate option should be chosen (Reference Case and SwansonCase & Swanson, 1993). Thus, a clinical scenario of depressive illness may be chosen, a range of options regarding treatment listed, and a number of vignettes of particular patients described. The candidate then selects the most appropriate treatment option for the cases concerned. There have been encouraging results in a pilot paper involving EMI questions and these were introduced in the spring 2003 Part I MRCPsych examination.

CRITICAL REVIEW

The Part II examination has also been altered, as befits a high-stakes test that enables successful candidates to be included in the Specialist Register and undertake higher training. The principal changes made include the introduction of a critical review paper and greater standardisation of the clinical viva concerned with the management of patients.

The need to evaluate accurately the growing volume of medical literature is becoming more and more apparent to all doctors. Furthermore, the practice of being able critically to appraise papers is an increasingly important technique in helping to maintain interest in one's area of work. It has also been shown that such instruction is of benefit in improving evidence-based medicine skills in postgraduate and medical students (Reference Fritschke, Greenhalgh and Falck-YtterFritschke et al, 2002), although the evidence for the educational value for this remains poorly researched (Reference Parkes, Hyde and DeeksParkes et al, 2003). It was primarily for educational reasons that the critical review paper was introduced into the MRCPsych examination in Spring 1999. In this part of the examination, candidates have to review relevant information from a published scientific paper in psychiatry and answer questions about the design of the study, appraisal of the methodology and the significance of the results in clinical practice. The introduction of this paper has been associated with a change in the format of the traditional journal club in the teaching of psychiatrists, and psychiatric tutors believe that the more stringent critical style is a good forum for teaching evidence-based medicine skills (Reference Taylor and WarnerTaylor & Warner, 2000). Examinations drive learning styles (Reference Newble and EntwistleNewble & Entwistle, 1986) and it is expected that future consultants will be more interested in appraising psychiatric literature following this change.

VIVA AND ESSAY COMPONENTS

The Part II clinical examination involves both the assessment and management of a long case and a separate viva voce examination testing the candidate on the management of patients using clinical vignettes or scenarios. In the past the examiners have made the selection of vignettes in this part of the examination. To ensure that the content and degree of difficulty of these vignettes is standardised, this part of the examination has now been structured with approved scenarios selected in the examination and probes to assist the examiners during the exercise (Reference McCreadieMcCreadie, 2002). Using the same clinical scenarios in each examination centre at identical times should further increase the reliability of this part of the examination.

The essay has been a consistent component of the MRCPsych Part II examination since the beginning. It assesses the ability to summarise and integrate information and has been retained despite reliability not always reaching high levels. The consistency of marking has been improved by the formation of an essay marking panel, whose members mark the papers at the College.

STANDARD-SETTING

Previously most parts of the written components of the MRCPsych examination were marked by the peer-referencing technique, which involves passing a predetermined proportion of the candidates taking the examination. This procedure is clearly influenced by the calibre of candidates at the time. With a well-prepared and knowledgeable group of candidates the chances of a borderline candidate passing the examination is reduced, whereas if those taking the examination are poorly informed, some who would not normally pass the examination might have the opportunity to do so. It is clearly fairer to have a pass mark that represents a standard agreed by the examiners beforehand. This is known as a criterion-referencing procedure. This strategy was introduced in the MRCPsych examination during 2002 and includes setting an appropriate standard and maintaining this standard in successive examinations. Representative standard-setting panels now meet regularly to determine appropriate pass/fail criteria.

FUTURE OPTIONS

The necessity for clinical competency has been stressed throughout this article. It would be inefficient and inappropriate to fail candidates who are clinically able on the grounds of low marks in a written examination that bears peripheral relevance to clinical psychiatry. However, successful candidates should have a sufficient degree of knowledge of the constructs underlying psychiatric practice. Candidates who have not achieved this level of knowledge were previously restricted to a finite number of attempts at the examination. This ruling was modified recently and candidates were allowed as many attempts as they wished in the written parts of the examination. Analysis has since shown that candidates who fail the clinical examination have an increased chance of passing on future attempts. Therefore, it was agreed recently that no restriction should be placed on the number of attempts at all parts of the examination.

Examinations require continuous assessment and refinement. Further changes may be proposed to the format of the MCQ paper and to the assessment of the long cases in Part II to determine more precisely the quality of interaction between candidate and patient. Changes are also likely to be made because of recent proposals by the Department of Health for radical reform of the post of senior house officer (Reference DonaldsonDonaldson, 2002). Progress through the training programme will be determined by assessment that will be competency-based. Medical Royal College examinations will be retained but external accreditation of all the medical Royal College examinations will be introduced to ensure greater homogeneity. In the responses to the consultation exercise requested on publication of this document, the medical Royal Colleges accepted these proposals but expressed concern over who should (or could) undertake this exercise.

Political factors and external forces are likely to drive further changes in College examinations in the future. The direction of these changes is difficult to define but may include the modularisation of courses with assessments at the conclusion of modules. Royal College examinations will still be required as evidence of competencies acquired at the end of a course of training. It seems likely that high-stakes tests of this nature will continue to take place at the end of basic specialist training, and it is possible that the record of in-service training (RITA) at the completion of higher specialist training will become a more formal exit examination. This would mean major alterations to the content of the present MRCPsych examination but the structure would not necessarily change to the same extent.

The MRCPsych examinations are now taken by close to 2000 candidates every year with almost two-thirds of these receiving their undergraduate medical education outside the British Isles (Reference Tyrer, Leung and SmallsTyrer et al, 2002). The examination needs to be transparently fair as well as assuring appropriate standards. The standards of the examiners should be shown to be high as well as the structure of the examination itself. Future Chief Examiners would do well to don the mantle of Sisyphus (Reference CamusCamus, 1942).

Footnotes

DECLARATION OF INTEREST

S.T. is the immediate past Chief Examiner of the Royal College of. Psychiatrists and F.O. is the present Chief Examiner and is a member and. examiner of the Professional Licensing Assessment Board of the General Medical. Council. The views expressed in this article are those of the authors and do. not necessarily represent those of the Examinations Department of the Royal. College of Psychiatrists.

References

Camus, A. (1942) The Myth of Sisyphus. Current edn (2000). London: Penguin.Google Scholar
Case, S. M. & Swanson, D. B. (1993) Extended matching items: a practical alternative to free response questions. Teaching and Learning in Medicine, 5, 107115.CrossRefGoogle Scholar
Dauphinee, D. (1994) Determining the content of certification examinations. In The Certification and Recertification of Doctors: Issues in the Assessment of Clinical Competence (eds Newble, D. I., Jolly, B. C. & Wakeford, R. E.), pp.92104. Cambridge: Cambridge University Press.Google Scholar
Donaldson, L. (2002) Unfinished Business: Proposals for Reform of the Senior House Officer Grade. A paper for Consultation. London: Department of Health.Google Scholar
Famuyiwa, O. O., Zachariah, M. P. & Ilechukwu, S. T. C. (1991) The objective structured clinical exam in psychiatry. Medical Education, 25, 4550.CrossRefGoogle Scholar
Fritschke, L., Greenhalgh, T., Falck-Ytter, Y., et al (2002) Do short courses in evidence-based medicine improve knowledge and skills? BMJ, 325, 13381341.CrossRefGoogle Scholar
Hodges, B., Regehr, G., Hanson, M., et al (1997) An objective examination for evaluating psychiatric clinical clerks. Academic Medicine, 72, 715721.CrossRefGoogle ScholarPubMed
Hodges, B., Regehr, G., Hanson, M., et al (1998) Validation of an objective structured clinical examination in psychiatry. Academic Medicine, 73, 910912.CrossRefGoogle ScholarPubMed
Hodges, B., Regehr, G., McNaughton, N., et al (1999) OSCE checklists do not capture increasing levels of expertise. Academic Medicine, 74, 11291134.CrossRefGoogle Scholar
Katona, C., Tyrer, S. P. & Smalls, J. (2000) Changes to the MRCPsych examinations. Psychiatric Bulletin, 24, 276278.CrossRefGoogle Scholar
Klein, R. (1998) Competence, professional regulation, and the public interest. BMJ, 316, 17401742.CrossRefGoogle ScholarPubMed
McCreadie, R. G. (2002) Patient management problems: ‘the vignettes’. Psychiatric Bulletin, 26, 463467.CrossRefGoogle Scholar
National Health Service Executive (1999) Clinical Governance: Quality in the new NHS. HSC 1999/065. Leeds: National Health Service Executive.Google Scholar
Newble, D. I. & Entwistle, N. J. (1986) Learning styles and approaches: implications for medical education. Medical Education, 20, 162175.CrossRefGoogle ScholarPubMed
Oyebode, F. (2002) Commentary on: Simulated patients and objective structured clinical examinations; review of their use in medical education. Advances in Psychiatric Treatment, 8, 348350.CrossRefGoogle Scholar
Parkes, J., Hyde, C., Deeks, J., et al (2003) Teaching critical appraisal skills in health care settings. Cochrane Library, issue 2. Oxford: Update Software.Google Scholar
Royal College of Psychiatrists (2001) Curriculum for Basic Specialist Training and the MRCPsych Examination (Council Report CR95). London: Royal College of Psychiatrists.Google Scholar
Smith, R. (1998) The dark side of medicine (Editor's choice). BMJ, 316, 0.Google Scholar
Taylor, P. & Warner, J. (2000) National survey of training needs for evidence-based practices. Psychiatric Bulletin, 24, 272273.CrossRefGoogle Scholar
Tyrer, S. P., Leung, W-C., Smalls, J., et al (2002) The relationship between medical school of training, age, gender and success in the MRCPsych examinations. Psychiatric Bulletin, 26, 257263.CrossRefGoogle Scholar
Wass, V., Jones, R. & Van der Vleuten, C. (2001) Standardised or real patients to test clinical competence? The long case revisited. Medical Education, 35, 321325.CrossRefGoogle ScholarPubMed
Submit a response

eLetters

No eLetters have been published for this article.