Skip to main content

Main menu

  • Home
  • Our journals
    • Clinical Medicine
    • Future Healthcare Journal
  • Subject collections
  • About the RCP
  • Contact us

Clinical Medicine Journal

  • ClinMed Home
  • Content
    • Current
    • Ahead of print
    • Archive
  • Author guidance
    • Instructions for authors
    • Submit online
  • About ClinMed
    • Scope
    • Editorial board
    • Policies
    • Information for reviewers
    • Advertising

User menu

  • Log in

Search

  • Advanced search
RCP Journals
Home
  • Log in
  • Home
  • Our journals
    • Clinical Medicine
    • Future Healthcare Journal
  • Subject collections
  • About the RCP
  • Contact us
Advanced

Clinical Medicine Journal

clinmedicine Logo
  • ClinMed Home
  • Content
    • Current
    • Ahead of print
    • Archive
  • Author guidance
    • Instructions for authors
    • Submit online
  • About ClinMed
    • Scope
    • Editorial board
    • Policies
    • Information for reviewers
    • Advertising

A critique of the specialty certificate examinations of the Federation of Royal Colleges of Physicians of the UK

John Mucklow and Jane Dacre
Download PDF
DOI: https://doi.org/10.7861/clinmedicine.10-5-519b
Clin Med October 2010
John Mucklow
Associate medical director with responsibility for specialty certificate examinations
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jane Dacre
Medical director MRCP (UK) Central Office
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Info & Metrics
Loading

Editor – We welcome John Cookson's interest in our new specialty certificate examinations (SCEs) (Clin Med April 2010 pp 141–4). However, his critique was based on a limited selection of the available information, so this correspondence provides a fuller update for readers. Four years on from the pilot examinations in 2006, after 11 diets in eight separate specialties and with almost 10,000 questions in the bank, there is much to report.1 The contribution from specialists throughout the UK to this effort has been superb.

Cookson is correct in saying that the pilot examinations were not mapped robustly to the curricula. Furthermore, progressive revision of the specialty curricula during the last two years has presented a moving target for the new examining boards. We have risen to this challenge. From 2009, each SCE blueprint has been mapped to the appropriate curriculum and every usable question related to the relevant curriculum domain. Question-writing groups are giving priority to the remaining gaps.

He criticises the curricula for differentiating between knowledge, skills and attitudes and expresses concern that the SCEs assess only knowledge. Although single best answer questions can evaluate problem-solving skills and clinical judgement, the SCEs were always intended as knowledge-based assessments. They were not designed to test skills or attitudes, which, we agree, are much better evaluated by direct observation and discussion face to face.

Cookson expresses disappointment that the indices of reliability in the pilots were inconsistent. Values of Cronbach's a obtained in examination diets of 200 questions, involving small cohorts with a narrow range of ability, are unlikely to reach 0.9. Indeed, recent research into the use of reliability suggests that the standard error of measurement may be a more appropriate metric.2 Nevertheless, it is reassuring that in nine out of 11 SCE diets to date reliability values have exceeded 0.8.

We appreciate the challenge of standard setting for new examinations. For information, the SCEs use the same criterion-referencing process (the Angoff method) used for the MRCP(UK) written examinations in recent years. Although many of those involved in the process had no previous experience, their task was made simpler by taking as a consistent yardstick the knowledge expected of a newly appointed specialist.

Footnotes

  • Please submit letters for the editor's consideration within three weeks of receipt of Clinical Medicine. Letters should ideally be limited to 350 words, and sent by email to: Clinicalmedicine{at}rcplondon.ac.uk

  • © 2010 Royal College of Physicians

References

  1. ↵
    1. Joint Committee on Higher Medical Training
    . Knowledge-based assessment; pilot project. London: Joint Committee on Higher Medical Training, 2006. www.jrcptb.org.uk/SiteCollectionDocuments/KBA%20Project%20Final%20Report.pdf.
  2. ↵
    1. Tighe J
    , McManus IC, Dewhurst NG, Chis L, Mucklow J. The standard error of measurement is a more appropriate measure of quality for postgraduate medical assessments than is reliability: an analysis of MRCP(UK) written examinations, 2002–2008, and Specialty Certificate Examinations. BioMed Central Medical Education (accepted for publication).

A critique of the specialty certificate examinations of the Federation of Royal Colleges of Physicians of the UK

It is good to read an update and that data continue to be gathered about the performance of the examination. The progress in question writing is impressive but it will be an ongoing task as some questions will perform poorly, some will go out of date and soon many will be remembered by candidates after the examination and maybe passed around.

It is correct that single best answer questions can evaluate problem solving so it seems a pity that a decision was taken to test only factual recall rather than utilisation of knowledge. The maintenance of validity is indeed not helped by the learning outcomes in the various curricula. My problem is not just that only knowledge outcomes are being assessed. They are written so that they mostly require knowledge recall rather than higher order thinking and further many outcomes are not listed under the appropriate heading so that an examination testing only those listed under knowledge will miss important topics. This shows the importance of designing assessment systems along with the outcomes, something previously neglected in the foundation programme. My anxieties about setting a specialist examination at the lowest level of Miller's triangle remain.

I identified educational impact, cost-effectiveness and acceptability as issues requiring more information and I was hoping this would be forthcoming. It would be a pity I think, if this examination led candidates to acquire most of their learning from books rather than from patients; a survey of their learning strategy would be of considerable interest. There must also now be some robust data on costs. Even if specialists are giving their time freely there is an opportunity cost; if they are writing questions they cannot be doing something else. The publication of robust costings would be a service to all of us who struggle to provide a good examination product for the resources available.

Establishing that standard setting is at the level of the new consultant will certainly be helpful. There must now be considerable information about the consistency of standard setting between the various diets and specialties. Publication of this would help answer the questions raised by the data in the pilot.

Footnotes

  • Please submit letters for the editor's consideration within three weeks of receipt of Clinical Medicine. Letters should ideally be limited to 350 words, and sent by email to: Clinicalmedicine{at}rcplondon.ac.uk

  • © 2010 Royal College of Physicians
Back to top
Previous articleNext article

Article Tools

Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Citation Tools
A critique of the specialty certificate examinations of the Federation of Royal Colleges of Physicians of the UK
John Mucklow, Jane Dacre
Clinical Medicine Oct 2010, 10 (5) 519-520; DOI: 10.7861/clinmedicine.10-5-519b

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
A critique of the specialty certificate examinations of the Federation of Royal Colleges of Physicians of the UK
John Mucklow, Jane Dacre
Clinical Medicine Oct 2010, 10 (5) 519-520; DOI: 10.7861/clinmedicine.10-5-519b
del.icio.us logo Digg logo Reddit logo Twitter logo CiteULike logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Footnotes
    • References
    • Footnotes
  • Info & Metrics

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • Trainee concerns regarding the Specialty Certificate Examination: results of a British Thoracic Society national survey
  • Google Scholar

More in this TOC Section

  • Response
  • Functional disorders and chronic pain
  • A further explanation for chest pain without visible coronary artery disease
Show more Letters to the editor

Similar Articles

Navigate this Journal

  • Journal Home
  • Current Issue
  • Ahead of Print
  • Archive

Related Links

  • ClinMed - Home
  • FHJ - Home
clinmedicine Footer Logo
  • Home
  • Journals
  • Contact us
  • Advertise
HighWire Press, Inc.

Follow Us:

  • Follow HighWire Origins on Twitter
  • Visit HighWire Origins on Facebook

Copyright © 2021 by the Royal College of Physicians