Trials and tribulations of the annual review of competence progression – lessons learned from core medical training in London
ABSTRACT
The annual review of competence progression (ARCP) was introduced as a way of keeping records and reviewing satisfactory progress through a medical curriculum for doctors in training. It provides public assurance that doctors are trained to a satisfactory standard and are fit for purpose. A routine external review of the core medical training (CMT) ARCPs in London revealed documentation of satisfactory progression of trainees to the next level of training without the evidence to support their completion of the curriculum. An internal review and series of process interventions were subsequently conducted and implemented to improve the quality and standardisation of the ARCPs. This paper reviews these interventions, discusses the lessons learned from the internal review and highlights issues applicable to any ARCP process.
Introduction
Core medical training (CMT) is a 2-year programme forming the first stage of specialty training for most doctors training in physicianly specialties.1 Trainees use an e-portfolio to record evidence to support acquisition of competencies against their curriculum. This evidence is reviewed at 11 months and 23 months at an annual review of competence progression (ARCP), with the trainee’s ‘outcome’ being dependent on their level of satisfactory progress.1,2 The ARCP panel consists of at least three members appointed by the training programme management committee, of which one must be either the postgraduate dean (or their deputy, such as a head or deputy head of school) or a training programme director (TPD). Successful trainees in their first year of CMT are awarded an outcome 1 and successful trainees at the end of their second year of CMT are awarded an outcome 6 (Table 1).2,3
Up to and including 2014, London CMT ARCPs were conducted locally at the trainee’s hospital with external representation on ARCP panels provided by other TPDs and educational supervisors. Only trainees deemed to be in difficulty or those in whom unsatisfactory outcomes were predicted by trainers were reviewed by central ARCP panels on behalf of the London local education and training boards (North Central and East London, North West London, South London) at Health Education England (HEE). External assessors appointed by the Joint Royal Colleges of Physicians Training Board (JRCPTB) have always been a part of the central ARCP panels in London and, in addition, have sampled 5–10% of e-portfolios and ARCP recommendations in order to provide independent review, impartial advice and scrutiny of all processes of delivery, assessment and evaluation of medical training.2–4
Problem
In 2014, the external assessor’s review of CMT ARCPs in London reported that trainees were achieving satisfactory outcomes without the necessary evidence to demonstrate their competence as mandated by the JRCPTB ARCP decision aid.3 Indeed, 16 out of 30 (53%) e-portfolios randomly sampled were deemed to fall short of the curriculum requirements and a number of issues were raised (Box 1). This feedback was considered to be a potential patient safety issue prompting the existing process to be subjected to immediate review and reform.
Solution
Under the direction of the HEE North West London postgraduate dean, a project group within the School of Medicine in HEE in London was convened in August 2014, comprising clinical (head of school, deputy head of school, education fellow) and non-clinical (senior operations manager and CMT operation officer) members to manage a formal review of CMT ARCPs in London. This was considered to be best supported by a ‘plan-do-study-act’ (PDSA)-style quality improvement process; this has been shown to be an effective model with incremental improvements being achieved by cycling through the four phases inherent in its description (Fig 1, Box 2).5
Discussion
A review of CMT ARCPs and subsequent scrutinisation of the process raised concerns not only about the way in which ARCPs were conducted in London, but also in how the completion of the curriculum was documented. It raised anxieties about how clinical and educational supervisors should best support their trainees and how they themselves are supported to fulfil their educational role. A PDSA-style process of review and reform achieved standardisation of ARCPs, a significant reduction in the proportion of insufficient evidence documented by trainees and, although limited in magnitude, small incremental decrements in the proportion of core medical trainees with unsatisfactory ARCP outcomes.
There has been much debate about the best way to assess competence within the medical profession.6–8 The most widely accepted method for doctors in training is the use of workplace-based assessments (WPBAs), which form part of a competency-based curriculum. The validity and applicability of competency-based curricula and, indeed, if WPBAs are the best method of assessing professional competence is beyond the scope of this paper.7–9 However, the lessons learned from the review of the CMT ARCP process in London has suggested areas for development with regard to the practical applications of the curriculum.
Does the lack of compliance with the ARCP process demonstrate a failing cohort of trainees?
Despite a high proportion of core medical trainees in London being awarded an unsatisfactory outcome at their ARCPs, they do not appear to represent a ‘failing’ cohort. Indeed, almost all core medical trainees initially awarded an outcome 5 at their ARCP had this upgraded to an outcome 1 or 6 once supportive evidence of their clinical performance was provided. If a major issue with the ARCP process is considered to be a lack of e-portfolio documentation of clinical activity (enabling attainment of CMT curriculum elements), the ‘logbook’ model used by specialties such as surgery or anaesthetics could be followed.10,11 The JRCPTB have introduced a logbook for CMT, but this has not been made a compulsory requirement and uptake from trainees has been variable.1 Furthermore, there is little evidence to support the use of logbooks in the literature with poor correlation between documentation of clinical exposures and clinical performance.1,12 Documented lists of activities, will only highlight potential learning opportunities and how these are utilised will be down to the trainee and their supervisor(s), and unless reflective components with formal WPBAs are employed, acquisition of competence in curriculum elements cannot be assumed.13
The CMT programme in London has a high competition ratio for available places, necessitating high academic performance to gain entry,14 and produces doctors with some of the highest MRCP (membership examination for the Royal College of Physicians) pass rates in the country.15 These trainees therefore appear to be motivated and of a high enough standard to pass the MRCP examination; thus, they should, in theory, be ready to fulfil the role of a medical registrar. Failing to produce the evidence for the required competencies laid out in the CMT curriculum may demonstrate a lack of understanding of the ARCP process or a lack of support to complete the documentation rather than a struggle to satisfy the curriculum requirements. Further work needs to be done to explore the barriers these trainees are facing and the reasons why there has been such poor compliance with the ARCP process.
Are CMT TPDs, educational supervisors and clinical supervisors appropriately trained for their role?
This review highlighted that trainers, as well as trainees, are not familiar with the CMT e-portfolio and ARCP process. Indeed, after implementation of the PDSA process that improved trainees’ documentation of acquisition of curricular competencies, approximately 15% of educational supervisors continued not to ‘sign-off’ trainees’ e-portfolios when such evidence was present (Table 2). The General Medical Council (GMC) sets standards for trainers,16 but a pilot GMC survey showed that trainers felt they had a lack of time to devote to their trainees and poor information technology service availability.17 They also complained of a lack of local support and wanted further training for their role.17 Unless these issues are addressed, trainers will find it difficult to supervise trainees appropriately and facilitate successful completion of curricula. As a trainer, maintaining current knowledge of curricula can be difficult and, over the last few years, a multitude of changes to postgraduate medical training have occurred.18 While the ‘Gold Guide’ reference for postgraduate training contains all of the necessary information about ARCPs, it is not utilised well by trainers and trainees. This may be because as it is such a large document that covers all areas of postgraduate medicine, it can be difficult to find the relevant information.2
Workshops targeting trainees and trainers involved in the delivery of the CMT curriculum in London received excellent feedback, but were poorly attended and did not have the desired impact of significantly reducing the proportion of trainees awarded unsatisfactory outcomes at their ARCP. The reasons for low attendance at these workshops are unclear but may reflect a difficulty in obtaining study leave or a perceived lack of relevance for the subject matter. If such workshops, focusing on CMT ARCPs and curriculum requirements, were to be made compulsory for CMT trainers and trainees, one would expect less unsatisfactory ARCP outcomes from a cohort of doctors better educated in the field. Another approach could be for CMT TPDs, educational supervisors and clinical supervisors to focus only on CMT and not undertake responsibilities for other levels of medical training. This may result in trainers being responsible for more CMT trainees but creates trainers actively engaged and up to date with training in one area.
How do we demonstrate quality assurance?
Following the review of London CMT ARCPs in 2014, it was felt that, going forward, centralised review panels and follow-up would be the most transparent, appropriate and consistent approach. While the onus of correct documentation and fulfilment of curricular elements remains firmly with trainees, the administration and quality assurance of this time-consuming process rests with the School of Medicine at HEE. This facilitates a failsafe, independent mechanism for making judgements about trainees’ competence and performance and their suitability to progress to the next stage of training; it also ensures external review of their professionalism.
For which procedures should core medical trainees demonstrate competence?
Our analysis of ARCP data showed areas where trainees were either not producing evidence of WPBAs in their e-portfolios for competence with procedural skills or had had their competence for procedures signed off by their educational supervisor with no supportive evidence. The quality improvement process implemented achieved a significant reduction in the proportion of trainees lacking evidence for procedural competence, but increased the percentage of e-portfolios with incomplete sign off by educational supervisors in the face of appropriate evidence (Table 2).
Historically, trainees have found ‘sign off’ for procedures difficult and in many CMT rotations, the lack of exposure to critically ill patients or specialised clinics means ‘sign off’ for procedures such as central venous cannula (CVC) access, pleural aspiration or advanced cardiopulmonary resuscitation is difficult. A recent core medical trainee survey by the Royal College of Physicians showed that up to 25% of core medical trainees had not led a cardiac arrest and almost 20% had never inserted a CVC.19 Feedback from higher trainees in medical specialties in London highlighted a lack of confidence in procedural skills and a theme that CVC insertion was more appropriately sited by anaesthetists.20 CVC insertion has been extensively reviewed and their placement must now be done using ultrasound guidance in a designated environment, such as an operating theatre or intensive care unit.21 Additionally, many pleural procedures are now performed away from the wards, in line with national guidelines.22 As such, it may not be reasonable to expect core medical trainees to develop competence for procedures less commonly performed in general medical settings and more often undertaken by specialists in specific clinical areas. However, while core medical trainees in London are often well supported in hospitals with many specialties represented, many of these doctors will go on to practise in settings where such support is less widely available. As such, developing independence in these skills is important and continues to be supported by national guidelines.22 Lifesaving procedures, however, can only be practised as procedural drills and reliably assessed if commonly performed.23 A need for a new approach to tackling the lack of exposure to certain clinical procedures and management of critically ill patients is therefore important for CMT. Cross-specialty learning and increased exposure to simulation training and critical care environments are potential solutions and are supported by current medical educational agendas.24–26
Conclusion
The review of the London CMT ARCPs demonstrated issues of concern with regard to the process but did not question the quality of the trainees or patient safety. The review raised challenging questions with regard to what should be considered an essential clinical procedure for CMT and the wider application of the ARCP with issues surrounding how best to record progress in training and engage and educate trainees and trainers in the process. A PDSA review process improved standardisation of ARCPs and had a positive effect on decreasing the amount of insufficient evidence in trainees’ e-portfolios. Going forward, we suggest that clear and accessible ARCP decision aids and checklists, which state the objective evidence required, should be available to trainees and trainers alike. Further work to explore the barriers trainees face in completing the curriculum and engaging with the ARCP process could also be developed. This should involve consultation with core medical trainees for direct feedback on the process.
The important areas highlighted during this review of the ARCP process should be considered in all annual reviews of clinical training and may facilitate reproducibility and, ultimately, quality assurance so that doctors are trained consistently throughout the country. This can only help to strengthen trainees’ confidence that the process is fair and transparent and that their training needs are being met at every level, without patient safety being compromised. Most importantly, the ARCPs are an annual review of training with an ultimate goal to produce independent practitioners fit for purpose.
Conflicts of interest
The authors have no conflicts of interests to declare.
- © Royal College of Physicians 2017. All rights reserved.
References
- ↵
- Joint Royal Colleges of Physicians Training Board
- ↵
- Academy of Medical Royal Colleges
- ↵
- Joint Royal Colleges of Physicians Training Board
- ↵
- Joint Royal Colleges of Physicians Training Board
- ↵
- Taylor MJ
- General Medical Council
- ↵
- ↵
- Brightwell A
- ↵
- Trust Cate O.
- ↵
- Royal College of Anaesthetists CPD online diary/Logbook
- ↵
- Royal College of Surgeons of England: Surgical tutor handbook
- ↵
- ↵
- Huang C
- ↵
- Joint Royal Colleges of Physicians Training Board
- ↵
- Royal College of Physicians
- ↵
- General Medical Council
- ↵
- General Medical Council
- ↵
- Patel M.
- ↵
- Tasker F
- ↵
- Royal College of Physicians
- ↵
- National Institute for Health and Care Excellence
- ↵
- British Thoracic Society
- ↵
- ↵
- Greenaway D.
- ↵
- Joint Royal Colleges of Physicians Training Board
- ↵
- Joint Royal College of Physicians Training Board
Article Tools
Citation Manager Formats
Jump to section
Related Articles
- No related articles found.
Cited By...
- No citing articles found.