Skip to main content

Main menu

  • Home
  • Our journals
    • Clinical Medicine
    • Future Healthcare Journal
  • Subject collections
  • About the RCP
  • Contact us

Clinical Medicine Journal

  • ClinMed Home
  • Content
    • Current
    • Ahead of print
    • Archive
  • Author guidance
    • Instructions for authors
    • Submit online
  • About ClinMed
    • Scope
    • Editorial board
    • Policies
    • Information for reviewers
    • Advertising

User menu

  • Log in

Search

  • Advanced search
RCP Journals
Home
  • Log in
  • Home
  • Our journals
    • Clinical Medicine
    • Future Healthcare Journal
  • Subject collections
  • About the RCP
  • Contact us
Advanced

Clinical Medicine Journal

clinmedicine Logo
  • ClinMed Home
  • Content
    • Current
    • Ahead of print
    • Archive
  • Author guidance
    • Instructions for authors
    • Submit online
  • About ClinMed
    • Scope
    • Editorial board
    • Policies
    • Information for reviewers
    • Advertising

Evaluation of feedback given to trainees in medical specialties

Tony CK Tham, Bill Burr and Mairead Boohan
Download PDF
DOI: https://doi.org/10.7861/clinmedicine.17-4-303
Clin Med August 2017
Tony CK Tham
AUlster Hospital, Dundonald, Belfast, UK
Roles: consultant gastroenterologist
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: tctham1234@gmail.com
Bill Burr
BJoint Royal Colleges of Physicians Training Board, London, UK
Roles: former medical director
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Mairead Boohan
CCentre for Medical Education, Queen’s University Belfast, Belfast, UK
Roles: lecturer and deputy director
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Info & Metrics
Loading

ABSTRACT

The aim of this study was to evaluate the quality of feedback provided to specialty trainees (ST3 or higher) in medical specialties during their workplace-based assessments (WBAs). The feedback given in WBAs was examined in detail in a group of 50 ST3 or higher trainees randomly selected from those taking part in a pilot study of changes to the WBA system conducted by the Joint Royal Colleges of Physicians Training Board. They were based in Health Education Northeast (Northern Deanery) and Health Education East of England (Eastern Deanery). Thematic analysis was used to identify commonly occurring themes. Feedback was mainly positive but there were differences in quality between specialties. Problems with feedback included insufficient detail, such that it was not possible to map the progression of the trainee, insufficient action plans made and the timing of feedback not being contemporaneous (feedback not being given at the time of assessment). Recommendations included feedback should be more specific; there need to be more options in the feedback forms for the supervisor to compare the trainee’s performance to what is expected and action plans need to be made.

KEYWORDS
  • Assessments
  • feedback
  • medical specialties
  • quality
  • supervision
  • thematic analysis trainees
  • workplace

Introduction

Workplace-based assessments (WBAs) aim to evaluate the working practices of doctors by being carried out in the actual workplace.1–3 WBAs in medical specialties can make reliable distinctions between doctors’ performance.4 For WBAs to have educational benefit, giving feedback is crucial.5,6 Feedback gives the trainee insight into their actions and their consequences and informs them about progress towards their personal objectives.5 There are recognised key quality indicators for effective feedback that should be familiar to all trainers. Some examples of these quality indicators include offering positive and negative comments; providing sufficient detail about the performance rather than generalisations; and developing an action plan so that the trainee can address problems more effectively.7,8

In November 2011, the General Medical Council publication Learning and assessment in the clinical environment: the way forward9 proposed two types of WBA:

  1. assessment for learning, or structured learning events (SLEs – formative assessment)

  2. assessment of performance (AoP – summative assessment).

In light of this, the Join Royal Colleges of Physicians Training Board (JRCPTB) set up a working group in January 2011 to review WBAs. It was decided to accept the General Medical Council’s recommendations regarding the types of WBA and JRCPTB added the premise that trainees needed fewer assessments with a greater emphasis on feedback and trainee reflection, all of which should be carried out at appropriate times throughout the training year. In order to confirm that the recommendations were deliverable, equitable and perceived to improve the existing system, they were piloted. The evaluation of the pilot put forward with many recommendations and one key recommendation was that there should be clear guidance and training on the process of feedback for SLEs.10

There is good evidence that if well implemented, feedback from WBAs leads to a perceived positive effect on clinical practice.11 However, there are very few studies in the literature evaluating the quality of the feedback provided to trainees during their WBAs, especially for medical specialties. We used accepted tools in educational research, such as qualitative analysis in the form of thematic analysis and also quantitative analysis.

The aim of this study was to evaluate the quality of feedback provided to specialty trainees during WBAs using recognised quality indicators as a benchmark.7,8 The WBAs that were examined were from trainees taking part in the JRCPTB pilot and a control group from a non-participating region.

Methods

The specialties that were involved in this project were acute internal medicine, clinical genetics, clinical neurophysiology, general internal medicine, genitourinary medicine, geriatric medicine, infectious diseases, neurology and palliative medicine. They were selected to achieve a mix of acute and non-acute specialties, and small and large specialties.

This study was approved by the Ethics Committee of the Queen’s University of Belfast. Part of the fair processing notice statement tells trainees that their data will be used appropriately for research purposes. They need to sign up to this when they enrol online with JRCPTB.

Subjects

All trainees assessed were specialty trainee year 3 (ST3) or higher. This comprised 25 trainees participating in the JRCPTB pilot in Health Education North East (Northern Deanery), who contributed 480 assessments, and 25 trainees from Health Education East of England (Eastern Deanery – not in the WBA pilot), who contributed 425 assessments.

The trainees from each locality were randomly selected by JRCPTB administrative staff so that they represented each training grade proportionally and different hospitals were included.

Data

JRCPTB trainees are registered on an electronic portfolio (ePortfolio) into which they enter training data, including full details of WBAs. The data contained within the portfolio may be accessed centrally by JRCPTB. The study took place over a 12-month period from August 2012–July 2013 and data for randomly selected trainees in this period were obtained. The identities of trainees were anonymised by JRCPTB. For each trainee, all mini clinical examination, acute care assessment tool (ACAT) and case-based discussion WBAs completed in that period were examined. Feedback was provided in the following fields: clinical assessment, investigation and management plans, clinical judgement, professionalism, action plan, and competence rating. All of the assessments were undertaken by consultants.

Analysis

Thematic analysis was used to evaluate the data to identify commonly occurring themes. The reasons for using thematic analysis are: the data set is qualitative in nature; the large body of data can be summarised; similarities and differences across the data set can be highlighted; allowing the data set to be organised and described.12 Thematic analysis was carried out by TCKT who had training in thematic analysis. MB analysed the data independently to ensure consistency and verification of the analysis. MB is a trained medical educationalist and an expert in thematic analysis.

Results

A total of 905 WBAs were analysed from the two localities in the study.

Feedback tended to be positive

There were a total of 3,699 individual entries where feedback was given. Of these, 3,248 contained positive feedback (88%). Therefore, feedback was generally positive with very few trainees receiving negative feedback.

Insufficient detail in feedback given

In general, the details written in the feedback texts lacked detail and were vague. When negative feedback was given, there was often insufficient detail to accompany the comments. An example includes: ‘needed guidance’. There were eight instances of negative feedback (0.2% of all feedback given) and of these five had sufficient detail to enable the trainee to improve.

When positive feedback was given, the majority of the feedback was vague and brief, lacking specific examples. Illustrations of such feedback include ‘good and efficient assessment’, ‘excellent’, ‘good’, ‘good clinical judgement’ and ‘very impressive’.

Differences in quality of feedback between specialties

The specialties that provided good quality feedback7,8 were palliative care and genitourinary medicine. In palliative care, 296 of 345 (86%) feedback items were of good quality. In genitourinary medicine, 124 of 155 (80%) feedback items were of good quality.

Palliative medicine, in particular, is notable for the high quality of the feedback given to trainees. Examples of such feedback include:

Use this case as a trigger to consider two issues: 1. the role of ketorolac in management of severe pain; 2. consideration of the challenges of applying specialist management manoeuvres (eg initiation of ketamine) in a generalist environment, and how you might tackle these challenges as a consultant.

Feedback from acute internal medicine, geriatrics, infectious diseases and neurology tended to be brief and, in general, did not meet many of the quality indicators for good feedback. The number (and percentages) of good quality feedback items in relation to the total number of feedback items for each specialty are as follows: acute internal medicine 244/690 (35%), geriatrics 1017/2675 (38%), infectious diseases 38/135 (28%) and neurology 164/445 (37%).

Feedback and rating of trainees in relation to their progression

For the majority of trainees, the feedback and rating could not be used to map the progression of the trainee’s performance over the year. For the majority of trainees, the rating by their assessors tended to be consistently similar with few variations over the training year.

Rating of trainees’ performance in relation to feedback

There are several examples that the assessment given does in fact reflect the trainee’s performance. Some ST3s were rated as performing at a higher level, eg at the level of completion of higher medical training. These same trainees were rated at this higher level on a consistent basis.

In contrast, there were some senior trainees who were not performing at the level expected. The rating was justified by the feedback given as there were implications that the trainee should have done better in some of the settings.

Suggestions for trainee development/action plans in the feedback

In general, about half of the feedback given included suggestions for further learning and development for trainees. Examples of good action plans include: Agree to: (1) take opportunities to be involved in emergency healthcare planning in different settings. (2) Complex discharge planning – practice [sic] ‘rallying the troops’/MDT [multidisciplinary team] to contribute to this process in a constructive manner. (3) Negotiating a patient’s expectation about discharge and helping them to move forward when appropriate.

The remainder of the action plans and suggestions for future development were vague and non-specific. Examples included: ‘continue to see a wide variety of cases’ and ‘continue the good work’. There was a dominant trend of advising the trainee to continue with what they were doing without specific guidance or suggestions.

Timing of feedback in relation to workplace-based assessment

It is difficult to know if the feedback provided was contemporaneous. There was some evidence that, with the ticket system, feedback was being given some time after the assessment.

Discussion

Feedback tended to be positive

It was noted that feedback tended to be positive in most cases with few trainees receiving negative feedback or suggestions for improvement. This does not meet one of the quality criteria for good feedback, which states that feedback should be balanced, giving both positives and negatives.7,8 Without honest feedback regarding actual performance, trainees are unlikely to seek advice about how to proceed in order to close the learning gap. Another possible explanation for the disproportionately positive feedback may be that trainees are able to select cases they feel confident at dealing with for their case-based discussion or mini clinical examination.

Differences in quality of feedback between specialties

It is notable that palliative medicine trainers provided higher-quality feedback to trainees compared with the other specialties in this study. Specifically, they provided more detail in the feedback given and also came up with appropriate action plans. The question is why does palliative medicine perform better than the other specialties in this aspect? Unlike general internal medicine and acute specialties, palliative medicine physicians see a smaller volume of patients and, therefore, are more likely to be able to spend time with their patients; albeit many of their patients have complex needs and issues. The evidence that the less acute nature of the specialty predisposes towards better feedback is supported by the finding that genitourinary medicine, another non-acute specialty, also provided good-quality feedback. Perhaps palliative medicine attracts and trains doctors to obtain skill sets that are appropriate for the specialty and also transferrable to providing feedback to trainees.

Feedback and rating of trainees in relation to their progression

It was observed that it was not possible to map the progression of trainees from the start of the training year to the end based on the assessors’ ratings. This is surprising as one would expect trainees to progress over the year. A potential explanation may be that the rating scales are insufficiently sensitive to map the progression of trainees. One of the quality indicators of good feedback is that trainee performance should be benchmarked to an appropriate standard; therefore, these standards should perhaps be included in the feedback forms, thereby enabling the assessor to make direct comparisons. In the case of a trainee at the start of their training, the trainer ought to be able to state that he/she is making satisfactory progress in comparison to other trainees at his/her stage.

Another possible explanation is that the WBAs were clustered together towards the end of training (pre-annual review of competence progression) rather than spread out over the year as intended (unpublished data, JRCPTB).

Rating of trainees performance in relation to feedback

The feedback provided reflected the level of the trainee’s performance. Thus, good trainees at an earlier stage of training were rated as performing at the level of completion of higher medical training with positive feedback appropriate for this rating; of 689 ratings, 127 (18%) were rated higher than expected. It is encouraging that reading the ratings and feedback given does provide an indication as to the overall performance of the trainee.

Suggestions for trainee development/action plans in the feedback

In this study, 46% (421 of 905 assessments) of the feedback received included action plans for trainees. This is a key quality indicator for feedback and all feedback should include an action plan. It is considered that formulation of an action plan may constitute the most critical step in providing feedback.1 Feedback is only effective when the trainee takes action to narrow the gaps identified.13 The evaluation of the WBA pilot from the JRCPTB supported this finding and, in fact, in the JRCPTB study only a minority of feedback included any action plan (recurring theme reported by 15 of 76 trainees).14 Faculty development is very important as evidenced by the work of Holmboe and others.15,16

Timing of feedback in relation to workplace-based assessments

With regards to the timing of the feedback in relation to the assessment taking place, there was some evidence (based on what was written) that the written feedback was done a few days after the assessment. This was supported by evidence from JRCPTB’s evaluation of the WBA pilot. One of the quality indicators for feedback is that it is timely, ie immediately after the observed assessment. A study looking at delayed versus immediate feedback demonstrated that the effectiveness was reduced when feedback was delayed.5,17

Limitations of the study

A potential limitation of this study is that it is a snapshot of two deaneries (local education and training boards) in the UK; therefore, it may not be representative of training in the rest of the UK.

Not all medical specialties were included in the pilot and notably some large specialties, such as cardiology, gastroenterology and respiratory medicine, were not involved.

Feedback from other WBAs, such as multisource feedback and directly observed procedures were not assessed.

Conclusions

Although there is much that is good about the feedback trainees are provided with, there is a lot of room to improve. Feedback is such an essential part of training that unless it is done properly, WBAs are wasted opportunities and become merely tick-box exercises. The recommendations for improvement are stated below, based on these results and the wider literature:5–7,11,13,15–17

  • A culture where it is safe to give and receive feedback needs to be established so that corrective feedback is viewed as an opportunity to improve rather than as painful criticism.

  • Information provided in feedback needs to be more specific and should include examples not generalisations.

  • Palliative medicine provided high-quality feedback and reasons why this specialty can achieve this should be explored so that others can learn from this.

  • In order to be able to track a trainee’s progress, there should be more options in the feedback forms for the supervisor to compare the trainee’s performance to what is expected at the stage of their training. The current rating scale could be further refined and made more sensitive to detect progress.

  • Action plans need to be made for all feedback provided. One way of encouraging this might be to give examples of a typical action plan in the feedback form.

  • Timing of feedback given should be contemporaneous and not several days after the assessment. To avoid the temptation to give feedback some time after the assessment, the ticket system for trainees to request written feedback needs to be looked at again.

Author contributions

TCKT made substantial contributions to the conception of the work, acquisition, analysis and interpretation of the data and drafted and revised the manuscript. BB and MB made substantial contributions to the conception of the work, analysis and interpretation of the data, and also helped to draft and revise the manuscript. All authors gave approval for the final version to be published.

Conflicts of interest

The authors have no conflicts of interest to declare.

Acknowledgements

We would like to thank the following:

  • TCKT's masters in clinical education was funded by the South Eastern Trust (special thanks to Craig Renfrew, director of medical education); and also the Northern Ireland Medical and Dental Training agency.

  • Winnie Wade (Director of Education, Royal College of Physicians, London, UK) who was involved in the initial discussions and gave her backing for this article

  • the Joint Royal Colleges of Physicians Training Board staff, namely Sylvia Filip, Hannah Watts, Rifa Begum and Kirstin Barnett, for their administrative support and help with the data.

  • © Royal College of Physicians 2017. All rights reserved.

References

  1. ↵
    1. Norcini J
    , Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach 2007;29:855–71.
    OpenUrlCrossRefPubMed
  2. ↵
    1. Shepard LA.
    The role of assessment in a learning culture. Educ Res 2000;29:4–14.
    OpenUrl
  3. ↵
    1. Mitchell C
    , Bhat S, Herbert A, Baker P. Workplace-based assessments of junior doctors: do scores predict training difficulties? Med Educ 2011;45:1190–8.
    OpenUrlPubMed
  4. ↵
    1. Wilkinson JR
    , Crossley JG, Wragg A, et al. Implementing workplace-based assessment across the medical specialties in the United Kingdom. Med Educ 2008;42:364–73.
    OpenUrlCrossRefPubMed
  5. ↵
    1. Hattie JA.
    Influences on student learning. Auckland: University of Auckland, 1999. https://cdn.auckland.ac.nz/assets/education/about/research/documents/influences-on-student-learning.pdf [Accessed 18 May 2017].
  6. ↵
    1. Veloski J
    , Boex JR, Grassberger J, Evans A, Wofson DB. Systematic review of the literature on assessment, feedback, and physicians’ clinical performance: BBEME Guide No. 7. Med Teach 2006;28:118–28.
    OpenUrl
  7. ↵
    1. Krackov SK.
    Giving feedback. In: Dent JA, Harden RM (eds). A practical guide for medical teachers, 4th edn. London: Churchill Livingstone, 2013:323–32.
  8. ↵
    1. Hesketh EA
    , Laidlaw JM. Developing the teaching instinct: 1, Feedback. Med Teach 2002;24:245–8.
    OpenUrlCrossRefPubMed
  9. ↵
    1. General Medical Council
    . Learning and assessment in the clinical environment: the way forward. London: GMC, 2011.
  10. ↵
    1. Cho PSP
    , Parry D, Wade W. Lessons learnt from a pilot of assessment for learning. Clin Med 2014;14:577–84.
    OpenUrlAbstract/FREE Full Text
  11. ↵
    1. Saedon H
    , Salleh S, Balakrishnan A, Imray CH, Saedon M. The role of feedback in improving the effectiveness of workplace based assessments: a systematic review. BMC Med Educ 2012;12:25.
    OpenUrlCrossRefPubMed
    1. Braun V
    , Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006;3:77–101.
  12. ↵
    1. Rushton A.
    Formative assessment: a key to deep learning? Med Teach 2005;27:509–13.
    OpenUrlCrossRefPubMed
  13. ↵
    1. Joint Royal Colleges of Physicians Training Board
    . ST3 Recruitement: palliative medicine. www.st3recruitment.org.uk/specialties/palliative-medicine [Accessed 18 May 2017].
  14. ↵
    1. Holmboe ES
    , Hawkins RE, Sj Huot. Direct observation of competence training: a randomized controlled trial. Ann Intern Med 2004;140:874–81.
    OpenUrlCrossRefPubMed
  15. ↵
    1. Bahar-Ozvaris S
    , Aslan D, Sahin-Hodoglugil N, Sayek I. A faculty development program evaluation: from needs assessment to long-term effects of the teaching skils improvement program. Teach Learn Med 2004;16:368–75.
    OpenUrlCrossRefPubMed
  16. ↵
    1. Hattie J
    , Timperley H. The power of feedback. Rev Educ Res 2007;77:81–112.
    OpenUrlCrossRef
Back to top
Previous articleNext article

Article Tools

Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Citation Tools
Evaluation of feedback given to trainees in medical specialties
Tony CK Tham, Bill Burr, Mairead Boohan
Clinical Medicine Aug 2017, 17 (4) 303-306; DOI: 10.7861/clinmedicine.17-4-303

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Evaluation of feedback given to trainees in medical specialties
Tony CK Tham, Bill Burr, Mairead Boohan
Clinical Medicine Aug 2017, 17 (4) 303-306; DOI: 10.7861/clinmedicine.17-4-303
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • ABSTRACT
    • Introduction
    • Methods
    • Results
    • Discussion
    • Conclusions
    • Author contributions
    • Conflicts of interest
    • Acknowledgements
    • References
  • Info & Metrics

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • Guidance for providing effective feedback in clinical supervision in postgraduate medical education: a systematic review
  • Development of a GMC aligned curriculum for internal medicine including a qualitative study of the acceptability of 'capabilities in practice as a curriculum model
  • The ageing population in healthcare: a challenge to, and in, the workforce
  • Google Scholar

More in this TOC Section

  • Can collaborative working improve diabetic retinal screening rates in individuals also diagnosed with a severe mental illness?
  • A comparison of in-person versus telephone consultations for outpatient hospital care
  • Ward-based learning in a pandemic: an approach to ensuring sustainable medical education for healthcare students
Show more Original research

Similar Articles

FAQs

  • Difficulty logging in.

There is currently no login required to access the journals. Please go to the home page and simply click on the edition that you wish to read. If you are still unable to access the content you require, please let us know through the 'Contact us' page.

  • Can't find the CME questionnaire.

The read-only self-assessment questionnaire (SAQ) can be found after the CME section in each edition of Clinical Medicine. RCP members and fellows (using their login details for the main RCP website) are able to access the full SAQ with answers and are awarded 2 CPD points upon successful (8/10) completion from:  https://cme.rcplondon.ac.uk

Navigate this Journal

  • Journal Home
  • Current Issue
  • Ahead of Print
  • Archive

Related Links

  • ClinMed - Home
  • FHJ - Home
clinmedicine Footer Logo
  • Home
  • Journals
  • Contact us
  • Advertise
HighWire Press, Inc.

Follow Us:

  • Follow HighWire Origins on Twitter
  • Visit HighWire Origins on Facebook

Copyright © 2021 by the Royal College of Physicians