Skip to main content

Main menu

  • Home
  • Our journals
    • Clinical Medicine
    • Future Healthcare Journal
  • Subject collections
  • About the RCP
  • Contact us

Future Healthcare Journal

  • FHJ Home
  • Content
    • Current
    • Ahead of print
    • Archive
  • Author guidance
    • Instructions for authors
    • Submit online
  • About FHJ
    • Scope
    • Editorial board
    • Policies
    • Information for reviewers
    • Advertising

User menu

  • Log in

Search

  • Advanced search
RCP Journals
Home
  • Log in
  • Home
  • Our journals
    • Clinical Medicine
    • Future Healthcare Journal
  • Subject collections
  • About the RCP
  • Contact us
Advanced

Future Healthcare Journal

futurehosp Logo
  • FHJ Home
  • Content
    • Current
    • Ahead of print
    • Archive
  • Author guidance
    • Instructions for authors
    • Submit online
  • About FHJ
    • Scope
    • Editorial board
    • Policies
    • Information for reviewers
    • Advertising

Determining doctors’ views on performance measurement and management of their clinical practice

Timothy M Trebble, Charles Carder, Maureen Paul, Emily Walmsley, Richard Jones, Peter Hockey and Nicholas Clarke
Download PDF
DOI: https://doi.org/10.7861/futurehosp.2-3-166
Future Healthcare Journal October 2015
Timothy M Trebble
APortsmouth Hospitals NHS Trust, Queen Alexandra Hospital, Portsmouth, UK
Roles: consultant gastroenterologist
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Charles Carder
BPortsmouth Hospitals NHS Trust, Queen Alexandra Hospital, Portsmouth, UK
Roles: foundation (FY1) doctor
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Maureen Paul
CDepartment of Human Resources, University of Southampton, Southampton, UK
Roles: HRM management MSc student
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Emily Walmsley
DHealth Education Wessex, Winchester, UK
Roles: specialist registrar in public health medicine
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Richard Jones
EWessex cardiovascular clinical network, Portsmouth Hospitals NHS Trust, Queen Alexandra Hospital, Portsmouth, UK
Roles: consultant cardiologist and clinical director
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Peter Hockey
FHealth Education Wessex, Winchester, UK
Roles: deputy postgraduate dean
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Nicholas Clarke
GDepartment of Human Resources, University of Southampton, Southampton, UK
Roles: professor of human resource management
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
Loading

ABSTRACT

Introducing performance measurement and management of clinicians’ practice may improve clinical productivity and quality of patient care; however the attitudes of doctors to such approaches are poorly defined. This was investigated through an anonymous qualitative postal questionnaire in a large district general hospital. A total of 93 from an invited cohort of 368 senior grade doctors participated. The results suggested that doctors understood the need to evaluate and manage their performance in medical practice, and address poor performance, but felt that current methods were inadequate. This principally related to poor validation and a lack of clinical ownership of data. The role of financial incentivisation was unclear but value was attributed to local clinical leadership, professional autonomy, recognition, and peer-group comparisons. This suggests that clinicians support the use of data-based performance measurement and management; however how it is undertaken is key to successful clinical engagement.

KEYWORDS
  • Performance measurement
  • performance management
  • validation
  • ownership

Introduction

Effectively managing the performance of hospital staff is associated with improved healthcare outcomes.1,2 However, the management of hospital-based clinicians in the NHS is rarely based on their individual clinical performance3 that may contribute towards variation in the quality and productivity of clinical practice.4,5 Performance measurement evaluates the quality and productivity of individual performance facilitating improvement in clinical practice6,7 through feedback, goal setting and monitoring.8

Commonly, standards of medical practice in NHS hospitals are measured at service or departmental level, and rarely extend to effective evaluation and management of individual doctor's performance,3 despite significant changes in the management of clinicians at a national level. The introduction of the Consultant Contract was intended to develop frameworks allowing managers to plan and monitor doctors’ work around organisational needs, but resulted in a reduction in performance.9 Likewise, the introduction of appraisal and revalidation under an external regulatory body, the General Medical Council10 was intended to maintain professional standards of practice, but involves limited use of objective data-based performance monitoring, and is considered an ineffective method of managing doctors’ day-to-day productivity.11 Furthermore, professional engagement with national initiatives to introduce performance management in the UK has been poor,12 reflecting resistance by clinicians,13 which has limited implementation.14 However, measuring and managing the productivity and quality of practice of individual clinicians remains a key strategy for improving clinical performance in the NHS including through job planning,15 remuneration with clinical excellence and incremental pay awards,16 and public release of clinical outcomes.17,18

Introducing effective clinical performance measurement and successfully translating this into quality and productivity improvements may be dependent on engaging the medical workforce and identifying their attitude to possible models, methods of implementation, safeguards and agreed ownership of data. This may also be necessary to avoid the dysfunctional behaviour and unintended consequences19 associated with poorly planned approaches.

The current study, therefore, investigated the attitudes of clinicians to performance measurement and management of individual clinical practice.

Methodology

Questionnaire development

The study cohort was identified using an electronic database of all hospital doctors holding senior grade positions and included consultants, staff grades and associate specialists at the Trust. Any individuals involved in formal medical management positions (eg clinical director or lead, chief of service and medical director) were excluded. All subjects were contacted by post and email with details of the study and invited to participate. All doctors were followed up with an email also inviting participation in the study.

View this table:
  • View inline
  • View popup
Box 1.

Questionnaire.

An anonymous qualitative questionnaire methodology was chosen to encourage open responses and for novel themes to emerge. The questionnaire consisted of descriptive questions developed from interviews of medical managers and a review of the current literature (Box 1). The questionnaire methodology was reviewed for content validity in three stages: i) by three newly qualified hospital specialists (post-CCST specialist registrars); ii) by two hospital specialists in permanent positions in the Trust but not included in the study cohort; and subsequent to changes being made; and iii) by two senior medical managers. A pilot study of 10 subjects was conducted for a final review of methodology.

Data collection

Data collection, collation and analysis was undertaken by CC a foundation year 2 doctor and junior medical manager, MP a masters student in human resources management at the University of Southampton, and EW a specialist registrar in public health medicine, Wessex Deanery. The questionnaires were reviewed for common themes based on an agreed coding strategy. Review of responses and results was undertaken jointly, allowing further review of coding and consistency of interpretation. A randomly selected sample (10%) of each reviewer's assigned questionnaires was additionally formally and fully re-coded by a second researcher also, to ensure consistency between the researchers’ interpretation of coding themes. There were no discrepancies identified.

Ethics

Ethical consent for the study was granted by University of Southampton (reference number 5858), with local trust recognition as a service evaluation study.

Results

The study was undertaken at Portsmouth Hospitals NHS Trust, a large secondary healthcare provider to a population of 650000 in Hampshire, UK. A total of 368 senior grade clinicians (9 associate specialists, 23 speciality doctors and 336 consultants) were invited to participate, with 93 responding from the specialties of surgery (19), medicine (24), anaesthetics (10), paediatrics (8), radiology (5) and pathology (4), with 23 not indicating, and from 1 to 30 years in post (mean 12.2). The following seven common themes were identified.

Current methods to evaluate and manage doctors’ performance within a department are varied but felt by many to be inadequate

There are currently several reported methods used to evaluate and manage doctors’ performance within departments. These included annual appraisal (28%; 26/93), 360 degree feedback (26%; 24/93), evaluation of work load activity (24%;22/93), peer review (17%; 16/93), patient survey (15%; 14/93) and job planning (13%; 12/93). However, approximately one-third of responders considered that performance in their department was either managed poorly or not managed at all by anyone other than the individual themselves, with only a minority (2%; 2/92) suggesting that it was managed well using regular education meetings and feedback. I see very little evaluation of the performance of doctors in my department. If this data is known, it is not being given to the doctors [sic]Broad levels of activity are apparent but not the clinical value or patient benefit.

There was a tendency to report the use of quantitative measures such as theatre lists and complication rates in the surgical specialties, but a general feeling that the identification of appropriate performance measures was a lot more complex in specialties such as paediatrics and elderly care. People overall perform well because their peers work closely with them and if they don't it would be evident. Outcome measures [are] difficult to measure as children don't die and we don't do procedures.Difficult to do in geriatric medicine – crude mortality rate is useless in a population where death may be a good outcome if advanced dementia, bed bound, in pain etc. Activity is also not a good indicator.

Consultants have noticed a recent increase in attempts to monitor performance and a greater awareness of its importance

Two-thirds of responders (62%; 58/93) reported a change in their attitude toward evaluating and managing specialists’ performance in the last three years, with a perceived higher scrutiny of their workload and efficiency, including through appraisal and revalidation. The advent of revalidation and a structured appraisal process has made Drs more aware of the need to prove and measure their performance.

Suggested barriers to specialists being evaluated and managed effectively existed and included both practical obstacles and resistance of doctors

The practical barriers included a lack of available data of sufficient quality (26%; 24/93), time constraints (24%; 22/93) and a lack of administrative support (9%; 8/93). Furthermore, it was considered that medical practice was inherently difficult to quantify or measure and that inhibited the development of standard methods of quantitatively assessing performance currently (25%; 23/93). What is meaningful is difficult to measure in paeds.The heterogeneity of the workload/case mix and finding good ways of measuring/evaluating performance.

The ‘units’ used to measure performance are not defined. How is clinical competence measured?

In addition, a number of respondents (26%; 24/93) felt that negative attitudes to performance management or even direct resistance by doctors represented an obstacle. Suspicion that it will be used as a management tool and used inappropriately.There is a real risk that any system of performance management becomes punitive and is viewed negatively.In house, it is difficult as the clinicians have to work together and relations may become strained.Reticence from clinicians in general. Scepticism about the systems devised. Will they just cause individuals to ‘game’ the system?Personal resistance/reluctance – some of it rightly inherited by the view that it doesn't weed out ‘poor’ doctors.

Departmental level medical managers (eg clinical directors or leads) should be responsible for evaluating and managing doctors performance

The majority of consultants felt that evaluating and managing specialists’ performance is the clinical director's (departmental level medical managers) role (80%; 74/93) (‘Doctors need to evaluate doctors’). However obstacles to this included the time available and the fixed period of time in post and the need for appropriate training and support (‘It is not CDs job to collect the information’). Need to make sure that highly paid staff are providing value for money to their employer.There is a role for the CD to oversee the process and identify and act on poor performance.They should be in best position to balance ‘performance’ vs ‘professionalism’.The only people who can understand and interpret the quality data is the specialists in that field.

Consultants are prepared to address perceived performance issues with colleagues

The majority of consultants would be, or have been, prepared to address perceived underperformance issues with their colleagues (73%; 68/93), with patient safety a major factor. However, reported concerns included that this often reflected sensitive issues potentially creating tensions within a department (23%; 21/93), previous negative experiences, that this may be perceived as ‘whistle blowing’, or that there was a lack of quality data to support such practice. Addressing underperformance would be very sensitive and there would be concerns about a ‘defensive attitude’Only [in a] subtle way, as there is a risk of breakdown of relationships

The majority of consultants had received some form of information on personal performance (71%; 66/93), but less than half (49%; 46/93) found this useful and only 23% (21/93) reported that it resulted in a change in their practice. A small number said they monitored their own performance (4%; 4/93).

There were mixed views about how useful doctors considered that information on their own performance would be. Among responders who felt that it would not be useful, concerns were expressed regarding the quality and accuracy of the information. I would welcome this. I would find it useful providing assistance given to make changes if appropriateAs long as performance involves numbers only it is useless

Consultants have mixed views on the ownership and sharing of performance data

A wide range of opinions were expressed by respondents regarding who should own doctors’ performance data, and whether it should be shared with colleagues, their employing hospital and the public. One-third of respondents felt that data should be widely shared (including with public) but only after adequate validation (39%; 36/93). A minority (11%; 10/93) of responders felt that all data should be shared without the need for validation prior to publication, but a third (36%) felt that this should be limited (from no sharing at all (3%), to with consultants only (2%)). Total transparency is only way.Owned by individual and clinical leader, wider dissemination dangerous.The public have little idea of significant of raw data…rely on the interpretation of the press.NHS becoming increasingly isolated in terms of not sharing performance – hard to defend. Public pay for service so should know

Non-financial incentives would motivate consultants to work differently, however financial incentives are motivational in only a minority

Respondents reported that they would be motivated to work differently by a number of different methods and incentives. Of these, feedback was most frequently reported (52%; 48/93), increased autonomy (27%; 25/93), recognition (22%), and comparison with peers (13%; 12/93). A minority of respondents stated that financial incentives would motivate them to work differently (27%; 25/93), but that the effect also may be detrimental or personally ineffective (8%; 7/93). A…supportive environment, some occasional thanks for all the hard work and relief from a culture of fear and litigation.I would like to know how I am performing.Involvement in decision making, good information from management, feedback from patients and colleagues.Performance targets, financial rewards/punishments will only reduce the quality of work.Ownership within the team and the feeling of being valued, money doesn't do it for me!Professional pride, I do not believe this should ever be linked to financial incentives

Discussion

This study investigated the attitudes of hospital-based doctors to the role of performance measurement and the use of data-based performance management in changing practice. The results suggest that doctors are open to addressing performance issues with colleagues, especially when relating to patient safety. However, how this is undertaken, and particularly how the data are validated (to ensure accuracy and relevance), who owns the data and how the process is led is fundamental to obtaining doctors’ engagement. There were a wide variety of methods currently being used for measuring the performance of doctors within the Trust, however this was considered to be successful in only a minority of cases. Doctors’ views on the value of financial incentivisation to motivate change in practice were varied, with more weight given to feedback, autonomy, recognition, comparison to peers and clear objectives. To our knowledge this is the first study in the published literature to directly evaluate doctors’ opinion on the subject of performance measurement and management, but its findings are consistent with the conclusions of other bodies of work. For example, a report by the UK National Audit Office3 suggested that medical and non-medical managers had difficulty in challenging doctors’ performance, and that this reflected the absence of usable information on performance.11 Furthermore, clinicians have expressed fears over the negative unintended consequences of the publication of such data, for example distortions of activity, gaming and perverse incentives,4 or the potentially detrimental effects of poorly validated data.20 This may also reflect low levels of trust doctors have in their managers.21 Finally, there is a large body of evidence suggesting that current informatics and available data are of poor quality and that this negatively influences medical engagement. The Royal College of Physicians’ (RCP's) iLab reported that current routinely generated data are not fit to support clinicians managing departmental performance and that developing clinician engagement should be an NHS priority.22 A further large RCP postal questionnaire study suggested that physicians considered routinely collected, centrally returned hospital episode statistics (HES) (coding) data were insufficiently validated to provide meaningful evaluation of physicians performance and were associated with insufficient engagement with doctors.23 The varying responses to the roles of financial and non-financial rewards in incentivising performance improvement are notable and may reflect differences in the motivational needs of doctors, their cultures or their professional roles.24 This may need to be considered in our approach to engaging doctors with improving practice.

The Trust involved in the study did not use a standardised, organisationally mandated (and supported) system of clinician performance measurement and management; nor is there one available for the NHS. This may be beneficial in allowing different approaches for departments and their managers that are consistent with their individual needs (eg a contingency approach), but also has disadvantages including that a lack of uniformity precludes direct comparisons of departments and may adversely influence the provision of performance data for the organisation as a whole.

There are limitations to this study. First, the study was performed in a single organisation with a relatively small cohort, albeit a cross section of specialities, which were self-selected through the methodology used, and their representation of the broader medical body is unconfirmed. The response rate following written and follow-up email invitation was disappointing at approximately 25%. However, this is within the expected range of between 10 and 50% for responses in a postal questionnaire.25 The relatively low numbers precluded subgroup analysis for example surgeons compared to physicians. Second, as a qualitative study, responses are subject to the interpretation of the researchers, although considerable efforts were made to produce consistency of the analysis.

The findings of this study suggest that doctors are most likely to be responsive to performance evaluation and management through modern systems of human resource management. Such approaches are more consistent with an ‘empower and facilitate’ as opposed to a ‘command and control’ organisational culture, focusing on intrinsic as opposed to extrinsic motivational strategies. Examples of this have been developed and evaluated within the context of individual performance review11 and performance measurement and management models26 for senior hospital clinicians, based on the systems used in successful non-healthcare public sector, commercial sector and charity sector organisations.

In conclusion, the results of this study suggested that hospital consultants are willing to engage with performance measurement, as well as manage and monitor their peers; however current management approaches are thought to be insufficient.

  • © Royal College of Physicians 2015. All rights reserved.

References

  1. ↵
    1. West MA
    , Borrill CS, Dawson JF, et al. The link between the management of employees and patient mortality in acute hospitals. Int J Hum Resour Manag 2002;13:1299–310.
    OpenUrlCrossRef
  2. ↵
    1. West MA
    , Guthrie JP, Dawson JF, Borrill CS, Carter M. Reducing patient mortality in hospitals: The role of human resource management. J Organ Behav 2006;27:983–1002.
    OpenUrlCrossRef
  3. ↵
    1. Comptroller and Auditor General
    . Managing NHS hospital consultants. London: National Audit Office, DoH, 2013.
  4. ↵
    1. Tomson CR
    , van der Veer SN. Learning from practice variation to improve the quality of care. Clin Med 2013;13:19–23.
    OpenUrlAbstract/FREE Full Text
  5. ↵
    1. Faulkner AC
    , Harvey IM, Peters TJ, Sharp DJ, Frankel SJ. Profiling outpatient workload: practice variations between consultant firms and hospitals in south west England. J Epidemiol Community Health 1997;51:310–4.
    OpenUrlAbstract/FREE Full Text
  6. ↵
    1. Soong C
    , High S, Morgan MW, Ovens H. A novel approach to improving emergency department consultant response times. BMJ Qual Saf 2013;22:299–305.
    OpenUrlAbstract/FREE Full Text
  7. ↵
    1. Kiefe CI
    , Allison JJ, Williams OD, et al. Improving quality improvement using achievable benchmarks for physician feedback: a randomized controlled trial. JAMA 2001;285:2871–9.
    OpenUrlCrossRefPubMed
  8. ↵
    1. Van Dooren W
    , Bouckaert G, Halligan J. Performance management in the public sector. Abingdon: Routledge, 2010:16–37.
  9. ↵
    1. Bloor K
    , Maynard A, Freemantle N. Variation in activity rates of consultant surgeons and the influence of reward structures in the English NHS. J Health Serv Res Pol 2004;9:76–84.
    OpenUrlAbstract/FREE Full Text
  10. ↵
    1. Armitage M
    , Shah K, Dacre J, Levine D, Waller D. Appraisal and revalidation: guidance for doctors preparing for relicensing and specialist recertification. 1. Appraisal. London: RCP, 2007.
  11. ↵
    1. Trebble TM
    , Cruickshank L, Hockey PM, et al. Individual performance review in hospital practice: the development of a framework and evaluation of doctors’ attitudes to its value and implementation. BMJ Qual Saf 2013;22:948–55.
    OpenUrlAbstract/FREE Full Text
  12. ↵
    1. Royal College of Physicians iLab
    . Engaging clinicians in improving data quality in the NHS. London: RCP, 2006.
  13. ↵
    1. Tavare A.
    Performance data: ready for the public. BMJ 2012;345:21–3.
    OpenUrl
  14. ↵
    1. Epstein AM.
    Performance measurement and professional improvement. In: Smith P, Mossialos E, Papanicolas I, Leatherman S (eds), Performance measurement for health system improvement. Cambridge: Cambridge University Press, 2009:613–40.
  15. ↵
    1. BMA and NHS Empolyers
    . A guide to consultant job planning. London: BMA and NHS Employers, 2011.
  16. ↵
    1. Review Body on Doctors’ and Dentists’ Remuneration
    . Review of compensation levels, incentives and the clinical excellence and distinction award schemes for NHS consultants. London: Stationery Office, 2012.
  17. ↵
    1. Iacobucci G.
    Patient survival rates for individual surgeons will be published from 2013. BMJ 2012;345:e8617.
    OpenUrlFREE Full Text
  18. ↵
    1. NHS Commissioning Board
    . Everyone counts: planning for patients 2013/14. London, NHS England, 2012.
  19. ↵
    1. Sheldon T.
    Promoting health care quality: what role performance indicators? Qual Health Care 1998;7 Suppl:S45–50.
  20. ↵
    1. Chinthapalli K
    . Surgeons’ performance data to be available from July. BMJ 2013;346:f3795.
    OpenUrlFREE Full Text
  21. ↵
    1. Davies HT
    , Hodges CL, Rundall TG. Views of doctors and managers on the doctor-manager relationship in the NHS. BMJ 2003;326:626–8.
    OpenUrlFREE Full Text
  22. ↵
    1. Royal College of Physicians and Informing Healthcare
    . The iLab in Wales: involving clinicians in improving data quality. London, RCP, 2007.
  23. ↵
    1. Croft GP
    , Williams JG, Mann RY, Cohen D, Phillips CJ. Can hospital episode statistics support appraisal and revalidation? Randomised study of physician attitudes. Clin Med 2007;7:332–8.
    OpenUrlAbstract/FREE Full Text
  24. ↵
    1. Conrad DA.
    Incentives for health-care performance improvement. In: Smith P, Mossialos E, Papanicolas I, Leatherman S (eds), Performance measurement for health system improvement. Cambridge: Cambridge University Press, 2009:613–40.
  25. ↵
    1. Saunders M
    , Lewis P, Thornhill A. Selecting samples. In: Saunders M, Lewis P, Thornhill A (eds), Research methods for business students, 5th edn. Harlow: Pearson Education Ltd, 2009:210–55.
  26. ↵
    1. Trebble TM
    , Paul M, Hockey PM, et al. Clinically led performance management in secondary healthcare: evaluating the attitudes of medical and non-clinical managers. BMJ Qual Saf 2015;24:212–20.
    OpenUrlAbstract/FREE Full Text
Back to top
Previous articleNext article

Article Tools

Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Citation Tools
Determining doctors’ views on performance measurement and management of their clinical practice
Timothy M Trebble, Charles Carder, Maureen Paul, Emily Walmsley, Richard Jones, Peter Hockey, Nicholas Clarke
Future Healthcare Journal Oct 2015, 2 (3) 166-170; DOI: 10.7861/futurehosp.2-3-166

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Determining doctors’ views on performance measurement and management of their clinical practice
Timothy M Trebble, Charles Carder, Maureen Paul, Emily Walmsley, Richard Jones, Peter Hockey, Nicholas Clarke
Future Healthcare Journal Oct 2015, 2 (3) 166-170; DOI: 10.7861/futurehosp.2-3-166
del.icio.us logo Digg logo Reddit logo Twitter logo CiteULike logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • ABSTRACT
    • Introduction
    • Methodology
    • Results
    • References
  • Figures & Data
  • Info & Metrics

Related Articles

  • No related articles found.
  • Google Scholar

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Emergency hospital admissions associated with non-communicable diseases 1998–2018 in England, Wales and Scotland: an ecological study
  • The UK physician workforce: one-third at increased risk of death from COVID-19
  • A specialised cardiorespiratory team approach in the intensive care management of COVID-19 patients: benefit on mortality, diagnosis and management
Show more Original research

Similar Articles

Navigate this Journal

  • Journal Home
  • Current Issue
  • Ahead of Print
  • Archive

Related Links

  • ClinMed - Home
  • FHJ - Home

Other Services

  • Advertising
futurehosp Footer Logo
  • Home
  • Journals
  • Contact us
  • Advertise
HighWire Press, Inc.

Follow Us:

  • Follow HighWire Origins on Twitter
  • Visit HighWire Origins on Facebook

Copyright © 2021 by the Royal College of Physicians