Outliers from national audits: their analysis and use by the Care Quality Commission in quality assurance and regulation of healthcare services in England ========================================================================================================================================================== * Helen Grote * Keiko Toma * Laura Crosby * Catherine Robson * Clare Palmer * Claire Land * Jessica Ball * Edward Baker ## ABSTRACT The Care Quality Commission (CQC) is the independent regulator of health and adult social care in England. As part of the intelligence-driven approach to regulation, the CQC works closely with national clinical audit bodies to identify key metrics which reflect quality of care and track the performance of providers against these metrics. Where outliers on national audits are identified that may reflect risks to patients, the CQC encourages the hospital to identify any learning points and implement changes to improve patient care. In this article, we describe the role of national audit outcomes in the regulatory process and how providers can use national audits to inform both quality assurance and quality improvement processes, with two illustrative case studies. We discuss the ongoing challenges with using audit data in the regulatory process and how these could be addressed. KEYWORDS: * audits and feedback * continuous QI * governance * patient safety * standards of care ## Introduction The Care Quality Commission (CQC) is the independent regulator of health and adult social care in England. It was established in 2009 to ensure that care providers meet the fundamental standards required for good patient care. The CQC’s four strategic priorities are to encourage improvement, deliver an intelligence-driven approach to regulation, promote a shared view of quality and improve the efficiency and effectiveness of its regulation. The use of national audit data for regulation spans all these strategic priorities, helping the CQC to better assess how providers meet the fundamental standard for good governance. Providers must have processes in place to review audit data, and must monitor and improve the quality and safety of the services accordingly.1 ## The use of data from national audits Participation in the National Clinical Audit and Patient Outcomes Programme (NCAPOP) is a condition of the NHS Standard Contract for hospitals, and the resulting feedback can be used by providers to monitor and improve the quality of services.2 Evidence indicates that this is a process that works to drive improvements in care; a recent Cochrane review of 140 randomised trials found that audits produced a median 4.3% improvement in the compliance of healthcare professionals with standards of best practice.3 In England, the Healthcare Quality Improvement Partnership (HQIP) holds the contract to manage the NCAPOP on behalf of NHS England. HQIP commissions national clinical audits (NCAs) covering some of the most commonly occurring conditions. The CQC’s analytics team works closely with NCA steering groups and expert clinicians to select high quality key metrics which demonstrate the effectiveness of care.4 These key metrics are used to benchmark providers against national standards and comparable peers. Where appropriate, the audit data are risk adjusted. Selected audit data, statistical benchmarks, data visualisations and guidance from clinicians on how to use audit data for improvement are developed by the CQC’s analytics team in conjunction with HQIP. HQIP publishes this information primarily to support medical directors and local clinical audit staff, but also for the public.5,6 This streamlined approach to the presentation and visualisation of audit data can be used to generate a shared view of quality and assess the performance of providers. Where no national standards currently exist, the CQC encourages audit bodies to recommend appropriate standards, with the aim of generating a shared view of quality across partner organisations and encouraging improved performance of all providers over time. Data from national clinical audits, such as the National Emergency Laparotomy Audit (NELA) and the Sentinel Stroke National Audit Programme (SSNAP) audit are particularly valuable in assessing the performance of providers, as these are generally well validated, with fewer limitations than some other national datasets.7 The CQC strongly encourages providers to participate in all audits listed in the NHS England quality accounts, both for the purposes of quality assurance and for quality improvement (QI). Good engagement with NCAs, together with high quality data, is required to ensure that national benchmarking retains its impact in encouraging improvement. Where the CQC is informed of poor engagement or non-participation in NCAs, a letter is issued to encourage provider chief executives to increase and sustain their participation in NCAs for the relevant clinical services in the hospital. These can be directed at encouraging applicable service managers to provide clinicians with the necessary time and resources to participate in audits, which is currently lacking in some organisations. ## How audits identify outliers Many NCAs analyse key outcome and process measures for outliers. The outlier analysis identifies providers with the greatest variation from the expected levels of performance, thereby stimulating reviews of care with the aim to encourage improvement. A negative outlier on a national audit does not necessarily mean there has been poor care; rather, it is a potential indicator of poor care that warrants further investigation. It is worth noting that a number of national audit programmes use statistical case mix adjustment to factor in expected variance in relation to the patient cohort admitted to different units; for example, the SSNAP audit uses a case mix adjustment process taking into account the age of the patient, underlying atrial fibrillation, type of stroke (haemorrhage versus infarction) and National Institutes of Health Stroke Scale (NIHSS) score on arrival at hospital.7 The most common method used by NCAs to analyse the degree of variation within a system is to use a funnel plot. This method looks at a cross-section of data and how far each data point deviates from the mean, median or target value; which is shown as a horizontal line cutting through the graph. This can clearly be seen in Fig 1 that shows the mean adjusted glycated haemoglobin (HbA1c) values for children and young people with type 1 diabetes cared for by paediatric diabetes units in England and Wales. The dashed and dotted lines on the graph represent two and three standard deviations (SDs) from the mean, respectively. Units above the top dotted line are considered ‘alarm’ level (outside the three SDs control limit) as their mean HbA1c are significantly worse than other units in England and Wales. ![Fig 1.](http://www.rcpjournals.org/https://www.rcpjournals.org/content/clinmedicine/21/5/e511/F1.medium.gif) [Fig 1.](http://www.rcpjournals.org/content/21/5/e511/F1) Fig 1. **Mean adjusted glycated haemoglobin values for children and young people with type 1 diabetes cared for by paediatric diabetes units in England and Wales: an example of ‘alarm’ level outliers**. Units outside the upper three standard deviation control limit are designated ‘alarm’ level outliers. Adapted with permission from the Healthcare Quality Improvement Partnership. HbA1c = glycated haemoglobin; PDUs = paediatric diabetes units; SDs = standard deviations. The greater the sample size, then the more accurate the funnel plot method. This is because as the sample size increases (for example, when more units submit data), the size of the standard error of the mean (reflecting how far the mean of the data is from the true population mean) decreases. This is reflected in target values on the funnel plot which are more likely to reflect the true population mean; in this case, the HbA1c values for children and young people cared for by paediatric diabetes units in England and Wales. As a result, where the sample size is larger, those units outside the marked control limits can be more confidently designated as true outliers. The accuracy of funnel plots can also be affected by heterogeneity; for instance, the clinical setting or types of participants. Publication bias in audits (such as selective reporting and missing data) can also affect the quality of the overall analysis.8 ## The CQC actions following notification of an audit outlier and the response expected from providers Regular engagement meetings with providers and on-site inspections of hospitals are critical to the work (see Box 1), as the factors affecting quality cannot be assessed from data alone.9 Data and indicators can be the starting points of conversations between the CQC and providers about their understanding of the outcome(s) over time and their local QI activity to improve care quality for patients. Data and indicators on their own are not judgements. | **Case study 1: The provider experience: King’s College Hospital NHS Foundation Trust**
The national clinical audit programme is a significant component of the quality improvement programme at King’s College Hospital NHS Foundation Trust (KCH). The trust has two large acute hospital sites: 87 national audits and confidential enquiries are relevant to the King’s College Hospital (Denmark Hill) site and 64 to the Princess Royal University Hospital (Bromley) site.
Reviewing the results of national audits is a significant undertaking and happens both within the clinical teams and centrally. There is no standardisation in the reporting of the national audits, so the patient outcomes team analyses each report and summarises, with input from the clinical leads, the key messages in relation to successes, concerns and required improvement. This enables the busy clinical teams, clinical governance structures, executive team and board to understand quickly and easily the implications of the results and use the information to drive improvement.
As with all NHS organisations, excellent patient outcomes and experience are the key aims, so to get most value out of the effort expended, the trust particularly focuses the efforts on national audit indicators relating to patient outcomes and experience. Where the trust does not perform well against these indicators, the trust targets a detailed review of the processes through internal investigations. The trust embeds the improvement actions into existing care group-, division- and trust-level action plans, ensuring that the work is joined up with other organisational priorities and improvement actions.
The trust is currently working with the business intelligence unit to develop an application to enable the capture of these data for use within the trust’s data warehouse. This will support clinical teams to use data from the national audits alongside performance and activity data to provide a holistic picture of their service. In addition, this will enable the trust to analyse themes occurring across the national audits, such as those relating to complications, pain management and provision of services.
**Case study 2: The inspector experience: working with Calderdale and Huddersfield NHS Foundation Trust**
From an inspector perspective, outcomes from externally validated data are an important part of engagement, monitoring and on-site inspections; as they help the Care Quality Commission (CQC) to objectively measure how an organisation is performing.
The approach to managing outlier alerts from national audits can also be an indicative of an organisation’s responsiveness and culture; for example, Calderdale and Huddersfield NHS Foundation Trust recently experienced an outlier alert in relation to a National Paediatric Diabetes Audit measure. The trust informed the CQC relationship owner (inspector) of this at the point at which they were alerted. This indicated that the trust was open and proactively managing alerts. It also enabled the inspector to have timely oversight of potential areas of concern, greater awareness of trust investigations to determine why the alert occurred (for example, from audit peer review) and knowledge of any immediate actions taken. Ongoing work includes reviewing the completion of action plans and discussing performance during regular engagement meetings. The proactive approach taken by the trust helped to provide assurance to CQC that the organisation was not merely collecting data but was actively using it to drive quality improvements, a marker of being well led.9 | || Box 1. Case studies Once a potential alert level (greater than two SDs but less than three SDs) or alarm level (greater than three SDs control limit) outlier is identified, a letter is sent to the provider to request a response. The clinical lead for the relevant service is expected to acknowledge the potential outlier status and validate that the data supplied to the audit for analysis was correct, or supply the correct data to allow the audit to confirm that the provider is, or is not, an outlier. Following the response, the NCA will confirm whether there is a case to answer. The CQC inspection framework sets out what standards need to be met to achieve a ‘good’ or ‘outstanding’ rating against the ‘effective’ and ‘well-led’ domains. This stipulates that the CQC expects the provider to investigate the audit outlier and submit a robust action plan. The review of the audit outlier should be undertaken by an individual with the appropriate seniority and clinical knowledge.10 The plan should address the learning points identified in the review, provide evidence of changes that the hospital has made to improve care and explain how these improvements will be monitored over time; together with named leads and timescales (supplementary material S1). A multi-professional team, comprising clinicians, analysts and regulatory inspectors, meet regularly to review the responses from providers, and assess whether each provider has adequately assessed their outlier and produced an appropriate action plan with learning points. The CQC will also monitor subsequent national audit benchmarking outcomes to see that performance is improving over time. ## How hospitals can use audit data to inform quality improvement initiatives The review of clinical audit data locally should be a cyclical process, not a one-off exercise, and should be undertaken with a view to identifying areas for improvement, and not just as a ‘tick-box’ exercise for regulation and assurance purposes. A systematic review undertaken by the Cochrane Study Group demonstrated that audits were more likely to produce a beneficial effect where baseline adherence to recommended practice was low and where feedback is carried out at regular intervals, with explicit targets and an action plan.2 NCA steering groups and royal colleges have produced several high-quality resources to support clinicians and providers in using audit data to support continuous QI.10–13 Any QI approach should provide a mechanism for using data from NCAs to inform, drive and stimulate improvement, and should synchronise audit cycles and QI in order to sustain improvements in care.10 In practice, this means that a good and effective clinical service is one in which staff understand their data from national audits and uses this information to drive local QI work and monitor improvements in patient care and safety.10–14 Many hospitals are now using statistical process control (SPC) to demonstrate common cause and special cause variation seen in data and thereby monitor their local QIs over time (Fig 2). Common cause variation indicates the chance variation expected in a system, whereas special cause variation indicates that something unusual might be occurring. The National Emergency Laparotomy Audit (NELA) uses SPC and makes these charts publicly available. ![Fig 2.](http://www.rcpjournals.org/https://www.rcpjournals.org/content/clinmedicine/21/5/e511/F2.medium.gif) [Fig 2.](http://www.rcpjournals.org/content/21/5/e511/F2) Fig 2. **Interval from decision to operate to arrival in theatre (expedited): an example of special cause variation**. The black line represents the month-by-month national benchmark hospital performance on the time interval from decision to operate to the patient arriving in theatre. The teal line illustrates the data for this particular hospital. The teal dots represent individual patients. There are three outliers (dots) above the upper control limit (red line), indicating patients who waited longer than the upper control limit. Adapted with permission from National Emergency Laparotomy Audit. The chart in Fig 2 illustrates the variation in the interval from decision to operate to the time that the patient arrives in theatre. The lower control limit is zero hours, and the upper control limit (red line) is 83 hours. We would expect 95% of data points to fall within these limits. In this example, there are three outliers (teal dots) above the upper control limit indicating that those patients who had long waits from decision to operate to arrival in theatre. These outliers are an example of special cause variation, and should prompt the clinical team to look into the circumstances of these three cases for opportunities for improvement. When a hospital has provided the CQC with evidence of their local changes and their analysis of the improvements over time (for example, with SPC), this helps provide assurance that the hospital is monitoring and improving patient outcomes and the service is well led.9 ## Does the use of data from national audits and assessment of outliers improve patient care? Information from the CQC, NCAs and published research all highlight examples of systematic data-driven approaches to improved patient care driven by clinical audit.15–20 A CQC report into QI in hospital trusts found that those rated as ‘good’ were monitoring clinical effectiveness across all their services and took immediate action wherever they found concerns. A focus on continuous QI was a key feature of high-quality care.21 National audit steering groups also support improvement through development of tools to help providers. The National Audit of Inpatient Falls identified two areas of weakness from the 2015 audit: measurement of visual acuity, and lying and standing blood pressure. Clinical tools were developed by the audit steering group to help providers measure these parameters; for example, South Tyneside District Hospital implemented these QI tools and, together with clear clinical leadership, support from a falls specialist nurse and engagement of patient-facing staff, improved measurement of lying and standing blood pressure from a baseline of 7.4% to 100% and reduced inpatient falls by 53% in just 6 months.16 Significant improvements in patient outcomes have also been observed following changes to clinical practice prompted by other audits including the National Lung Cancer Audit, NELA and the SSNAP audit.13,19 Evidence also indicates that identification of outlier status alone can drive improvement; a study of 31 (out of 86) hospitals in the states of New York and Massachusetts found that, although these institutions were larger and treating more patients, mortality reduced after public report of outlier status. The reduction of in-hospital mortality was in excess of that observed at non-outlier institutions, with no significant corresponding reduction in the number of procedures performed, suggesting a culture of local improvement processes had been implemented.17 More recently, East Kent Hospitals University NHS Foundation Trust flagged as an outlier on two measures in the National Lung Cancer Audit: the number of small cell lung cancer patients receiving chemotherapy and the number of patients with advanced non-small cell lung cancer receiving systemic anticancer treatment. This prompted a service redesign including the recruitment of more specialised nurses and improvements to radiology and pathology reporting processes.19 ## Ongoing challenges with the use of audit data in the regulatory process The use of audit data and, in particular, assessment of outliers is already used by the CQC as part of an intelligence-led approach to regulation, which if developed could potentially reduce the burden on providers as data are already submitted to NCAs. The CQC expects all providers, regardless of sector, to contribute to national audits and use audit data to monitor and improve the quality and effectiveness of their clinical services. Poor quality and incomplete data submissions to audits have, in themselves, been linked to poor quality care. A review of missing data relating to baseline disease activity in the National Clinical Audit for Rheumatoid and Early Inflammatory Arthritis demonstrated that poor engagement with the audit correlated with poor quality care.22 Although submitting data to national audits is a time-consuming process, this is improving with the digitisation of healthcare records. Providers expect a reciprocal relationship whereby NCA steering groups analyse the data and provide clear reports with benchmarking against peers, and, where improvement is required, unambiguous presentation of ‘headline’ metrics.23 The development of clear dashboards highlighting key metrics (such as those developed by HQIP and the CQC through the national clinical audit benchmarking project ([https://ncab.hqip.org.uk](https://ncab.hqip.org.uk))) gives provider organisations valuable information in a manner that can be utilised effectively.23 Making more efficient use of audit data in quality assurance and regulatory processes will require careful selection of key metrics and streamlining of data collection by NCAs where appropriate. Standardised reporting formats from NCAs would also better enable the development of apps and dashboards for easy data visualisation and benchmarking. Although audits are increasingly including independent providers and mental health services, most NCAs operate solely within NHS acute hospitals, which limits their wider application to regulation. The CQC expects independent hospitals and mental health trusts to participate where possible and encourages NCA steering groups to enable these sectors to participate easily and use the data to inform local quality assurance and improvement processes. The coverage of services by national audit processes is also inadequate at present. While surgical specialties and stroke services have a well-established culture of audit, benchmarking and reporting, other specialties (including psychiatry, children and young people’s services and cancer services) have a limited number of national audits covering only a few conditions within their specialty remit. Furthermore, the effectiveness of many clinical audits in promoting change is limited by their episodic approach, which only provides a snapshot of performance for a designated short timeframe for data collection, often separated by a period of a year or more. Where the audit cycle is separated by long periods of time, the effect of other factors (such as seasonal variations in demand or case mix) on service performance are also likely to be missed. Some audits, such as the Falls and Fragility Fracture Audit Programme and the SSNAP audit, have moved towards cycles of continuous audit. This provides continuous, timely feedback and a greater wealth of data from which patterns can be drawn to identify which interventions are most likely to improve patient care. Other programmes audit selected topics on a periodic basis only; the National Cardiac Audit Programme (NCAP), for example, publishes some of their data 3 years after collection. This variation in time between audit cycles limits the use of data for regulatory purposes and is hugely frustrating for clinicians seeking to implement QI processes in a timely manner.23 Traditional audit approaches are gradually being incorporated into cycles of continuous QI, implemented locally, but designed according to internationally recognised methods (such as plan, do, study, act).24 The Royal College of Emergency Medicine (RCEM) audit programme has evolved into a national QI programme with more of a focus on continuous improvement incorporating audit data.25 This allows a more responsive approach, enabling problems to be detected earlier and facilitating a means by which systems, processes (not just outcomes) and non-technical aspects of care can be assessed. A future challenge to regulation will be how best to benchmark and compare providers who are using recognised high-quality QI approaches to respond to the need for service improvements based on locally obtained data. ## Conclusion While NCA participation and results are increasingly used to inform monitoring, inspection and rating of healthcare services, the goal of all audit activity should be to drive improvement in patient care. This requires commitment from hospital boards who should ensure that there is provider-wide oversight of audit outcomes and outliers, enabling any trends of poor performance to be identified and acted upon, and areas of improvement to be celebrated and encouraged. Awareness of the role of national audits, and their place in systematic QI, should be encouraged among all clinical staff, and at all stages of training.11 Findings from the CQC’s inspection activity has consistently shown that, where there is a provider-wide commitment to QI, ratings and outcomes for patients are also consistently positive. It is the CQC’s intention as a regulator to encourage this through the development of an intelligence-driven approach to regulation and to ensure that more patients receive safe, effective, high-quality care.1 This will require ongoing collaboration with NCA steering groups and specialty organisations to ensure that audit data covers a wider range of conditions, is collected in real-time, made contemporaneously available and presented in dashboards with key metrics that can be easily utilised for both QI and quality assurance. ## Supplementary material Additional supplementary material may be found in the online version of this article at [www.rcpjournals.org/clinmedicine](http://www.rcpjournals.org/clinmedicine): S1 – An example of a good response from a trust, adapted with permission from Barking, Havering and Redbridge University Hospitals NHS Trust. ## Acknowledgements The authors would like to thank the following who have been involved in the audits and outliers workstream at the CQC: Mike Zeiderman, Jimmy Walker, Jon Shelton, Amy Lloyd, Matthew Reynolds, Lisa Annaly, Mike Richards, Kevin Kelleher, Chris Brownett, Stephanie Charnley-Smith, Claire Land, Sophie Stevens, Alexander Armstrong, Sundeep Thusu, Iona Thorne, Edwin Selvaratnam, Sanjay Krishnamoorthy, Jane Ingham, Daniel Keenan, Kieran Mullan and Sidhartha Sinha, and the directors, associate directors, project managers, and the Methodology Advisory Group from HQIP. We would also like to thank HQIP and the The Royal College of Paediatrics and Child Health for allowing us to reproduce the funnel plot in Fig 1, and NELA for the use of their example SPC chart for Fig 2. We would also like to thank Kirstin Hannaford, CQC, for editorial support. * © Royal College of Physicians 2021. All rights reserved. ## References 1. Care Quality Commission. Our strategy for 2016 to 2021. CQC, 2016. [www.cqc.org.uk/sites/default/files/20160523\_strategy\_16-21\_strategy\_final\_web\_01.pdf](http://www.cqc.org.uk/sites/default/files/20160523\_strategy\_16-21\_strategy_final_web_01.pdf) [Accessed 07 July 2020]. 2. Healthcare Quality Improvement Partnership. Clinical audit – statutory and mandatory requirements. HQIP, 2017. [www.hqip.org.uk/wp-content/uploads/2018/02/hqip-statutory-and-mandatory-requirements-in-clinical-audit-guidance.pdf](http://www.hqip.org.uk/wp-content/uploads/2018/02/hqip-statutory-and-mandatory-requirements-in-clinical-audit-guidance.pdf) [Accessed 22 March 2021] 3. 1. Skrypak M 1. Alderson S, 2. et al Foy R, Skrypak M, Alderson S, et al. Revitalising audit and feedback to improve patient care. BMJ 2020;368:m213. [FREE Full Text](http://www.rcpjournals.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjE2OiIzNjgvZmViMjZfOS9tMjEzIjtzOjQ6ImF0b20iO3M6Mjg6Ii9jbGlubWVkaWNpbmUvMjEvNS9lNTExLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 4. Care Quality Commission, Healthcare Quality Improvement Partnership. Prioritisation of metrics from national clinical audits and clinical outcome review programmes. HQIP, 2018. [www.hqip.org.uk/wp-content/uploads/2018/02/maximising-ncapop-data-for-cqc-inspections-prioritisation-of-metrics-report.pdf](http://www.hqip.org.uk/wp-content/uploads/2018/02/maximising-ncapop-data-for-cqc-inspections-prioritisation-of-metrics-report.pdf) [Accessed 07 July 2020]. 5. Health Quality Improvement Partnership. National Lung Cancer Audit Context Page. HQIP, 2020. [https://ncab.hqip.org.uk/national-lung-cancer-audit](https://ncab.hqip.org.uk/national-lung-cancer-audit) [Accessed 07 July 2020]. 6. Health Quality Improvement Partnership. National Lung Cancer Audit: King’s College Hospital NHS Foundation Trust. HQIP. [https://ncab.hqip.org.uk/reports/card/trusts/RJZ/NLCA/](https://ncab.hqip.org.uk/reports/card/trusts/RJZ/NLCA/) [Accessed 07 July 2020] 7. 1. Campbell J 1. Cloud GC, 2. et al Bray BD, Campbell J, Cloud GC, et al. Derivation and external validation of a case mix model for the standardized reporting of 30-day stroke mortality rates. Stroke 2014;45:3374–80. [Abstract/FREE Full Text](http://www.rcpjournals.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6OToic3Ryb2tlYWhhIjtzOjU6InJlc2lkIjtzOjEwOiI0NS8xMS8zMzc0IjtzOjQ6ImF0b20iO3M6Mjg6Ii9jbGlubWVkaWNpbmUvMjEvNS9lNTExLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 8. 1. Marston L Sedgwick P, Marston L. How to read a funnel plot in a meta analysis BMJ 2015;351:h4718. [FREE Full Text](http://www.rcpjournals.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjE3OiIzNTEvc2VwMTZfOC9oNDcxOCI7czo0OiJhdG9tIjtzOjI4OiIvY2xpbm1lZGljaW5lLzIxLzUvZTUxMS5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 9. Care Quality Commission. Key lines of enquiry, prompts and ratings characteristics for healthcare services. CQC, 2018. [www.cqc.org.uk/sites/default/files/20180628%20Healthcare%20services%20KLOEs%20prompts%20and%20characteristics%20FINAL.pdf](http://www.cqc.org.uk/sites/default/files/20180628%20Healthcare%20services%20KLOEs%20prompts%20and%20characteristics%20FINAL.pdf) [Accessed 07 July 2020]. 10. Healthcare Quality Improvement Partnership. Detection and management of outliers for national clinical audits. HQIP, 2018. [www.hqip.org.uk/wp-content/uploads/2019/12/Appendix10b\_2018.05.08\_HQIP-2018-implementation-guide\_Eng-only.pdf](http://www.hqip.org.uk/wp-content/uploads/2019/12/Appendix10b_2018.05.08_HQIP-2018-implementation-guide_Eng-only.pdf) [Accessed 01 July 2021]. 11. Royal College of Physicians. Unlocking the potential: Supporting doctors to use national clinical audit to drive improvement. RCP, 2018. [www.rcplondon.ac.uk/projects/outputs/unlocking-potential-supporting-doctors-use-national-clinical-audit-drive](http://www.rcplondon.ac.uk/projects/outputs/unlocking-potential-supporting-doctors-use-national-clinical-audit-drive) [Accessed 07 July 2020]. 12. Academy of Medical Royal Colleges. Quality improvement: Training for better outcomes. AoMRC, 2016. [www.aomrc.org.uk/wp-content/uploads/2016/06/Quality\_improvement\_key\_findings\_140316-2.pdf](http://www.aomrc.org.uk/wp-content/uploads/2016/06/Quality\_improvement_key_findings_140316-2.pdf) [Accessed 07 July 2020]. 13. National Institute for Clinical Excellence. Principles for best practice in clinical audit. Radcliffe Medical Press, 2002. [www.nice.org.uk/media/default/About/what-we-do/Into-practice/principles-for-best-practice-in-clinical-audit.pdf](http://www.nice.org.uk/media/default/About/what-we-do/Into-practice/principles-for-best-practice-in-clinical-audit.pdf) [Accessed 07 July 2020]. 14. Royal College of Anaesthetists. Raising the standard: a compendium of audit recipes for continuous quality improvement in anaesthesia. RCoA, 2012. [www.rcoa.ac.uk/sites/default/files/documents/2019-09/CSQ-ARB-2012\_0.pdf](http://www.rcoa.ac.uk/sites/default/files/documents/2019-09/CSQ-ARB-2012_0.pdf) [Accessed 07 July 2020]. 15. Care Quality Commission. Driving improvement: Case studies from eight NHS trusts. CQC, 2017. [www.cqc.org.uk/sites/default/files/20170614\_drivingimprovement.pdf](http://www.cqc.org.uk/sites/default/files/20170614_drivingimprovement.pdf) [Accessed 07 July 2020]. 16. Healthcare Quality Improvement Partnership. National Audit of Inpatient Falls audit report. HQIP, 2017. [www.hqip.org.uk/resource/national-audit-of-inpatient-falls-audit-report-2017](http://www.hqip.org.uk/resource/national-audit-of-inpatient-falls-audit-report-2017) [Accessed 07 July 2020]. 17. 1. McCabe J 1. Kennedy K, 2. et al Waldo S, McCabe J, Kennedy K, et al. Quality of care in hospitals identified as outliers in publicly reported mortality statistics for percutaneous coronary intervention. Circulation 2017;135:1897–907. [Abstract/FREE Full Text](http://www.rcpjournals.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTQ6ImNpcmN1bGF0aW9uYWhhIjtzOjU6InJlc2lkIjtzOjExOiIxMzUvMjAvMTg5NyI7czo0OiJhdG9tIjtzOjI4OiIvY2xpbm1lZGljaW5lLzIxLzUvZTUxMS5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 18. Care Quality Commission. The state of care in NHS acute hospitals: 2014 to 2016: Findings from the end of CQC’s programme of NHS acute comprehensive inspections. CQC, 2016. [www.cqc.org.uk/sites/default/files/20170302b\_stateofhospitals\_web.pdf](http://www.cqc.org.uk/sites/default/files/20170302b_stateofhospitals_web.pdf) [Accessed 07 July 2020]. 19. Clover B. Teaching trust flagged as cancer outlier two years in a row. Health Services Journal 2019. [www.hsj.co.uk/quality-and-performance/teaching-trust-flagged-as-cancer-outlier-two-years-in-a-row/7025986.article](http://www.hsj.co.uk/quality-and-performance/teaching-trust-flagged-as-cancer-outlier-two-years-in-a-row/7025986.article) [Accessed 07 July 2020]. 20. 1. Bray B 1. Buckingham R Stewart K, Bray B, Buckingham R. Improving the quality of care through national clinical audit. FHJ 2016;3:203–6. 21. Care Quality Commission. Quality improvement in hospital trusts: Sharing learning from trusts on a journey of QI. CQC, 2018. [www.cqc.org.uk/publications/evaluation/quality-improvement-hospital-trusts-sharing-learning-trusts-journey-qi](http://www.cqc.org.uk/publications/evaluation/quality-improvement-hospital-trusts-sharing-learning-trusts-journey-qi) [Accessed 07 July 2020] 22. 1. Bechman K 1. Dennison E Yates M, Bechman K, Dennison E. Data quality predicts care quality: findings from a National Clinical Audit. Arthritis Therapy and Research 2020;22:87. 23. 1. Alvarado N 1. Keen J, 2. et al McVey L, Alvarado N, Keen J, et al. Institutional use of national clinical audits by healthcare providers. J Eval Clin Pract 2020;27:143–50. 24. Skull S. Embedding clinical audit into everyday practice: Essential methodology for all clinicians. J Paediatr Child Health 2020;56:1533–6. 25. Royal College of Emergency Medicine. RCEM quality improvement guide. RCEM, 2020. [www.rcem.ac.uk/docs/QI%20Resources/RCEM%20Quality%20Improvement%20Guide%20(June%202020).pdf](http://www.rcem.ac.uk/docs/QI%20Resources/RCEM%20Quality%20Improvement%20Guide%20(June%202020).pdf) [Accessed 3 February 2021].