Systems, design and value-for-money in the NHS: mission impossible? =================================================================== * Terry Young * Alec Morton * Sada Soorapanth ## ABSTRACT NHS organisations are being challenged to transform ­themselves sustainably in the face of increasing demands, but they have little room for error. To manage trade-offs and risks precisely, they must integrate two very different streams of ­expertise: systems approaches to service design and implementation, and economic evaluation of the type pioneered by the National Institute of Health and Care Excellence (NICE) for pharmaceuticals and interventions. Neither approach is fully embedded in NHS service transformation, while the combination as an integrated discipline is still some way away. We share three examples to show how design methods may be deployed within a value-for-money framework to plan operationally and in terms of clinical outcomes. They are real cases briefly described and the unreferenced ones are anonymised. They have been selected by one of the authors (TY) during his sabbatical research because each illustrates a commonly observed challenge. To meet these challenges, we argue that the health economics cost / quality-adjusted life year (QALY) framework promulgated by NICE provides an under-appreciated lens for thinking about trade-offs and we highlight some systems tools which have also been under-utilised in this context. KEYWORDS * Economic evaluation * cost effectiveness analysis * health service design * value assessment * health services ## Introduction NHS Providers has described the NHS’s challenge as ‘mission impossible’.1 Its report recognises that while service improvement must not let up, there must be a new realism about trade-offs, and sometimes silver-plated services may have to suffice in place of gold. This demands an approach to service improvement that can avoid programmes that go nowhere or go wrong, while communicating realistic trade-offs to politicians, the public and staff. We frame this as a question of system design – in its broadest sense – within a proven value-for-money framework. For nearly two decades, the National Institute of Health and Care Excellence (NICE) has set the criteria for clinical decisions in a way that accommodates finance as well as outcomes. It is only starting to turn to the question of service design – the interest in value and design is growing2 – but we pursue this line and use a map upon which planners may position their intent for a service ahead of implementation and evaluate it afterwards by focusing on cost and health benefit.3–6 ## Setting service design in a value-for-money framework Any health service development must address two questions: is it cheaper and does it deliver better care? The cost-effective plane (CE-plane) in Fig 1 puts each on a separate axis: health benefit measured typically in quality-adjusted life years (QALY) and expenditure in £.7–10 The QALY is a measure of the utility gained by an individual from a drug or intervention. A new hip, for instance, might restore mobility for 15 years, so the extra utility (0.15, say) would be integrated over that period (at a modest discount rate of, say, 2.5%) to determine the health gained (almost 2 QALYs). ![Fig 1.](http://www.rcpjournals.org/https://www.rcpjournals.org/content/futurehosp/5/3/156/F1.medium.gif) [Fig 1.](http://www.rcpjournals.org/content/5/3/156/F1) Fig 1. The cost-effective plane for assessing services. NICE = National Institute of Health and Care Excellence Other countries use affordability thresholds but NICE, unusually, sets a cost-effectiveness threshold for NHS care (with suitable caveats) at £20,000–30,000/QALY.11,12 Therefore, if a hip replacement costs £10,000, its cost-effectiveness of ∼£5,000/QALY would place it well under the NICE threshold and so it would be deemed cost-effective. The average cost of a QALY in the NHS has been estimated at £12,90013 and the hip would be good value even on that basis. Using such a guide, a £200 million NHS hospital should deliver 15,500 QALYs/year. We think of each service as a QALY factory, delivering health utility to each a patient on discharge. By placing the existing service (orange dot) at the origin, one can design into the shaded area. The most desirable designs deliver greater efficiency and better outcomes (blue dot), but services that cost more and deliver more (green dot) may be cost-effective if the incremental cost-effectiveness ratio (ICER) is <£20,000/QALY. The principle works in reverse to make savings (purple dot). However, on the principle that people are more demanding when saving than when investing (both in general and for health14,15), the threshold gradient steepens and planners may seek to save £40,000–60,000 for each surrendered QALY. ## Vignettes of service changes To explore how these ideas may be used, we appeal to three examples. ### Example 1: ward closure to save money Allen cites the closure of a medical ward, a relatively easy savings measure, that led a frustrated staff member to observe: ‘Ask anybody on the ground what the impact would be of shutting 38 medical beds and they would have told you it was asking for trouble.’16 The justification for the decision? ‘[It] was assumed that there would be facilities in the community’. Fig 2 shows the original plan (in purple) and the locus along which the final outcome lay (red dots). Although aiming to save money, there were unplanned downsides because of, first, the disruption to staff and patients of spreading care across inappropriate wards as beds become scarce. Second, delays in getting patients to a ward, or having them as outliers, leads to poorer outcomes and reduces the health benefit the service delivers. In real terms, there was a risk of overall loss. ![Fig 2.](http://www.rcpjournals.org/https://www.rcpjournals.org/content/futurehosp/5/3/156/F2.medium.gif) [Fig 2.](http://www.rcpjournals.org/content/5/3/156/F2) Fig 2. Cost-effectiveness plane for a decision to close a ward. NICE = ­National Institute of Health and Care Excellence; QALY = quality-adjusted life year With hindsight a different plan might have prevailed and the question of embedding effective foresight into decision making is one we must return to. ### Example 2: ergonometric design of wards A design study for a new ward in a hospital in the north of England aimed to shorten regular nursing journeys, using paper drawings, pencils and rulers to estimate walking distances. The ward was completed in 2014/15 and was thoroughly evaluated in 2016 (with an update in spring 2017) by comparing measurements from the original ward and another (more modern) ward with the performance of the new. Pedometer results suggested savings of up to half an hour per shift per nurse, while other benefits included lower maintenance, fewer medication errors, slips and falls, and greater patient satisfaction, together with reduced absenteeism and higher retention of staff. The design delivered what it was supposed to and the real position, measured afterwards, revealed unexpected benefits for patients and staff. In this case, the design tools were basic but the evaluation was detailed. The plans and realisation are captured in Fig 3. ![Fig 3.](http://www.rcpjournals.org/https://www.rcpjournals.org/content/futurehosp/5/3/156/F3.medium.gif) [Fig 3.](http://www.rcpjournals.org/content/5/3/156/F3) Fig 3. Ward redesigned to reduce staff journeys: planned (purple) and measured (green). NICE = National Institute of Health and Care Excellence ### Example 3: planning options A Clinical Commissioning Group (CCG) wishing to increase the level of community-based mental health provision commissioned a System Dynamics (SD) study to support its planning. SD uses computers to simulate how patients flow around a service under different configurations and using different mixes of staff. Through a series of scenarios the team identified, for example, a service configuration that increased the proportion of patients treated at home from 38% to 45%, while reducing those treated in hospital from 60% to 45% (a new primary care service handling the remainder). With a final target to provide 50% of care at home (20% in primary care and 30% as acute care), the scenario described a half-way house and generated confidence to plan the next cycle of service improvement. In this case, the CE-plane can be used to sense-test the planning team’s proposals. Fig 4 shows the space that a new service might viably occupy in terms of operations and outcomes, and illustrates another way in which a map such as this may aid early decision making. Moreover, once a service has been commissioned, the same map may be used to evaluate its performance in operation. ![Fig 4.](http://www.rcpjournals.org/https://www.rcpjournals.org/content/futurehosp/5/3/156/F4.medium.gif) [Fig 4.](http://www.rcpjournals.org/content/5/3/156/F4) Fig 4. Use of a model in planning to prototype service options and ensure that there is a viable design space for a new type of service. NICE = National Institute of Health and Care Excellence ## What design methods can be used in conjunction with the CE-plane? The Royal Academy of Engineering’s recent report, *Engineering better care*,17 promoted engineering principles in healthcare. It describes two broad approaches – the double diamond and the spiral – within which specific methods may be deployed to better understand the problem and to generate a solution. These methods might include the following. * Drawing pictures: healthcare managers and clinicians are familiar with pathway mapping and may well have encountered rich pictures,18–20 flowcharts and sketches of physical environments, or any other simple diagram.21–23 Diagrams aid systematic analysis and may highlight dependencies, clashes or obstructions, and help to sequence events. * Decision trees:24–6 as simple pencil-and-paper studies, or backed by probabilities and run in spreadsheets, these tools are useful for sequencing decisions and help to eliminate loose ends (perhaps patients who do not fit the main categories and who slip out of sight). * Simulations and computer models:27–32 computer games are perhaps the best known way to create synthetic worlds in which people can interact, and many have an element of design, as players build a theme park or even a hospital. Critically, models support: what if I put more staff on the ward from 2 pm; what if the one-stop clinic also has access to ultrasound imaging? A challenge is the slow adoption of such methods in healthcare.33,34 * Prototypes:35 the literature is vast and varied and Adrian Newey is readable on designing fast cars.36 Engineering relies heavily on prototypes, from computer-generated to scale models, to life-size mock-ups. Prototypes allow two critical classes of questions to be addressed: will a service do what it is intended to do and what might happen that had not been anticipated? ## Discussion Nobody deliberately designs a service to be poorer or more expensive, but many plans fail. Bohmer and Imison assess the NHS’s attempts to reduce care costs through workforce redesign and report: *England’s experience is a cautionary tale…Well-intentioned reforms have often failed to generate the expected results because workforce redesigns were not accompanied by work redesigns. New roles became supplementary instead of substitutive, and gains on the cost side were offset by increases in utilization rates or transaction costs*.37 Although not couched in terms of systems or value, they put their finger on the NHS’s capacity to redesign and highlight the problem of visible savings against hidden losses. We note that offset losses may take the form of poorer health outcomes, and that they can be quantified if we follow NICE’s lead. We have seen how closing a ward affects the whole hospital, and the wider the system in which the impact may be felt, so planners need reliable ways to predict the gains and losses to the wider system when managing change locally. Table 1 makes some helpful connections between these problems and suitable methods to design a solution. View this table: [Table 1.](http://www.rcpjournals.org/content/5/3/156/T1) Table 1. Suitable design methods for the examples given Implicit in the decision to close a ward (Example 1) is the time pressure: therefore, a suitable method must be easy to use, yet test the underlying assumptions and identify hidden and unexpected side effects. An afternoon of drawing process maps or decision trees would probably answer the key questions: where do this ward’s patients come from and where would they go once it closes? The closure might then be accompanied by a robust patient-routing plan. Ward flows (Example 2) are interesting, especially because there are now sophisticated ways to model patient and staff flows. What is noteworthy here is that even simple tools, applied within a compelling value framework of saving activity and informed by good before and after data, can make a difference and be shown to have done so. In this case, the ward was essentially a one-for-one replacement of another ward and so the risk of knock-on effects was reduced. As a pursuit, planning options (Example 3) lends itself ideally to computer modelling, since many services have good data and predictions of population growth are available. Depending upon the scale and risks involved, one might choose to prototype some elements separately. For instance, people from a patient participation group might, with due ethical clearance, join a focus group and the physical access of parts of the system could be tested using a mock-up. The key thing is that the CE-plane can be used to establish the context of a proposed change and then one of a series of methods can be used to see ahead of time whether the aim is possible. ## Conclusions The challenging months and years to come will require the NHS to take a laser focus on *trade-offs* and on *risk*. We think the ­cost-effective plane – an indigenous NHS invention – provides an excellent way of framing trade-offs, and indicate how it might be linked to system and design methods to address the risks associated with system change. We have used three examples to demonstrate how the value framework being established by NICE can be applied to service design in conjunction with systems thinking and have reflected on the design methods that could be used. Whether the mission which the NHS faces is a mission impossible – as NHS providers claim – is open to discussion; what is not open to debate is that NHS staff have no choice about whether to accept it or not. ## Acknowledgements The authors wish to thank Rosemary Jenssen and Peter Lacey for ­discussions and for their contributions to this paper. * © Royal College of Physicians 2018. All rights reserved. ## References 1. NHS Providers. Mission Impossible? The task for NHS providers in 2017/18. [http://nhsproviders.org/media/2727/mission-impossible-report.pdf](http://nhsproviders.org/media/2727/mission-impossible-report.pdf) [Accessed 2 March 2018]. 2. 1. Salisbury C 1. Jani A, 2. et al Watson J, Salisbury C, Jani A, et al. Better value primary care is needed now more than ever. BMJ 2017:359. 3. Steinbrook R. Saying No isn’t NICE –the travails of Britain’s National Institute for Health and Clinical Excellence. N Engl J Med 2008;359:1977–81. [CrossRef](http://www.rcpjournals.org/lookup/external-ref?access_num=10.1056/NEJMp0806862&link_type=DOI) [PubMed](http://www.rcpjournals.org/lookup/external-ref?access_num=18987366&link_type=MED&atom=%2Ffuturehosp%2F5%2F3%2F156.atom) [Web of Science](http://www.rcpjournals.org/lookup/external-ref?access_num=000260632700001&link_type=ISI) 4. National Institute for Health and Clinical Excellence. The guidelines manual. NICE, 2012. [www.nice.org.uk/process/pmg6/chapter/­introduction](http://www.nice.org.uk/process/pmg6/chapter/introduction) [Accessed 6 March 2018]. 5. 1. Chalkidou K 1. Littlejohns P Stevens A, Chalkidou K, Littlejohns P. The NHS: assessing new technologies, NICE and value for money. Clin Med 2011;11:247–50. [Abstract/FREE Full Text](http://www.rcpjournals.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTI6ImNsaW5tZWRpY2luZSI7czo1OiJyZXNpZCI7czo4OiIxMS8zLzI0NyI7czo0OiJhdG9tIjtzOjI0OiIvZnV0dXJlaG9zcC81LzMvMTU2LmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 6. 1. Knapp M 1. Fernandez JL Cerri KH, Knapp M, Fernandez JL. Decision making by NICE: examining the influences of evidence, process and context. Health Econ Policy Law 2014;9:119–41. [CrossRef](http://www.rcpjournals.org/lookup/external-ref?access_num=10.1017/S1744133113000030&link_type=DOI) [PubMed](http://www.rcpjournals.org/lookup/external-ref?access_num=23688554&link_type=MED&atom=%2Ffuturehosp%2F5%2F3%2F156.atom) 7. 1. McGuire A 1. Drummond M , McGuire A **(**eds), Economic Evaluation in Health Care: Merging Theory with Practice Oxford. Oxford: OUP, 2001. 8. 1. Sculpher MJ 1. Claxton K 1. Stoddart GL 1. Torrance GW. Drummond MF, Sculpher MJ, Claxton K, Stoddart GL, Torrance GW. Methods for the evaluation of healthcare programmes. Oxford: OUP, 2015. 9. 1. Siegel J 1. Russell L 1. Weinstein M. Gold M, Siegel J, Russell L, Weinstein M. Cost-effectiveness in health and medicine. Oxford and New York: OUP, 1996. 10. 1. Claxton K 1. Sculpher M Briggs A, Claxton K, Sculpher M. Decision modelling for health ­economic evaluation. Oxford: OUP, 2006. 11. 1. Guerra-Junior AA 1. Godman B 1. Morton A 1. Ruas CM Santos AS, Guerra-Junior AA, Godman B, Morton A, Ruas CM. Cost-effectiveness thresholds: methods for setting and examples from around the world. Expert Rev Pharmacoecon Outcomes Res 2018;18:277–88. 12. 1. Drummond M 1. Kanavos P Sorensen C, Drummond M, Kanavos P. Ensuring value for money in health care. Denmark: WHO Regional Office for Europe on behalf of the European Observatory on Health Systems and Policies, 2008. 13. 1. Martin S 1. Soares M, 2. et al Claxton K, Martin S, Soares M, et al. Methods for the estimation of the National Institute for Health and Care Excellence cost-­effectiveness threshold. Health Technol Assess 2015;19:1–503. [CrossRef](http://www.rcpjournals.org/lookup/external-ref?access_num=10.3310/hta19880&link_type=DOI) [PubMed](http://www.rcpjournals.org/lookup/external-ref?access_num=26507206&link_type=MED&atom=%2Ffuturehosp%2F5%2F3%2F156.atom) 14. 1. Tversky A Kahneman D, Tversky A. Prospect Theory: an analysis of decision under risk. Econometrica 1979;47:263–92. [CrossRef](http://www.rcpjournals.org/lookup/external-ref?access_num=10.2307/1914185&link_type=DOI) [PubMed](http://www.rcpjournals.org/lookup/external-ref?access_num=15795132&link_type=MED&atom=%2Ffuturehosp%2F5%2F3%2F156.atom) [Web of Science](http://www.rcpjournals.org/lookup/external-ref?access_num=A1979GR75700001&link_type=ISI) 15. 1. Gertsen K 1. Willan AR 1. Faulkner LA O’Brien BJ, Gertsen K, Willan AR, Faulkner LA. Is there a kink in consumers’ threshold value for cost-effectiveness in health care? Health Economics 2002;11:175–80. [CrossRef](http://www.rcpjournals.org/lookup/external-ref?access_num=10.1002/hec.655&link_type=DOI) [PubMed](http://www.rcpjournals.org/lookup/external-ref?access_num=11921315&link_type=MED&atom=%2Ffuturehosp%2F5%2F3%2F156.atom) [Web of Science](http://www.rcpjournals.org/lookup/external-ref?access_num=000174474400008&link_type=ISI) 16. Allen D. The invisible work of nurses: hospitals, organisation and healthcare, 1st edn. Abingdon: Routledge, 2014. 17. 1. Bogle D 1. Dean J Clarkson J, Bogle D, Dean J Engineering better care: a ­systems approach to health and care design and continuous improvement. London: Royal Academy of Engineering, 2017 18. Checkland P. Soft systems methodology. Human Systems Management 1989;8:273–89. 19. 1. Poulter J Checkland P, Poulter J. Learning for action: a short definitive account of Soft Systems Methodology and its use for practitioners, teachers and students. Chichester: Wiley, 2006. 20. 1. Scholes J Checkland P, Scholes J. Soft Systems Methodology in action. New York: Wiley, 1991. 21. Smartdraw flowchart. [www.smartdraw.com/flowchart/](http://www.smartdraw.com/flowchart/) [Accessed 7 March 2018]. 22. 1. Hinrichs S 1. Jafri T 1. Clarkson PJ Jun TG, Hinrichs S, Jafri T, Clarkson PJ. Thinking with simple ­diagrams in healthcare systems design. Proceedings of DESIGN 2010, the 11th International Design Conference. 2010:1787–94. 23. 1. Ward JR 1. Clarkson PJ Simsekler MCE, Ward JR, Clarkson PJ. Evaluation of system ­mapping approaches in identifying patient safety risks. Int J Qual Health Care 2018;30:227–33. 24. 1. Abbas AE Howard RA, Abbas AE. Foundations of decision analysis. Pearson, 2016. 25. Raiffa H. Decision analysis: introductory lectures on choice under uncertainty. New York: Random House, 1968. 26. 1. Wright G Goodwin P, Wright G. Decision analysis for management ­judgement, 5th edn. Chichester: Wiley, 2014. 27. Pidd M. Tools for thinking – modelling in management science. Chichester: John Wiley and Sons, 1996. 28. 1. Klein JH Brailsford S, Klein JH. The value of modelling and simulation in healthcare. Brunel University London on behalf of the Cumberland Initiative, 2015. [http://cumberland-initiative.org/wp-content/uploads/2015/07/the\_value\_report\_web.pdf](http://cumberland-initiative.org/wp-content/uploads/2015/07/the_value_report_web.pdf) [Accessed 20 November 2015]. 29. 1. Nelson NL 1. Nicol DM Carson JS, Nelson NL, Nicol DM. Discrete Event Simulation. Englewood Cliffs, NJ: Prentice Hall, 2000. 30. Pidd M. Computer Simulation in Management Science, 5th edn. Chichester: Wiley, 2004. 31. Robinson S. Simulation: the practice of model development and use. Chichester: Wiley, 2004. 32. Forrester JW. System dynamics, systems thinking, and soft OR. System Dynamics Review 1994;10:245–56. [CrossRef](http://www.rcpjournals.org/lookup/external-ref?access_num=10.1002/sdr.4260100211&link_type=DOI) [Web of Science](http://www.rcpjournals.org/lookup/external-ref?access_num=A1994NU87200010&link_type=ISI) 33. 1. Naseer A 1. Stergioulas L, 2. et al Jahangirian M, Naseer A, Stergioulas L, et al. Simulation in health-care: lessons from other sectors. Operational Research International Journal 2012;12:45–55. 34. 1. Hankin J 1. Young T Fackler J, Hankin J, Young T. Why healthcare professionals are slow to adopt modeling and simulation. Proceedings of the 2012 Winter Simulation Conference 2012:97. 35. 1. Warschat J 1. Fischer D Bullinger HJ, Warschat J, Fischer D. Rapid product development – an overview. Computers in Industry 2000;42:99–108. 36. Newey A. How to build a car. London: HarperCollins, 2017. 37. 1. Imison C Bohmer RMJ, Imison C. Lessons from England’s health care workforce redesign: no quick fixes. Health Aff 2013;32:2025–31. [Abstract/FREE Full Text](http://www.rcpjournals.org/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6OToiaGVhbHRoYWZmIjtzOjU6InJlc2lkIjtzOjEwOiIzMi8xMS8yMDI1IjtzOjQ6ImF0b20iO3M6MjQ6Ii9mdXR1cmVob3NwLzUvMy8xNTYuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9)