Skip to main content

Main menu

  • Home
  • Our journals
    • Clinical Medicine
    • Future Healthcare Journal
  • Subject collections
  • About the RCP
  • Contact us

Future Healthcare Journal

  • FHJ Home
  • Content
    • Current
    • Ahead of print
    • Archive
  • Author guidance
    • Instructions for authors
    • Submit online
  • About FHJ
    • Scope
    • Editorial board
    • Policies
    • Information for reviewers
    • Advertising

User menu

  • Log in

Search

  • Advanced search
RCP Journals
Home
  • Log in
  • Home
  • Our journals
    • Clinical Medicine
    • Future Healthcare Journal
  • Subject collections
  • About the RCP
  • Contact us
Advanced

Future Healthcare Journal

futurehosp Logo
  • FHJ Home
  • Content
    • Current
    • Ahead of print
    • Archive
  • Author guidance
    • Instructions for authors
    • Submit online
  • About FHJ
    • Scope
    • Editorial board
    • Policies
    • Information for reviewers
    • Advertising

Artificial intelligence and the NHS: a qualitative exploration of the factors influencing adoption

Kirsty Morrison
Download PDF
DOI: https://doi.org/10.7861/fhj.2020-0258
Future Healthc J November 2021
Kirsty Morrison
AUniversity of Birmingham College of Medical and Dental Sciences, Birmingham, UK
Roles: medical student
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: kirsty_morrison@aol.co.uk
  • Article
  • Figures & Data
  • Info & Metrics
Loading

Abstract

Background AI has the potential to improve healthcare. However, there is limited research investigating the factors which influence the adoption of AI within a healthcare system.

Research aims I aimed to use innovation theory to understand the barriers and facilitators that influence AI adoption in the NHS; and to explore solutions to overcome these barriers, and examine these factors, particularly within radiology, pathology and general practice.

Methodology Twelve semi-structured, one-to-one interviews were conducted with key informants. Interview data were analysed using thematic analysis.

Findings A range of barriers and facilitators to the adoption of AI within the NHS were identified, including IT infrastructure and language clarity. Several solutions to overcome the barriers were proposed by participants, including education strategies and innovation champions.

Conclusion Future research should explore the importance of IT infrastructure in supporting AI adoption, examine the terminology around AI and explore specialty-specific barriers to AI adoption in greater depth.

KEYWORDS:
  • artificial intelligence
  • machine learning
  • adoption
  • NHS

Introduction

Across clinicians, data scientists, managers and governments, there is an aspiration that AI will transform healthcare delivery over the coming years. The enthusiasm for such a transformation is growing rapidly; a Google search for ‘AI in health and social care’ produces 360 million search results, an increase from 109 million just 12 months ago.1 However, while technical breakthroughs in AI and healthcare are shared widely, there is often less interest in solutions to promote the adoption of such technologies across complex, and perhaps siloed, health systems.2,3 This is a critical gap within the literature. If doctors are unwilling to work with AI, if concerns about data bias and legal liability are unaddressed, or if the data needed to validate such technologies cannot be shared with industry, then these breakthroughs will remain hypothetical.

AI can be defined as ‘the science of making machines do things that would require intelligence if done by people’.4 Despite some controversy, there is a growing consensus that the term ‘AI’ encompasses the subsets of machine learning and deep learning algorithms.5,6 These are the two computer science disciplines most commonly applied in healthcare.6,7 Therefore, the broad term of AI will be used throughout this paper.

There is a consensus that the specialties that stand to realise the potential of AI soonest are those which are data-driven, such as radiology and pathology.3,8,9 For instance, machine learning algorithms can be as accurate as radiologists in interpreting four important chest X-ray findings.10 Meanwhile, other radiology applications include screening for breast and lung cancers, as AI could support the interpretation of mammography and computed tomography of the chest.9,11,12 Within pathology, AI could be applied to improve the efficiency of metastases detection in lymph nodes and the accuracy of prostate cancer grading.13,14 These applications are especially valuable because radiology and pathology are currently facing significant pressures across the NHS.15,16 However, while data-driven specialties may harness the benefits of AI earlier, its impact is sure to be felt more widely. Within primary care, the Royal College of General Practitioners believes that AI will have the most valuable near-term impact when used for administrative tasks, releasing time for healthcare professionals.17 Future AI applications in general practice include identifying the relevant guidelines for each consultation, and potentially even recognising, and challenging, cognitive biases.18,19

There is a paucity of qualitative research, which evaluates the barriers to AI adoption within a healthcare system. This is significant because qualitative research is ideally suited to examining the attitudes, behaviours and decisions that impact whether an innovation is ultimately adopted. Of the two qualitative studies that explored barriers to adoption, one was conducted with French stakeholders, whose perspectives may not translate to the NHS, while the other focused solely on autonomous robots, which are unlikely to be the first AI product adopted at scale.20,21 Moreover, there is limited literature that explores the facilitators of AI adoption. Therefore, this paper moves beyond the current evidence base by examining both barriers and facilitators to AI adoption, within the context of NHS clinical practice.

Diffusion of innovations (DOI) theory has been used as a theoretical framework to interpret the findings of this qualitative study, because it has been applied successfully to understand the adoption of other medical technologies.22–27 Rogers defined diffusion as ‘the process by which an innovation is communicated through certain channels over time, among the members of a social system.’22 DOI theory describes four elements which influence the diffusion of a given innovation: characteristics of the innovation (Table 1), communication channels, time and the social system.

View this table:
  • View inline
  • View popup
Table 1.

Further information on the diffusion of innovations framework22

The primary objective of this research was to apply innovation theory to understand the barriers and facilitators to AI adoption in the NHS, as perceived by key informants. Where barriers were identified, potential solutions to overcome these were explored. The secondary objective was to examine these factors within the context of radiology, pathology and general practice.

Methods

This qualitative interview-based study examines the opinions of thought leaders in the UK healthcare AI landscape.

Ethics and permissions

The study received ethical approval from the University of Birmingham Internal Research Ethics Committee. Participants provided written informed consent to participate.

Sampling and recruitment

Purposive sampling selected key informants who were thought leaders from diverse backgrounds. Methods to select participants included contacting relevant royal colleges, regulatory bodies and research organisations, hand searching lists of contributors to major reports in the field and searching on LinkedIn. Four groups of key informants were sought (Table 2).

View this table:
  • View inline
  • View popup
Table 2.

Key informants sought and rationale

Twelve participants were recruited. Initially, organisations were emailed using publicly accessible email addresses or existing contacts. This led to the recruitment of five participants. The remaining seven participants were identified by the researcher and contacted through email or LinkedIn. Thirteen prospective participants and four organisations who were contacted did not respond or were unable to participate.

Recruitment planned to continue until data saturation was reached. However, COVID-19 placed considerable stress on the healthcare sector, so recruitment ended after 12 interviews. The researcher was satisfied this number of interviews would generate useful data because the aim of qualitative research is not to generalise the findings and key informant sampling is recognised to generate rich data even from a small number of interviews.28,29

Data collection

Twelve interviews were conducted in March and April 2020. The average interview length was 37 minutes (range 23–52). The researcher conducted all interviews one-to-one with the participants. One interview occurred face-to-face, with the rest by Zoom. The researcher used a reflexive approach to consider how her personal characteristics shaped the research (supplementary material S1).

Interviews were semi-structured, guided by a topic guide (supplementary material S2) which was pilot tested. Interviews were audio recorded and transcribed verbatim by the researcher. Field notes were made following each interview.

To improve validity, data collection and analysis were iterative processes. The topic guide was modified following the initial coding of interview transcripts to allow unforeseen issues raised by participants to be explored further in subsequent interviews. A summary of the initial coding of each transcript was returned to participants for member checking. Saturation of codes was reached within 11 interviews.

Data analysis

Transcribed interviews were uploaded to NVivo 12 software, and Braun and Clarke's six-step guide to thematic analysis was then followed (Box 1).30

View this table:
  • View inline
  • View popup
Box 1.

Description of data analysis

Findings

Twelve interviews were conducted with participants who worked in the UK healthcare AI ecosystem. Participants included NHS doctors, managers, researchers and personnel at regulatory bodies (supplementary material S3).

Three themes and nine sub-themes were identified (Table 3).

View this table:
  • View inline
  • View popup
Table 3.

Themes and sub-themes

Theme 1: system

Socio-political context of the NHS in 2020

A lack of funding was cited by six participants (across all four key informant groups) as a barrier to AI adoption in the NHS. A typical sentiment was that ‘the NHS doesn't have any money.’ (P3).

Additionally, it was suggested that NHS organisations often cannot look past the initial start-up costs to future benefits; P7: ‘I think trust finances often focus on such a short-term basis at the moment that, if [AI] improves patient outcomes and efficiencies over 5 years, great. But what's it going to do to the trust finances for the next 12 months?’

The quality of IT infrastructure across the NHS was mentioned by nine participants as a barrier to AI adoption; P8: ‘If you can't invest in the basic digital infrastructure, then AI is it out of your reach.’

However, two participants (P5 and P9) disagreed, and expressed the view that the quality of IT infrastructure was not a significant barrier to AI adoption.

Many participants also discussed how it would be important to fund the change, and not the technology in isolation; P3: ‘In terms of practical facilitators ... IT and tech project managers and dedicated funding to support them as well.’ and P12: ‘Champions can be really helpful to allay fears.’

Regulatory landscape

Eight participants highlighted the current regulatory landscape as a barrier; P5: ‘[Regulation] is an absolute mess, right? It's an absolute mess. ... If you've got a piece of kit, which is AI, where does that sit? The MHRA? The GMC? ... CQC?’

Those who felt regulation acted as a barrier highlighted that regulation was confusing for developers to navigate and the roles and remits of regulators were unclear.

Fit within the puzzle

Nearly all participants discussed how some specialties would be more amenable to AI than others; P8: ‘I think obviously the low hanging fruit would be ... doctors with patterns. So, image recognition ... dermatology, radiology, pathology and ophthalmology.’

Most participants felt that there would be specialty-specific barriers to adoption, and some gave examples of these. However, four participants did not feel qualified to comment on specialty-specific barriers to adoption. Within those who did discuss the issue, many emphasised that adoption in primary care faces different challenges to secondary care; P9: ‘The general practitioners (GPs) and the tech suppliers aren't able to work together because there are just so many GP practices.’ P3: ‘GPs deal a lot with mental health, chronic illnesses, disabilities and things which are very non-digitisable healthcare problems.’

It was suggested that NHSX, the body responsible for the digital transformation of the NHS, should identify where the NHS could benefit most from AI and share those needs with developers so that they can create useful AI products; P9: ‘I think there's a role for NHSX ... and people like that to really identify. Well, what can this technology do? ... And then where does that best plug into clinical pathways to release value or improve care? And then to signal demand to the tech developers ... and then they can develop against it.’

Theme 2: people

What actually is AI?

The need for clarity of language around AI was raised by many participants; P10: ‘We all need to use the same language ... you know you can be 10 minutes into a meeting and no one's got a clue what anyone's talking about because there's no baseline terms.’

Several participants expressed dislike for the term ‘AI’, with some going as far as to claim ‘I hate the word AI.’ (P2).

Additionally, hype regarding AI was emphasised by four participants as a potential barrier; P2: ‘People talk about AI like ... it's a utopian dream that's going to solve all our needs financially and clinically and everything else.’

Finally, misunderstanding and fear around AI were highlighted by a few participants. It was suggested that fears around AI often stemmed from misunderstandings around ‘what AI might be, rather than what it actually is.’ (P1).

People powered transformation

Seven participants outlined the need for the education of healthcare professionals, particularly ‘explaining what the benefits will be and how it will help them in their work.’ (P12).

Three participants proposed that real-world examples of AI being used in the NHS would support better understanding and dispel some fears; P1: ‘I think once we have one or two actual examples of AI being deployed into the NHS and people can see ... the benefits ... then people will think of it just as any other type of software that we would now struggle to do without.’

The need to educate, and communicate with, the public was raised by a few participants; P10: ‘Then someone needs to work on the comms plan, you know, with the public. Can you imagine ... the public with some of this stuff?’

It's not going to be a robo-doc

Five participants highlighted how they did not believe that AI would replace doctors, with one clarifying ‘that is nobody's intention.’ (P1). Instead, several participants suggested that AI had the potential to improve the working life of clinicians; P12: ‘So, [doctors] might be thinking that they're going to lose their job and this AI system is going to do everything. But the reality is, it's just going to do a discrete set of tasks. And that might free them up to do other things ... have more patient-facing time, do more research.’

However, three participants, all classified as ‘individuals with influence’, suggested that fewer healthcare professionals may be required in the future.

Several participants noted a lack of resistance among healthcare professionals towards AI. Most (including a doctor and royal college representative) felt that patient-facing staff were keen to try using AI; P6: ‘We're not seeing it as replacing us, we're seeing it as being very much a tool to assist us.’

However, one participant believed that some staff would not be keen to engage; P5: ‘There's a bottom 25% [of doctors] where it's going to be really difficult to get them to adopt or engage in it.’

Theme 3: technology

Data driven nature

Several participants spoke about the ‘fragmented data pool’ (P4) within the NHS as a barrier to the development of AI products; P2: ‘I think people make an assumption about data usage in the NHS that there's just a single point of contact and you can get access to every patient record – doesn't work like that. We don't have that.’

Furthermore, five participants felt that information governance, especially at a local level, hampered data sharing; P10: ‘How do you deal with all of the things the public would be really worried about? Which is all the information governance.’

One interviewee proposed a solution of information governance templates, perhaps to be developed by NHSX; P9: ‘So, there's something for NHSX to do ... provide national data protection impact assessment templates for trusts to adapt ... that would speed things up massively.’

Challenges ahead

Many participants discussed the concept of ‘black box’ AI as raising new challenges, including the right to an explanation; P10: ‘To what extent should people have a right under the [General Data Protection Regulation] legislation to an answer about why their treatment is taking a particular course? If a computer has decided that.’

However, there was disagreement. Several participants believed that black box AI was theoretical, and that AI would never be truly unexplainable. Others thought that there was a degree of over-questioning; P2: ‘People talking about ... the black box algorithm, how do we know what decision is made? And my answer is ... how do you know the clinician ... who's sat in front of you is making the right decision?’

Several participants raised concerns regarding legal liability. One participant highlighted how there was no case law regarding AI yet.

Another issue highlighted by several participants was the transferability of an AI tool from one setting to another; P9: ‘We had a radiology machine learning [AI] provider in north London. Again, really high sensitivity and specificity. Applied it in south London. It's terrible. Different ethnic group, different scanner, different radiology positioning.’

Finally, the risk of biased algorithms was raised by a few participants; P7: ‘How do we know an algorithm isn't biased?’

Evaluation

Most participants spoke about the regulatory and evaluation challenges introduced by self-learning AI; P11: ‘If you have a machine learning [AI] device that you want to learn while it's going along, each learning procedure creates effectively a new device. So, you would in fact have to then go around the regulatory circle again just for that one learning event.’

Additionally, three participants discussed the lack of an agreed gold standard for how accurate an AI tool needs to be before it can be used in clinical practice; P5: ‘There's no real understanding as to where the benchmark is for being able to use a piece of AI software. So, does it have to be better than the percentage of, say, misdiagnoses the consultants make? As good as? Or what? Nobody really knows.’

Discussion

DOI theory

The key aspects of Rogers’ DOI theory that were reflected in the findings were the perceived relative advantage, compatibility, complexity, trialability and observability of AI. These represent the five characteristics of an innovation explained in Table 1. Additionally, some elements of both the ‘time’ and the ‘social system’ aspects of Rogers’ model were reflected in the results (Table 4).

View this table:
  • View inline
  • View popup
Table 4.

Key findings mapped to the diffusion of innovations framework

DOI theory suggests that of the five characteristics, relative advantage and compatibility are especially significant in explaining the rate of adoption.22 Indeed, these were the two characteristics that applied most often to the findings.

Implications for practice

Many of the findings correlated to themes that are well explored in the existing literature base; for example, the confusing regulatory landscape for AI, issues with data access to develop AI products, the view that some specialties are more amenable to AI and the need for education around AI.2,3,31–42

However, some findings were unexpected, moving beyond the existing literature. Many participants highlighted the need for improved language clarity around AI, and the term AI itself was disliked by several interviewees. This issue of language clarity was not discussed in depth in the literature, with only one qualitative study indicating a poor understanding of the term ‘AI’.20 Moreover, most participants spoke of the financial pressures facing the NHS and highlighted how these may negatively impact the adoption of AI. These financial pressures are well documented generally; however, they did not feature in the literature on AI adoption.43,44 Another unanticipated finding was ambiguity around the gold standard that AI will need to reach before it can be deployed in a healthcare system. Champions (as facilitators of AI adoption) were also endorsed by several participants, despite the fact they were not explored in the evidence base on AI adoption. Lastly, the transferability of an AI product, from one healthcare setting to another, was highlighted by several participants as likely to be poor, but this issue was not well explored in the literature and was only discussed in one narrative review.45

Some of the barriers to AI adoption highlighted by this study represent practical concerns. Considering the issue of language clarity when discussing AI, there was no consensus among participants on what terms should be used instead. This raises questions around how AI adoption can be meaningfully discussed among healthcare professionals if standardised terminology is yet to be agreed. Similarly, uncertainty regarding the gold standard for AI underlines the need for a national conversation regarding the threshold of performance that an AI tool has to meet before it can be deployed across the NHS. Ongoing uncertainty in this area has the potential to undermine confidence in AI among both healthcare professionals and the public. Lastly, the transferability of AI is significant because if new AI tools need to be developed or existing ones significantly re-designed to be used across different NHS sites, then this will result in an additional financial burden.

Four solutions to encourage AI adoption were proposed by participants. One was the use of champions, which are recognised to support the adoption of innovations in other sectors.46,47 Another was education for healthcare professionals and the public, emphasising the benefit AI can offer to clinicians’ working lives and sharing real-world case studies of AI use in the NHS to improve understanding. The final two solutions involved NHSX and similar organisations. Firstly, that these organisations should identify the areas in the NHS where AI can best address existing challenges and ask developers to focus on these. Secondly, that they could provide clarity on information governance, in the form of template data protection impact assessments, to support data sharing to develop AI products. The latter has also been suggested by others.31,33

Limitations

Although data saturation was reached, the sample size was smaller than planned. Additionally, all participants were male. This was unintentional, of the 20 potential participants contacted directly during recruitment, seven were women. Unfortunately, none participated. Women are underrepresented within leadership positions in both healthcare and health technology, so recruiting a gender diverse sample was anticipated to be challenging.48–50 While the findings are internally coherent, they would likely be enhanced with a more gender diverse sample, and this represents a future research area.

Furthermore, complete triangulation was not reached, as data collection and analysis were conducted by a single researcher. Additionally, the research objective to explore barriers and facilitators within the context of radiology, pathology and general practice was not fully addressed as some participants were unwilling to comment.

Conclusion

DOI theory was an applicable and relevant theoretical lens. The issue of language clarity around AI was a notable, and unexpected, finding which will have implications for practice. Additionally, the study identified some specialty-specific factors that could influence the adoption of AI, particularly affecting general practice, although these were not explored in as much depth as intended. Finally, there was disagreement among participants regarding whether the quality of IT infrastructure in some areas of the NHS acted as a barrier to AI adoption.

A significant implication for practice, which echoes the existing literature, is the need for education around AI for healthcare professionals, the media and the general public. This can promote understanding, while dispelling fears and myths. Unfortunately, this may be difficult as the media tends to seek a silver bullet for the challenges facing the NHS. Additionally, champions and real-world case studies should be considered as practical facilitators to support AI adoption within an NHS setting, while further clarity on information governance would also be welcomed.

Future research is needed to ascertain whether the quality of IT infrastructure could impact the ability of certain NHS organisations to adopt AI. Other areas of further research include the language around AI and specialty-specific barriers to AI adoption. Finally, future studies in this area should attempt to capture a more diverse sample of participants.

Supplementary material

Additional supplementary material may be found in the online version of this article at www.rcpjournals.org/fhj:

S1 – Reflexivity statement.

S2 – Interview topic guide.

S3 – Participant characteristics.

  • © Royal College of Physicians 2021. All rights reserved.

References

  1. ↵
    1. Fenech M
    . AI for physicians: What should you be doing? Royal College of Physicians, 2019. www.rcp.ac.uk/news/ai-physicians-what-should-you-be-doing [Accessed 02 December 2020].
  2. ↵
    1. Topol EJ
    . High-performance medicine: The convergence of human and artificial intelligence. Nat Med 2019;25:44–56.
    OpenUrlCrossRefPubMed
  3. ↵
    1. Loh E
    . Medicine and the rise of the robots: A qualitative review of recent advances of artificial intelligence in health. BMJ Lead 2018;2:59–63.
    OpenUrl
  4. ↵
    1. Minsky ML
    . Semantic Information Processing. Massachusetts: The MIT Press, 1968.
  5. ↵
    1. Ongsulee P
    . Artificial intelligence, machine learning and deep learning. 15th International Conference on ICT and Knowledge Engineering (ICT&KE) 2017:1–6. https://ieeexplore.ieee.org/document/8259629 [Accessed 02 December 2020].
  6. ↵
    1. Panch T
    , Szolovits P, Atun R. Artificial intelligence, machine learning and health systems. J Glob Health 2018;8:020303.
    OpenUrlPubMed
  7. ↵
    1. Bini SA
    . Artificial intelligence, machine learning, deep learning, and cognitive computing: What do these terms mean and how will they impact health care? J Arthroplasty 2018;33:2358–61.
    OpenUrl
  8. ↵
    1. Kulkarni S
    , Seneviratne N, Baig MS, Khan AHA. Artificial intelligence in medicine: Where are we now? Acad Radiol 2020;27:62–70.
    OpenUrl
  9. ↵
    1. Hosny A
    , Parmar C, Quackenbush J, Schwartz LH, Aerts HJWL. Artificial intelligence in radiology. Nat Rev Cancer 2018;18:500–10.
    OpenUrlCrossRefPubMed
  10. ↵
    1. Singh R
    , Kalra MK, Nitiwarangkul C, et al. Deep learning in chest radiography: Detection of findings and presence of change. PLoS One 2018;13:e0204155.
    OpenUrl
  11. ↵
    1. Lehman CD
    , Yala A, Schuster T, et al. Mammographic breast density assessment using deep learning: Clinical implementation. Radiology 2019;290:52–8.
    OpenUrl
  12. ↵
    1. Sathyakumar K
    , Munoz M, Singh J, et al. Automated lung cancer detection using artificial intelligence (AI) deep convolutional neural networks: A narrative literature review. Cureus 2020;12:e10017.
    OpenUrl
  13. ↵
    1. Steiner DF
    , MacDonald R, Liu Y, et al. Impact of deep learning assistance on the histopathologic review of lymph nodes for metastatic breast cancer. Am J Surg Pathol 2018;42:1636–46.
    OpenUrlCrossRefPubMed
  14. ↵
    1. Nagpal K
    , Foote D, Liu Y, et al. Development and validation of a deep learning algorithm for improving Gleason scoring of prostate cancer. NPJ Digit Med 2019;2:48.
    OpenUrl
  15. ↵
    1. Martin J
    . Meeting pathology demand: Histopathology workforce census. London: Royal College of Pathologists, 2018.
  16. ↵
    1. Care Quality Commission
    . A national review of radiology reporting within the NHS in England. London: Care Quality Commission, 2018.
  17. ↵
    1. Royal College of General Practitioners
    . Artificial intelligence and primary care. London: RCGP, 2018.
  18. ↵
    1. Manning CL
    . Artificial intelligence could bring relevant guidelines into every consultation. BMJ 2019;366:l4788.
    OpenUrlFREE Full Text
  19. ↵
    1. Summerton N
    , Cansdale M. Artificial intelligence and diagnosis in general practice. Br J Gen Pract 2019;69:324–5.
    OpenUrlFREE Full Text
  20. ↵
    1. Laï MC
    , Brian M, Mamzer MF. Perceptions of artificial intelligence in healthcare: Findings from a qualitative survey study among actors in France. J Transl Med 2020;18:14.
    OpenUrlPubMed
  21. ↵
    1. Cresswell K
    , Cunningham-Burley S, Sheikh A. Health care robotics: Qualitative exploration of key challenges and future directions. J Med Internet Res 2018;20:e10410.
    OpenUrl
  22. ↵
    1. Rogers EM
    . Diffusion of Innovations, 5th edn. New York: Free Press, 2003.
  23. ↵
    1. Zhang X
    , Yu P, Yan J, et al. Using diffusion of innovation theory to understand the factors impacting patient acceptance and use of consumer e-health innovations: A case study in a primary care clinic. BMC Health Serv Res 2015;15:71.
    OpenUrl
  24. ↵
    1. Helitzer D
    , Heath D, Maltrud K, et al. Assessing or predicting adoption of telehealth using the diffusion of innovations theory: A practical example from a rural program in New Mexico. Telemed J E Health 2003;9:179–87.
    OpenUrlCrossRefPubMed
  25. ↵
    1. Lee TT
    . Nurses' adoption of technology: Application of Rogers' innovation-diffusion model. Appl Nurs Res 2004;17:231–8.
    OpenUrlPubMed
  26. ↵
    1. Chew F
    , Grant W, Tote R. Doctors on-line: Using diffusion of innovations theory to understand internet use. Fam Med 2004;36:645–50.
    OpenUrlPubMed
  27. ↵
    1. Ash JS
    , Lyman J, Carpenter J, et al. A diffusion of innovations model of physician order entry. Proc AMIA Symp 2001:22–6.
  28. ↵
    1. Creswell J
    . Research Design: Qualitative, quantitative, and mixed methods approaches, 4th edn. Thousand Oaks: SAGE Publications, 2013.
  29. ↵
    1. Payne G
    , Payne J. Key concepts in social research. London: SAGE Publications, 2004.
  30. ↵
    1. Braun V
    , Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006;3:77–101.
    OpenUrlCrossRef
  31. ↵
    1. Reform
    . Data-driven healthcare: Regulation & regulators. Reporting of the findings & methodology. London: Reform, 2019.
  32. ↵
    1. The AHSN Network
    . Accelerating artificial intelligence in health and care: Results from a state of the nation survey. The AHSN Network, 2018. https://wessexahsn.org.uk/img/news/AHSN%20Network%20AI%20Report-1536078823.pdf [Accessed 02 December 2020].
  33. ↵
    1. House of Lords Select Committee on Artificial Intelligence
    . AI in the UK: Ready, willing and able? London: Authority of the House of Lords, 2018.
  34. ↵
    1. Harwich E
    , Laycock K. Thinking on its own: AI in the NHS. London: Reform, 2018.
  35. ↵
    1. Academy of Medical Royal Colleges
    . Artificial intelligence in healthcare. London: AoMRC, 2019.
  36. ↵
    1. Ahmad OF
    , Stoyanov D, Lovat LB. Barriers and pitfalls for artificial intelligence in gastroenterology: Ethical and regulatory issues. Tech Gastrointest Endosc 2020;22:80–4.
    OpenUrl
  37. ↵
    1. Joshi I
    , Morley J. Artificial Intelligence: How to get it right. Putting policy into practice for safe data-driven innovation in health and care. London: NHSX, 2019.
  38. ↵
    1. Royal College of Radiologists
    . RCR position statement on artificial intelligence. RCR, 2018. www.rcr.ac.uk/posts/rcr-position-statement-artificial-intelligence [Accessed 02 December 2020].
  39. ↵
    1. Royal College of Physicians
    . RCP calls on health sector to ensure workforce is fully equipped for digital future. RCP, 2019. www.rcp.ac.uk/news/rcp-calls-health-sector-ensure-workforce-fully-equipped-digital-future [Accessed 02 December 2020].
  40. ↵
    1. Royal College of General Practitioners
    . Artificial intelligence and primary care. London: RCGP, 2018.
  41. ↵
    1. Blease C
    , Kaptchuk TJ, Bernstein MH, et al. Artificial intelligence and the future of primary care: Exploratory qualitative study of UK general practitioners' views. J Med Internet Res 2019;21:e12802.
    OpenUrlCrossRef
  42. ↵
    1. Haan M
    , Ongena YP, Hommes S, et al. A qualitative study to understand patient perspective on the use of artificial intelligence in radiology. J Am Coll Radiol 2019;16:1416–9.
    OpenUrlCrossRef
  43. ↵
    1. Robertson R
    , Wenzel L, Thompson J, et al. Understanding NHS financial pressures: How are they affecting patient care? London: The King's Fund, 2017.
  44. ↵
    1. Kraindler J
    , Firth Z, Charlesworth A. False economy: An analysis of NHS funding pressures. London: The Health Foundation, 2018.
  45. ↵
    1. Carter SM
    , Rogers W, Win KT, et al. The ethical, legal and social implications of using artificial intelligence systems in breast cancer care. Breast J 2020;49:25–32.
    OpenUrl
  46. ↵
    1. Chesbrough H
    , Crowther AK. Beyond high tech: Early adopters of open innovation in other industries. R D Manag 2006;36:229–36.
    OpenUrl
  47. ↵
    1. Sergeeva N
    . What makes an ‘innovation champion’? Eur J Innov Man 2016;19:72–89.
    OpenUrl
  48. ↵
    1. Denend L
    , McCutcheon S, Regan M, et al. Analysis of gender perceptions in health technology: A call to action. Ann Biomed Eng 2020;48:1573–86.
    OpenUrl
  49. ↵
    1. Kvaerner KJ
    , Aasland OG, Botten GS. Female medical leadership: Cross sectional study. BMJ 1999;318:91–4.
    OpenUrlAbstract/FREE Full Text
  50. ↵
    1. Kuhlmann E
    , Ovseiko PV, Kurmeyer C, et al. Closing the gender leadership gap: A multi-centre cross-country comparison of women in management and leadership in academic health centres in the European Union. Hum Resour Health 2017;15:2.
    OpenUrlCrossRefPubMed
Back to top
Previous articleNext article

Article Tools

Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Citation Tools
Artificial intelligence and the NHS: a qualitative exploration of the factors influencing adoption
Kirsty Morrison
Future Healthc J Nov 2021, 8 (3) e648-e654; DOI: 10.7861/fhj.2020-0258

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Artificial intelligence and the NHS: a qualitative exploration of the factors influencing adoption
Kirsty Morrison
Future Healthc J Nov 2021, 8 (3) e648-e654; DOI: 10.7861/fhj.2020-0258
del.icio.us logo Digg logo Reddit logo Twitter logo Facebook logo Google logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • Introduction
    • Methods
    • Evaluation
    • Discussion
    • Limitations
    • Conclusion
    • Supplementary material
    • References
  • Figures & Data
  • Info & Metrics

Related Articles

  • Google Scholar

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Sleep measures and cardiovascular disease in type 2 diabetes mellitus
  • Health and care needs of hospitalised people experiencing homelessness: an inpatient audit
  • People living in homeless hostels: a survey of health and care needs
Show more Original research

Similar Articles

FAQs

  • Difficulty logging in.

There is currently no login required to access the journals. Please go to the home page and simply click on the edition that you wish to read. If you are still unable to access the content you require, please let us know through the 'Contact us' page.

  • Can't find the CME questionnaire.

The read-only self-assessment questionnaire (SAQ) can be found after the CME section in each edition of Clinical Medicine. RCP members and fellows (using their login details for the main RCP website) are able to access the full SAQ with answers and are awarded 2 CPD points upon successful (8/10) completion from:  https://cme.rcplondon.ac.uk

Navigate this Journal

  • Journal Home
  • Current Issue
  • Ahead of Print
  • Archive

Related Links

  • ClinMed - Home
  • FHJ - Home

Other Services

  • Advertising
futurehosp Footer Logo
  • Home
  • Journals
  • Contact us
  • Advertise
HighWire Press, Inc.

Follow Us:

  • Follow HighWire Origins on Twitter
  • Visit HighWire Origins on Facebook

Copyright © 2023 by the Royal College of Physicians