Stakeholder Perceptions of Challenges and Benefits of AI in Diagnostic Imaging: A Systematic Thematic Exploration within the NHS
DOI:
https://doi.org/10.70749/ijbr.v3i9.2761Keywords:
Artificial Intelligence; Diagnostic Imaging; Stakeholder Perceptions; NHS; Thematic AnalysisAbstract
This study critically investigates NHS stakeholder perceptions of artificial intelligence (AI) adoption within diagnostic imaging, exposing the socio-technical, ethical, and institutional tensions that underpin implementation challenges. Despite sustained policy investment, much of the extant literature remains techno-centric—overlooking the epistemic concerns, professional disempowerment, and legitimacy anxieties of frontline radiographers, radiologists, patients, and healthcare leaders. To address this gap, the study adopts a theory-informed secondary qualitative synthesis, integrating the Technology Acceptance Model (TAM) and Stakeholder Theory to interrogate how acceptance, trust, and governance perceptions shape AI readiness. Thirteen UK-based empirical studies (2020–2025) were selected through a PRISMA-guided protocol and analysed using Braun and Clarke’s six-phase reflexive thematic analysis. Seven analytically distinct themes emerged: Perceived Benefits of AI; Trust, Explainability, and Human-AI Collaboration; Governance, Ethical, and Safety Barriers; Workforce Readiness and Education Gaps; Equity, Inclusivity, and Bias Risks; Stakeholder Engagement and Co-Production; and Sustainability, Funding, and Public Trust. Findings reveal that trust in AI is not reducible to system accuracy or explainability, but shaped by power asymmetries, legitimacy deficits, and a lack of structured co-production. Educational gaps, governance ambiguities, and algorithmic bias further exacerbate stakeholder misalignment. Although reliant on secondary data, the study compensates through methodological rigour and conceptual triangulation. This study offers a novel theoretical and empirical contribution by mapping stakeholder-specific tensions and advancing a multidimensional framework for ethically aligned AI governance. It concludes that responsible AI integration in NHS diagnostic imaging depends not solely on technical innovation, but on participatory design, equitable stakeholder inclusion, and institutional trust-building across all levels of the health system.
Downloads
References
1. Al-Zahrani, A. M., &Alasmari, T. M. (2025). A comprehensive analysis of AI adoption, implementation strategies, and challenges in higher education across the Middle East and North Africa (MENA) region. Education and Information Technologies. https://doi.org/10.1007/s10639-024-13300-y
2. Aravazhi, P. S., Ravindran, K. O., Balasubramani, K., Kamil, M., Gouthaman, K., Karki, L., Thiyagarajan, S., & Nair, A. S. (2024). Radiologists’ perceptions and readiness for integrating artificial intelligence in diagnostic imaging: A survey-based study. Bioinformation. https://doi.org/10.6026/9732063002001943
3. Assaf, R., Omar, M., Saleh, Y., Attar, H., Alaqra, N. T., & Kanan, M. (2024). Assessing the Acceptance for Implementing Artificial Intelligence Technologies in the Governmental Sector: An Empirical Study. Engineering, Technology & Applied Science Research, 14(6), 18160–18170.
4. Born, J., Beymer, D., Rajan, D., Coy, A., Mukherjee, V. V., Manica, M., ... & Rosen-Zvi, M. (2021). On the role of artificial intelligence in medical imaging of COVID-19. Patterns, 2(6).
5. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
6. Brown, A., & Hartley, K. (2021). Digital transformation in community nursing. British Journal of Community Nursing, 26(9), 422–427. https://doi.org/10.12968/bjcn.2021.26.9.422
7. Buolamwini, J., & Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency (pp. 77-91). PMLR.
8. Byrne, D. (2022). A worked example of Braun and Clarke approach to reflexive thematic analysis. Quality & Quantity, 56(3), 1391–1412.
9. Campbell, K. A., Orr, E., Durepos, P., Nguyen, L., Li, L., Whitmore, C., Gehrke, P., Graham, L., & Jack, S. M. (2021). Reflexive thematic analysis for applied qualitative health research. The Qualitative Report, 26(6), 2011–2028.
10. Chada, B. V., & Summers, L. (2022). AI in the NHS: a framework for adoption. Future Healthcare Journal, 9(3), 313–316.
11. Chen, C., & Sundar, S. S. (2024). Communicating and combating algorithmic bias: effects of data diversity, labeler diversity, performance bias, and user feedback on AI trust. Human–Computer Interaction, 1-37.
12. Curry, L. A., Nembhard, I. M., & Bradley, E. H. (2009). Qualitative and Mixed Methods Provide Unique Contributions to Outcomes Research. Circulation, 119(10), 1442–1452. https://doi.org/10.1161/CIRCULATIONAHA.107.742775
13. Cushnan, D., Bennett, O., Berka, R., Bertolli, O., Chopra, A., Dorgham, S., Favaro, A., Ganepola, T., Halling-Brown, M., & Imreh, G. (2021). An overview of the National COVID-19 Chest Imaging Database: Data quality and cohort analysis. Gigascience, 10(11), giab076.
14. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS quarterly, 319-340.
15. Doherty, G., McLaughlin, L., Hughes, C., McConnell, J., Bond, R., & McFadden, S. (2024). Radiographer Education and Learning in Artificial Intelligence (REAL-AI): A survey of radiographers, radiologists, and students’ knowledge of and attitude to education on AI. Radiography, 30, 79–87.
16. Fazakarley, C. A., Breen, M., Leeson, P., Thompson, B., & Williamson, V. (2023). Experiences of using artificial intelligence in healthcare: a qualitative study of UK clinician and key stakeholder perspectives. BMJ open, 13(12), e076950.
17. Freeman, R. E. (1984). Strategic management: A stakeholder approach. Cambridge university press.
18. Goktas, P., & Grzybowski, A. (2025). Shaping the future of healthcare: Ethical clinical challenges and pathways to trustworthy AI. Journal of Clinical Medicine, 14(5), 1605. https://doi.org/10.3390/jcm14051605
19. Hercheui, M., & Mech, G. (2021). Factors affecting the adoption of artificial intelligence in healthcare. Global Journal of Business Research, 15(1), 77–88.
20. HM Government. (2021, September). National AI strategy. Department for Digital, Culture, Media and Sport. https://assets.publishing.service.gov.uk/media/614db4ecd3bf7f7187208500/National_AI_Strategy__mobile_version_.pdf
21. Hole, L. (2024). Handle with care; considerations of Braun and Clarke approach to thematic analysis. Qualitative Research Journal, 24(4), 371–383.
22. Iacobucci, G. (2020). Government’s plan to digitise NHS risks wasting billions, MPs warn. British Medical Journal Publishing Group.
23. Ibrahim, F., Münscher, J.-C., Daseking, M., & Telle, N.-T. (2025). The technology acceptance model and adopter type analysis in the context of artificial intelligence. Frontiers in Artificial Intelligence, 7, 1496518.
24. Ivchyk, V. (2024). OVERCOMING BARRIERS TO ARTIFICIAL INTELLIGENCE ADOPTION. Three Seas Economic Journal. https://doi.org/10.30525/2661-5150/2024-4-3
25. Karpathakis, K., Pencheon, E., &Cushnan, D. (2024). Learning from international comparators of national medical imaging initiatives for AI development: Multiphase qualitative study. JMIR AI, 3(1), e51168.
26. Karran, A., Charland, P., Martineau, J.-T., Guinea, A. O. de, Lesage, A., Sénécal, S., & Léger, P.-M. (2024). Multi-stakeholder Perspective on Responsible Artificial Intelligence and Acceptability in Education. ArXiv, abs/2402.15027. https://doi.org/10.48550/arXiv.2402.15027
27. Kuo, R. Y. L., Freethy, A., Smith, J., Hill, R., Jerome, D., Harriss, E., Collins, G. S., Tutton, E., & Furniss, D. (2024). Stakeholder perspectives towards diagnostic artificial intelligence: A co-produced qualitative evidence synthesis. EClinicalMedicine, 71. https://www.thelancet.com/journals/eclinm/article/PIIS2589-5370(24)00134-2/fulltext?uuid=uuid%3Adfea1c5b-cdf1-4b2f-b17d-d4985801fe2e
28. Larson, D. B., Magnus, D. C., Lungren, M. P., Shah, N. H., & Langlotz, C. P. (2020). Ethics of using and sharing clinical imaging data for artificial intelligence: A proposed framework. Radiology, 295(3), 675–682.
29. Lip, G., Novak, A., Goyen, M., Boylan, K., & Kumar, A. (2024). Adoption, orchestration, and deployment of artificial intelligence within the National Health Service—facilitators and barriers: An expert roundtable discussion. BJR| Artificial Intelligence, 1(1), ubae009.
30. Majrashi, K. (2024). Determinants of Public Sector Managers’ Intentions to Adopt AI in the Workplace. International Journal of Public Administration in the Digital Age (IJPADA), 11(1), 1–26.
31. Morley, J., &Floridi, L. (2020). NHS AI Lab: Why we need to be ethically mindful about AI for healthcare.
32. Morrow, V., Boddy, J., & Lamb, R. (2014). The ethics of secondary data analysis: Learning from the experience of sharing qualitative data from young people and their families in an international study of childhood poverty.
33. National Health Executive. (2025, January 29). New report outlines NHS failures. https://www.nationalhealthexecutive.com/articles/new-report-outlines-nhs-failures
34. Newlands, R., Bruhn, H., Díaz, M. R., Lip, G., Anderson, L. A., & Ramsay, C. (2024). A stakeholder analysis to prepare for real-world evaluation of integrating artificial intelligent algorithms into breast screening (PREP-AIR study): A qualitative study using the WHO guide. BMC Health Services Research, 24(1), 569. https://doi.org/10.1186/s12913-024-10926-z
35. NHS England. (2019). The NHS long term plan (Version 1.2). https://www.longtermplan.nhs.uk/wp-content/uploads/2019/08/nhs-long-term-plan-version-1.2.pdf
36. NHS England. (2022, July). Our 2022/23 business plan. https://www.england.nhs.uk/wp-content/uploads/2022/08/B1733_Our-2022-23-Business-Plan_July-2022.pdf
37. NHS Transformation Directorate. (2021, June 30). Round 3 of the Artificial Intelligence in Health and Care Award is now open. NHS England. https://transform.england.nhs.uk/blogs/round-3-of-the-artificial-intelligence-in-health-and-care-award-is-now-open/
38. NHSX. (2019). Artificial intelligence: How to get it right – Putting policy into practice for safe data-driven innovation in health and care (p. 8). NHS Transformation Directorate. https://transform.england.nhs.uk/media/documents/NHSX_AI_report.pdf
39. Nirapai, A., &Leelasantitham, A. (2024). A new adoption model for quality of experience assessed by radiologists using ai medical imaging technology. Journal of Open Innovation: Technology, Market, and Complexity, 10(3), 100369.
40. Oxford Analytica. (2019). AI adoption in UK healthcare will be slower than hoped. Emerald Expert Briefings, oxan-db(oxan-db). https://doi.org/10.1108/OXAN-DB247738
41. Rainey, C., Bond, R., McConnell, J., Hughes, C., Kumar, D., & McFadden, S. (2024). Reporting radiographers’ interaction with Artificial Intelligence—How do different forms of AI feedback impact trust and decision switching? PLOS Digital Health, 3(8), e0000560.
42. Rainey, C., O’Regan, T., Matthew, J., Skelton, E., Woznitza, N., Chu, K.-Y., Goodman, S., McConnell, J., Hughes, C., & Bond, R. (2021). Beauty is in the AI of the beholder: Are we ready for the clinical integration of artificial intelligence in radiography? An exploratory analysis of perceived AI knowledge, skills, confidence, and education perspectives of UK radiographers. Frontiers in Digital Health, 3, 739327.
43. Rainey, C., O’Regan, T., Matthew, J., Skelton, E., Woznitza, N., Chu, K.-Y., Goodman, S., McConnell, J., Hughes, C., & Bond, R. (2022). UK reporting radiographers’ perceptions of AI in radiographic image interpretation–Current perspectives and future developments. Radiography, 28(4), 881–888.
44. Rawashdeh, M. A., Almazrouei, S., Zaitoun, M., Kumar, P., & Saade, C. (2024). Empowering radiographers: A call for integrated AI training in university curricula. International Journal of Biomedical Imaging, 2024(1), 7001343.
45. Redruello-Guerrero, P., Jimenez-Gutierrez, C., Ramos-Bossini, A. L., Jiménez-Gutiérrez, P. M., Rivera-Izquierdo, M., & Sánchez, J. B. (2022). Artificial intelligence for the triage of COVID-19 patients at the emergency department: a systematic review. Signa Vitae, 18(6).
46. Sandhu, M. (2025, March). AI in healthcare: Why public trust remains the critical missing ingredient. Nuom Health. https://www.nuom.health/insights/ai-healthcare-public-trust
47. Stogiannos, N., O’Regan, T., Scurr, E., Litosseliti, L., Pogose, M., Harvey, H., Kumar, A., Malik, R., Barnes, A., & McEntee, M. F. (2024). AI implementation in the UK landscape: Knowledge of AI governance, perceived challenges and opportunities, and ways forward for radiographers. Radiography, 30(2), 612–621.
48. Sujan, M. A., White, S., Habli, I., & Reynolds, N. (2022). Stakeholder perceptions of the safety and assurance of artificial intelligence in healthcare. Safety Science, 155, 105870.
49. Tanweer, A., Gade, E. K., Krafft, P. M., & Dreier, S. K. (2021). Why the Data Revolution Needs Qualitative Methods. Harvard Data Science Review.
50. Topff, L., Sánchez-García, J., López-González, R., Pastor, A. J., Visser, J. J., Huisman, M., ... & Imaging COVID-19 AI initiative. (2023). A deep learning-based application for COVID-19 diagnosis on CT: The Imaging COVID-19 AI initiative. Plos one, 18(5), e0285121.
51. Topol Review. (2019). The Topol Review—NHS Health Education England. The Topol Review — NHS Health Education England. https://topol.hee.nhs.uk/
52. Tripathy, J. P. (2013). Secondary data analysis: Ethical issues and challenges. Iranian Journal of Public Health, 42(12), 1478.
53. Williams, R. T. (2024). Paradigm shifts: Exploring AI’s influence on qualitative inquiry and analysis. Frontiers in Research Metrics and Analytics, 9, 1331589.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Indus Journal of Bioscience Research

This work is licensed under a Creative Commons Attribution 4.0 International License.