USE OF ARTIFICIAL INTELLIGENCE TOOLS IN EMERGENCY MEDICINE: A CROSS-SECTIONAL SURVEY AND NARRATIVE REVIEW

Main Article Content

Dr. Moses Akpoghene

Keywords

ChatGPT, Artificial Intelligence (AI), NHS.

Abstract

The use of artificial intelligence (AI) tools in emergency medicine is expanding rapidly, with clinicians increasingly integrating platforms such as ChatGPT, Gemini, and DeepSeek into their daily workflows. Despite the growing prevalence of these tools, there remains a lack of formal guidance around their use, particularly in relation to data governance, medico-legal accountability, and clinical safety. Understanding current utilisation patterns, clinician perceptions, and safety concerns is essential to guide safe and effective AI adoption in emergency departments (EDs).


Objective: To evaluate awareness, utilisation, and clinician attitudes toward AI tools among emergency medicine professionals in the UK, and to identify practical concerns and recommendations for integrating AI safely into NHS emergency care.


Methods: A cross-sectional survey was conducted among emergency medicine clinicians across various NHS hospital types. A total of 31 participants completed the structured questionnaire, covering awareness, frequency of use, perceived safety, cognitive workload impact, and preferred safeguards. Responses were analysed descriptively, with data represented as frequencies and percentages.


Results: AI awareness was high, with 93.5% of respondents reporting familiarity with tools such as ChatGPT and Gemini. Over 83% reported occasional or regular use, predominantly for documentation (63%), clinical decision support (38.7%), prescribing guidance (38.7%), triage stratification (38.7%), and patient explanation (38.7%). While 77.4% believed AI could reduce cognitive workload, 87.1% cited data protection and GDPR compliance as the primary concern. Other concerns included medico-legal accountability (67.7%) and algorithmic bias (61.3%). All respondents supported NHS-hosted, GDPR-compliant AI tools, and 93.5% recommended formal training by Trusts. Open-ended responses emphasised the need for rigorous testing, clinician oversight, local hosting, and integration into existing EPR systems.


Conclusion: Emergency clinicians demonstrate strong awareness and emerging reliance on AI tools, primarily for documentation and clinical decision support. However, widespread adoption is hindered by significant data governance concerns and the absence of NHS-sanctioned infrastructure. There is urgent need for policy frameworks, training pathways, and secure AI integrations that align with legal, ethical, and clinical standards in the UK emergency care context.

Abstract 13 | PDF Downloads 3

References

[1] Topol E. deep medicine: how artificial intelligence can make healthcare human again. New York: Basic Books 2019.
[2] Shen J, Zhang CJP, Jiang B, et al. Artificial intelligence versus clinicians in disease diagnosis: systematic review. JMIR Med Inform 2019;7:10010.
[3] Johnson KW, Soto JT, Glicksberg BS, et al. Artificial Intelligence in Cardiology. Journal of the American College of Cardiology 2018;71:2668-79.
[4] Liu Y, Zhang Y. ChatGPT as a clinical support tool: a comprehensive review of applications, assessment, and implementation challenges. Physiotherapy Practice and Research. 2025. DOI: 10.1177/22130683251379638
[5] Meskó B, Görög M. a short guide for medical professionals in the era of artificial intelligence. NPJ Digital Medicine 2020;3:126.
[6] NHS England. NHS Long Term Workforce Plan. 2023. https://www.england.nhs.uk/wp-content/uploads/2023/06/nhs-long-term-workforce-plan-v1.2.pdf.
[7] Hasan SS, Fury MS, Woo JJ, et al. Ethical Application of Generative Artificial Intelligence in Medicine. Arthroscopy: The Journal of Arthroscopic & Related Surgery: Official Publication of the Arthroscopy Association of North America and the International Arthroscopy Association. 2025;41:874-85.
[8] Howell MD, Corrado GS, DeSalvo KB. Three Epochs of Artificial Intelligence in Health Care. JAMA 2024;331:242-4.
[9] Cresswell K, Williams R, Dungey S, et al. A mixed methods formative evaluation of the united kingdom national health service artificial intelligence lab. NPJ Digital Medicine 2025;8:448.
[10] HM Government. Data protection and digital information bill. 2023. https://bills.parliament.uk/bills/3430.
[11] BMA. Principles for Artificial Intelligence (AI) and Its Application in Healthcare. British Medical Association 2024,
[12] Bienefeld N, Keller E, Grote G. Human-AI teaming in critical care: a comparative analysis of data scientists' and clinicians' perspectives on AI augmentation and automation. J Med Internet Res 2024;26:50130.
[13] Cooper J, Haroon S, Crowe F, et al. Perspectives of health care professionals on the use of AI to support clinical decision-making in the management of multiple long-term conditions. J Med Internet Res. 2025, 27:71980. 10.2196/71980
[14] Cinalioglu K, Elbaz S, Sekhon K, et al. Exploring differential perceptions of artificial intelligence in health care among younger versus older Canadians: results From The 2021 Canadian Digital Health Survey. Journal of Medical Internet Research 2023;25:38169.
[15] Heinrichs H, Kies A, Nagel SK, et al. Physicians' Attitudes Toward Artificial Intelligence in Medicine: Mixed Methods Survey and Interview Study. Journal of Medical Internet Research 2025;27:74187.
[16] Croskerry P. The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them. Acad Med. 2003;78:775-80.
[17] Boonstra A, Laven M. Influence of artificial intelligence on the work design of emergency department clinicians a systematic literature review. BMC Health Services Research 2022;22:669.
[18] Jeong J, Kim S, Pan L, et al. reducing the workload of medical diagnosis through artificial intelligence: a narrative review. Medicine 2025;104:41470.
[19] Kachman MM, Brennan I, Oskvarek JJ, et al. How artificial intelligence could transform emergency care. The American Journal of Emergency Medicine 2024;81:40-6.
[20] Ayorinde A, Mensah DO, Walsh J, et al. Health care professionals' experience of using AI: systematic review with narrative synthesis. Journal of Medical Internet Research 2024;26:55766.
[21] Sahoo RK, Sahoo KC, Negi S, et al. Health Professionals' Perspectives on the Use of Artificial Intelligence in Healthcare: A Systematic Review. Patient Education and Counseling 2025;134:108680.
[22] British Medical Association. Principles for Artificial Intelligence (AI) and its Application in Healthcare. BMA 2023.
[23] HM Government. Data Protection and Digital Information Bill 2023. https://bills.parliament.uk/bills/3430.
[24] Ayoub NF, Rameau A, Brenner MJ, et al. American Academy of Otolaryngology-Head and Neck Surgery (AAO-HNS) Report on Artificial Intelligence. Otolaryngology--Head and Neck Surgery: Official Journal of American Academy of Otolaryngology-Head and Neck Surgery 2025;172:734-43.
[25] Huo B, Boyle A, Marfo N, et al. large language models for chatbot health advice studies: a systematic review. JAMA logoJAMA Network Open 2025;8:2457879.
[26] Moura L, Jones DT, Sheikh IS, et al. Implications of large language models for quality and efficiency of neurologic care: emerging issues in Neurology. Neurology 2024;102:209497.
[27] Ramkumar PN, Woo JJ. Editorial Commentary: Large Language Models Like ChatGPT Show Promise, but Clinical Use of Artificial Intelligence Requires Physician Partnership. Arthroscopy: The Journal of Arthroscopic & Related Surgery: Official Publication of the Arthroscopy Association of North America and the International Arthroscopy Association 2025;41:1448-50.
[28] Obermeyer Z, Powers B, Vogeli C, et al. Dissecting Racial Bias in an Algorithm Used to Manage the Health of Populations. Science 2019;366:447-453.