Could artificial intelligence write mental health nursing care plans?
Samuel Woodnutt, Chris Allen, Jasmine Snowden, Matt Flynn, Simon Hall, Paula Libberton, Francesca Purvis- Pshychiatric Mental Health
Accessible Summary
What is Known on the Subject?
Artificial intelligence (AI) is freely available, responds to very basic text input (such as a question) and can now create a wide range of outputs, communicating in many languages or art forms. AI platforms like OpenAI's ChatGPT can now create passages of text that could be used to create plans of care for people with mental health needs. As such, AI output can be difficult to distinguish from human‐output, and there is a risk that its use could go unnoticed.
What this Paper Adds to Existing Knowledge?
Whilst it is known that AI can produce text or pass pre‐registration health‐profession exams, it is not known if AI can produce meaningful results for care delivery. We asked ChatGPT basic questions about a fictitious person who presents with self‐harm and then evaluated the quality of the output. We found that the output could look reasonable to laypersons but there were significant errors and ethical issues. There are potential harms to people in care if AI is used without an expert correcting or removing these errors.