With AI getting more intelligent each year, as evidenced by the recent news about ChatGPT passing the U.S. Medical Licensing Exam, many wonder what role AI and healthcare will have in the future. At this point, and with the technology available, it is safe to say that they can hold a valuable role in increasing efficiency and automating routine tasks – all valuable assets as providers strive to meet the increasing needs of communities.
While we are not suggesting AI Chat bots replace physicians, the technology is surprisingly good at producing the correct answers and rationale for getting there. For now, it seems more likely that ChatGPT will be used to augment certain tasks that are repetitive and take up extra time of healthcare workers.
So, What is ChatGPT?
You have probably seen bots pop up on a website with a cute window and a “Can I help you?” message. You may have used one for customer service questions or on an internet shopping site to check on a package. These bots are typically trained with a range of text responses, and if your question doesn’t fit the pre-programmed list, then you get transferred to a human.
Company OpenAI’s groundbreaking GPT Chat Bot is changing the range of what a bot can do. GPT is short for generative pre-trained transformer – basically an AI assistant that has been trained with tons of data and can provide answers to virtually any request. The newest version, ChatGPT has been trained on literally billions of text samples and can generate useful results based on that knowledge. This AI-powered chatbot is capable of mirroring intuitive human conversation – even songs, poetry, essays, and computer code.
The results are pretty impressive, and for the most part sound like they were written by a human. The way in which ChatGPT replies can even be customized upon request. In an example given in a recent Uproxx article, the bot was asked to explain tort reform. The response was full of legal language and was very encyclopedic. Then, amazingly, the bot was asked to explain that concept to a ten year old. The new response was full of relevant comparisons to age-appropriate games and used simpler language that made it easier to understand.
The possibilities are endless – results can be delivered in any language, in the form of a Haiku, to write your wedding thank-you cards, or a cover letter for a specific job opportunity.
Potential Healthcare Applications
With the vast range of tasks that ChatGPT can do, it makes sense that it would be used in healthcare – one of our most overburdened and understaffed industries. According to Medical Economics, healthcare profession users of ChatGPT are leaning into the technology and using it in the following ways:
• To summarize patient records. ChatGPT can be taught to act as a virtual assistant for physicians. It can extract essential information from patient records, populating reports with data like family history, symptoms, current medications, allergies, labs, etc. This would permit physicians to have the information efficiently summarized at their fingertips, freeing up time to see more patients and focus more on patient care.
• To enhance clinical decision systems. The technology can be trained to match up information such as test results, labs, vital signs, and symptoms to give physicians recommendations for patient care and treatment decisions.
• Automating administrative functions. The combination of AI and healthcare may be able to reduce the amount of time that doctors and their staff spend navigating insurance approvals, writing referral letters, and working claims denials.
• Improving patient education. Physicians could use the ChatGPT bot to keep patients informed throughout treatment by using the bot to simplify medical notes into layperson’s language and communicating that information.
• Providing answers to patient questions (FAQs). It is often difficult to contact a physician during their workday, due to their busy schedules. ChatGPT can give answers to frequently asked questions abut diagnoses or managing their condition.
What is the Downside to AI and Healthcare?
While the technology is certainly stunning, and many are eager to adapt it to the healthcare world, there are some legal and ethical gray areas. ChatGPT has been shown to make up references or factually provide incorrect information. It is also only trained using information from up through 2021, meaning that it will need to be updated consistently to make sure that it is relevant. The bot learns from its input, meaning that it is possible to disperse biased or unsuitable responses.
ChatGPT and Mental Health –
One problem that healthcare has been struggling with is that of improving patient access to mental health resources. More patient than ever before are seeking help – without finding the psychiatrists and therapists that they need available in their area. The AI-powered chatbot is particularly good at producing conversation-like text answers and large volumes of information.
These AI-powered counseling sessions are in use already for mental health support at Woebot Health, which touts talk sessions that fit into your life, whenever you want to chat – no appointments and no waiting rooms. This is certainly an advantage for those in need of a friendly and knowledgeable, non-judgmental conversation. One professional psychiatrist’s review listed several pros, but some cons, such as:
- Unable to provide much correction in response to free text, such as challenging the user’s thought patterns like as real therapist would.
- Typically needed to scroll to find past examples.
- Can be repetitive and get in a feedback loop.
The bot is trained with cognitive behavioral therapy techniques and has appropriate responses to most simple questions and comments. Psychiatrist Colleen Stiles-Shields recommends Woebot as a supplement to therapy for in-between visits and for mild daily mental health issues, such as difficulty sleeping. She does not recommend it for diagnosing and a sole treatment option. To its credit, Woebot also uses patient health questionnaire items to track moods, depressive symptoms, anxiety, and suicidal ideation.
Whatever the current concerns, it looks like AI tools like ChatGPT are here to stay. Overall, the legal and ethical use of AI and healthcare will require careful processes of considering privacy, HIPAA compliance, security, accuracy, transparency, accountability, and informed consent by users. There are certainly improvements to be made and “bugs” to be worked out, however it does show promise for augmenting the abilities of the healthcare workforce in the future.
FormDr is trusted by over 7000 providers to supply HIPAA-compliant electronic forms and form templates. By eliminating the paper forms, practices reduce administrative burden and costs, easily find forms that are uploaded directly into the EHR and streamline work processes. The FormDr benefits are many – find out how we can help increase your medical office’s efficiency by contacting us for a free demo.