Do the Challenges Outweigh the Potential of Using ChatGPT in CDI?

 

 

Chat GPT in CDI (Clinical Documentation Integrity) is real. In the rapidly evolving realm of artificial intelligence (AI), the chatter has reached a fever pitch around the potential uses of ChatGPT, OpenAI’s conversational AI model. Since its unveiling in November 2022, advocates have championed the technology’s potential in sectors as varied as healthcare, marketing, law, and education. The Clinical Documentation Integrity (CDI) space is in the crosshairs of this discussion, especially as AI continues to reshape healthcare delivery. There are already concerns about potential malpractice risks associated with the use of AI. Still, the question remains: can ChatGPT in CDI solve all its related issues and challenges?

The Iodine Intelligence podcast, hosted by Iodine Software’s Marketing Manager, Lauren Hickey, grapples with this timely question. Hickey’s guest, Fran Jurcak, Chief Clinical Officer at Iodine Software, joined the conversation to dissect the implications of ChatGPT in CDI.

Hickey and Jurcak discuss the following:

  • The capabilities and limitations of ChatGPT and its underlying Natural Language Processing (NLP) technology in the context of CDI
  • The potential issues with the biases inherent in AI technologies, including ChatGPT
  • The future possibilities for CDI through a blend of AI technologies and whether ChatGPT could play a role in that solution

Fran Jurcak, a seasoned healthcare leader and the Chief Clinical Officer at Iodine Software, brings a wealth of knowledge and experience to this conversation. Known for her clinical strategy expertise, Jurcak has been instrumental in exploring and applying AI’s potential within the CDI landscape, leveraging her rich background in healthcare, nursing, and technology to chart a forward-thinking path for Iodine Software.

ChatGPT’s potential in CDI is a complex issue, as the conversation reveals. While the technology’s ability to generate original content based on a user’s request shows promise, its limitations—specifically, its inability to interpret information contextually and identify missing data—pose significant challenges, including whether or not physicians will meticulously check the work for accuracy. Moreover, the AI’s existing biases, fueled by the online textual data used for training, could present serious ethical implications when applied to medical documentation. According to Jurcak, while ChatGPT alone may not be the magic bullet for CDI, its integration with other AI technologies could open new avenues for innovation in the future.

Recent Episodes

Virtual care is no longer an experiment—it’s a structural shift in healthcare. Telehealth usage remains significantly higher than pre-2020 levels, and providers across disciplines are rethinking how to deliver higher-quality outcomes without the overhead and insurance constraints of traditional clinics. Meanwhile, recreational and endurance sports participation continues to rise, with millions of Americans registering…

Hospitals and surgery centers own millions of dollars in equipment — but owning assets and having actionable visibility into them are two different things. Most systems maintain inventories, yet many struggle with outdated records, fragmented tracking, and limited insight into useful life or service contracts. With nearly half of U.S. hospitals reporting negative operating…

Behind every city vote, hospital budget or zoning decision is a leader navigating tough, often conflicting priorities. Right now, public leaders are operating in an environment of rising healthcare costs, workforce shortages and heightened community expectations—especially within safety-net systems that collectively provide billions in uncompensated care each year. The stakes are real—they affect patients…