Iodine Software is joining the growing list of healthcare companies looking to harness the power of large language models to reduce clinician burnout.
The Austin-based company — which sells technology designed to improve providers’ clinical documentation, coding accuracy and revenue cycle management — recently announced that it is expanding its relationship with OpenAI. By doing so, Iodine will integrate the latest GPT-4 language model into its existing AI technologies and datasets, specifically its AwareCDI product suite.
This product suite is designed to help providers improve their clinical documentation improvement (CDI) team performance, as well as drive revenue in the mid-cycle, said Iodine CEO William Chan.
“It applies AI to ‘think’ like a CDI professional, helping them solve numerous documentation integrity problems that ensure clinical record accuracy and prevent revenue leakage,” he declared.
Clinicians often spend more than half of their workday tending to their administrative responsibilities, which can include hours of clinical documentation, Chan pointed out. Consequently, the clarity of the clinical record may be compromised, leading to improper or omitted documentation. Such gaps can significantly impact clinical quality and reimbursement, potentially resulting in providers missing out on crucial revenue.
Iodine’s software seeks to address this problem by locating inconsistencies between clinical evidence and documentation, tracking post-discharge and pre-bill records, and enabling better collaboration between clinician and CDI teams. The company is now working to improve its technology’s capabilities with the addition of GPT-4.
Iodine is exploring a few areas where the generative AI model could be useful, including predictive analytics, clinical automation and streamlining the physician query process, Chan said.
“Incorporating generative AI into AwareCDI will significantly improve the technology’s prediction accuracy and will streamline the physician query process. We are already using AI and machine learning, but training large language models like GPT-4 on very large datasets like ours will only improve our applications,” he explained.
Chan pointed out that AI’s efficiency is generally dependent on the data on which it’s trained. Advanced AI models from companies like OpenAI can boost their predictive power by training on the most robust datasets, he added.
Iodine’s dataset contains more than 27% of all U.S. inpatient data, Chan said. In his view, this dataset is a “powerful asset” for training large language models.
Iodine counts about 900 U.S. hospitals as customers, including Baylor, Scott & White and Baptist Jacksonville.
Photo: hirun, Getty Images