Artificial intelligence based assessment of clinical reasoning documentation: an observational study of the impact of the clinical learning environment on resident documentation quality

Abstract Background Objective measures and large datasets are needed to determine aspects of the Clinical Learning Environment (CLE) impacting the essential skill of clinical reasoning documentation. Artificial Intelligence (AI) offers a solution. Here, the authors sought to determine what aspects o...

Full description

Saved in:
Bibliographic Details
Main Authors: Verity Schaye, David J. DiTullio, Daniel J. Sartori, Kevin Hauck, Matthew Haller, Ilan Reinstein, Benedict Guzman, Jesse Burk-Rafel
Format: Article
Language:English
Published: BMC 2025-04-01
Series:BMC Medical Education
Subjects:
Online Access:https://doi.org/10.1186/s12909-025-07191-x
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850176967502987264
author Verity Schaye
David J. DiTullio
Daniel J. Sartori
Kevin Hauck
Matthew Haller
Ilan Reinstein
Benedict Guzman
Jesse Burk-Rafel
author_facet Verity Schaye
David J. DiTullio
Daniel J. Sartori
Kevin Hauck
Matthew Haller
Ilan Reinstein
Benedict Guzman
Jesse Burk-Rafel
author_sort Verity Schaye
collection DOAJ
description Abstract Background Objective measures and large datasets are needed to determine aspects of the Clinical Learning Environment (CLE) impacting the essential skill of clinical reasoning documentation. Artificial Intelligence (AI) offers a solution. Here, the authors sought to determine what aspects of the CLE might be impacting resident clinical reasoning documentation quality assessed by AI. Methods In this observational, retrospective cross-sectional analysis of hospital admission notes from the Electronic Health Record (EHR), all categorical internal medicine (IM) residents who wrote at least one admission note during the study period July 1, 2018– June 30, 2023 at two sites of NYU Grossman School of Medicine’s IM residency program were included. Clinical reasoning documentation quality of admission notes was determined to be low or high-quality using a supervised machine learning model. From note-level data, the shift (day or night) and note index within shift (if a note was first, second, etc. within shift) were calculated. These aspects of the CLE were included as potential markers of workload, which have been shown to have a strong relationship with resident performance. Patient data was also captured, including age, sex, Charlson Comorbidity Index, and primary diagnosis. The relationship between these variables and clinical reasoning documentation quality was analyzed using generalized estimating equations accounting for resident-level clustering. Results Across 37,750 notes authored by 474 residents, patients who were older, had more pre-existing comorbidities, and presented with certain primary diagnoses (e.g., infectious and pulmonary conditions) were associated with higher clinical reasoning documentation quality. When controlling for these and other patient factors, variables associated with clinical reasoning documentation quality included academic year (adjusted odds ratio, aOR, for high-quality: 1.10; 95% CI 1.06–1.15; P <.001), night shift (aOR 1.21; 95% CI 1.13–1.30; P <.001), and note index (aOR 0.93; 95% CI 0.90–0.95; P <.001). Conclusions AI can be used to assess complex skills such as clinical reasoning in authentic clinical notes that can help elucidate the potential impact of the CLE on resident clinical reasoning documentation quality. Future work should explore residency program and systems interventions to optimize the CLE.
format Article
id doaj-art-e402e0ca9c8d4535ac9979a167927af5
institution OA Journals
issn 1472-6920
language English
publishDate 2025-04-01
publisher BMC
record_format Article
series BMC Medical Education
spelling doaj-art-e402e0ca9c8d4535ac9979a167927af52025-08-20T02:19:07ZengBMCBMC Medical Education1472-69202025-04-0125111010.1186/s12909-025-07191-xArtificial intelligence based assessment of clinical reasoning documentation: an observational study of the impact of the clinical learning environment on resident documentation qualityVerity Schaye0David J. DiTullio1Daniel J. Sartori2Kevin Hauck3Matthew Haller4Ilan Reinstein5Benedict Guzman6Jesse Burk-Rafel7Department of Medicine, New York University Grossman School of MedicineDepartment of Medicine, New York University Grossman School of MedicineDepartment of Medicine, New York University Grossman School of MedicineDepartment of Medicine, New York University Grossman School of MedicineDepartment of Medicine, New York University Grossman School of MedicineInstitute for Innovations in Medical Education, New York University Grossman School of MedicineDivision of Applied AI Technologies, New York University Langone HealthDepartment of Medicine, New York University Grossman School of MedicineAbstract Background Objective measures and large datasets are needed to determine aspects of the Clinical Learning Environment (CLE) impacting the essential skill of clinical reasoning documentation. Artificial Intelligence (AI) offers a solution. Here, the authors sought to determine what aspects of the CLE might be impacting resident clinical reasoning documentation quality assessed by AI. Methods In this observational, retrospective cross-sectional analysis of hospital admission notes from the Electronic Health Record (EHR), all categorical internal medicine (IM) residents who wrote at least one admission note during the study period July 1, 2018– June 30, 2023 at two sites of NYU Grossman School of Medicine’s IM residency program were included. Clinical reasoning documentation quality of admission notes was determined to be low or high-quality using a supervised machine learning model. From note-level data, the shift (day or night) and note index within shift (if a note was first, second, etc. within shift) were calculated. These aspects of the CLE were included as potential markers of workload, which have been shown to have a strong relationship with resident performance. Patient data was also captured, including age, sex, Charlson Comorbidity Index, and primary diagnosis. The relationship between these variables and clinical reasoning documentation quality was analyzed using generalized estimating equations accounting for resident-level clustering. Results Across 37,750 notes authored by 474 residents, patients who were older, had more pre-existing comorbidities, and presented with certain primary diagnoses (e.g., infectious and pulmonary conditions) were associated with higher clinical reasoning documentation quality. When controlling for these and other patient factors, variables associated with clinical reasoning documentation quality included academic year (adjusted odds ratio, aOR, for high-quality: 1.10; 95% CI 1.06–1.15; P <.001), night shift (aOR 1.21; 95% CI 1.13–1.30; P <.001), and note index (aOR 0.93; 95% CI 0.90–0.95; P <.001). Conclusions AI can be used to assess complex skills such as clinical reasoning in authentic clinical notes that can help elucidate the potential impact of the CLE on resident clinical reasoning documentation quality. Future work should explore residency program and systems interventions to optimize the CLE.https://doi.org/10.1186/s12909-025-07191-xArtificial intelligenceClinical reasoningDocumentationClinical learning environment
spellingShingle Verity Schaye
David J. DiTullio
Daniel J. Sartori
Kevin Hauck
Matthew Haller
Ilan Reinstein
Benedict Guzman
Jesse Burk-Rafel
Artificial intelligence based assessment of clinical reasoning documentation: an observational study of the impact of the clinical learning environment on resident documentation quality
BMC Medical Education
Artificial intelligence
Clinical reasoning
Documentation
Clinical learning environment
title Artificial intelligence based assessment of clinical reasoning documentation: an observational study of the impact of the clinical learning environment on resident documentation quality
title_full Artificial intelligence based assessment of clinical reasoning documentation: an observational study of the impact of the clinical learning environment on resident documentation quality
title_fullStr Artificial intelligence based assessment of clinical reasoning documentation: an observational study of the impact of the clinical learning environment on resident documentation quality
title_full_unstemmed Artificial intelligence based assessment of clinical reasoning documentation: an observational study of the impact of the clinical learning environment on resident documentation quality
title_short Artificial intelligence based assessment of clinical reasoning documentation: an observational study of the impact of the clinical learning environment on resident documentation quality
title_sort artificial intelligence based assessment of clinical reasoning documentation an observational study of the impact of the clinical learning environment on resident documentation quality
topic Artificial intelligence
Clinical reasoning
Documentation
Clinical learning environment
url https://doi.org/10.1186/s12909-025-07191-x
work_keys_str_mv AT verityschaye artificialintelligencebasedassessmentofclinicalreasoningdocumentationanobservationalstudyoftheimpactoftheclinicallearningenvironmentonresidentdocumentationquality
AT davidjditullio artificialintelligencebasedassessmentofclinicalreasoningdocumentationanobservationalstudyoftheimpactoftheclinicallearningenvironmentonresidentdocumentationquality
AT danieljsartori artificialintelligencebasedassessmentofclinicalreasoningdocumentationanobservationalstudyoftheimpactoftheclinicallearningenvironmentonresidentdocumentationquality
AT kevinhauck artificialintelligencebasedassessmentofclinicalreasoningdocumentationanobservationalstudyoftheimpactoftheclinicallearningenvironmentonresidentdocumentationquality
AT matthewhaller artificialintelligencebasedassessmentofclinicalreasoningdocumentationanobservationalstudyoftheimpactoftheclinicallearningenvironmentonresidentdocumentationquality
AT ilanreinstein artificialintelligencebasedassessmentofclinicalreasoningdocumentationanobservationalstudyoftheimpactoftheclinicallearningenvironmentonresidentdocumentationquality
AT benedictguzman artificialintelligencebasedassessmentofclinicalreasoningdocumentationanobservationalstudyoftheimpactoftheclinicallearningenvironmentonresidentdocumentationquality
AT jesseburkrafel artificialintelligencebasedassessmentofclinicalreasoningdocumentationanobservationalstudyoftheimpactoftheclinicallearningenvironmentonresidentdocumentationquality