Research Article: Characterizing electronic health record usage patterns of inpatient medicine residents using event log data

Date Published: February 6, 2019

Publisher: Public Library of Science

Author(s): Jason K. Wang, David Ouyang, Jason Hom, Jeffrey Chi, Jonathan H. Chen, Brenessa Lindeman.

http://doi.org/10.1371/journal.pone.0205379

Abstract

Amid growing rates of burnout, physicians report increasing electronic health record (EHR) usage alongside decreasing clinical facetime with patients. There exists a pressing need to improve physician-computer-patient interactions by streamlining EHR workflow. To identify interventions to improve EHR design and usage, we systematically characterize EHR activity among internal medicine residents at a tertiary academic hospital across various inpatient rotations and roles from June 2013 to November 2016. Logged EHR timestamps were extracted from Stanford Hospital’s EHR system (Epic) and cross-referenced against resident rotation schedules. We tracked the quantity of EHR logs across 24-hour cycles to reveal daily usage patterns. In addition, we decomposed daily EHR time into time spent on specific EHR actions (e.g. chart review, note entry and review, results review).In examining 24-hour usage cycles from general medicine day and night team rotations, we identified a prominent trend in which night team activity promptly ceased at the shift’s end, while day team activity tended to linger post-shift. Across all rotations and roles, residents spent on average 5.38 hours (standard deviation = 2.07) using the EHR. PGY1 (post-graduate year one) interns and PGY2+ residents spent on average 2.4 and 4.1 times the number of EHR hours on information review (chart, note, and results review) as information entry (note and order entry).Analysis of EHR event log data can enable medical educators and programs to develop more targeted interventions to improve physician-computer-patient interactions, centered on specific EHR actions.

Partial Text

As medical educators, we hope our resident trainees value direct patient care and contact. Instead, we may progressively find their attention dominated by electronic health records (EHR) that mediate their work and proxy for their patients.[1] Observational studies confirm an increasing shift from direct patient care to computer use in the wake of duty hour restrictions,[2,3,4] informing ongoing reforms into the structure of medical training.[5] Physicians report increasing time spent on paperwork and the computer[6], alongside less time available for clinical interactions with patients.[7] A growing burden of redundant clinical notes, alert fatigue, and an overflowing inbox has led to a systemic “4000 clicks a day” problem[8] that has contributed to physician job dissatisfaction and burnout rates. Indeed, correlations between an increasing EHR task load and physician burnout have been demonstrated.[6,9] There exists a pressing need to understand clinical EHR usage to inform opportunities to improve effective patient care processes. Previous studies have quantified EHR usage intensity through direct observation and self-reported diaries.[2, 10] Ironically, with the amount of time trainees spend on the EHR, the computer also provides precise, reproducible, and scalable quantification of their electronic activities. Event log data, for instance, can capture a provider’s digital timeline of actions by tracking his or her clicks within an EHR interface. Published comparisons between manual observation of clinician activity and automatic analysis of logged EHR timestamps confirm that the two approaches yield similar results for workflow analysis.[11, 12] Systematic evaluation of event log data can thus elucidate provider EHR usage patterns and inform the development more targeted interventions to improve physician-computer-patient interactions.

Using event log timestamps, systematically characterize the intensity of EHR usage across different inpatient medicine rotations and roles.

Logged timestamps from the Epic EHR system for inpatient rotations of internal medicine residents at an academic tertiary care hospital were extracted from the STRIDE project[13] from June 2013 to November 2016. We tracked the quantity of logged EHR actions accessed over 24-hour cycles per user binning by half-hour intervals. Logged EHR actions correspond to behaviors performed on the EHR as clinicians navigate components (e.g. notes, orders, results) of a patient’s electronic chart. Epic-coded EHR actions were binned into broad behavioral categories by a board-certified internal medicine physician. The most common action categories included chart review (provider review of patient medical history, diagnoses, symptoms, demographics, etc.), note review, results review (provider review of lab and imaging results), note entry, order entry, and navigator use. The EHR (Epic) navigator consists of pre-curated sequences of modules to facilitate common actions such as admission, rounding, and discharge. To estimate time spent on a given action, we considered time intervals between event logs separated by five minutes or less of inactivity. For example, if a resident accessed a chart review action at time A and a results review action at time B, and the time interval between A and B was less than five minutes, the interval would be attributed to action A (chart review). If the time interval surpassed five minutes, idleness would be assumed. Sensitivity of results to integer idleness thresholds varying between 5–10 minutes were evaluated. Daily EHR usage was estimated as the sum of all active inter-access time intervals per day. User timestamps were cross-referenced against resident year and rotation schedules to account for each user’s progression through the internal medicine residency program over the three-year window. User-days with less than one hour of activity were excluded from analysis to account for remote access during vacation days. P-values were computed using two-sample t-tests allowing unequal variances (Welch’s t-test). Analyses were performed with Python 2.7 and R 2.13.

During the three-year window, 15,909,629 unique actions were logged by 101 unique residents covering 99 PGY1 (post-graduate year one) intern-years and 61 PGY2+ resident-years. Fig 1 illustrates the intensity of EHR interactions per resident day during a 24-hour cycle for PGY1 interns versus PGY2+ residents across different inpatient rotations.

With increasing reliance on EHRs to mediate patient care, direct analysis of EHR audit logs provides a granular and objective way to characterize physician EHR usage. A key limitation of this approach is the estimation of idle time between access logs; overestimating idle times could overestimate EHR usage times. However, our sensitivity analysis showed that estimations were robust to integer idle time cutoffs between 5–10 minutes. Additionally, although results are aggregated across multiple years, the reported EHR usage statistics are derived from a single academic center, and EHR interfaces are often modified according the needs of each provider system. Nonetheless, we observe that key statistics including mean daily EHR usage time are similar to those found in prior studies conducted at other institutions.[10, 12] The pattern of EHR activity over 24-hour cycles provides qualitative insights into resident behavior (Fig 1). The 2011 ACGME duty hour restrictions prompted the separation of the hospital’s General Medicine rotation into Day and Night Teams. These 24-hour cycles may suggest that night teams treat patient care as discrete shift work, with clinical activities promptly ceasing at 7 AM as noted by the steep drop-off, while general medicine (day team) EHR activity often lingers well beyond duty hour recommendations (9 PM onwards) and restrictions (11 PM onwards), defined as ten and eight hours before the subsequent 7 AM shift.Perhaps more concerning is the large burden of physician-computer interaction across all four rotations, a phenomena that almost certainly restricts direct physician-patient interaction. Assuming a 12-hour work day3, PGY1 interns and PGY2+ residents spent nearly half (PGY1 = 46%, PGY2+ = 43%) of work time on the EHR, respectively. Sinsky et al. confirmed a similar statistic through manual observation of provider workflow, noting that between 37–49% of attending work hours were dedicated to EHR or desk tasks. [10] This trend is especially notable considering the sharp peak in EHR activity upon day team arrival (Fig 1), suggesting the traditional model of pre-rounding at the patient bedside has been replaced by the workroom computer as the trusted source for patient information.[1] Nonetheless, time spent on EHR tasks is still highly individual among providers as we observe substantial variability in mean daily EHR usage statistics, with standard deviations ranging between 1.56–2.58 hours across the four medicine rotations.Comparing between roles (Table 1), we see that PGY2+ residents execute a greater number of EHR actions with disproportionately more time spent on note review, yet less daily EHR time, compared to PGY1 interns. Indeed, in the clinic, PGY2+ residents embrace a supervisory role and often oversee twice the number of patients (note the nearly 2:1 median patient ratio during general medicine rotations) compared to PGY1 interns. Conversely, PGY1 interns are delegated time-intensive note entry duties albeit for a smaller population of patients, potentially accounting for greater daily EHR time. Across diverse rotations, we see significant time spent on the same pattern of EHR activities. PGY1 interns and PGY2+ residents spent on average 2.4 and 4.1 times the number of EHR hours on information review (chart, note, and results review) as information entry (note and order entry) (Table 1), respectively. Improvements in EHR design could target the burden of data retrieval by encouraging more concise documentation and redesigned or automated content organization. By identifying specific EHR activities that consistently dominate resident computer usage across multiple inpatient rotations and roles, we hope to facilitate a more targeted, data-driven approach to improving physician-computer-patient interactions.

 

Source:

http://doi.org/10.1371/journal.pone.0205379

 

0 0 vote
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments