Automated surveillance utilizing electronically available data has been found to be accurate and save time. An automated CDI surveillance algorithm was validated at four CDC Prevention Epicenters hospitals. Electronic surveillance was highly sensitive, specific, and showed good to excellent agreement for hospital-onset; community-onset, study facility associated; indeterminate; and recurrent CDI.
It is recommended all US hospitals track
The study population included all adult patients ≥ 18 years of age admitted to four US hospitals participating in the CDC Epicenters Program from July 1, 2005 to June 30, 2006. These hospitals included Barnes-Jewish Hospital (St. Louis, MO), Brigham and Women’s Hospital (Boston, MA), The Ohio State University Medical Center (Columbus, OH), and University Hospital (Salt Lake City, UT).
A conceptual automated CDI surveillance algorithm was created based on recommended surveillance definitions (
There were 1767 patients with stool positive for
The overall sensitivities, specificities, and kappa values of the algorithm by CDI onset compared to the gold standard were as follows: hospital-onset: 92%, 99%, and 0.90; community-onset, study facility associated: 91%, 98%, and 0.84; community-onset, other healthcare facility associated: 57%, 99%, and 0.65; community-onset, community-associated: 96%, 94%, and 0.69; indeterminate cases: 80%, 98%, and 0.76; and recurrent cases: 94%, 99%, and 0.94 (
Each hospital had to individualize the algorithm to their facility. Hospitals A, B, and C did not have discrete data on where a patient was admitted from (e.g. admit from home, long term care facility), whereas hospital D did. Therefore, categorization of community-onset cases at these hospitals was dependent on the discharge status (e.g. discharge to home, long term care facility) if the patient had a prior hospitalization in the previous 12 weeks. Hospital A has a code for patients with frequent outpatient visits called “recurring patients,” which has a start date of the first visit and end date of December 31. Many “recurring patients” with CDI were misclassified as hospital-onset CDI. The medical informatics team created a new table within the database which contained information regarding the visit type associated with a given encounter to correct this problem. Hospital B made minor modifications to the hospital-onset time cut-off to improve accuracy. Hospital C was not able to modify their algorithm because some data was available only through free text fields. Hospital D initially included patients who were admitted to only one particular building, missing those patients who were admitted to the other three buildings of their medical center. This was corrected. Three other issues were identified and resolved after the initial review of discordant cases: outpatient encounters were included when determining case categorization rather than only inpatient encounters; only the first positive
This goal of this study was to develop and validate an automated CDI surveillance algorithm utilizing existing electronically available data. Previous research indicates electronic surveillance is more accurate and reliable than manual surveillance (
Each hospital worked with their individual information technology teams to apply the general automated CDI surveillance algorithm to the data available at their facilities. In this study, data availability and type of data varied from hospital to hospital, thus impacting the accuracy of the automated algorithm. This issue is illustrated by Hospital D, where hospital performed the best at categorizing community-onset CDI because there was a discrete variable that captured where patients were admitted from.
There are potential limitations to the use of an automated CDI surveillance algorithm. Electronic surveillance requires access to an electronic health record (EHR) system. Only about 12% of US hospitals have an EHR system (
Another limitation of using an automated CDI surveillance algorithm is that chart review is not performed. While the lack of chart review is mitigated by enforcing toxin testing of only diarrheal stool, misclassification is still possible. It is possible that a true community-onset CDI case could be misclassified as a hospital-onset CDI case if stool was collected after the hospital-onset cut-off date. In addition, patients with a positive assay for
This study found automated electronic CDI surveillance to be highly sensitive and specific for identifying cases of hospital-onset; community-onset, study center-associated; and recurrent CDI. Automated CDI surveillance will allow infection preventionists to devote more time to infection prevention efforts. In addition, automated CDI surveillance may facilitate a healthcare facility’s ability to track community-onset CDI. Community-onset CDI likely contributes to hospital-onset CDI because patients admitted to a healthcare facility with CDI are a source of
This work was supported by grants from the Centers for Disease Control and Prevention (UR8/CCU715087-06/1 and 5U01C1000333 to Washington University, 5U01CI000344 to Eastern Massachusetts, 5U01CI000328 to The Ohio State University, and 5U01CI000334 to University of Utah) and the National Institutes of Health (K23AI065806, K24AI06779401, K01AI065808 to Washington University).
ERD: research: Optimer, Merck; consulting: Optimer, Merck, Sanofi-Pasteur, and Pfizer
HAN, DSY, JM, KBS, JEM, YMK, VJF: no disclosures
Findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention. Preliminary data were presented in part at the 21st Annual Society of Healthcare Epidemiology of America, Dallas, TX (Apr 1 – 4, 2011), abstract number 157.
Conceptual automated CDI surveillance algorithm.
Sensitivities, specificities, and kappa values by CDI onset and facility
| Case definition | Facility [Sensitivity (%), Specificity(%),(Kappa value)] | |||||||
|---|---|---|---|---|---|---|---|---|
|
| ||||||||
| A | B | C | D | Total | ||||
| Healthcare facility-onset | 99, 98 (.97) | 75, 99 (.66) | 94, 100 (.93) | 100,99 (.99) | 92, 99 (.90) | |||
| Community-onset, study center-associated | 93, 96 (.83) | 100, 97 (.86) | 84, 99 (.83) | 81, 100 (.88) | 91, 98 (.84) | |||
| Community-onset, other healthcare facility associated | 16, 99 (.25) | 82, 98 (.61) | 53, 98 (.59) | 96, 98 (.93) | 57, 99 (.65) | |||
| Community-associated, community-onset | 91, 95 (.71) | 100, 87 (.63) | 100, 92 (.44) | 100, 99 (.91) | 96, 94 (.69) | |||
| Indeterminate | 83, 98 (.80) | 73, 98 (.63) | 63, 97 (.48) | 84, 99 (.88) | 80, 98 (.76) | |||
| Recurrent | 99, 99 (.97) | 88, 99 (.85) | 64, 100 (.77) | 97, 100 (.98) | 94, 99 (.94) | |||