Skip directly to site content Skip directly to page options Skip directly to A-Z link Skip directly to A-Z link Skip directly to A-Z link
Volume 10, Number 1—January 2004
Research

Evaluating Detection and Diagnostic Decision Support Systems for Bioterrorism Response

Dena M. Bravata*†Comments to Author , Vandana Sundaram*†‡, Kathryn M. McDonald*†, Wendy M. Smith*†, Herbert Szeto*†§, Mark D. Schleinitz¶#, and Douglas K. Owens*†‡
Author affiliations: *University of California San Francisco-Stanford Evidence-based Practice Center, Stanford, California, USA; †Stanford University School of Medicine, Stanford, California, USA; ‡VA Palo Alto Healthcare System, Palo Alto, California, USA; §Kaiser Permanente, Redwood City, California, USA; ¶Rhode Island Hospital, Providence, Rhode Island, USA; #Brown University School of Medicine, Providence, Rhode Island, USA

Main Article

Table 3

Evaluation data for diagnostic decision support systems for bioterrorism-related illnessa

System name Purpose Evaluation datab
Clinical decision support system for detection and respiratory isolation of tuberculosis patients (21) To automate the detection and respiratory isolation of patients with positive cultures and chest x-rays suspicious for TB. In a retrospective analysis, the system increased the proportion of appropriate TB isolations in inpatients from 51% to 75% but falsely recommended isolation of 27 of 171 patients. In a prospective analysis, the system correctly identified 30 of 43 of patients with TB but not identify 21 of these patients (false-negatives). However, the decision support system identified 4 patients not identified by the clinicians (21).
Columbia–Presbyterian Medical Center Natural Language Processor (22) To automate the identification of 6 pulmonary diseases (including pneumonia) through analysis of radiology reports. The system had a sensitivity of 81% (95% confidence interval [CI] 73% to 87%) and a specificity of 98% (95% CI 97% to 99%) compared to physicians who had an average sensitivity of 85% and specificity of 98% (22).
Computer Program for Diagnosing and Teaching Geographic Medicine (23) To provide a differential diagnosis of infectious diseases matched to 22 clinical parameters for a patient; also to provide general information about infectious diseases, anti-infective agents, and vaccines. The computer program correctly identified 75% (222 of 295) of the actual cases and 64% (128 of 200) of the hypothetical cases of patients with infectious diseases (23). The clinical diagnosis was included in the computer differential diagnosis list in 94.7% of cases. Among the cases included in this evaluation, several were for bioterrorism diseases (23).
DERMIS (24,25) To provide a differential diagnosis of skin lesions. The system correctly diagnosed lesions 51% to 80% of the time and included the correct diagnosis among its top 3 choices 70% to 95% of the time (out of a total of 5,203 cases) (24,25). The system was more accurate for dermatologist users than general practitioners.
Dxplain (26) To provide a differential diagnosis based on clinician-entered signs and symptoms. The system includes descriptions and findings for potential bioterrorism agents, and is updated weekly to account for potential outbreaks. In an evaluation of 103 consecutive internal medicine cases, Dxplain correctly identified the diagnosis in 73% of cases, with an average rank of 10.7 (the rank of a diagnosis refers to its position on the differential diagnosis—for example, the diagnosis with the greatest likelihood of being the actual disease is ranked first and the next most likely diagnosis is ranked second) (26).
Fuzzy logic program to predict source of bacterial infection (27) To use age, blood type, gender, and race to predict the cause of bacterial infections. The program was able to correctly classify 27 of 32 patients into 1 of 4 groups based on demographic data alone (27).
Global Infectious Disease and Epidemiology Network (GIDEON) (28) To provide differential diagnoses for patients with diseases of infectious etiology. All potential bioterrorism agents as specified by CDC are included in the GIDEON knowledge base (28). Whereas medical house officers listed the correct diagnosis first in their admission note 87% of the time (for 75 of 86 patients), GIDEON provided the correct diagnosis for 33% (28 of 86 patients) (28).
Iliad (and Medical HouseCall which is a system for consumers derived from Iliad) (2931) To provide a differential diagnosis based on clinician-entered signs and symptoms. The knowledge base is focused in internal medicine and was last updated in 1997. In a multicenter evaluation, each of 33 users analyzed 9 diagnostically difficult cases. On average, Iliad included the correct diagnosis in its list of possible diagnoses for 4 of the 9 cases, and included the correct diagnosis within its top 6 diagnoses for 2 of the 9 cases. The differential diagnosis generated by Iliad is not dependent upon the level of training of the user (2931).
Neural Network for Diagnosing Tuberculosis (32) To predict active pulmonary TB (using clinical and radiographic information) so that patients may be appropriately isolated at the time of admission. The neural network correctly identified 11 of 11 patients with active TB (100% sensitivity, 69% specificity) compared with clinicians who correctly diagnosed 7 of 11 patients (64% sensitivity, 79% specificity) (32).
PNEUMON-IA (33) To diagnose community-acquired pneumonia from clinical, radiologic and laboratory data. The decision support system correctly identified pneumonia in 4 of 10 cases, compared with between 3 and 6 cases for the clinician experts (33).
Quick Medical Reference (QMR) (34) To provide a differential diagnosis based on clinician-entered signs and symptoms. One prospective study used QMR to assist in the management of 31 patients for which the anticipated diagnoses were known to exist in the QMR knowledge base. In the 20 cases for which a diagnosis was ultimately made, QMR included the correct diagnosis in its differential in 17 cases (85%) and listed the correct diagnosis as most likely in 12 cases (60%) (34).
SymText (35,36) To analyze radiology reports for specific clinical concepts such as identifying patients with pneumonia. Average sensitivity and specificity for assessing the location and extension of pneumonia was 94% and 96% for physicians and 34% and 95% for SymText. In selecting patients who are eligible for the pneumonia guideline, the area under the ROC curves was 89.7% for SymText and 93.3% for physicians (35,36).
Texas Infectious Disease Diagnostic Decision Support System (37) To provide a weighted differential diagnosis based on manually entered patient information. The system was compared to a reference standard that missed the diagnosis of 98 of 342 cases of brucellosis. In 86 of the 98 patients, this system listed brucellosis in the top 5 diagnoses on the differential diagnosis list, and in 69 of these 98 patients, brucellosis was the only disease suggested by the system. The system missed the diagnosis in 12 of 98 patients. On average, without the system it took 17.9 days versus 4.5 days with the system to suspect the correct diagnosis (37).
University of Chicago – Artificial Neural Network for Interstitial Lung Disease (38) To help radiologists differentiate among 11 interstitial lung diseases by using clinical parameters and radiographic findings to develop a differential diagnosis. Areas under the ROC curve obtained with and without the system output were 0.911 and 0.826 (p < 0.0001), respectively (38).
University of Chicago – Computer Aided Diagnosis of Interstitial Lung Disease (39) To aid in the detection of interstitial lung disease in digitized chest radiographs. Areas under the ROC curve obtained with and without computer-aided diagnostic output were 0.970 and 0.948 (p = 0.0002), respectively (39).

aTB, tuberculosis; CDC, Centers for Disease Control and Prevention; ROC, receiver-operating characteristic curve.
bWhere possible, we report sensitivity and specificity data (and highlight them in bold); if the published reports did not provide these values directly but did provide sufficient data for them to be calculated, we performed these calculations.

Main Article

References
  1. Hughes  JM, Gerberding  JL. Anthrax bioterrorism: lessons learned and future directions. Emerg Infect Dis. 2002;8:10134.PubMedGoogle Scholar
  2. Heller  MB, Bunning  ML, France  ME, Niemeyer  DM, Peruski  L, Naimi  T, Laboratory response to anthrax bioterrorism, New York City, 2001. Emerg Infect Dis. 2002;8:1096102.PubMedGoogle Scholar
  3. McCullough  M. Anthrax hoaxes, false alarms taxing authorities nationwide. The Seattle Times. November 10, 2001;Nation & World.
  4. Perkins  BA, Popovic  T, Yeskey  K. Public health in the time of bioterrorism. Emerg Infect Dis. 2002;8:10158.PubMedGoogle Scholar
  5. Bravata  DM, McDonald  K, Owens  DK, Smith  W, Rydzak  C, Szeto  H, Bioterrorism preparedness and response: use of information technologies and decision support systems (Evidence Report/Technology Assessment No. 59). Rockville (MD): prepared by the UCSF-Stanford Evidence-based Practice Center under Contract No. 290-97-0013 for the Agency for Healthcare Research and Quality; 2002.
  6. F.Y. 2002-F.Y. 2006 plan for combating bioterrorism. Washington: U.S. Department of Health and Human Services; 2001.
  7. Henahan  S. Anthrax sensor. Access Excellence.com. [Accessed September 28, 2001]. Available from: URL: http://www.accessexcellence.com/WN/SUA12/anthrax298.html
  8. MesoSystems Products. MesoSystems Technology Inc. [Accessed October 29, 2001]. Available at: http://www.mesosystems.com
  9. Holmberg  M, Gustafsson  F, Hornsten  EG, Winquist  F, Nilsson  LE, Ljung  L, Bacteria classification based on feature extraction from sensor data. Biotechnol Tech. 1998;12:31924. DOIGoogle Scholar
  10. Rowe  CA, Scruggs  SB, Feldstein  MJ, Golden  JP, Ligler  FS. An array immunosensor for simultaneous detection of clinical analytes. Anal Chem. 1999;71:4339. DOIPubMedGoogle Scholar
  11. Idaho Technologies products. Idaho Technologies. [Accessed October 29, 2001]. Available from: URL: http://www.idahotech.com
  12. Milanovich  F. Reducing the threat of biological weapons. Lawrence Livermore National Laboratory, Science and Technology Review. [Accessed September 7, 2001]. Available from: URL: http://www.llnl.gov/str/Milan.html
  13. Commission on Life Sciences, National Research Council. Chemical and biological terrorism: research and development to improve civilian medical response. Washington: National Academy Press; 1999.
  14. Biological detection system technologies: technology and industrial base study: a primer on biological detection technologies. North American Technology and Industrial Base Organization; 2001.
  15. Rostker  B. Close-out report: biological warfare investigation. Washington: Department of Defense; 2000.
  16. Von Bredow  J, Myers  M, Wagner  D, Valdes  J, Loomis  L, Zamani  K. Agroterrorism: agricultural infrastructure vulnerability. Ann N Y Acad Sci. 1999;894:16880. DOIPubMedGoogle Scholar
  17. New Horizons Diagnostics Corporation. New Horizons Diagnostics Corp. [Accessed August 22, 2001]. Available from: URL: http://www.nhdiag.com/
  18. Ticket  SMART. (biological agents). American School of Defense. [Accessed October 24, 2001]. Available from: URL: http://www.asod.org/id10.htm
  19. Centers for Disease Control and Prevention. Handheld immunoassays for detection of Bacillus anthracis spores. [Accessed October 25, 2001]. Available from: URL: http://www.bt.cdc.gov/DocumentsApp/Anthrax/10182001HealthAlertPM/10182001HealthAlertPM.asp
  20. Government Service Administration. GSA Policy Advisory: Guidelines for federal mail centers in the Washington, DC, Metropolitan Area for managing possible anthrax contamination. [Accessed March 26, 2003]. Available from: URL http://www.ostp.gov/html/GSAAnthraxGuidelines.html
  21. Knirsch  CA, Jain  NL, Pablos-Mendez  A, Friedman  C, Hripcsak  G. Respiratory isolation of tuberculosis patients using clinical guidelines and an automated clinical decision support system. Infect Control Hosp Epidemiol. 1998;19:94100.
  22. Hripcsak  G, Friedman  C, Alderson  PO, DuMouchel  W, Johnson  SB, Clayton  PD. Unlocking clinical data from narrative reports: a study of natural language processing. Ann Intern Med. 1995;122:6818.PubMedGoogle Scholar
  23. Berger  SA, Blackman  U. Computer program for diagnosing and teaching geographic medicine. J Travel Med. 1995;2:199203. DOIPubMedGoogle Scholar
  24. Brooks  GJ, Ashton  RE, Pethybridge  RJ. DERMIS: a computer system for assisting primary-care physicians with dermatological diagnosis. Br J Dermatol. 1992;127:6149. DOIPubMedGoogle Scholar
  25. Smith  HR, Ashton  RE, Brooks  GJ. Initial use of a computer system for assisting dermatological diagnosis in general practice. Med Inform Internet Med. 2000;25:1038. DOIPubMedGoogle Scholar
  26. Hammersley  JR, Cooney  K. Evaluating the utility of available different diagnosis systems. Proceedings of the Annual Symposium on Computer Applications in Medical Care 1988:229–31.
  27. Cundell  DR, Silibovsky  RS, Sanders  R, Sztandera  LM. Using fuzzy sets to analyze putative correlates between age, blood type, gender and/or race with bacterial infection. Artif Intell Med. 2001;21:2359. DOIPubMedGoogle Scholar
  28. Ross  JJ, Shapiro  DS. Evaluation of the computer program GIDEON (Global Infectious Disease and Epidemiology Network) for the diagnosis of fever in patients admitted to a medical service. Clin Infect Dis. 1998;26:7667. DOIPubMedGoogle Scholar
  29. Murphy  GC, Friedman  CP, Elstein  AS, Wolf  FM, Miller  T, Miller  JG. The influence of a decision support system on the differential diagnosis of medical practitioners at three levels of training. AMIA Proc Annu Fall Symp 1996:219–23.
  30. Berner  ES, Webster  GD, Shugerman  AA, Jackson  JR, Algina  J, Baker  AL, Performance of four computer-based diagnostic systems. N Engl J Med. 1994;330:17926. DOIPubMedGoogle Scholar
  31. Bouhaddou  O, Lambert  JG, Miller  S. Consumer health informatics: knowledge engineering and evaluation studies of medical HouseCall. Proc AMIA Symp 1998:612–6.
  32. El-Solh  AA, Hsiao  CB, Goodnough  S, Serghani  J, Grant  BJ. Predicting active pulmonary tuberculosis using an artificial neural network. Chest. 1999;116:96873. DOIPubMedGoogle Scholar
  33. Verdaguer  A, Patak  A, Sancho  JJ, Sierra  C, Sanz  F. Validation of the medical expert system PNEUMON-IA. Comput Biomed Res. 1992;25:51126. DOIPubMedGoogle Scholar
  34. Bankowitz  RA, McNeil  MA, Challinor  SM, Parker  RC, Kapoor  WN, Miller  RA. A computer-assisted medical diagnostic consultation service. Implementation and prospective evaluation of a prototype. Ann Intern Med. 1989;110:82432.PubMedGoogle Scholar
  35. Fiszman  M, Chapman  WW, Evans  SR, Haug  PJ. Automatic identification of pneumonia related concepts on chest x-ray reports. Proc AMIA Symp 1999:67–71.
  36. Chapman  WW, Haug  PJ. Comparing expert systems for identifying chest x-ray reports that support pneumonia. Proc AMIA Symp 1999:216–20.
  37. Carter  CN, Ronald  NC, Steele  JH, Young  E, Taylor  JP, Russell  LH, Knowledge-based patient screening for rare and emerging infectious/parasitic diseases: a case study of brucellosis and murine typhus. Emerg Infect Dis. 1997;3:736.PubMedGoogle Scholar
  38. Ashizawa  K, MacMahon  H, Ishida  T, Nakamura  K, Vyborny  CJ, Katsuragawa  S, Effect of an artificial neural network on radiologists' performance in the differential diagnosis of interstitial lung disease using chest radiographs. AJR Am J Roentgenol. 1999;172:13115.PubMedGoogle Scholar
  39. Monnier-Cholley  L, MacMahon  H, Katsuragawa  S, Morishita  J, Ishida  T, Doi  K. Computer-aided diagnosis for detection of interstitial opacities on chest radiographs. AJR Am J Roentgenol. 1998;171:16516.PubMedGoogle Scholar
  40. Sox  HC, Blatt  MA, Higgins  MC, Marton  KI. Medical decision making. Boston: Butterworth-Heinemann; 1988.

Main Article

Page created: December 21, 2010
Page updated: December 21, 2010
Page reviewed: December 21, 2010
The conclusions, findings, and opinions expressed by authors contributing to this journal do not necessarily reflect the official position of the U.S. Department of Health and Human Services, the Public Health Service, the Centers for Disease Control and Prevention, or the authors' affiliated institutions. Use of trade names is for identification only and does not imply endorsement by any of the groups named above.
file_external