Skip directly to site content Skip directly to page options Skip directly to A-Z link Skip directly to A-Z link Skip directly to A-Z link
Volume 7, Number 5—October 2001
Perspective

Implementing a Network for Electronic Surveillance Reporting from Public Health Reference Laboratories: An International Perspective

Author affiliations: Centers for Disease Control and Prevention, Atlanta, Georgia, USA

Cite This Article

Abstract

Electronic data reporting from public health laboratories to a central site provides a mechanism for public health officials to rapidly identify problems and take action to prevent further spread of disease. However, implementation of reference laboratory systems is much more complex than simply adopting new technology, especially in international settings. We describe three major areas to be considered by international organizations for successful implementation of electronic reporting systems from public health reference laboratories: benefits of electronic reporting, planning for system implementation (e.g., support, resources, data analysis, country sovereignty), and components of system initiation (e.g., authority, disease definition, feedback, site selection, assessing readiness, problem resolution). Our experience with implementation of electronic public health laboratory data management and reporting systems in the United States and working with international organizations to initiate similar efforts demonstrates that successful reference laboratory reporting can be implemented if surveillance issues and components are planned.

The environment for infectious disease surveillance systems is rapidly changing, and the need to obtain current information about diseases in a specific population is increasing. Disease surveillance in many countries is often fragmented or out-of-date, slow, nonstandardized, inflexible, and not well integrated with respect to both laboratory and epidemiology functions. Emerging diseases (1,2), changes in antibiotic resistance (3), and threats of terrorism with biologic agents (4) have heightened the awareness of surveillance needs worldwide. In this environment, public health officials in many countries desire to improve their surveillance for infectious diseases and look to electronic reporting systems to address deficiencies. Indeed, electronic data reporting from public health laboratories to a national-level surveillance database provides a means for public health officials to identify problems rapidly and take action to prevent further spread of disease (5). However, implementing an electronic surveillance system is considerably more complex than simply applying new technology. Issues such as personnel, funding, politics, and public health policies also must be taken into account.

Well before adopting an electronic surveillance system, those involved in planning it may want to address issues regarding public health systems in general and how these apply specifically to their own country or laboratory. For example, laboratory methods for characterizing and subtyping agents are rapidly changing; new technologies for surveillance can make even recent developments appear obsolete; personnel, public health policies, and politics change, to name a few factors. With these considerations in mind, planners will want to consider flexible laboratory surveillance systems. Planners also need to determine exactly what the surveillance system is to accomplish, whether it is to detect outbreaks, analyze trends, or generate hypotheses (6-10).

Benefits of Electronic Reporting for Surveillance

Electronic surveillance systems meet three broad surveillance objectives: they generate hypotheses, monitor trends, and detect clusters and outbreaks (11). Electronic data transmission enables these objectives to be met very rapidly and often more accurately than with other reporting systems, thus extending the benefits to actually controlling the spread of illness. While the information needs for tomorrow and capacities to meet them can change overnight, the underlying surveillance principles and objectives are constant.

Generating Hypotheses

One role of surveillance is to provide hypothesis-generating data (e.g., demographic characteristics of patients, risk factors for illness, or antimicrobial resistance patterns of infecting organisms). Surveillance databases should not be expected to provide answers to all questions about a particular disease or topic but to comprise a minimum data set to suggest hypotheses about events under surveillance.

Monitoring Trends

Surveillance systems that collect consistent data over extended periods can provide valuable information about spatial, temporal, and demographic changes in disease incidence. For example, the emergence or reemergence of pathogens, changes in antimicrobial resistance, or changes in target populations can be detected rapidly by examining electronic surveillance data. Information on trends or patterns provides a reliable basis for decision making about preventing and controlling disease.

Detecting Clusters or Outbreaks

Before the initiation of electronic reporting, surveillance data may have signaled only that a cluster had occurred; this signal was often of little value in outbreak control since the outbreak may have been over before it was recognized. Detecting clusters often depended on alert laboratorians or epidemiologists recognizing increases in disease occurrence on the basis of their increased workload or their memory rather than actual data. With electronic reporting, data can be transmitted so rapidly that an outbreak can be detected and investigated while it is ongoing and interventions can be implemented. Statistical evaluations of surveillance data reported electronically can be more timely and accurate, and they will have greater value in detecting and curtailing outbreaks early (5,10,12-16).

Planning

Planning might begin with general considerations on how to address surveillance. Even at this stage, perceptions and opinions of planners are likely to differ. The difficulty of arriving at consensus on topics such as the appropriate flow of data can be demonstrated by the discussions surrounding the National Salmonella Surveillance System (17), which have been ongoing since the system converted to an electronic format in 1990. Although it may not be possible to bring closure to all general surveillance questions (6,18), several need to be asked early in planning: What is appropriate and necessary to report? Should international (regional) centers be established for receiving and distributing reports? How can regional centers obtain participation of member countries and ensure appropriate levels of discretion regarding potentially damaging information traveling along the network?

Planners should also research what will be required to support an electronic surveillance reporting system in their country. The suggestion that a country build a network for rapid reporting, analysis, and communications is usually received with enthusiasm (19). Later, when the resources (including personnel) required become apparent, enthusiasm can wane. In every case, inevitable situations exist that could undermine a country's ability to implement and sustain a system. Obtaining software and beginning to report without first planning and establishing the support base for the system spells failure (6). The following issues should be considered by all involved in planning electronic systems.

Availability of Data at All Levels

The purpose of laboratory disease surveillance is to collect appropriate data for each site, analyze these data, and act on the information in conjunction with other public health groups. Although summary data are occasionally collected, they allow for only limited analyses, and sites receiving only summary numbers have little, if any, way to evaluate the data quality (aggregated data are less amenable to editing at sites beyond the reporting site). In contrast, data pertaining to individual disease events provide a basis for multiple analyses, and receiving sites can develop edit routines to detect errors in data reported. Thus, data should be collected at a level of specificity that will provide decision makers the information they need to take action.

Analysis of laboratory data should be encouraged at all levels (11,18). At the local level, public health officials are accustomed to dealing with known levels of specified diseases and are more likely to detect an abrupt change in frequencies as their case follow-up load increases or their demands for certain laboratory services increase. At successively broader-reaching levels of the public health system, there is less front-line knowledge of specific local disease events. When public health data are analyzed and interpreted at all participating levels and appropriate databases are built to satisfy the needs for data at each surveillance level (11), the interval from disease event to problem recognition can be substantially reduced. This is particularly true in the networks of international surveillance. Multinational centers may encourage and assist ministries of health to evaluate their data for unusual clusters and trends, and then assimilate reports of national disease problems into regional assessments. In the United States, for example, reports of Salmonella isolates are examined for national outbreaks at the Centers for Disease Control and Prevention (CDC) and also by personnel in state health departments for outbreaks within states.

Continuous System Support

An electronic reporting system needs well-defined, continued support. Experts on disease are needed to establish surveillance relationships, definitions, and ongoing surveillance; and technical experts need to establish system tools and implement business rules (6,8,20). Two other levels of commitment are necessary. First, personnel at the receiving site should establish a cooperative relationship with reporting sites; reach agreement on disease definitions, reporting frequencies, and data elements; and provide feedback to reporting sites (18). Second, reporting sites should agree to use a common surveillance reporting tool that is coordinated and supported by the central site. At CDC, disease specialists (epidemiologist, laboratorians, and statisticians) work together to accomplish the first task, while system trainers and computer specialists manage the laboratory reporting "help desk" to answer system questions by telephone and visit reporting sites to provide training and answer technical questions. The Caribbean Epidemiology Center (CAREC), a Pan American Health Organization (PAHO) comprising 21 member countries, is an example of such an international reporting infrastructure. The CAREC central data-receiving site provides disease reporting specifications, training, and support to member countries, i.e., the reporting sites.

A critical component of system support is building consensus among partners. In the United States, the Council of State and Territorial Epidemiologists (CSTE), the Association of Public Health Directors (APHL), CDC, local health departments, and other agencies are often active partners in developing national surveillance systems. CSTE and APHL meet annually to discuss surveillance issues, system implementation policies, and data dissemination. CAREC meets annually for a joint meeting of laboratorians and epidemiologists from each member country to define surveillance directions, set policy, and develop consensus plans.

Resource Needs

Paper-based systems require resources for data processing (encoding, editing, and data input), reporting, and storage. In contrast, electronic reporting imposes the additional larger burden of training (21), technical assistance, and hardware requirements (Table 1). Resources to sustain a reference laboratory reporting system vary by site and depend on such factors as number of diseases reported, complexity of the system, and number of sites involved.

Resource requirements for electronic surveillance may have the largest impact on three areas: personnel, funding, and workload. Probably the most important of these is personnel since the outcome of reference laboratory reporting is largely dependent on the will of the involved personnel to make the system succeed. This highlights the need to assure acceptance of electronic reporting by everyone involved in the planning stages (22). Obtaining support and guidance for the system from organizations representing reporting sites can also encourage acceptance by participating laboratories. For example, APHL was instrumental in developing and implementing electronic reporting from state public health laboratories in the United States (9). During the initial phases, CDC was in constant contact with APHL to determine system specifications and resolve data issues.

Funding resources can vary greatly by reporting site (6,11,22-24). In the United States, state public health laboratory resources are not consistent, ranging from the underfunded state laboratory with outdated computers to the highly funded laboratory with state-of-the-art computer equipment and networks. In the public health community, competition for funding is strong; in some instances, organizations with the most resources can prepare the most effective funding proposals, thereby obtaining more funds than those with more urgent needs. Similarly, in developing countries, resources to replace antiquated equipment for basic laboratory functions may not exist, and establishing electronic reporting may be impossible (24). Obtaining hardware for a laboratory system may depend on external funding (22). For implementing a network hub, resources must be allocated for hardware, software, and personnel. Creating such networks may be very valuable in terms of increased frequency and speed of communication among participants and greater ease of data sharing.

The workload of laboratories in the United States and other countries is overwhelming, and data management and reporting are considered less urgent than simply meeting the day-to-day demands. Thus, system designers need to minimize additional work while demonstrating added value. Rapid reporting of public health laboratory data also imposes additional burdens on epidemiologic resources. As clusters are detected electronically, epidemiologic assessments are needed and interventions are often necessary (5), further increasing workloads.

High-Quality Data at All System Points

Electronic laboratory reporting provides the opportunity for large numbers of records to populate a database rapidly, regardless of data quality. Changes in data origin, acquisition, and reporting practices at reporting sites may remain invisible to the receiving sites. Such variations may cause subtle data misrepresentations, which can adversely affect data quality and cause public health officials to take erroneous action. Recently, for example, an unusual number of multiresistant pathogens were electronically reported to CDC in a single week. The report caused concern and prompted immediate public health action in the belief that a dangerous statewide outbreak was occurring. Further data evaluation revealed a data entry error by untrained personnel at the site of origin. This example highlights the need for constant evaluation of data by trained personnel at both reporting and receiving sites.

Multinational surveillance centers face even more such issues, since data quality varies from country to country. One benefit of having these centers would be to improve a country's existing infrastructure or assist in building one sufficient to provide quality data. Long-term surveillance benefits are much greater when electronic reporting systems use the established public health infrastructure rather than bypassing it (19,25). Systems bypassing public health infrastructures will likely have data of no higher quality than those reported through official channels.

CAREC is an example of a multicenter international organization that has taken a leadership role in improving infrastructure in participating countries. The 21 CAREC member countries vary dramatically in their public health laboratory resources. CAREC has been instrumental in providing hardware, software, and training to these laboratories. A week-long training workshop on laboratory reporting that CAREC conducted for member country epidemiologists and laboratorians strengthened the laboratory data management infrastructure and led to a greater commitment to electronic reporting.

Balancing International Data Sharing and Country Sovereignty

Each country should maintain jurisdiction over its data and the extent to which data originating within its borders are distributed and published. Political and economic issues concerning data reporting for surveillance purposes should not be overlooked. The consequence of a regional center's disclosing data outside a country while a serious disease problem is occurring within the country might be economically and politically devastating. For example, a report of a food-associated disease can prevent acceptance of food products exported from the country and may inflict havoc on its tourism industry (14,26). Local participants in an electronic surveillance system should be notified immediately if the national site detects an unusual cluster in the area.

Considering the sensitive nature of some disease problems, a two-pronged international public health surveillance network would be acceptable in many places and is often used, i.e., a formal surveillance network and an informal communications network (12,13,27-29). The formal network comprises information in the public domain that is open via authorized access and might include controlled access by the public. The informal network, providing information and data summaries not appropriate for public knowledge but needed to alert appropriate public health officials about potential (possibly unconfirmed) disease problems, serves a restricted group of users and is open only to key public health officials participating in the surveillance effort. The formal network provides the public with limited information that has been confirmed and is considered correct and final, and the informal network, based on possibly preliminary and incomplete data, serves the function of quietly alerting national officials about a potential disease problem outside their country that might affect their own population. Such a strategy must have exceptions or be modified in some situations. For example, a PAHO subregional center (20) may provide laboratory services for member countries that do not individually have resources to perform these functions. In this case, the data flow is reversed, since data from individual countries are generated at the center's laboratory (e.g., by Salmonella serotyping) instead of being reported to the center. The center is responsible for analyzing the data and rapidly returning appropriate results to the country. The center and the individual country become coowners of the data, but their responsibilities differ: The center is responsible for sending data and analysis results to the country; the country is responsible for interpreting the results and acting appropriately.

Clearly, an information international network has the benefit of providing public health officials a worldwide perspective on disease trends and international outbreaks. This in turn provides the mechanism for public health officials to rapidly identify worldwide problems and take international action to prevent further spread of disease.

Strong Leadership

Because the topic of disease surveillance can be politically sensitive, adding electronic reference laboratory reporting tends to raise political interest (6,18). Competing organizations within the public health system may have conflicting interests that hinder effective implementation of electronic systems. These issues should be identified in the planning phase; personnel representing different interests should participate in the planning and consensus on key issues. Even so, internal issues will arise that can substantially diminish efforts to implement an effective system. Leaders in the planning process should be persons or organizations knowledgeable about electronic systems, focused on the goals and purposes of surveillance, and able to resolve conflicting viewpoints.

In the United States, CDC in conjunction with CSTE is integrating the numerous approaches now used to report the list of notifiable and reportable diseases. Although the list itself is a unifying point of agreement among the state and federal governments (30), approaches to building databases for these reports vary widely, as do solutions for how diseases should be reported (23). By assuming the leadership and problem-solving roles and pursuing a collective effort to integrate these approaches, these groups anticipate reducing the reporting burden at local, state, and federal levels (31).

Components of Successful Electronic Reporting

A successful electronic surveillance system has many components. Surveillance systems and software for surveillance are not equivalent. The following items should be considered in initiating such a system.

Authority

Public health reference laboratory reporting should be supported from the highest office with the authority to mandate public health offices and officials to participate (6,22) (e.g., Ministry of Health). Without the mandate of official authority (including funding and infrastructure), surveillance systems may be built but operate successfully for only a short time. In a multinational surveillance system, all participating countries should have such a clear mandate; it should not be assumed.

Standardizing Disease Definitions

Standardizing disease definitions among reporting sites is a critical component for data analysis. This role is taken by the central receiving site (32). The public health community has developed standards for use in case and laboratory electronic reporting (33-35); these could be implemented in international reference laboratories and clinical laboratories. In some international communities, the public health reference laboratory may also serve as the clinical laboratory.

Although data about cases can be standardized by carefully constructed disease definitions, standardization of laboratory data may be more difficult because data quality is affected by the variability of laboratory competency, methods, workload, equipment, and other factors. Periodic quality checks of laboratory procedures may help increase data quality.

Planning

Planning is a critical component in setting up a reference laboratory surveillance system. Identifying system stakeholders (both laboratory and epidemiology) is often a first step. Hosting a stakeholders' planning meeting can provide a means to address and define system goals, objectives, business rules, functions, and surveillance approaches (e.g., sentinel vs. comprehensive). These meetings also provide an avenue to evaluate and plan for political and technical issues surrounding integration; a workable timetable for system implementation including system pilot testing; data confidentiality, ownership, and dissemination rights at all reporting sites; quality control assessment; identification of personnel for system management; and creation of a system help desk.

Data Elements

Data collected for surveillance differ from research data. Surveillance should provide the minimum necessary data fields to understand the current disease situation. The tendency to ask for data about every factor potentially related to each disease condition should be curtailed, as it will burden both the plan and reporters.

System Support

Initially, a support plan may be developed for each site to address personnel turnover, personnel responsibilities, and backup procedures. After reporting has begun, all system users should have a point of contact for problem solving. If the central reporting site is also the developer of the surveillance software, user-support personnel fill this role and can be a continuing resource for identifying needs for future enhancements. User groups can also be a means for system or software developers to receive system specifications and enhancements.

Support personnel for electronic reporting can be generally grouped according to function (Table 2). Although numbers of persons will vary, personnel performing each function should be specifically identified (6,21,36). In some settings, one person may be assigned more than one task. When a data transmission system is down and surveillance is interrupted, resources and support should be available at each site to make system recovery immediate (6). Delays can deliver an implicit message to participants that timeliness is not important.

The responsibility for analyzing electronically reported data and returning results rapidly to the reporting sites may be specifically designated. The complexity of analysis and volume of data will determine the need for statistical, programming, and epidemiologic resources (36). At the central receiving site, these resources should be allocated in the initial planning stage and should not be diverted to other areas after its implementation.

The support structure in state and international public health laboratories varies according to the funding and the complexity of the data management system. For example, in smaller laboratories with limited computer resources, one person may serve many roles (e.g., site supervisor, data manager, programmer). In a larger laboratory (the central receiving site), many people perform these roles. In CDC's Division of Bacterial and Mycotic Diseases (in which approximately 100,000 isolates per year are reported electronically) one computer programmer, one epidemiologist, one statistician, and one health communicator are actively involved with data analysis, system support, and feedback to 100 reporting sites.

Assessing Country Readiness

Assessing a country's readiness for electronic surveillance before a system is implemented involves many elements. The existing laboratory infrastructure should be evaluated to ensure commitment to the project. Potential participating laboratories (reporting sites) should also be evaluated to determine technical capabilities and requirements (e.g., access to an Internet service provider, access to dedicated quality phone lines, computer knowledge, determination of hardware requirements, and availability of computer maintenance support). Assessment of professional staff support to determine political concerns (e.g., data sharing between laboratory and epidemiology sites), and evaluation of staff personnel requirements to provide effective maintenance of surveillance are also important. Ongoing sources of financial and political support should be contacted to determine if long-term commitment to the surveillance system exists among those in authority to support the project.

Site Selection and Cooperation

Each public health site has a different surveillance environment, with varying levels of interest and expertise as well as differing perspectives about surveillance and software tools, funding levels and sources, political atmospheres, and disease problems affecting daily workloads. Such differences should not be overlooked in enlisting sites.

The central site should coordinate selection of reporting sites on the basis of what they can contribute to the surveillance effort and their desire to participate. Persons at the central receiving level who have particular interest and knowledge about the disease should enlist the appropriate sites, provide definitions and consultation, and assist in outbreak or other investigations. At CDC, each implementation of a new laboratory reporting module (a set of specifications for each reporting condition) is spearheaded by an epidemiologist with close assistance of technical staff. Electronic reporting will benefit substantially when two persons with distinct job functions work together to prepare reporting sites. For example, having epidemiologists prepare personnel at the participating sites for their expected role in the system before contact is made by the technical support team should help to ensure that site epidemiologists and laboratorians have been given adequate specifications and that the capabilities of the site to participate have been determined.

Site Preparation and Training

As each reporting site joins the electronic reporting system, all persons participating at that site should be trained to perform the duties associated with surveillance and the system tools (11,22,36). Various approaches to system training may be used.

In one approach, trainers from the central site visit the reporting sites to train laboratorians and epidemiologists to use the system and assist with issues specific to their environments. This strategy was demonstrated in PAHO-sponsored electronic laboratory training conferences in Trinidad, Venezuela, and Argentina. Although it involves substantial travel costs to the central site, more people are usually trained than if each site sends personnel to the central receiving site, and the training is conducted in the reporting site's environment.

A second approach is for participants from reporting sites to visit the central receiving site for training, as occurred when personnel from the Ireland Food Safety Authority visited CDC for training. This approach is efficient for the central receiving site because training can be coordinated into a single large session without incurring travel time.

A third hybrid of these approaches, especially appropriate in the international setting, provides a training workshop at the central site attended by representatives from the reporting sites and follow-up with onsite training. Bringing a group together at the central site creates an environment that induces interactions among participants that may not likely occur otherwise. All these approaches, however, have a common goal: to train people who can then train others at their sites.

Often, initial training is lost as the organization experiences personnel turnover, system enhancements, or infrequent use of the system. The central receiving site may want to be alert to signs that retraining might be needed. A plan could be devised initially to provide training to representatives from all sites, follow-up to ensure that the training was effective, and retraining when necessary.

Software/Hardware Distribution

Often, surveillance systems are funded in part by the central receiving site (24) or by sources outside the public health domain. Hardware provided by these sources may arrive at sites with no provisions for its disposition. The support team should ensure that equipment from either source arrives at the correct location and is tested and operational, that software is properly installed, and that equipment is not diverted to another project.

Software selection presents additional considerations. Software designed for collecting and reporting data about only a single disease can be designed in every detail to accommodate the needs for that particular surveillance effort. However, the broader perspective for surveillance will embrace software that is designed for multiple disease reporting purposes and that can be easily adapted for many different environments and reporting requirements. In either case, software should be regarded by its users as easy to use and comprehensive in its functions (8). The tendency is to begin laboratory reporting by shopping for software that might perform data management without concern about surveillance goals. Rather, one may want to select software on the basis of the functions it can bring to the system (10,15). If a software package has been developed and successfully implemented for laboratory reporting, then implementation or modification of existing software may be more efficient.

Pilot Test

Successful laboratory implementations (9) include a pilot phase in which questionnaires, equipment, and personnel are tested. Pilot phases ensure that data being reported from sites meet system specifications. Later, as new disease conditions are added to the system, they should also be tested.

Problems identified during the pilot testing phase can range widely--from poor software performance, communication problems, and data issues to funding and resource problems. Each problem should be addressed and its solution retested before actual reporting is begun. In the United States, pilot testing of the reporting system was done in five state public health laboratories. When problems reported to CDC were resolved, these laboratories were able to assume a leadership role in implementing the system in other state laboratories.

Feedback

Often, surveillance systems are dependent on voluntary or mandatory reporting without compensation (34). To assure that the system's value is recognized, central receiving sites should provide meaningful feedback to the reporting sites (18). Feedback should provide information needed by the sites and stimulate sites to input data in a timely way. Feedback on recent trends and current multisite clusters of disease should take precedence over bulky reports and detailed tabulations, which, although useful for fiscal accounting or periodic disease assessment, provide little incentive for timely reporting.

Summary

Our experience with implementing electronic public health laboratory data management and reporting systems in the United States and with working with international organizations to initiate similar efforts demonstrates that successful reference laboratory reporting can be implemented if surveillance issues and components are adequately planned for. The public health can benefit when data arrive at analysis sites so rapidly that outbreaks can be detected, responses initiated, and interventions implemented in time to prevent cases that would otherwise have occurred. Resources focused on an electronic reporting systems for public health laboratories and efforts to sustain the system are worthwhile when the system rapidly makes data available to describe current disease situations. Initiating electronic reporting opens a new paradigm for conducting surveillance-one that is highly challenging but increasingly necessary.

Dr. Bean is the Chief of of Biostatistics and Information Management Branch and project leader of the Laboratory Information Tracking System in the National Center for Infectious Diseases, Centers for Disease Control and Prevention. Her research interests include reference laboratory infrastructures, outbreak detection, design and implementation of reference laboratory information management systems, and electronic laboratory surveillance.

Dr. Martin is the former Chief of the Biostatistics and Information Management Branch and former project leader of the Laboratory Information Tracking System at the National Center for Infectious Diseases, Centers for Disease Control and Prevention. His research interests include reference laboratory infrastructure; outbreak detection; design and implementation of reference laboratory information management systems; electronic laboratory surveillance; and Salmonella stereotypes trend analysis.

Top

Acknowledgment

The authors thank Robert Tauxe and Claire Broome for their insightful suggestions and comments, Linda MacKinnon for her assistance in manuscript development, and Lynne McIntyre for her helpful editorial comments.

Top

References

  1. Centers for Disease Control and Prevention. Addressing emerging infectious disease threats: a prevention strategy for the United States. Atlanta: U.S. Department of Health and Human Services; 1994.
  2. Centers for Disease Control and Prevention. Preventing emerging infectious diseases: a strategy for the 21st century. Atlanta: U.S. Department of Health and Human Services; 1998.
  3. Satcher  D. Emerging infections: getting ahead of the curve. Emerg Infect Dis. 1995;1:16. DOIPubMedGoogle Scholar
  4. Stephenson  J. Confronting a biological Armageddon: experts tackle prospects of bioterrorism. JAMA. 1996;276:34951. DOIPubMedGoogle Scholar
  5. Mahon  BE, Rohn  DD, Pack  SR, Tauxe  RV. Electronic communication facilitates investigation of a highly dispersed foodborne outbreak: salmonella on the superhighway. Emerg Infect Dis. 1995;1:945. DOIPubMedGoogle Scholar
  6. Martin  SM, Bean  NH. Data management issues for emerging diseases and new tools for managing surveillance and laboratory data. Emerg Infect Dis. 1995;1:1248. DOIPubMedGoogle Scholar
  7. Bean  NH, Martin  SM, Bradford  H. PHLIS: an electronic system for reporting public health data from remote sites. Am J Public Health. 1992;82:12736. DOIPubMedGoogle Scholar
  8. Hutwagner  LC, Maloney  EK, Bean  NH, Slutsker  L, Martin  SM. Using laboratory-based surveillance data for prevention: an algorithm for detecting salmonella outbreaks. Emerg Infect Dis. 1997;3:395400. DOIPubMedGoogle Scholar
  9. Foltz  AM. Modeling technology transfer in health information systems. Learning from the experience of Chad. Int J Technol Assess Health. 1993;9:34659. DOIPubMedGoogle Scholar
  10. Hull  C. Observations on health information in developing countries. Methods Inf Med. 1994;33:3045.PubMedGoogle Scholar
  11. Hutwagner  LC, Maloney  EK, Bean  NH, Slutsker  L, Martin  SM. Using laboratory-based surveillance data for prevention: an algorithm for detecting salmonella outbreaks. Emerg Infect Dis. 1997;3:395400. DOIPubMedGoogle Scholar
  12. Sandiford  P, Annett  H, Cibulskis  R. What can information systems do for primary health care? An international perspective. Soc Sci Med. 1992;34:107787. DOIPubMedGoogle Scholar
  13. Heymann  DL, Rodier  GR. Global surveillance of communicable diseases. Emerg Infect Dis. 1998;4:3625. DOIPubMedGoogle Scholar
  14. Dalton  CB, Griffin  PM, Slutsker  L. Electronic communication and the rapid dissemination of public health information. Emerg Infect Dis. 1997;3:801. DOIPubMedGoogle Scholar
  15. Tauxe  RV. Emerging foodborne diseases: an evolving public health challenge. Emerg Infect Dis. 1997;3:42534. DOIPubMedGoogle Scholar
  16. Stephenson  J. New approaches for detecting and curtailing foodborne microbial infections. JAMA. 1997;277:1337, 133940. DOIPubMedGoogle Scholar
  17. Farrington  CP, Andrews  NJ, Beale  AD, Catchpole  MA. A statistical algorithm for the early detection of outbreaks of infectious disease. JR Stat Soc. 1996;159:54763. DOIGoogle Scholar
  18. Centers for Disease Control and Prevention. Salmonella surveillance annual summaries. Atlanta: The Centers; 1998.
  19. De Kadt  E. Making health policy management intersectoral: issues of information analysis and use in less developed countries. Soc Sci Med. 1989;29:50314. DOIPubMedGoogle Scholar
  20. Groce  NE, Reeve  ME. Traditional healers and global surveillance strategies for emerging diseases: closing the gap. Emerg Infect Dis. 1996;2:3513. DOIPubMedGoogle Scholar
  21. Epstein  DB. Recommendations for a regional strategy for the prevention and control of emerging infectious diseases in the Americas. Emerg Infect Dis. 1995;1:1035. DOIPubMedGoogle Scholar
  22. Loevinsohn  BP. Data utilization and analytical skills among mid-level health programme managers in a developing country. Int J Epidemiol. 1994;23:194200. DOIPubMedGoogle Scholar
  23. Mendelson  DN, Salinsky  EM. Health information system and the role of state government. Health Aff. 1997;16:10619. DOIGoogle Scholar
  24. Thacker  SB, Stroup  DF. Future directions for comprehensive public health surveillance and health information systems in the United States. Am J Epidemiol. 1994;140:38397.PubMedGoogle Scholar
  25. Osiobe  SA. Health information imperatives for third world countries. Soc Sci Med. 1989;28:912. DOIPubMedGoogle Scholar
  26. Plianbangchang  S. Southeast Asia intercountry consultative meeting on prevention and control of new, emerging, and reemerging infectious diseases. Emerg Infect Dis. 1995;1:158. DOIPubMedGoogle Scholar
  27. Plotkin  BJ, Kimball  AM. Designing an international policy and legal framework for the control of emerging infectious diseases: first steps. Emerg Infect Dis. 1997;3:19. DOIPubMedGoogle Scholar
  28. Vacalis  TD, Bartlett  CLR, Shapiro  CG. Electronic communication and the future of international public health surveillance. Emerg Infect Dis. 1995;1:345. DOIPubMedGoogle Scholar
  29. Kaferstein  FK, Motarjemi  Y, Bettcher  DW. Foodborne disease control: a transnational challenge. Emerg Infect Dis. 1997;3:50310. DOIPubMedGoogle Scholar
  30. Fritz  CL, Dennis  DT, Tipple  MA, Campbell  GL, McCance  CR, Gubler  DJ. Surveillance for pneumonic plague in the United States during an international emergency: a model for control of imported emerging diseases. Emerg Infect Dis. 1996;2:306. DOIPubMedGoogle Scholar
  31. Centers for Disease Control and Prevention. Case definitions for infectious conditions under public health surveillance. MMWR Morb Mortal Wkly Rep. 1997;46(RR-10):55.
  32. Centers for Disease Control and Prevention. Integrating public health information and surveillance systems: a report and recommendations for the CDC/ATSDR steering committee on public health information and surveillance system development. Atlanta: The Centers; 1995.
  33. Scheckler  WE. Surveillance, foundation for the future: a historical overview and evolution of methodologies. Am J Infect Control. 1997;25:10611. DOIPubMedGoogle Scholar
  34. Centers for Disease Control and Prevention. Common Information for Public Health Electronic Reporting (CIPHER) Guide. Available at URL: http://www.cdc.gov/od/hissb/docs/cipher.htm
  35. White  MD, Kolar  LM, Steindel  SJ. Evaluation of vocabularies for electronic laboratory reporting to public health agencies. J Am Med Inform Assoc. 1999;6:18594. DOIPubMedGoogle Scholar
  36. McDonald  CJ, Overhage  M, Dexter  P, Takesue  B, Dwyer  DM. A framework for capturing clinical data sets from computerized sources. Ann Intern Med. 1997;127:67582.PubMedGoogle Scholar
  37. Berhie  G. Emerging issues in health planning in Saudi Arabia: the effects of organization and development on the health care system. Soc Sci Med. 1991;33:81524. DOIPubMedGoogle Scholar

Top

Tables

Top

Cite This Article

DOI: 10.3201/eid0705.010502

Table of Contents – Volume 7, Number 5—October 2001

EID Search Options
presentation_01 Advanced Article Search – Search articles by author and/or keyword.
presentation_01 Articles by Country Search – Search articles by the topic country.
presentation_01 Article Type Search – Search articles by article type and issue.

Top

Comments

Please use the form below to submit correspondence to the authors or contact them at the following address:

Nancy Bean, Centers for Disease Control and Prevention, 1600 Clifton Road, Mailstop C09, Atlanta, GA 30333, USA; fax: 404-639-2780

Send To

10000 character(s) remaining.

Top

Page created: April 26, 2012
Page updated: April 26, 2012
Page reviewed: April 26, 2012
The conclusions, findings, and opinions expressed by authors contributing to this journal do not necessarily reflect the official position of the U.S. Department of Health and Human Services, the Public Health Service, the Centers for Disease Control and Prevention, or the authors' affiliated institutions. Use of trade names is for identification only and does not imply endorsement by any of the groups named above.
file_external