Information for Disasters, Information Disasters, and ... - CiteSeerX

3 downloads 43130 Views 418KB Size Report
master named Mr. B, as the new deputy instead; a man only 8 years from his. Abstract .... ments associated with “good” information systems and to translate them ...
SPECIAL REPORT

Information for Disasters, Information Disasters, and Disastrous Information Sharon M. McDonnell, MD, MPH;1 Helen N. Perry, MA, PhD (Cand);2 Brooke McLaughlin, MPH;1 Bronwen McCurdy, MPH;1 R. Gibson Parrish MD1

1. Institute for Health Policy and Clinical Practice, Center for Evaluative Clinical Sciences-Educational Programs, Dartmouth Medical School, Hanover, New Hampshire USA 2. Division of Emerging Infections and Surveillance Services, National Center for Prevention, Detection and Control of Infectious Diseases Centers for Disease Control and Prevention, Atlanta, Georgia USA Correspondence: Sharon M. McDonnell, MD, MPH c/o Dartmouth Medical School PO Box 197 Peacham VT 05862 USA E-mail: [email protected] Keywords: ??? Abbreviations: CDC = US Centers for Disease Control and Prevention IDSR = Integrated Disease Surveillance and Response NGO = Non-governmental organization WHO = World Health Organization WHO-AFRO = The African Regional Office of WHO Web publication:

Abstract Information is needed to support humanitarian response in every phase of a disaster. Participants of a multilateral working group convened to examine how best to meet these information needs. Although information systems based on routine reporting of diseases are desirable because they have the potential to identify trends, these systems usually do not deliver on their promise due to inadequate organization and management to support them. To identify organizational and management characteristics likely to be associated with successful information systems in disaster settings, evaluations of the Integrated Disease Surveillance and Response (IDSR) programs in 12 participating countries were reviewed. Characteristics that were repeatedly mentioned in the evaluations as associated with success were grouped into nine categories: (1) human resources management and supervision; (2) political support; (3) strengthened laboratory capacity; (4) communication and feedback (through many mechanisms); (5) infrastructure and resources; (6) system design and capacity; (7) coordination and partnerships with stakeholders; (8) community input; and (9) evaluation. Selected characteristics and issues within each category are discussed. Based on the review of the IDSR evaluations and selected articles in the published literature, recommendations are provided for improving the shortand long-term organization and management of information systems in humanitarian responses associated with disasters. It is suggested that information systems that follow these recommendations are more likely to yield quality information and be sustainable even in disaster settings. McDonnell SM, Perry HN, McLaughlin B, McCurdy B, Parrish RG: Information for disasters, information disasters, and disastrous information. Prehosp Disast Med 2007;22(4):315–321. Development of a reasonably effective primary surveillance system took time. Usually, two full years were required. Experience showed that development was best achieved by establishing for each administrative unit of perhaps 25 million populations, a surveillance team of perhaps two to four persons with transport. Each team…visited each reporting unit regularly to explain and discuss the program, to distribute forms (and often vaccine) and to check on those who were delinquent in reporting. Regularly distributed surveillance reports also helped to motivate these units. Undoubtedly, the greatest stimulus to reporting was the prompt visit of the surveillance team for outbreak investigations and control whenever cases were reported. This simple obvious and direct indication that the routine weekly reports were actually seen and were a cause for public health action did more, I am sure, than the multitude of government directives which were issued. DA Henderson, 19761 The position of deputy headmaster was advertised at an English school. Mr. A, based on his credentials and 20 years teaching at the school was confident he would be chosen. On the day of the announcement, however, the headmaster named Mr. B, as the new deputy instead; a man only 8 years from his

July–August 2007

http://pdm.medicine.wisc.edu

Prehospital and Disaster Medicine

316

Information for Disasters

training. Mr. A. complained to the headmaster, saying that he had 20 years of experience and was therefore the more logical choice. The headmaster responded kindly but soberly saying “Mr. A. I know your work, and I believe you do not have 20 years of experience, rather, you have had one year of experience 20 times. Story told by Working Group participant, HHC Conference, 2006 Introduction Background Since the 1970s and 1980s, a robust literature has developed around the vital role of information to support disaster responses. Information is needed to assess the key contextual elements relating to the health, environment, and culture of the affected population; to select and monitor appropriate interventions; and to encourage accountability among the agencies, donors, and governments involved.2–8 Regrettably, the continued inability to set up mechanisms that will reliably obtain this information and disseminate, raises questions about the capacity and the commitment to solve the underlying problems.9,10 Multi-country evaluation reports about the response to the 2004 Tsunami highlight the poor ability of agencies and donors to define the context in which a disaster occurs or to apply the standards of international disaster assistance.11–15 For example, inappropriate assistance based on inadequate information led to the delivery of vaccines such as cholera and hepatitis A that, based on technical consensus standards, are not recommended for use in disaster settings.16 Health programs were developed for Indonesia and Sri Lanka without understanding that long-standing civil conflict existed in these places. Health and vaccination programs were brought to communities without adequately involving the community or documenting the activities. Without records of the specific vaccines used or the individuals vaccinated, the population must be viewed as unvaccinated and a new effort mounted.13,17 Lessons learned about information systems, whether in their infancy—such as those created to support emergency humanitarian relief—or those that are well-established in stable conditions, point to a limited number of necessary elements: rudimentary organizational and management support; inadequatley trained and supervised personnel; and lack of ongoing community involvement and information exchange.1,18–21 Despite successful projects that demonstrate or describe these elements like Child Survival, Integrated Disease Surveillance and Response (IDSR), the Routine Health Information Network (RHINO), and the Sphere Project, most agencies implementing information systems in disaster settings fail to incorporate them into their systems. In this paper, the pervasive managerial and organizational challenges that constrain the ability of humanitarian agencies to gather and use information to guide disaster response are examined. The failure to meet these challenges is the most important reason for poor quality data and the inability to demonstrate program impact. Moreover, lack of information perpetuates inadequate capacity development Prehospital and Disaster Medicine

and hinders the ability to learn from and cope with the changing needs of communities over time.22 To further characterize these challenges, a literature review was conducted23–28 and compiled data from evaluations and reports were completed on the US Centers for Disease Control/World Health Organization (CDC/WHO) sponsored Integrated Disease Surveillance and Response (IDSR) program.29–38 The IDSR program was chosen for two reasons: (1) it is a large information system that uses a consistent approach in multiple settings; and (2) during the 2006 Humanitarian Health Conference Technical Session, the Working Group on Monitoring and Evaluation named IDSR as one of the few “successful” information system projects. The goal was to describe the organizational elements associated with “good” information systems and to translate them into guidance for humanitarian relief. It is hoped that this paper will encourage increased emphasis on these elements in disaster responses, particularly as such responses often last years beyond the initial crisis. Challenges to Information Gathering and Use During the acute crisis, the task of setting up programs to provide services dominates the collection and use of information.39 As a result, the systems set up to provide information are driven by the needs of individual agencies and are often ad hoc.24 As the situation evolves, organizational skills and managerial competency within and among agencies become critical to the ability to obtain and provide useful information that can meet changing needs. The principal challenges to gaining high quality information in disaster and emergency settings can be grouped into two broad categories: environmental and organizational. The environmental challenges during and following a disaster relate to the circumstances and context in which the event occurred. Damage to infrastructure or a lack of security may render some areas inaccessible or inhospitable, without adequate roads, communication, food and shelter. These difficulties create enormous challenges for relief efforts and constrain their ability to obtain the most basic information needed to mount an appropriate response.40 In the post-tsunami response, just as in the Kurdish refugee crisis of 1991, the lack of access to geographical areas and the people living there significantly limited the ability of agencies to assess, plan, and deliver needed services.13,15 Disasters that occur in an environment with civil strife and a lack of security pose an even larger challenge to information systems because there is neither the stability nor security to conduct needs assessments, maintain reliable services, or develop community support. The organizational challenges for an information system relate to the fact that the system is a fragile chain of people, events, forms, and data held together by voluntary agreements to participate in sharing information that enables action to promote health and control disease. Establishing and maintaining an information system is difficult in any context; after a disaster it is even more difficult. No matter what the purpose or scale of an information system, it must have the capacity to collect, organize, analyze, communicate, and evaluate data.23 Creating and

http://pdm.medicine.wisc.edu

Vol. 22, No. 4

McDonnell, Perry, McLaughlin, et al

317

maintaining this fragile chain of people and voluntary agreements to accomplish this requires significant organizational and managerial capacity and skills.2,12 Each agency must determine how it will collect and manage information for its own needs, as well as how it might link its information with that of other agencies to develop a larger picture of the disaster situation that can be used to assess and monitor overall community needs for implementing agencies (e.g., non-governmental organizations (NGOs), United Nation agencies, host governments).2,12,22 Thus, the organizational and managerial challenges relate both to the competence of individual agencies as well as the existence of a supporting entity or authority structure that can collect, compile, and manage data from multiple agencies. In the typical disaster scenario, each NGO must raise funds, recruit skilled personnel, and commence work in an unstructured and unfamiliar environment. As each agency sets up services, it simultaneously must coordinate with hundreds of other NGOs.12 For example, in South India during the tsunami response, there were 300 organizations in one district of Tamil Nadu alone.41 There is no standard, recognized entity that coordinates or oversees the information needed for disaster response. Rather, for each disaster, such an entity either must be appointed or emerge in the humanitarian response over days or weeks in the aftermath of the event.11,12 The mechanism for coordination is unique to each event and may include an appointed lead agency, host government, UN umbrella agencies (i.e., multiple UN agencies forming a single coordination group), or NGO coordination agencies. The authority granted to the coordinating entity may be derived from voluntary consensus, its ability to grant permission to work in the area, its administration of donor funds, or enforceable legal mandates. The demand on humanitarian agencies to rapidly develop organizational and managerial capacity in a setting devoid of predictable structure is unparalleled in any other business endeavor. In an attempt to encourage the development of high quality information systems, voluntary, minimum standards have been created and have been adopted by >200 NGOs and UN agencies.16,22,42 These standards address the principles but not the processes for providing information support to humanitarian relief. Methods The recommendations from 13 evaluation reports representing 12 countries and one region from the IDSR program were compiled. The African Regional Office of WHO (WHO-AFRO) in collaboration with the WHO headquarters and with technical assistance from the CDC initiated the IDSR program in 1998.26–28 This program was developed to improve the quality of information on communicable diseases available to agencies and governments and to strengthen their capacity to respond to this information. An important element of this program was to streamline multiple, disease-specific information systems into an integrated system that could be described clearly from field to headquarters levels.23 Some of the unique aspects of IDSR are its emphasis on generic information July–August 2007

system functions that transcend diseases and its identification of core activities needed for any type of information system designed to provide ongoing data. The IDSR emphasizes the activities and infrastructure required to collect, manage, and use information. These are defined in technical guidelines and have been adapted and used in 21 countries throughout Africa and at least 15 countries in other regions.28 Using the IDSR framework, each country performed an assessment to identify gaps in existing information systems; during the past 10 years, efforts to address the gaps have been directed by the WHO-AFRO and have included the provision of resources, technical support, and core staff representing bilateral donors, ministries of health, disease control programs, and NGOs. The issues named in the evaluations as “areas for attention or improvement” were reviewed and then compiled under nine headings. These headings and the frequencies that issues under each heading were named within the evaluation reports are listed in Table 1. [unclear] Using specific recommendations from the evaluations, as well as others identified in the published literature, lessons learned for each heading and a list of core managerial and organizational elements for successful information systems were formulated and are listed in Table 2. The extent that an agency can use these lessons while information systems are designed will predict the likelihood that its data will be valued and the systems sustained. In the following sections, key issues in the tables are highlighted. Results and Discussion Human Resources, Training, and Supervision Nearly all participants identified insufficient staff to perform core functions of health and public health in the evaluations, including the operation of information systems as a major problem. However, the problem usually is not related to lack of training.43 In the short-term, any system that collects and uses information must be able to effectively recruit and retain staff. In the long-term, it is the structural issues, such as low pay for workers who quickly reach the top of their career ladder, and inadequate management support and incentives for existing workers that are the most significant contributors to problems with staff recruitment and turnover. Supervision is a significant component within any plan to strengthen human resources in humanitarian response. Reliable technical supervision can motivate staff by providing feedback, mentoring, and simply demonstrating interest in what is reported.43–45 For example, nearly one-quarter of the staff interviewed for the evaluations reported being unclear about who was responsible for performing specific tasks within the information system, a problem that could be improved with supervision and feedback. Based on experience with the IDSR, when supervisors were put in place, a need for tools, such as checklists and vehicles, and communication skills to facilitate regular supervisory visits to field offices emerged. Political Support and Authority Structure As countries, districts, and agencies worked to develop the IDSR information systems, there was a growing apprecia-

http://pdm.medicine.wisc.edu

Prehospital and Disaster Medicine

318

Information for Disasters Area for attention or improvement

Human resources, training and supervision Political support and authority structure Strengthened laboratory capacity Communication and feedback Infrastructure and resources System design and capacity

Coordination and partnerships with key internal and external stakeholders Community based surveillance Evaluation

Frequency 13 11

9 8 5 5 4 3 3

McDonnell © 2007 Prehospital and Disaster Medicine

Table 1—Summary table of recommendations from review of CDC/WHO Integrated Disease Surveillance and Response (IDSR) evaluation reports completed in 2006 or 2007

tion for the value of political support to provide a practical framework to guide multiple agencies at different levels in their efforts to integrate and improve information systems. Specific activities that were important to the development of a multi-agency, countrywide system for reporting information included: 1. Providing a consistent vision and articulating the goals of the information system. Specifically, “sensitizing” agencies, countries, and stakeholders to the value of such a system through visits, meetings, and initiatives was critical. Ongoing marketing and promotion of the goals of the program resulted in its terminology and strategies becoming familiar to, and consistently used by, all involved; and 2. Developing or adapting guidelines, laws, and/or standards as needed, to provide a context and framework for sharing information. As an example, the 2005 International Health Regulations Update for the African Region embraced the “Integrated Disease Surveillance and Response (IDSR) strategy that the WHO Regional Committee for Africa adopted in 1998 by its resolution AFR/RC48/R2” as the context and strategy for developing information systems.46 Political support from the Regional Offices of WHO and the national health ministries for the IDSR program meant that decision-makers increasingly required existing systems to adhere to this vision and did not allow competing or non-collaborative systems to be put into place. This was illustrated when various governments required that new categorical programs (e.g., avian influenza) iniPrehospital and Disaster Medicine

tiate their work within the IDSR framework and guidelines. Such discipline and stability serves to protect existing efforts from being undermined by each new, highly resourced, categorical program. At the same time, there must be a way that new agencies and new categorical programs can join the information system effort, be heard, and have their needs addressed without disrupting previous efforts. Ideally, political support should provide a stable context in which agencies can work and develop capacity over time. The presence of active conflict in some countries (e.g., Sudan) significantly reduced political support for the collection and use of quality information.47,48 The lack of political support and governmental authority meant that NGOs in these countries had to take responsibility for the time-consuming task of assigning roles and coordinating the activities required to set up and support information systems—a job that normally would be performed by the government. This situation is present in many disaster settings and may partially explain the persistent inability to develop quality information systems. Political support and an adequate authority structure are the necesssary foundation for linking large geographic areas and/or multiple agencies into coordinated information systems. Even in chronic disaster settings, years after the acute events, the lack of political authority and organizational capacity often persists. Constraining capacity for information systems to provide routine, ongoing information for surveillance, population health monitoring, and/or program evaluation. Laboratory Capacity Laboratory confirmation is an important source of feedback to those reporting diseases, as illustrated in the Henderson quotation at the beginning of this article on his experience in the smallpox eradication campaign.1 Providing laboratory information impacts the overall accuracy of the information system and motivates quality improvement. In the baseline country assessments prior to IDSR implementation, the lack of laboratory services repeatedly was named as a problem. By 2006 and 2007, consistent support to information systems led to enhanced abilities to connect the information collected to response (treatment and investigation) and decision-making. The maturation of IDSR systems resulted in more support of laboratories at country level and laboratory services expanded considerably. These improved laboratory systems identified previously unrecognized gaps related to inadequate transportation of specimens to laboratories for outbreak confirmation and the inadequacy of networks among existing laboratories for testing specimens. This illustrates that even though the major categories of issues affecting information systems generally remained constant, the issues within some categories changed over time. This success has perpetuated a greater interest in having confirmatory information and accurate diagnostic capacity as well as using these laboratories to move beyond individual patient care towards broader public health functions. A model for developing laboratory services—including specimen collection, handling, transport, and testing—that

http://pdm.medicine.wisc.edu

Vol. 22, No. 4

McDonnell, Perry, McLaughlin, et al

319

I. Political support and authority are present and supportive of the system a. Higher-level officials or authority support the system. In humanitarian response this may be created through the United Nations, donors, or non-governmental organizations. b. Sensitization and marketing of priorities and vision for the system must be established and maintained. c. New monies and ideas are oriented toward the existing system. The existing system must make a place at the table for new ideas and programs as they emerge. Interests have to be represented. d. Authorities provide enforcement and incentives to maintain the goals, standards, and formats of the system. This varies with the situation and the extent and nature of the authority structure. Incentives can be of many types. Some systems have few incentives for honest, accurate, timely reporting. II. Human resources, training, and supervision are available and sufficient to staff the system a. Sufficient staff is available to design, analyze, and maintain the system. At least one epidemiologist must be available for consultation at routine intervals. Adequate support staff must be available to maintain organized, accessible data. b. *Training must be provided to all users (concepts and skills). c. Mechanism exists to train new staff quickly. d. Continuing education is available, routine, and based on quality improvements. e. *Materials describing the system and used for training are professionally done and up-to-date. f. There must be a mechanism for those working within the system to further their careers. III.Infrastructure and resources are sufficient to maintain an organized and well-managed system a. *Supervision (technical, routine, supportive, credible) is available. b. *Confirmation of cases via laboratory or other sources can be integrated into system and communicated back to those reporting diseases or to other health workers. c. System receives ongoing maintenance and needed supplies. d. Sufficient resources for system (money and in-kind) are available. e. Small amounts of discretionary local funds are available for unexpected needs.

IV.System design is clear, well defined, and acceptable a. Goals and objectives of the system are stated. b. * System and the data being collected are acceptable to health workers and community members (e.g., collection of data on therapeutic errors or abortions may be less acceptable to health workers or certain communities). c. Roles of staff at all levels and others supporting the system are described (see IDSR matrix as example). d. *The processes and responsibility for detecting, collecting, reporting, analyzing, and storing data are described; and individual roles are defined. e. Reporting burden and time demands are reasonable and feasible. The number of conditions being reported and the amount of information requested per report is manageable. The system requirements fit in the workflow of those reporting conditions and those organizing data. f. System and processes are designed and tested with users. g. System can be altered. For example, it can be upgraded, meet new needs, and discard unnecessary data elements.

V. Data are perceived as useful and important a. *Collected data are useful to those collecting and reporting the data. Local staff can access data for queries, updates, cleaning, and reconciling discrepancies. b. System is “owned” by those that run it and report to it. c. *Interest in reported data is genuine; data are professionally reviewed and used in various ways. For example, a qualified epidemiologist reviews data once a month. d. *Data are used to initiate investigations, to justify need for prevention, and to guide and evaluate control measures. e. Decision-makers, journalists, community members, and other agencies show an interest in data and use them in reports, proposals, or analyses. f. *Data are used for reports in newspapers, magazines and other lay publications. g. *Other users or systems need, support, and use the data. Examples include police using data for investigating deaths and environmental programs linking data from system with their monitoring data. Administrative systems can be powerful users. Examples include the use of data for releasing benefits to widows, compensating injured workers, paying life insurance claims, and admitting children to school if they have a vaccine card or a negative laboratory test.

McDonnell © 2007 Prehospital and Disaster Medicine

Table 2—Core information system capacities: necessary managerial and organizational elements to generate useful information from quality information systems Those items marked with an asterisk (*) have been repeatedly associated with improved reporting in the literature or previous evaluations.

July–August 2007

http://pdm.medicine.wisc.edu

Prehospital and Disaster Medicine

320

Information for Disasters

is feasible for a given context and available resources has not been well-described for either acute humanitarian crises or for chronic disaster settings in which more and different services are needed. Experience with the IDSR system might serve as a model for disaster settings including the location of laboratories that could be part of a network to serve affected populations.

for information system activities such as reviewing reports and adequately supervising staff, and to assist with the investigation of health concerns. It is noteworthy that the evaluators repeatedly suggested that the availability of a small discretionary budget at the local or field level would provide a powerful incentive to improve the system and motivate local workers.

Communication and Feedback The evaluations found that the structural and technical aspects of communication, such as reliable and familiar channels for communication, were less important than were the content of the communication and the concept of feedback. Two types of feedback were described as especially important: 1. After some event or investigatory activity, field staff want to receive feedback from “headquarters” or the technical experts about what happened; and 2. Health workers or others reporting information to the system want routine feedback on their reporting: how were their reports used and were they useful? Field staff and others want to know how their efforts and input link into and are used by the system. Feedback is an important motivator of individual staff through supervisory feedback, as well as of the organization’s staff in general, through the release of various products in which staff can see the results of their work. The responsiveness of the information system’s leadership and the staff ’s belief that what is reported will be noticed and generate a predictable response create curiosity, job satisfaction, and professionalism. Mechanisms identified in the evaluations for providing feedback and demonstrating responsiveness include newsletters or bulletins, donor or internal reports, investigations of reported cases, newspaper articles, training, supervisory visits, reminders and receipts (acknowledgement via phone, email, or letter), personnel evaluation and rewards, and program changes. Feedback to staff about their reporting should occur for both routine and unusual events, including incomplete reports, confirmatory tests, increased cases, or simply questions. Although feedback improved from their baseline assessments within information systems in countries implementing IDSR, its importance in the proper functioning of information systems was repeatedly emphasized in the evaluations.

Community-Based Surveillance The issues in this category include the need to incorporate information from non-health system sources (e.g., agriculture, education, income generation, and transportation) and to develop inventive ways to recruit, train, and support community relationships and community-based workers and volunteers. Improving mechanisms to obtain and share information from communities using innovative and flexible survey and reporting methods would be valuable not only as an information source, but also, as a way to enhance the communities’ sense of participation and ownership in health-related program activities. Such participation was viewed as an important link to community support, system sustainability, and the perception of effectiveness among community members.

Coordination and Partnerships with Key Internal and External Stakeholders Relationships among agencies and across vertical programs may be tainted by competition for attention and resources. The evaluations indicated that coordination, cooperation, and system integration were more effective at the district and field levels than at national or international levels. As might be expected, relationships at higher levels were more complex and fraught with difficulties. Infrastructure and Resources The perception by field and district staff was that resource shortages significantly hindered the system’s capacity to collect and use information. Personnel shortages, for example, meant that existing staff were not able to dedicate time Prehospital and Disaster Medicine

Monitoring and Evaluation The recommendations in the areas of evaluation and monitoring were for technical support for conducting evaluations and for more time for local staff to periodically assess their systems and activities. Staff wanted an opportunity to systematically and fruitfully look at their progress and ask whether they were doing the right thing, given the fact that persons responding to information systems often lose sight of their objectives, lose focus, and try to do too many things and serve too many masters. Specific Applications to Disasters and Humanitarian Relief Many of the lessons learned and suggestions made in the evaluations of the IDSR are applicable to the establishment and maintenance of information systems developed in disaster settings to support humanitarian relief. These include those related to human resources, communication and feedback, and infrastructure and resources. As expected, NGO resources usually are directed at the delivery of medical or other health-related services rather than the support of information systems. Nevertheless, providing adequate human and material support to the information system, including adequate training and supervision of information system staff and staff reporting diseases and events to the system, can yield significant benefits in the delivery of services by identifying the most pressing health needs and facilitating better targeting of services. To maintain the interest of staff and the quality of information, regular feedback to staff and others working with the information system is critical. Obtaining community-based information is another lesson from IDSR that could be used to great benefit in many disaster settings. Relying on information from medical clinics alone can provide a very incomplete picture of the health status and health problems of the population affected by the disaster.

http://pdm.medicine.wisc.edu

Vol. 22, No. 4

McDonnell, Perry, McLaughlin, et al

321

The lack of political authority and support is a greater challenge in disaster settings than for the countries implementing the IDSR. Unlike the context of the IDSR, in disaster settings, NGOs must take more responsibility for coordinating their information systems, though this frequently is not done or done well. Nevertheless, if systems can be coordinated, and compatible forms and other data collection methods that allow data from different systems to be combined can be used, a much better picture of the impact of the disaster and the humanitarian relief can emerge. Obtaining laboratory information in a disaster setting is another challenge. In the early phases of disasters, a laboratory is not considered a priority. Diagnoses of individual or population health problems often are made on the basis of symptoms and may not be very specific. Nevertheless, the ability to collect and test specimens facilitates the detection and characterization of antibiotic resistance, the identification of species of malarial parasites, the determination of the cause of outbreaks, and the location of sources of water contamination. Given the difficulties of setting up a laboratory in the disaster setting, NGOs may be able to work with the WHO, the CDC, or other government laboratories to obtain this valuable source of information. Conclusions Having the organizational capacity and managerial competence required to develop and maintain a useful information system is not a problem to be solved, but rather a process to be managed.19,20 Even the simplest system requires ongoing political support, resources, and organizational capacity to maintain the technical ability to collect, collate, organize, clean, analyze, and use data. These organizational realities have challenged and bedeviled information systems in every context, but especially those in disaster settings. The primary reason that information systems fail is not simply due to the difficult environment that disasters create or its limited resources, but rather, inadequate organizational capacity associated with a lack of managerial oversight and support. Developing the organizational and managerial capacity to provide the core capacities described in Table 2 could make a significant contribution to the value of the data collected and the sustainability of the information systems. In the early phase of disaster response with its chaos and lack of adequate services, the emphasis has to be on immediately providing a safe environment, adequate food and shelter, and public health and medical services. Determining where these services are needed, mounting the necessary financial and logistical support for them, and putting them in place are extraordinarily difficult. During humanitarian responses, NGOs function autonomously with little or no governmental or other political support. The value of political support and authority to the development of a coherent and sustained information system was of critical importance in the IDSR, and it is the most difficult element to achieve in disaster settings. Developing an on-site coordination body (NGO, UN, donors) and with it, collaborative action among agencies in the field can

July–August 2007

be a powerful means to expand organizational capacity and to improve the quality of support for information collection and public health action. This body may have to act in a quasi-governmental role, define requirements and standards, and provide technical services to agencies, donors, and the UN to support their information systems. Or, its role may be to coordinate and support agencies in conjunction with an appointed governmental or UN body in authority. In any event, the time involved in coordination is considerable and proportional to the complexity of the situation and the number of agencies involved. The design of information systems must accommodate constant change, and by monitoring the situation, the system should provide the information to recognize those changes as they occur. To accomplish this, the vision of the system must be solidly grounded in the need for useful information about the communities affected by the disaster and the involvement of those communities in its collection and use. The demands for data from donors and technical or multi-lateral actors tends to shift the center of attention away from the communities and field staff, reducing them to a means for collecting data without them benefiting from what is gathered. But, true gains can be made by exploring ways to increase enthusiasm and interest among field staff, inspire community advocates and their involvement, and thereby improve the quality, completeness, and transparency of the data collected. Current guidelines for information systems during disaster responses do not specify the minimum personnel needed to sustain basic data collection and use.16 While too much specificity about personnel might set an impossible standard, too little specificity also is risky as it perpetuates the lack of clarity about what is needed to initiate and maintain these systems. Thus, we need definitions of the workers and their roles, including the technical, supervisory, clerical and administrative personnel required to collect and organize data and to note when the data do not arrive or are incomplete.23,43 Finally, it must be demonstrated that lessons are learned from experience and create the conditions for success. The IDSR program is informative insofar as it has directly approached both significant and mundane organizational and managerial challenges rather than trying to “get around” them with a technological “fix”. The enthusiasm to develop a broader array of indicators and a centralized mega-information system that can connect and meet the needs of both field programs and multi-lateral agencies21,42 must be tempered with the reality that such a system inherently is difficult to implement and operate even in its most basic form. To create such a system requires significant time and environmental and organizational stability that often are in short supply in the disaster setting. The experience gained in the IDSR in taking small, but persistent steps that address political support, supervision, and feedback to field workers, the value of laboratory data, and the need for continuous monitoring, evaluation, re-thinking, and innovation points us toward a more practical and ultimately successful path.

http://pdm.medicine.wisc.edu

Prehospital and Disaster Medicine

322

Information for Disasters

References 1. Henderson DA: Principles and lessons from the smallpox eradication program. Bull World Health Organ 1987;65(4):535–546. 2. Banatvala N, Zwi AB: Public health and humanitarian interventions: Developing the evidence base. BMJ 2000;321(7253):101–105. 3. Toole MJ, Waldman RJ: Refugees and displaced persons. JAMA 1993;270:600–605. 4. Roberts L, Hofmann CA: Assessing the impact of humanitarian assistance in the health sector. Emerg Themes Epidemiol 2004;1(1):3. 5. VanRooyen MJ, Hansch S, Curtis D, et al: Emerging issues and future needs in humanitarian assistance. Prehosp Disast Med 2001;16(4):216–222. 6. Burkle FM, McGrady KAW, Newett SL, et al: Complex humanitarian emergencies: III. Measures of effectiveness. Prehosp Disast Med 1995;10;48–56. 7. Griekspoor A, Sondorp E: Enhancing the quality of humanitarian assistance: Taking stock and future initiatives. Prehosp Disast Med 2001;16(4):209–215. 8. Salama P, Spiegel P, Talley L, et al: Lessons learned from complex emergencies over past decade. Lancet 2004;364:1801–1813. 9. Deitchman S: What have we learned? Needs assessment. Prehosp Disast Med 2005;20(6)468–470. 10. Hakewill PA: Monitoring and evaluation of relief programs. Trop Doct 1991;21(suppl 1):24–28. 11. VanRooyen M, Leaning J: After the tsunami — facing the public health challenges. N Engl J Med 2005;352(5):435–438. 12. Zwi AB: How should the health community respond to violent political conflict? PLoS Med 2004;1(1):e14. 13. Mills EJ: Sharing evidence on humanitarian relief. BMJ 2005;331:1485–1486. 14. Cosgrave J: Tsunami evaluation. Summary of issues from secondary and tertiary data.Tsunami Evaluation Coalition, 2005. Available at http://www.tsunami-evaluation.org/NR/rdonlyres/59973B2E-E34F-46BF-BB8617B1A10687EA/0/literature_survey_nov05.pdf. Accessed 28 May 2007. 15. Thomas A: Linking preparedness and performance: The tsunami experience. Humanitarian Exchange Dec 2005;32. Available at http://www.odihpn.org/documents/humanitarianexchange032.pdf#page=38. Accessed 01 June 2007. 16. Sphere Project: Humanitarian Charter and Minimum Standards in Disaster Response: Common standard 6: Evaluation. Geneva, Switzerland: Sphere Project, 2004. Available at http://www.sphereproject.org/content/view/34/ 84/lang,English. Accessed 05 June 2007. 17. Humanitarian Exchange No. 32 Dec 2005 Humanitarian practice network. Humanitarian policy group. Overseas development institute. London UK http://www.odihpn.org/documents/humanitarianexchange032.pdf#page=38 Accessed 01 June 2007. 18. Sandiford P, Cibulskis R: What do information systems do for primary health care? An international perspective. Soc Sci Med 1992;34(10):1077–1087. 19. Timaeus I, Harpham T, Price M, et al: Health surveys in developing countries: The objectives and design of an international program. Soc Sci Med 1988;27(4):359–68. 20. Calain P: From the field side of the binoculars: a different view on global public health surveillance. Health Policy Plan 2007;22:13–20. 21. Calain P: Exploring the international arena of global public health surveillance. Health Policy Plan 2007;22:2–12. 22. Griekspoor A: Raising standards in emergency relief: How useful are Sphere minimum standards for humanitarian assistance? BMJ 2001;323(7315):740–742. 23. Perry HN, McDonnell SM, Alemu A, et al: Planning an integrated disease surveillance and response system: A matrix of skills and activities. BMC Medicine 2007; in press. 24. Nsubuga P, White M, Thacker S, et al: Public Health Surveillance: A Tool for Targeting and Monitoring Interventions. In: Jamison DT, Breman JG, Measham JR, et al, eds. Disease Control Priorities in Developing Countries, 2nd edition. New York: Oxford University Press, 2006. 25. McNabb SJN, Chungong S, Ryan M, et al: Conceptual framework of public health surveillance and action and its application in health sector reform. BMC Public Health 2002;2(2):1–9. 26. World Health Organization/Regional Office for Africa: AFR/RC54/12 Rev. 1 18 June 2004. Fifty-fourth session. Brazzaville: WHO-AFRO, 2004. 27. World Health Organization-Regional Office for Africa, US Centers for Disease Control and Prevention: Technical guidelines for integrated disease surveillance and response in the African region. Brazzaville: WHO-AFRO, 2002. Available at http://www.cdc.gov/idsr/focus/surv_sys_strengthening/tech_guidelines-integrated-diseaseENG.pdf. Accessed 08 June 2007.

Prehospital and Disaster Medicine

28. World Health Organization–Regional Office for Africa: A Regional Strategy for Communicable Diseases 1999-2003. Brazzaville: WHOAFRO, 1999. 29. Partners for Health Reform Plus: April 2002 Tanzania Trip Report. Bethesda, MD: Partners for Health Reform Plus, 2002. 30. US Centers for Disease Control and Prevention: Integrated Disease Surveillance and Response: Four Countries’ Experience, 1998–2005: An Evaluation of the Implementation of the IDSR Strategy in Ghana, Uganda, Tanzania and Zimbabwe. Atlanta: CDC, 2005. 31. World Health Organization: Health Metrics Network: Issues in Health Information: Issue 6: Disease Surveillance. Geneva: World Health Organization. Available at http://www.who.int/healthmetrics/documents/hmnissue_diseasesurveillance.pdf. Accessed 15 July 2007. 32. US Centers for Disease Control and Prevention: Report of IDSR Malaria Data in the Mozambique Health System (unpublished trip report). Atlanta GA: CDC National Center for Preparedness, Detection, and Control of Infectious Diseases, 2006. 33. US Centers for Disease Control and Prevention: Documentation of the Laboratory Networking in Implementation of Integrated Disease Surveillance and Response in Rwanda (unpublished report). Atlanta GA: CDC National Center for Preparedness, Detection, and Control of Infectious Diseases. 34. World Health Organization: Strengthening Surveillance and Response for Epidemic-Prone and Vaccine-Preventable Diseases in Selected African and Eastern Mediterranean countries: Report of the UNFIP Final Project Evaluation in Burkina Faso, Ghana, Guinea, Mali and Southern Sudan. Lyon: WHO Office for National Epidemic Preparedness and Response, 2005. Available at http://www.who.int/csr/resources/publications/surveillance/WHO_CDS_CSR_LYO_2005_23w.pdf. Accessed 08 June 2007. 35. US Centers for Disease Control and Prevention: Report of the Technical Meeting on the Implementation of Core Indicators for Integrated Disease Surveillance and Response in the African Region. Atlanta: CDC, 2004. 36. World Health Organization: WHO Liberia situation report No. 21, period 1-21 Apr 2006. Available at http://unjobs.org/archive/78108078624403173 32499740039859214512472560. Accessed 08 June 2007. 37. Gueye D, Banke K, Mmbuji P: Follow-Up Monitoring and Evaluation of Integrated Disease Surveillance and Response in Tanzania. Bethesda, MD: Partners for Health Reformplus Project and Abt Associates Inc., 2006. 38. Support for Analysis and Research in Africa (SARA) Project: Integrated Disease Surveillance and Response. Policy Brief. Washington DC: Academy for Educational Development, 2003. Available at http://pdf.dec.org/pdf_docs/PNACU389.pdf. Accessed April 2007. [AU: exact date available?] 39. UN Educational, Scientific, and Cultural Organization: Guidebook for Planning Education in Emergencies and Reconstruction. Chapter 3: Capacity Building. Paris: International Institute for Educational Planning, 2006. Available at http://www.unesco.org/iiep/eng/focus/emergency/guidebook.htm. Accessed 01 June 2007. 40. McDonnell SM, Bolton P, Sunderland N, et al: The role of the applied epidemiologist in conflict. Emerg Themes Epidemiol 2004;1:4. 41. Barenstein JD: Challenges and risks in post-tsunami housing reconstruction in Tamil Nadu. Humanitarian Exchange 2006;33. 42. Mock N, Garfield R: Health tracking for improved humanitarian performance. Prehosp Disast Med 2007;22(4): in press. 43. McDonnell SM, Yassin AS, Brown WG, et al: Measuring health program effectiveness in the field: An assessment tool. Prehosp Disast Med 2007;22(4): in press. 44. Soumerai SB, Avorn J: Principles of educational outreach (‘academic detailing’) to improve clinical decision making. JAMA 1990;263:549–56. 45. Curtale F, Sirwakoti B, Lagrosa C, et al: Improving skills and utilization of community health volunteers in Nepal. Soc Sci Med 1995;40(8):1117–1125. 46. World Health Organization/Regional Office for Africa: International Health Regulations (2005), AFR/RC56/INF.DOC/2. Addis Ababa: WHO-AFRO, 2006. Available at http://www.afro.who.int/rc56/documents/afr_rc56_inf_doc_2_international_health_regulations_final.pdf. Accessed 08 June 2007. 47. World Health Organization: Southern Sudan Health Update. Geneva: World Health Organization, 2002. Available at http://www.who.int/disasters/repo/9153.pdf. Accessed 08 June 2007. 48. World Health Organization: Health Action in Crises. Highlights No 158 – 14 to 27 May 2007. Geneva: World Health Organization, 2007. Available at http://www.who.int/hac/donorinfo/highlights/highlights_158_14_27May2 007.pdf. Accessed 20 June 2007.

http://pdm.medicine.wisc.edu

Vol. 22, No. 4