Health Policy PlanHealth Policy PlanheapolheapolHealth Policy and Planning0268-10801460-2237Oxford University Press22669899353846110.1093/heapol/czs022czs022Original ArticlesThe implementation of Integrated Disease Surveillance and Response in Uganda: a review of progress and challenges between 2001 and 2007LukwagoLuswa12*NanyunjaMiriam3NdayimirijeNestor4WamalaJoseph1MalimboMugaga1MbabaziWilliam3GasasiraAnne1NabukenyaImmaculate N1MuseneroMonica1AlemuWondimagegnehu5PerryHelen6NsubugaPeter6TalisunaAmbrose11Uganda Ministry of Health, Kampala, Uganda, 2Makerere University School of Public Health, Kampala, Uganda, 3World Health Organization Country Office for Uganda, Kampala, Uganda, 4World Health Organization, Inter-Country Communicable Diseases Surveillance Team, Kampala, Uganda, 5World Health Organization Regional Office for Africa, Brazzaville, Republic of Congo, 6Centers for Disease Control, Atlanta, GA, USA*Corresponding authors. Luswa Lukwago, Epidemiological Surveillance Division, Ministry of Health, Uganda, P. O. Box 7272, Kampala, Uganda. E-mail: luswal@yahoo.com1201346201246201228130405122011Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2012; all rights reserved.2012This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Background In 2000 Uganda adopted the Integrated Disease Surveillance and Response (IDSR) strategy, which aims to create a co-ordinated approach to the collection, analysis, interpretation, use and dissemination of surveillance data for guiding decision making on public health actions.

Methods We used a monitoring framework recommended by World Health Organization (WHO) and Centers for Disease Control and Prevention (CDC)-Atlanta to evaluate performance of the IDSR core indicators at the national level from 2001 to 2007. To determine the performance of IDSR at district and health facility levels over a 5-year period, we compared the evaluation results of a 2004 surveillance survey with findings from a baseline assessment in 2000. We also examined national-level funding for IDSR implementation during 2000–07.

Results Our findings show improvements in the performance of IDSR, including: (1) improved reporting at the district level (49% in 2001; 85% in 2007); (2) an increase and then decrease in timeliness of reporting from districts to central level; and (3) an increase in analysed data at the local level (from 10% to 47% analysing at least one target disease, P < 0.01). The case fatality rate (CFR) for two target priority diseases (cholera and meningococcal meningitis) decreased during IDSR implementation (cholera: from 7% to 2%; meningitis: from 16% to 4%), most likely due to improved outbreak response. A comparison before and after implementation showed increased funding for IDSR from government and development partners. However, funding support decreased ten-fold from the government budget of 2000/01 through to 2007/08. Per capita input for disease surveillance activities increased from US$0.0046 in 1996–99 to US$0.0215 in 2000–07.

Conclusion Implementation of IDSR was associated with improved surveillance and response efforts. However, decreased budgetary support from the government may be eroding these gains. Renewed efforts from government and other stakeholders are necessary to sustain and expand progress achieved through implementation of IDSR.

Integrated disease surveillance and responsesurveillance indicatorsepidemic preparedness and responseinfectious disease surveillanceUganda

KEY MESSAGES

The Integrated Disease Surveillance and Response strategy, recommended by World Health Organization Africa Region Office (WHO AFRO) in 1998, was successfully implemented in Uganda.

The strategy was associated with improved surveillance and response efforts as demonstrated by the process, output and outcome indicators from 2001 to 2007 at the national and sub-national levels.

However, decreased budgetary support from the government may be eroding these gains.

Background

The major burden of communicable diseases in Uganda remains attributable to preventable diseases ranging from endemic diseases such as malaria to emerging and re-emerging infections such as HIV/AIDS, viral haemorrhagic fevers, tuberculosis and cholera (MOH Uganda 1999; MOH Uganda 2000; Somda et al. 2009). Among the several strategies put in place by the Ugandan government to address this disease burden is a functional public health surveillance system with an early warning system (MOH Uganda 2001).

This system aims at timely and appropriate responses to priority disease threats that involve all stakeholders, including the government sectors (from national to community levels), the private sector and development partners (Perry et al. 2007).

Uganda adopted a decentralized health care system in 1993. Through decentralization, the country was divided into 45 districts. In 2001, the country was further divided into 56 districts, and in 2006, the number of districts increased to 80. The districts are further sub-divided into functional zones called health sub-district (HSDs) and within each HSD are hierarchal levels of health facilities, which offer a range of services (health centres I, II, III, IV and hospitals). For example, the level designated as Health Centre I is the most peripheral level of health services, with no permanent facility but with community or village health workers offering mainly preventive services within their communities. The referral level starts at the Health Centre IV, and then hospitals (both district and referral hospitals), which offer curative and surgical services in addition to preventive services (Anokbonggo et al. 2004).

In 1998, Member States of the Africa Region of the World Health Organization (WHO AFRO) adopted the Integrated Disease Surveillance (IDS) strategy with the intent to create integrated, action-oriented, district-focused public health surveillance systems (WHO AFRO 1999; WHO AFRO 2000). In 2001, due to the importance of linking surveillance to public health action, the strategy was renamed Integrated Disease Surveillance and Response (IDSR) (WHO AFRO and CDC 2001). The WHO framework for monitoring and evaluating surveillance and response systems for communicable diseases was designed around structures, quality and process (core and support functions), from which surveillance indicators were derived and adapted by several countries including Uganda (WHO 2004).

In preparation for the implementation of this strategy in Uganda, a baseline assessment of the available vertical surveillance systems was conducted in February 2000 (CDC 2000; WHO AFRO 2000). Findings from this assessment formed the basis for implementation of IDSR through integration of existing surveillance systems, which involved: (i) establishment of a co-ordination mechanism to link various surveillance systems to create an integrated system; (ii) harmonization of data collection tools; (iii) procurement of standardized data storage; and (iv) storing data in a uniform database where it is easily accessible to users and policy-makers. In Uganda, this co-ordinated approach to data collection, analysis, interpretation, use and dissemination was designed to guide decision-making for public health action.

Following the February 2000 assessment, a 5-year strategic plan of action was developed from which annual work plans with clear objectives were developed and launched in May 2000 (MOH Uganda 2000; Perry et al. 2007). The plan aimed at reducing the negative impact of epidemic threats through an early warning system and timely response. It sought to ensure the availability of disease data to monitor the progress of interventions for control, elimination or eradication of target diseases. Achievements in these areas would contribute to the overall health sector goal of reducing morbidity and mortality due to preventable causes.

It is worth noting that along the road to IDSR implementation, the need to integrate laboratory surveillance was emphasized, in terms of personnel, reagents and supplies, as well as the confirmation process for reported disease epidemic threats. The evaluation of laboratory involvement in the subsequent years of IDSR implementation identified continued weaknesses in terms of trained laboratory personnel, especially at district and sub-district levels, processing of laboratory results and ability to link laboratory results with the weekly surveillance reporting system, which on its part reflected a steady increase in timeliness of reporting priority diseases (Nsubuga et al. 2009).

This review provides information on the progress of IDSR after several years of implementation. It highlights the costs involved, successes and the challenges faced during the implementation of the IDSR strategy in Uganda. The information from this review is important for strengthening IDSR, which not only contributes to the control of communicable disease in Uganda, but also enhances the capacity to implement the World Health Organization’s International Health Regulations (2005), a legally-binding agreement that provides a new framework for co-ordinating and managing public health threats, which came into effect in June 2007.

Methods

Uganda has been following a systematic approach to planning and implementation of the IDSR strategy since 2000. Periodic reviews of the indicators monitoring core and support functions were recommended by WHO AFRO, the US Centers for Disease Control (CDC), ministries of health and all implementing partners as a means for assessing national surveillance systems (WHO AFRO 1999; WHO AFRO 2000; WHO 2002). We evaluated the progress of IDSR implementation in Uganda at national, district and health facility levels from 2001 to 2007.

At national level we reviewed the progress of IDSR implementation through analysis of the national core indicators adapted from the standard WHO AFRO IDSR indicators listed in Table 1. The implementation of IDSR at national level was evaluated through analysis of the core IDSR indicators, funding for implementation and comparison of costs before and after IDSR implementation. Analysis of IDSR core indicators focused on completeness and timeliness of weekly epidemiological data, morbidity and mortality data for cholera and meningococcal meningitis as tracer conditions for IDSR performance from 2001 to 2007 (Table 2). Completeness was computed based on proportion of districts and health units per district which submit weekly and monthly reports in a calendar year. Timeliness was computed based on proportion of districts that submit timely reports in a calendar year. Timely submission of reports was receipt of district epidemiological data by Thursday following the end of the previous epidemiological week. Attack rates (AR) and case fatality rates (CFR) were computed based on the district level weekly and monthly reports of cases, deaths and at-risk populations per evaluation year. Cholera and meningococcal meningitis were selected among the epidemic prone diseases under surveillance as the tracer epidemic diseases to monitor attack and case fatality rates regularly.

IDSR core indicators and their targets

IndicatorTarget
1.Proportion of health facilities submitting weekly surveillance reports on time to district level80%
2.Proportion of districts submitting weekly surveillance reports on time to the next higher level80%
3.Proportion of district monthly reports that are submitted timely80%
4.Proportion of cases of diseases targeted for elimination or eradication reported using case-based or line listing forms100%
5.Proportion of suspected outbreaks of epidemic-prone diseases reported to next higher level within 48 hours of surpassing the epidemic threshold100%
6.Proportion of reports of investigated outbreaks that include analysed case-based data100%
7.Proportion of investigated outbreaks with laboratory results100%
8.Proportion of confirmed outbreaks with a nationally recommended public health response80%
9.Proportion of districts with current trend analysis (line graphs) for selected priority diseases100%
10.Case fatality rate for outbreaks of priority diseasesCholera: <1% Meningococcal meningitis: <10%
– Cholera
– Meningococcal meningitis
– Others (specify)
11.Attack rate for outbreaks of priority diseases
– Cholera
– Meningococcal meningitis
– Others (specify)

Key IDSR performance indicators at national level, Uganda, 2001–07

2001200220032004200520062007
Completeness
N = number of districts required to report56565656568080
Median monthly % [range]90 [86–100]92 [88–100]97 [94–100]99 [99–100]88 [85–100]89 [83–100]94 [89–100]
Median weekly % [range]48 [35–72]88 [62–98]96 [88–100]96 [84–100]95 [88–100]76 [69–88]85 [73–91]
Timeliness
N = number of districts required to report56565656568080
Median monthly % [range]50 [38–66]59 [44–72]78 [66–87]88 [70–98]76 [69–84]64 [58–74]77 [68–85]
Median weekly % [range]49 [34–66]91 [81–97]93 [78–98]96 [72–98]93 [86–97]59 [46–66]53 [44–63]
Case fatality rate (CFR)
Cholera %a (total number of deaths/total number confirmed cases)6.8 (23/340)5.6 (124/2228)2.9 (123/4254)2.4 (89/3710)2.5 (108/4252)1.9 (92/4785)2.1 (35/1662)
Meningococcal meningitis %b (total number of deaths/ total number of confirmed cases)16.3 (172/1053)16.1 (353/2186)13.5 (294/2180)16.5 (224/1356)5.2 (52/1009)4.6 (66/1441)4.1 (320/7805)

Notes: aFor trend of cholera CFR, P < 0.01; Durbin–Watson calculation = 1.088.

bFor trend of meningococcal meningitis CFR, P < 0.003; Durbin–Watson calculation = 2.5.

The adapted IDSR core indicators (Table 1) and their targets were included in the national surveillance databases in order to routinely monitor progress with essential surveillance functions, such as detection and reporting of priority diseases, analysis and interpretation of data, investigation (including laboratory confirmation of suspected outbreaks) and response to epidemic threats, and provision of feedback.

Denominators were determined by the number of units for a given indicator that exist in the country. For example, total number of health units in a district constitute the denominator for timeliness or completeness of reporting in a given period, and the total outbreaks reported in a country constitute the denominator for outbreak timely investigation and response. The designation of denominators, therefore, constituted the unit base, which was in the form of structures such as health units, health workers or health events.

We also examined the national level funding for IDSR implementation from 2000/2001 to 2007/2008. We conducted a cost analysis to compare funding for surveillance before (1996–99) and during IDSR implementation (see Table 5) using annual budgets and financial reports. Data on funding from 1996 to 1999 are aggregated because the national level vertical programmes conducted the surveillance functions directly at district and regional levels. Decentralization of the financial management levels had not yet taken effect. However, from the year 2000, the funds for IDSR implementation were disbursed directly to national, regional and district levels. The mean costs analysed were in line with key resources involved in implementation of IDSR (as shown in Table 5).

At district and health facility level, an evaluation on the performance of IDSR core and support functions was carried out in 2004. In order to allow comparison and determine progress, the 2004 IDSR evaluation was based on the same parameters as the IDSR baseline assessment in 2000. In the 2000 IDSR assessment, eight of the then existing 45 districts of Uganda were sampled. A multi-stage stratified sampling was done in which four regions were selected purposely to have regional representation. In each region, two districts were selected by simple random sampling, giving a total of eight districts, and in each district, eight health facilities were selected again by simple random sampling. Because the 2000 baseline assessment was based on a small sample, comparison with the 2004 evaluation poses a limitation.

The evaluation focused on the IDSR structures at national, district and health facility levels, core and support functions, inputs, processes, outputs and outcomes of IDSR, based on the key interventions from the 2000 IDSR Plan of Action. We used similar methods to collect data, including structured questionnaires administered to personnel at health facility, district and national levels; observation checklists and key informant interviews. The sampled districts and health facilities were part of IDSR implementation from the time of its inception in 2000 to the time the evaluation was conducted in 2004.

We analysed the 2004 evaluation survey results to compare the performance of district and health-facility levels in 2004 with the 2000 baseline (CDC 2000). The 2004 evaluation of the district and health-facility levels was based on a three-stage sampling technique. In the first stage, two districts were selected by random sampling from each of the 10 geographical regions, giving a total of 20 districts for the evaluation out of 56 in the country. In the second stage at district level, two Health Sub-Districts (HSDs) were selected by simple random sampling, giving a total of 40 HSDs from the 20 districts. HSDs are sub-operational levels with a health facility capable of handling outpatient and inpatient health care services, and an oversight role to lower-level health facilities in their catchment area. To assess performance at the health facility level, in the 3rd stage, a total of 217 health units (both government and private, at different levels) were selected through multi-stage random sampling.

The evaluation used both quantitative and qualitative methods, with a standardized questionnaire for most aspects and key informant interviews at national and district levels. The tools were administered to a sample of health facility staff who were the in-charges of the facilities, and were therefore purposely sampled to collect information on the IDSR organizational structure, flow of information, list of diseases and conditions under surveillance, availability of guidelines and standards, feedback mechanism, as well as training in IDSR.

Data management and analysis

Data entry screens were developed using the Epi Info Version 3.4 computer package, with consistency and logic checks. An analysis plan was developed and analysis was done using Epi Info Version 3.4 (CDC 2008). In addition, we used qualitative data from key informant interviews to supplement the quantitative data and to explain some of the issues related to the success and challenges of IDSR implementation.

The data on cost analysis were entered and analysed in Microsoft Excel 2003 (Microsoft Corp., Seattle) to calculate the means, standard deviations and also to compare costs before and after institution of the IDSR. To determine the cost for the implementation of IDSR, the cost of all the different items was identified.

We calculated the costs using annual budget and actual remissions to the region, district and health centre level. Annual financial resources provided by the Ugandan government for IDSR implementation were obtained from the annual government-approved budget for the Epidemiological Surveillance Division of the Ministry of Health (MOH), the focal unit for IDSR implementation. These were entered into a Microsoft Excel sheet for comparison over time. The population estimates were obtained from estimate data from the Uganda Bureau of Statistics. We used this data to compute the mean annual cost per capita per year for all IDSR activities over the two comparative periods/systems using population estimates and mean annual costs.

We analysed the costs involved in the implementation of the strategy and compared them with the costs before its introduction. The mean costs analysed were in line with key resources involved in implementation of IDSR (see Table 5). The costs from 2000–07 were annualized and then added together to obtain the average.

For comparison, the costs from 1996–99 were also summed up to reflect the mean annual costs. Some costs for the latter period were estimated using proxy figures of the previous year and similar programmes at the MOH. The costs of equipment, supplies and processes and feedback on IDSR were considered at national, regional and district levels. Capital costs were depreciated at 5% annually over a 10-year useful-life time horizon. The depreciated costs for vehicle and office equipment were added together to get the subsequent cost over the 7-year period and the same was done for the period 1996–99.

ResultsIDSR implementation at the national levelTimeliness and completeness of reporting

We noted improvements in completeness (the proportion of health units required to report) and timeliness (the proportion of health units who reported disease information by the date it was due) of reporting in the analysis of IDSR outputs and outcomes. For example, the completeness of weekly reporting from districts to the national level improved from 48% in 2001 to 85% in 2007; district monthly reporting remained steady between 90% in 2001 and 99% in 2007. The qualitative results attributed this success to the integrated reporting system adapted at the inception of IDSR in 2000, the multiple communication channels (radio call, telephone, fax and internet) between the district and national level, and the increased technical support at the start up of IDSR. Almost all disease programmes at the MOH and districts gradually bought into the integrated system through technical support and reliance on the system for analysed disease data, resulting in good performance indicators. However, timeliness of district reporting, which had steadily increased to 93% by 2005, dropped to 53% in 2007 (Table 2). Key informants’ discussions attributed this to the establishment of several new districts with inadequate technical capacity and a drop in the funding at national level.

Response to epidemics

From the start of 2003 to 2007, 83 suspected outbreaks were reported to the national level, of which 78% (65/83) were investigated within 48 hours of notification and responded to by the National Rapid Response Team. These outbreaks included cholera (19), meningococcal meningitis (11), dysentery (11), measles (8), plague (6), viral haemorrhagic fevers such as Marburg and Ebola haemorrhagic fevers (5), and one of each of the following: hepatitis B, hepatitis E, anthrax, methanol poisoning and syphilis. The total number of outbreaks experienced and investigated within 40 hours of notification by year were 16/21 (76.2%) in 2003, 15/20 (75.0%) in 2004, 14/17 (82.4%) in 2005, 11/14 (78.6%) in 2006 and 9/11 (81.8%) in 2007. We observed a gradual decrease in outbreaks recorded and an increase in the proportion of outbreaks investigated within 48 hours of notification from 2003 to 2007.

Key informant interviews attributed the improvements in epidemic preparedness and response to better availability of equipment, reagents and other laboratory supplies at the Central Public Health Laboratories, the Uganda Virus Research Institute and the Faculty of Veterinary Medicine at Makerere University.

Laboratories at both the Uganda Virus Research Institute and the Faculty of Veterinary Medicine were upgraded to Laboratory Biosafety Level 2, with plans for further upgrading to Biosafety Level 3 soon. The Central Public Health bacteriological laboratories have maintained their capacity for culture and sensitivity for all the samples shipped from a network of laboratories with an effective specimen referral system. Timely response to Marburg and meningitis epidemics resulted in rapid containment of a Marburg epidemic and control of meningitis in 2007. The CFR for the tracer epidemic diseases was reduced: for cholera, the CFR dropped from 7% in 2001 to 2% in 2007, and for meningococcal meningitis, it dropped from 16% in 2001 to 4% in 2007 (Table 2). However, the attack rates for the reported and confirmed cholera outbreaks increased gradually between 2000 and 2007 from 546.8 per 100 000 in 2001 to 1019 per 100 000 population in 2007. The attack rates of the confirmed meningococcal meningitis outbreaks increased between 2000 and 2004 and decreased from 2004 to 2007 (Table 3).

The trend of attack rates for cholera and meningococcal meningitis for confirmed outbreaks 2001–07

YearCholera
Meningococcal meningitis
Population at riskTotal no. of casesAttack rate per 100 000Population at riskTotal no. of casesAttack rate per 100 000
200174 429340456.8146 8221053717.2
2002136 513315230.7193 86642542194.3
2003116 546651558.6230 58233901470.2
200458 645274467.2265 02337111400.3
200558 957174295.1173 61230601762.6
2006111 93813961247.1198 33159022975.8
2007162 96516621019.9209 54878053724.7

Data management and feedback

The surveillance unit has enhanced its capacity for data management and ensured that appropriate computers for data management and reporting, radio communication equipment, a working telephone and connection to the internet are all in place. In 2002, the national weekly disease table was introduced as a regular Monday feature in the New Vision, Uganda’s daily newspaper with the widest circulation, at an average cost of US$550 per week’s publication for the years 2002 to 2004 (Figure 1). MOH staff, managers and politicians at national and district levels noted that this weekly publication served as an advocacy tool and was useful in highlighting the MOH’s response to epidemic threats. The key informants at the Epidemiology Surveillance Division attributed the increased completeness and timeliness indicators during that period to the fact that the Monday weekly publication kept districts and HSDs on their toes to avoid missing out in the widely publicized disease table.

Clip from New Vision, Uganda’s main daily newspaper, showing the Ministry of Health weekly disease table for the benefit of the public and decision makers at national and district levels

Financial resources

Our examination of financial resources indicated a ten-fold decline in government funding from the 2000–01 to the 2007–08 budget, as shown in Figure 2. Key informants in the Epidemiology Surveillance Division noted that current levels of funding to the Epidemiology Surveillance Division in the Ministry of Health are rather inadequate for implementation of the routine surveillance functions. The division mobilized additional resources from different partners to make up for central budget shortfalls in order to support implementation of planned activities (Table 4).

Epidemiological Surveillance Division (ESD) budget allocation from financial years 2000/01 to 2007/08 showing gradual decline in budget allocation, in US$

Summary of funding sources other than government for IDSR implementation from 2002–04

SourceAmountPeriodComment
Rockefeller FoundationUS$70 0002002Funds were aimed at improving communication and coordination. They were shared between ESD, Resource Centre, UNHRO and IPH.
WHO GenevaUS$60 0002002 & 2004This support was meant to accelerate IDSR activities and was code named IDSR-LITE.
CDC AtlantaUS$80 0002002–04This support was through IPH for outbreak investigation and response activities.
USAIDUS$450 0002003–04This support was given through WHO and meant to improve selected IDSR functions.

Notes: ESD = Epidemiological Surveillance Division, Ministry of Health; IPH = Institute of Public Health; UNHRO = Uganda National Health Research Organization.

Technical officers from the MOH reported that major outside support was provided by the United States Agency for International Development (USAID, through WHO in 2003 and 2004), the US CDC (through Makerere University School of Public Health and the African Field Epidemiology Network from 2004 to 2006) and WHO for IDSR training for health workers in northern Uganda, annual surveillance review meetings and epidemic investigation and response as needed. The support from these different sources has been incorporated with the government funding shown in the cost analysis. Table 5 shows the costs of surveillance activities in the periods before and during IDSR implementation. We note that the funding input for surveillance was minimal for surveillance activities before 2000. However, there was considerable input into the IDSR by the government of Uganda as well as development partners, which was shared between the national, district and health centre levels. Several key informants mentioned the successful dissemination of the 2000 baseline assessment as the main tool used to advocate for IDSR, as well as the regular IDSR meetings at national level. It was pointed out that these enabled the decision makers to pay attention to the well-developed Plan of Action and mobilized funding for implementation of IDSR both from government and from development partners. Key informants pointed out the continuous involvement of partners, especially during emergencies, and support for Master of Public Health (MPH) fellows as some of the initiatives that have provided a boost for the otherwise declining funding.

Comparison of costs before (1996–99) and during (2000–07) IDSR implementation

2000–07
1996–99 All levels*
NationalRegionalDistrict
Training30 57566 63069 00060 000
Radio communication20 00020 000109 00020 000
EPR kits27 00030 00020 000
Investigation and response255 00090 000198 252100 000
Computers & office equipment14 53546 44520 0005 300
Vehicle and maintenance152 915160 000100 00092 000
Laboratory reagents55 00030 000230 00015 000
Support supervision75 00066 63080 00036 000
IDSR training23 00012 00095 000
Feedback26 0005 0005 0005 000
Technical assistance (WHO)12 00024 2124 900
Personnel support (7 years)154 90236 0001 425 00018 000
Total845 927556 9172 366 152371 300
Computed costs2002–071996–99
Average annual cost538 42892 825
Average population25 000 00020 000 000
Per capita input0.021537120.00464125

Notes: EPR = Epidemic Preparedness and Response.

*Includes national, district and regional operational levels.

IDSR implementation at district and health facility levels

The results of district-level IDSR indicators, summarized in Table 6, show improvements in a range of indicators compared with the baseline indicators for 2000. Similarly, improvements were observed from the health facilities, as shown in Table 7.

District level IDSR performance indicators, baseline 2000 survey comparison with 2004 survey, Uganda

IDSR indicatorsBaseline (2000)Performance (2004)
% (n = 8)% (n/N)
Laboratory coordinator present0 (0/8)90 (18/20)
Databases observed0 (0/8)75 (15/20)
Trend line graphs observed75 (6/8)75 (15/20)
Description of data by place observed63 (5/8)40 (8/20)
Description of data by age and sex observed0 (0/8)30 (6/20)
Demographic data at site observed0 (0/8)85 (17/20)
Derived rates from demographic data observed38 (3/8)55 (11/20)
Rapid Response Teams (RRTs)0 (0/8)80 (16/20)
Suspected outbreaks investigated within 48 hours of notification0 (0/8)69 (9/13)*
Response to suspected epidemics within 48 hours of notification was observed25 (2/8)46 (6/13)*
Functional epidemic preparedness committee0 (0/8)65 (13/20)
Sent feedback to lower levels15 (1/8)55 (11/20)
District medical officers trained in IDSR0 (0/8)50 (10/20)

Note: *Only 13 districts reported experience with outbreaks in the past 1 year.

Health facility IDSR performance indicators, 2000 baseline survey comparison with 2004 survey, Uganda

IndicatorBaseline 2000 (%)Performance 2004 (%)
Outpatient department registers observed92 (48/52)98 (214/217)
Outpatient department registers correctly filled56 (29/52)61 (132/217)
Standard case definition booklets observed (new indicator in 2004)No report40 (86/217)
Standard Uganda clinical guidelines observed35 (18/52)81 (175/217)
Health facility databases observed (new indicator in 2004)No report75 (163/217)
Ability to confirm malaria observed*51 (27/52)55 (79/144)
Ability to confirm meningococcal meningitis observed*21 (10/52)14 (20/144)
Ability to confirm tuberculosis by Ziehl Neelsen (ZN) stain*44 (23/52)52 (75/144)
Ability to confirm HIV by serology*No report36 (51/144)
Analysed data (at least one priority disease) observed10 (5/52)47 (102/217)
Observed health facilities with demographic data at siteNo report50 (108/217)
Proportion of health facilities with in-charge staff trained in IDSRNo report21 (45/217)

Note: *Indicator observed for health facilities with a functional laboratory (n = 144).

Epidemic preparedness and response

By the time of the survey in 2004, 80% (16/20) of the districts had rapid response teams. In addition, 69% (9/13) of the districts investigated suspected outbreaks with laboratory confirmation within 48 hours of notification. The response to epidemics within 48 hours improved from 25% (2/8) in 2000 to 46% (6/13) in 2004. Designated district laboratory co-ordinators were in place in 90% (18/20) of the districts. At health facility level, laboratory confirmation for malaria, meningococcal meningitis and tuberculosis remained low as compared with the 2000 assessment, at 55% (79/144), 14% (20/144) and 52% (75/144), respectively (Table 7). Qualitative findings attributed the continued poor IDSR indicators at health facility level in general, and poor laboratory performance indicators in particular, to a failure by districts to recruit trained personnel, which is further attributed to limited funding for district staff recruitment.

Data analysis and training

Results also show that 50% (10/20) of the District Health Officers from the sampled districts had been trained in IDSR skills. We observed an increase in the presence of databases from 0% in 2000 to 75% (163/217) in 2004 in the sampled health facilities. Health facilities with evidence of analysed data improved from 10% (5/52) in 2000 to 47% (102/217) in 2004 (P < 0.01). Similarly, 75% (15/20) of the districts in 2004 had databases. Regarding confirmation of meningitis, there was a decrease from 21% (10/52) in 2000 to 14% (20/144) in 2004 from the health facility samples. Key informants again attributed this to the inadequate laboratory capacities, especially regarding trained personnel, in several new districts.

Trend lines were observed in 75% (15/20) of the districts, and availability of derived rates from demographic data improved from 38% (3/8) in 2000 to 55% (11/20) in 2004 (P < 0.05). Availability of trend lines indicates that the districts are using the data collected to monitor the disease trends and could thus quickly detect unusual increases. Derivation of rates is evidence of some data analysis.

Reporting and feedback

We observed that feedback at district level improved from 15% (1/8) in 2000 to 55% (11/20) in 2004. Feedback involves newsletters sent to districts from the national level including some analysed data from the district, comparisons with other districts, and written communication about the data received from the district. At the health facility level, outpatient registers were observed in 98% (214/217) of the units surveyed, [compared with 92% (48/52) in 2000], and the availability of standard clinical guidelines improved from 35% (18/52) in 2000 to 81% (175/217) in 2004 (P < 0.01). Standard clinical guidelines refer to the MOH recommended treatment for different diseases and guide health workers in appropriate case management.

Qualitative results suggested that the improved IDSR indicators at district level were due to IDSR training support and the existence of surveillance and health management information system (HMIS) focal persons, with activities supported by WHO country office.

Discussion

While some indicators have eroded and substantial challenges remain, this review demonstrates that significant successes were nonetheless achieved during the first eight years of IDSR implementation in Uganda. There is a functional IDSR system in place and advocacy has been established, resulting in a marked improvement in IDSR performance. For example, at the national level, there is commendable progress in the implementation of IDSR as demonstrated by several indicators such as the completeness and timeliness of data reporting from the districts. There has also been more timely detection and response to acute outbreaks of infectious diseases. While IDSR was initially funded more fully by the government, and subsequently declining government input was temporarily offset by contributions from other sources, the most substantial improvements in IDSR performance indicators occurred in the context of stronger financial support than currently exists. The good progress with IDSR indicators suggested that there is a link between high political and financial commitment and the steady progress of IDSR performance. Similarly, at district and health-facility level, the performance in the indicators such as case or event detection, laboratory surveillance, data analysis, training, and outbreak investigation and response was satisfactory.

However, it was also observed that some critical indicators, including the capacity to confirm priority diseases such as malaria and tuberculosis, were still very low, suggesting that the laboratory component of IDSR in Uganda needs further strengthening (Vogt 1996; Shears 2000a; Kariuki Njenga et al. 2008; Nsubuga et al. 2009).

The rapid increase in the number of health districts from 45 in 2001 to 80 in 2006 has created new challenges to the implementation of IDSR. Whereas decentralization aims at bringing services closer to the people, maintaining a functional surveillance system amidst such rapid changes can be very challenging and requires adequate advance planning (Anokbonggo et al. 2004). New surveillance officers had to be identified and trained for each new district and the surveillance infrastructure reviewed to handle this change. The new surveillance officers needed logistical and transport support so as to conduct supervision of IDSR implementation and in some situations such support was not available.

The review of financial resources revealed a steady decline in funding allocations for the national level. Such a decline in funding support is likely to affect the performance of the IDSR strategy in Uganda. There are already early indications that diminished funding is possibly contributing to the decline in gains made by IDSR, as evidenced by the decline in the timeliness and completeness of reporting in 2005–07, after initial gains through 2000 to 2004. To reverse this decline, renewed efforts are needed by the government, development partners and districts to develop new plans of action with definite financial and technical commitment (Shears 2000b).

Although the vertical surveillance systems existing before the inception of IDSR presented a challenge to integrated surveillance, the readiness for several funders to adapt the IDSR model seems to favour the latter. The established structures from community to district and national level appear to be set to overcome this challenge. It is important to note, however, that programmes intent on a vertical surveillance approach still exist, mainly due to selected external vertical funding. Our observation from the cost analysis results in Table 5 shows that minimal funding trickled down to lower levels from the vertical surveillance systems operating before IDSR was started. Comparison of the mean costs for IDSR implementation for the period before and after 2000 revealed that although there was an improvement in the per capita input, funding remained low, as seen in a similar analysis by Somda et al. (2009). Even with the low cost for implementing IDSR, the benefits as reflected in the earlier results appeared to be enormous.

Despite the challenges observed, there has been a demonstrable improvement in the IDSR performance over the period 2001 to 2007 at all levels. Such a functional IDSR system was the basis for the early and quick detection, investigation and response to epidemics such as Marburg and Ebola viral haemorrhagic fevers in 2007, and likely contributed to the decreasing CFR of the tracer epidemic diseases.

WHO/AFRO has agreed to use IDSR as the vehicle for core capacity strengthening for the implementation of the revised International Health Regulations (2005) (The Lancet 2007; Kicman-Gawlowska 2008; Kicman-Gawlowska 2009). However, without sustained human, financial and logistical support, as well as institutionalization in the core agencies and budgets of public health, the successes and gains made in recent years could be lost and the core capacities for the International Health Regulations (2005) may not be achieved. Uganda has been and should continue to be a model country for IDSR in Africa (Weekly Epidemiological Record 2003). Renewed political and financial commitment is needed to sustain these successes.

We therefore conclude that improvements in IDSR resulted in better preparedness and more timely detection and response, which translated into reduced CFR. These gains were achieved through a relatively high level of funding, technical support from CDC and WHO, and in-service training in IDSR core functions. Continued support is needed to maintain and expand the gains made through IDSR implementation.

Funding

The implementation of IDSR, the performance monitoring and evaluation were funded by the Government of Uganda, World Health Organization (WHO), US Centers for Disease Control and Prevention (CDC), USAID African Bureau and the Rockefeller Foundation. The review of the data and writing of this manuscript was supported financially by the Ministry of health, WHO and CDC. WHO and CDC funded a writing workshop where the authors started writing the manuscript. They also provided technical support for data analysis and interpretation, and contributed to the writing of the manuscript. Pre-submission review and clearance was also done by CDC-Atlanta.

Conflict of interest

None declared.

Acknowledgements

Several individuals, institutions and organizations provided technical guidance and logistical support in establishing IDSR in Uganda. In addition some participated in this study through data collection and by providing meaningful discussions during the proposal and result presentations in the different meetings. We gratefully acknowledge the Ministry of Health in Uganda, specifically Dr Sam Zaramba, Director General; Dr Sam Okware, Commissioner of Community Health; and Dr Dennis K W Lwamafa, Commissioner for National Disease Control. We also acknowledge the support and guidance of the World Health Organization (WHO) Country Office, the WHO Regional Headquarters in Africa (AFRO), and the WHO Headquarters in Geneva particularly Dr Stella Chungong and Dr Idrissa Sow.

We deeply appreciate the funding and support from USAID, in particular, Ms Mary Harvey (USAID), Rockefeller Foundation, and technical support from Dr Sambe Duale (Africa 2010, Tulane University). Many colleagues from CDC-Atlanta and CDC-Uganda contributed technical and logistical support to Uganda’s IDSR program especially Dr Mark White. Several organizations supported IDSR in general in Uganda and this study in particular, including Makerere University School of Public Health, the Africa Field Epidemiology Network (AFENET), UNICEF, African Medical and Research Foundation (AMREF), and the Uganda Virus Research Institute (UVRI).

ReferencesAnokbonggoWWOgwal-OkengJWObuaCAupontORoss-DegnanDImpact of decentralization on health services in Uganda: a look at facility utilization, prescribing and availability of essential drugsEast African Medical Journal2004SupplS2715125109Centers for Disease Control and Prevention (CDC)Assessment of infectious disease surveillance—Uganda, 2000MMWR Morbidity and Mortality Weekly Report2000496879110947057Centers for Disease Control and Prevention (CDC)What is Epi Info?2008Online at: http://www.cdc.gov/epiinfo/, accessed 19 March 2012Kariuki NjengaMTraicoffDTettehCLaboratory epidemiologist: skilled partner in field epidemiology and disease surveillance in KenyaJournal of Public Health Policy2008291496418523470Kicman-GawlowskaAThe surveillance of communicable diseases within the International Health RegulationsPrzegl Epidemiology20086273949Kicman-GawlowskaANational IHR Focal PointPrzegl Epidemiology2009631437MOH UgandaNational Health Policy. Kampala: Ministry of Health1999Online at: http://www.healthresearchweb.org/files/National_Health_Policy_1999.pdf, accessed 19 March 2012MOH UgandaIntegrated Disease Surveillance Action Plan2000Unpublished. Ministry of Health, Kampala, UgandaMOH UgandaHealth Sector Strategic Plan I2001Kampala: Ministry of Health. Online at: http://www.health.go.ug/docs/HSSPfinalEdition.pdf, accessed 19 March 2012NsubugaPBrownWGGrosecloseSLImplementing Integrated Disease Surveillance and Response: four African countries’ experience, 1998–2005Global Public Health200953648019916090PerryHNMcDonnellSMAlemuWPlanning an integrated disease surveillance and response system: a matrix of skills and activitiesBMC Medicine200752417697387ShearsPEmerging and reemerging infections in Africa: the need for improved laboratory services and disease surveillanceMicrobes and Infection2000a24899510865194ShearsPCommunicable disease surveillance with limited resources: the scope to link human and veterinary programmesActa Tropica2000b763710913758SomdaZCMeltzerMIPerryHNCost analysis of an integrated disease surveillance and response system: case of Burkina Faso, Eritrea, and MaliCost Effectiveness and Resource Allocation20097119133149The LancetInternational Health Regulations: the challenges ahead. [Editorial]The Lancet20073691763Uganda Bureau of Statistics. Online at: www.ubos.org/, accessed 19 March 2012VogtRLLaboratory reporting and disease surveillanceJournal of Public Health Management Practice199622830Weekly Epidemiological RecordImplementing integrated disease surveillance and responseWeekly Epidemiological Record2003782324012866262WHODocumentation of Integrated Disease Surveillance and Response implementation in the African and Eastern Mediterranean Regions2002Report of a WHO meeting4–15 November 2002Harare, ZimbabweGenevaWorld Health OrganizationWHOOverview of the WHO framework for monitoring and evaluating surveillance and response systems for communicable diseasesWeekly Epidemiological Record200479322615460083WHO AFROIntegrated Disease Surveillance Strategy in the African Region: a regional strategy for communicable diseases 1999–20031999Harare, ZimbabweWHO Regional Office for AfricaWHO AFROAssessment protocol for national disease surveillance systems and epidemic preparedness and response2000Harare, ZimbabweWHO Regional Office for AfricaWHO AFRO, CDCTechnical Guidelines for Integrated Diseases Surveillance and Response in the African Region2001Harare, ZimbabweWHO Regional Office for Africa