Learning from other projects?
16 November, 2008
In October I visited and reviewed the progress of UNICEF’s Women and Children’s Health Program in Papua (WCHPP), as part of a wider team including staff from UNICEF, AusAID, GTZ and specialists in maternal and neonatal health. This is the third of a series of such reviews conducted annually since 2006.
How is that relevant to Katine, a very different community in a very different country, assisted by a very different organisation, funded by a very different donor?
Well, it does seem that the two projects do share some common issues, which I will list here in the hope that someone might react to them (including the good staff of AMREF Soroti).
- Both projects involve the assisting agency working closely with government structures. In both project designs there is an important role for community mobilisation and empowerment, especially in relation to health matters. In Papua UNICEF wants to increase public awareness and use of maternal and child health services. It sees increasing demand for services as an essential complement to increasing the supply of such services. I think AMREF has a similar view in Katine. But the progress with this part of the project in Papua has been slow. UNICEF has led, but the government has not been that enthusiastic about this part of the program, compared to others. Questions were raised by the review team as to whether UNICEF should continue working in this area, especially given the fact that their staff were already overburdened with work. It might be better to find or encourage an independent NGO to work on the demand side, especially one that was more independent of the government.
- Both projects are engaged in many activities, so much so that it was not easy at first glance to appreciate the strategy guiding these activities. In the WCHPP there were two overlapping strategies. One could be described as humanitarian assistance. This involved funding support for staff training, repair of health facilities, procurement of supplies, etc. All of this was useful, and appreciated, but it was essentially gap filling, filling in where government funds should have been at work. Government funds were available, but they were being invested elsewhere, in new hospital buildings and in sectors other than health. The other strategy was the introduction of new and improved ways of doing things (computerised health information systems, partnerships between midwives and traditional birth attendants, etc). This is where UNICEF was arguably doing more development work, potentially adding value to the way in which government health systems were already working.
- With both the humanitarian and development work there were some important limitations, that needed to be addressed. With the humanitarian work, there were no agreed performance targets or milestones that UNICEF would help the government reach, and then phase out its assistance. For example, at what point do you stop supporting the operations of a sub-district health centre, and move your assistance elsewhere? With the development work, the introduction of new or improved methods was not consistently associated with assessment of existing methods, and then a follow up comparison of the value of the new method. Nor was there any systematic packaging and promotion of the results of studies that were carried out. De factor “experiments” were being carried out, but they were not being systematically assessed and results publicised. This is despite the fact that the idea of development projects as “policy experiments” has been around for decades (See Rondinelli, 1993).
- There was a third feature that seemed to be shared by both projects, relating to decentralisation. Both Uganda and Indonesia have decentralised government administration, Indonesia probably the most radically. If ever there was a need for decentralised administration, it would be in Papua, which is culturally and geographically a long way from Jakarta. Yet, UNICEF’s response to providing assistance to its Papua office was remarkably centralised. A KAP (Knowledge Attitude Practice) survey was centrally commissioned for the whole of Indonesia as one entity, and this will be used to measure the impact of health education and awareness raising activities in Papua (and elsewhere), where the issues facing women and children are not carbon copies of those found elsewhere in Indonesia. A M&E framework was introduced, exactly as used by similar UNICEF projects elsewhere in Indonesia. Yet the capacities of local government, and associated health services in Papua bear no comparison with those found in Java. All this raises questions of how much can we expect a highly centralised aid agency to help a highly decentralised government?
- The fourth feature related to the behaviour of UNICEF and GTZ staff as members of the annual review teams I mentioned above. While they have now participated in reviews of each others projects for three years in a row, it still surprises me how much they are still very wedded to their view that their own approach is the best possible under the circumstances, and how little acknowledgement (and even less follow up action) they have given to potentially positive features of each other’s projects. This experience resonates with what we have seen on the Guardian blog, in the way that AMREF has responded to comments by another NGO that it is possible to build schools significantly cheaper that the way AMREF has done. That is, with a little interest and enthusiasm to learn, and a considerable amount of defensiveness.
In my report on my second visit to Katine in August 2008 I raised a very similar set of issues
- The risks of overly centralised management of the project, within AMREF
- Questions about to what extent AMREF should do community empowerment, and whether it might be better done by another party more independent of government
- The need to set targets for what AMREF was trying to achieve, in consultation with its local partners
- The need to take the idea of generating development models seriously, by building operational research into all new activities that were trying to do things differently
The issues raised here also relate to my argument in a recent posting: How well is the KCPP doing? Compared to what? In that posting I argued that comparisons of AMREF’s performance with other projects could be as important as comparisons of with its own stated intentions and objectives