The evaluator evaluated

17 November, 2008

Here are…

AMREF REFLECTIONS ON 2ND EXTERNAL EVALUATION OF KCPP

tabled at the Quarterly Governance meeting, Barclays, 1 Churchill Place, 15th October 2.30-4.30 pm

along with my responses in red

Strengths

  • Mobilised different stakeholders in AMREF to critically reflect on the progress, key priorities over next 12 months – key development questions on what, when and why
  • Acknowledged the different pieces of work going on in Katine and the sheer volume of the work
  • Participatory and inclusive – got insights from different stakeholders and beneficiary groups
  • The process of responding to the report enabled the whole of AMREF to work together and crystallise and document our approach to development with a particular focus on KCPP
  • Useful recommendation and learning on how to improve our M&E processes

Areas for improvement

  • Very long/heavy report – really difficult to analyse especially when English is not a first language
    • There is a four page summary at the front, and a contents page, and separate annexes
    • The next report will aim to be 20 pages max (excluding Exec Summ and Annexes)
      • But this proposal is subject to discussions re plans for the AMREF Mid Term Revew (MTR)
  • Intensive process/time consuming for different stakeholders in KCPP (PIT, country office team, AMREF HQ, AMREF UK, Farm Africa).
    • Following discussions with Guardian in August my aim now is to progressively reduce the frequency of visits and, where possible, to integrate these with AMREF review processes.
  • The evaluation visits are too regular to enable staff on ground to learn, take action before participating in another evaluation (So far we have had 2 evaluations in 7 months). Most 3 year projects have 2 – midterm and end of project). KCPP will have 8 over the 3 years (6 + midterm and end of project).  This is not feasible if we want staff to value the evaluation  and deliver on the specific targets in their workplans
    • Following discussions with Guardian in August my aim now is to progressively reduce the frequency of visits and, where possible, to integrate these with AMREF review processes.
  • Analysis of KCPP did not fully take into account the wider AMREF context, its policies and procedures)
    • Examples of important missing contextual information would be useful
    • Not sure how this gap could be addressed while also reducing the size of the report
  • Analysis of KCPP did not fully take into account the wider development context of Katine, Soroti district and Uganda
    • Examples of important missing contextual information would be useful
    • Not sure how this gap could be addressed while also reducing the size of the report
  • The evaluation focussed a lot on the processes. To the staff, who are working under difficult circumstances, highlighting both the interim tangible and intangible outcomes could have been more motivating
    • In the early stages of a project it is the work processes that are most visible and important, and outcomes tend to be less visible. Future visits should focus progressively more on outcomes and impact.
  • In some instances the report  did not take into account key sensitivities about staff and impact on the relationships that AMREF has with different stakeholders
    • Details are needed here before I can respond. But on reflection it seemed as though  it was sections of AMREF who were the most sensitive, and sections of government who were quite robust (in wanting their views expressed).

Suggestions for the future

  • Make the report shorter and in simple, user-friendly language (maximum 10 pages)
    • 10 pages is too short. 20 pages is more realistic. But final agreement here will depend on the ambit of the next visit, which is under discussion.
  • Consider the implications of the evaluation recommendations on the capacity of the PIT and practical realities on the ground – deliver initial targets, address new issues within very tight deadlines
    • Noted: Recommendations should be limited in number
  • The report has multiple target audiences.  It would be good to have a summary which is user friendly and targeting our stakeholders in Uganda, especially the district officials (i.e beyond the UK donors, UK public through the website).  Otherwise we stand to be accused of using extractive evaluation processes
    • Noted: There is a need for an Executive Summary that can fulfil this function (in addition to the existing list of Recommendations). But I am not sure if I should produce customised versions for different stakeholders. This might be better done by AMREF.
  • Place analysis of findings and recommendations within the contextual realities of Katine sub-county, Soroti District and Uganda development context.  Co-evaluation with someone from Africa/Uganda would add a lot of value
    • Need explanation of what “contextual realities” are being referred to, and how they would be covered within a 10 page, or 20 page, report
    • Co-evaluation with “someone from Africa/Uganda” will happen if we integrate my next visit with the AMREF mid-term review
  • Extend the time and if possible limit the number of evaluations bearing in mind that we shall also have midterm and end of project
    • Agreed, as noted above
  • Next evaluation should focus more on interim outcomes and the foundations we are making for sustainability
    • Sounds appropriate at that point in time in the lifespan of the project. As part of this next review, I would like to see some systematic documentation of what has been done / happened in all the villages of Katine sub-country. A starting point would be a spreadsheet of villages x activities (including non-AMFREF activities).
  • Clarify roles and scope of the evaluation – technical and programmatic; operations and management.
    • The Terms of Reference (ToRs) for each visits should be where this is done. Draft ToRs were shared with AMREF, Barclays and the Guardian for comment prior to each of the two visits to Katine so far. The same will be the case with visits in the future.

Learning from other projects?

16 November, 2008

In October I visited and reviewed the progress of UNICEF’s Women and Children’s Health Program in Papua (WCHPP), as part of a wider team including staff from UNICEF, AusAID, GTZ and specialists in maternal and neonatal health. This is the third of a series of such reviews conducted annually since 2006.

How is that relevant to Katine, a very different community in a very different country, assisted by a very different organisation, funded by a very different donor?

Well, it does seem that the two projects do share some common issues, which I will list here in the hope that someone might react to them (including the good staff of AMREF Soroti).

  • Both projects involve the assisting agency working closely with government structures. In both project designs there is an important role for community mobilisation and empowerment, especially in relation to health matters. In Papua UNICEF wants to increase public awareness and use of maternal and child health services. It sees increasing demand for services as an essential complement to increasing the supply of such services. I think AMREF has a similar view in Katine. But the progress with this part of the project in Papua has been slow. UNICEF has led, but the government has not been that enthusiastic about this part of the program, compared to others. Questions were raised by the review team as to whether UNICEF should continue working in this area, especially given the fact that their staff were already overburdened with work. It might be better to find or encourage an independent NGO to work on the demand side, especially one that was more independent of the government.
  • Both projects are engaged in many activities, so much so that it was not easy at first glance to appreciate the strategy guiding these activities. In the WCHPP there were two overlapping strategies. One could be described as humanitarian assistance. This involved funding support for staff training, repair of health facilities, procurement of supplies, etc. All of this was useful, and appreciated, but it was essentially gap filling, filling in where government funds should have been at work. Government funds were available, but they were being invested elsewhere, in new hospital buildings and in sectors other than health. The other strategy was the introduction of new and improved ways of doing things (computerised health information systems, partnerships between midwives and traditional birth attendants, etc). This is where UNICEF was arguably doing more development work, potentially adding value to the way in which government health systems were already working.
    • With both the humanitarian and development work there were some important limitations, that needed to be addressed. With the humanitarian work, there were no agreed performance targets or milestones that UNICEF would help the government reach, and then phase out its assistance. For example, at what point do you stop supporting the operations of a sub-district health centre, and move your assistance elsewhere? With the development work, the introduction of new or improved methods was not consistently associated with assessment of existing methods, and then a follow up comparison of the value of the new method. Nor was there any systematic packaging and promotion of the results of studies that were carried out. De factor “experiments” were being carried out, but they were not being systematically assessed and results publicised. This is despite the fact that the idea of development projects as “policy experiments” has been around for decades (See Rondinelli, 1993).
  • There was a third feature that seemed to be shared by both projects, relating to decentralisation. Both Uganda and Indonesia have decentralised government administration, Indonesia probably the most radically. If ever there was a need for decentralised administration, it would be in Papua, which is culturally and geographically a long way from Jakarta. Yet, UNICEF’s response to providing assistance to its Papua office was remarkably centralised. A KAP (Knowledge Attitude Practice) survey was centrally commissioned for the whole of Indonesia as one entity, and this will be used to measure the impact of health education and awareness raising activities in Papua (and elsewhere), where the issues facing women and children are not carbon copies of those found elsewhere in Indonesia. A M&E framework was introduced, exactly as used by similar UNICEF projects elsewhere in Indonesia. Yet the capacities of local government, and associated health services in Papua bear no comparison with those found in Java. All this raises questions of how much can we expect a highly centralised aid agency to help a highly decentralised government?
  • The fourth feature related to the behaviour of UNICEF and GTZ staff as members of the annual review teams I mentioned above. While they have now participated in reviews of each others projects for three years in a  row, it still surprises me how much they are still very wedded to their view that their own approach is the best possible under the circumstances, and how little acknowledgement (and even less follow up action) they have given to potentially positive features of each other’s projects. This experience resonates with what we have seen on the Guardian blog, in the way that AMREF has responded to comments by another NGO that it is possible to build schools significantly cheaper that the way AMREF has done. That is, with a little interest and enthusiasm to learn, and a considerable amount of defensiveness.

In my report on my second visit to Katine in August 2008 I raised a very similar set of issues

  • The risks of overly centralised management of the project, within AMREF
  • Questions about to what extent AMREF should do community empowerment, and whether it might be better done by another party more independent of government
  • The need to set targets for what AMREF was trying to achieve, in consultation with its local partners
  • The need to take the idea of generating development models seriously, by building operational research into all new activities that were trying to do things differently

The issues raised here also relate to my argument in a recent posting: How well is the KCPP doing? Compared to what? In that posting I argued that comparisons of AMREF’s performance with other projects could be as important as comparisons of with its own stated intentions and objectives