Evaluating the Guardian’s role in the Katine project
14 July, 2008
For understandable reasons, most development projects have objectives that are focused on changing the lives of people they want to assist: usually poor and disadvantaged communities. Yet development projects often involve partnerships between multiple organisations, located at local and national levels in the assisted country, and further afield. Donor organisations are often based in a different country altogether.
Rarely are donors asked to specify objectives about their own role, and to assess their performance in terms of the achievement of those objectives. Yet, many aspects of their activities can be important, affecting how the recipient organisations are able to do their work. For example, the speed and efficiency of aid transfers and the scale and complexity of their information requirements.
The Guardian is an unusual donor in many respects. Unlike many more traditional donors it is not a “hands off” donor, only wanting to receive a project proposal, then periodic progress reports and then a final evaluation report. Perhaps with a brief field visits once a year. Instead, the Guardian has hired a Uganda journalist to be reporting from Soroti district two weeks out of every month. They have hired an external evaluator to make field visits every six months. Their own staff are making frequent visits to Soroti. And in addition, AMREF will be providing six monthly progress reports to the Guardian. All these activities involve costs, both to the Guardian and to AMREF, both direct and indirect (e.g. staff spending time with visitors versus their own program of work).
Given these costs, a useful question that can be asked of all donors (and not just the Guardian) is: Okay, so what did you do with all the information that you obtained via these various channels? Are the costs of these activities justified by some benefits? If so, what are they? Interestingly, I suspect the Guardian may be in a better position than most traditional donor organisations to answer this question. There is the Guardian Katine website and blog, which is updated at least weekly and almost daily, and which is the primary means of fund-raising. The scale and depth of this website stands in dramatic contrast to many donor websites, which might at best have a static description of the projects they fund, which might be occasionally updated. Options for interactivity will normally be between negligible and non-existent. Overall, the level of public transparency (of the aid process) provided by most donor websites is very limited.
While the Guardian website seems to be going well, evaluating its performance at present is still quite a challenge, at least to me. One reason is that as far as I know, no one has got around to explicitly documenting the objectives for the website, which would then enable some form or monitoring and evaluation of its performance. Websites are simply part of what the Guardian does, it would seem.
Nevertheless, it is clear that the Guardian staff do have some conceptions of what good performance looks like for this website, and others. These relate to numbers of visitors, the numbers of comments posted, their quality, the amount of money raised via the website, etc. And other parties like the One World Trust, also have their views on the value of the website, having awarded it the New Media Award for 2008. The jury noted:
“Katine [website] does a brilliant job of bringing ordinary people from a small African village into global conversations. This 2-way communication is the hallmark of an interesting web project. It succeeds in engaging a wide readership, as testified by the remarkable level of public donations, but above all it brings complex and subtle arguments about development and power into a public space where policy makers meet, engage and debate with both specialists and ordinary people. Katine feels like it has an impact on decision makers.
The quality of debate is remarkably high. Informative and challenging discussions, stimulated by the invitation of knowledgeable contributors, are testament to the engagement of development policy makers. The site is visually accessible, it gives the feeling of being able to interact at village level. There is great story-telling with high production values. The Katine project shows a route for other non-profits to follow. It makes real impact and conveys a feeling of real change.”
This quote is interesting primarily for the potentially useful performance criteria it introduces. How well the website is actually doing on some of these criteria is not yet so well documented. Perhaps more important still is the need to come to some agreement with other Katine stakeholders (especially AMREF) about the relative importance of these performance criteria.
Is it worth paying more attention to the monitoring and evaluation of the Guardian’s Katine website? There are at least three reasons for arguing yes. Firstly, if intended achievements were more explicit and prioritised, and actual achievements more carefully monitored and documented, it seems likely AMREF might be more accepting of some of the costs it feels it is incurring so far. The discussion could move on from a focus on costs, to a focus on cost-effectiveness. Secondly, analysis of performance could help the Guardian further improve its own performance, through having a clearer idea of what it wants and how well it is doing so far. Thirdly, the whole Guardian experience of the Katine website could be analysed, documented and communicated to other donor organisations who, it could easily be argued, should be learning from this unique experiment so they can become better donors.