This post originally appeared on the World Bank Development Impact Blog under the title "Are Impact Evaluations Enough? The Social Observatory Approach to Doing-by-Learning" by Vijayendra Rao on the 16th of June, 2014.

Impact Evaluations are just one of many important tools to improve “adaptive capacity.” To improve implementation, they need to be integrated with monitoring and decision support systems, methods to understand mechanisms of change, and efforts to build feedback loops that pay attention both to everyday and long-term learning.  While there has been some scholarly writing and advocacy on this point, it has been more talk than action. 

Building adaptive capacity, which is premised on a fundamental change in the culture of projects towards an honest engagement with evidence and data, can be the key to improving development effectiveness.  To incubate this approach, the Social Observatory (SO henceforth) - a team* of economists, sociologists, behavioral scientists, and computer systems experts - have been collaborating closely for over two years with staff from the World Bank’s South Asia Region Livelihoods Team (SASDL), and government officials from Bihar and Tamil Nadu states in India.  Our goal has been to improve the adaptive capacity of “Livelihoods Projects,” and, in particular, Jeevika in Bihar and the Pudhu Vaazhvu Project, or PVP, in Tamil Nadu.  

Livelihoods projects, which represent about $2 billion of the Bank’s current India work-program and have a twenty-year history, are the archetypical “complex intervention.”   They work in a sequential manner: They first create a hierarchical network (they call it a “federation”) of rural women’s self help groups (SHGs) with a focus on micro-credit and savings, and build on the assumption (increasingly being validated by various researchers) that well-developed SHG networks have as much of a social impact as an economic impact.  The projects proceed on the logic that SHGs act as a “highway” on which women-centric anti-poverty interventions can be rolled out – controlled and managed by the network.  Indian livelihoods project have over 30 such “vertical” interventions that attempt to tackle important problems such as food insecurity, lack of skills, malnutrition, climate adaptation, poor sanitation, and poor agricultural yields. 

The SO’s approach is to integrate IEs within a broader system of learning and engagement that addresses other essential aspects of adaptive capacity.   In this, it is motivated by the following two principles: 

A)  Embedded Research:  We are driven less by the sexiness of a research question, and more by attempting to build a project’s ability to learn. Therefore research teams engage, almost continuously, with project staff over a few years.  The goal is to move the staff’s perception of researchers from overqualified outsiders who fly in whenever surveys need to be done, to the project’s own private think-tank.  This embedded approach allows for an ongoing dialogue between project and research teams; researchers constantly learn from project staff and provide feedback to them, and tailor research efforts to the project’s needs.  

B) Interdisciplinarity:  As an integrated inter-disciplinary team, the SO lets the question determine the method, rather than simply insisting that one tool represents the supreme standard.  This allows us to assist with a variety of issues that help build the project’s adaptive capacity (I will limit myself here to a few illustrative examples): 

1) Decision Support Systems (DSS):  Projects, particularly interventions that are highly subject to contextual variation and have uncertain trajectories of change, need to constantly track where they are failing and where they are succeeding.  A DSS allows project staff, from the village-level upward, to make everyday decisions on the basis of good data. 

A focus on developing a technically competent Management Information System (MIS) is necessary but not sufficient:  staff have incentives to fudge data; data are usually not available quickly enough to be useful for everyday decision making – six months lags are typical; and data are not reported in a manner that staff can use to make everyday decisions.  As I showed in a recent Policy Research Report on Localizing Development (co-authored with Ghazala Mansuri), monitoring systems are often non-existent or, at best, neglected.  In our experience with the SO, when monitoring systems exist they are developed by an outsourced consultant to fulfill the World Bank’s requirements rather than actually used for regular decision-making. Hence, our focus on DSS rather than on MIS. 

In Tamil Nadu, the PVP has developed a community based system of data entry and validation, where socio-economic profiles of SHG groups, and their credit records are entered every week at the village level by women themselves and are made available instantly in a central database.   Community-based entry and validation therefore makes good quality and high- frequency data available on a regular basis and, because it is locally validated, its accuracy is not questioned. 

The data are then reported in simple dashboards at the village, cluster, district, and state level to allow project staff across the hierarchy to assess and evaluate the credit performance of women, track when villages have problems with fund disbursement, and check if SHG groups are being formed or are disbanding.  If problems are noticed they can be immediately dealt with. 

2) Process Monitoring and Case-Studies:  Decision Support Systems are useful to track key indicators of project effectiveness but they do not tell us much about how implementation can be improved. To address this the SO has helped Jeevika and PVP to develop an effective process monitoring system.   An external team of qualitative researchers visits a rotating random sample of project villages every quarter to assess the quality of implementation and to note any other issues of importance.  These findings are summarized in a report with a two-page executive summary.  The findings are discussed with project staff to improve implementation and make course corrections in design, which establishes a feedback loop.  

Often projects need quick feedback on topical issues that are difficult to cover by the process monitors, so we supplement process monitoring with case studies. A case study was used to assess the quality of implementation in the $1 billion National Rural Livelihoods Project, which led to significant corrections. 

3) “Quick and Dirty” and “Long-Term” Evaluations:  The SO has twelve impact evaluations at various levels of completion. But which interventions to evaluate and what outcomes to track are determined in consultation with project staff and TTLs.  The assignment method that is used for the evaluation is conditioned by the urgency with which the findings are needed.  For instance, when the SO began its work, both the Bihar and Tamil Nadu projects needed a sense of whether Phase 1 of the project had worked or failed.  The SO, consequently, conducted PSM based evaluations on Phase 1 interventions in these states to a get a quick sense of the project’s impact.   

4) Rigorous Mixed-Method Evaluations:  In Phase 2 of Jeevika, the SO is conducting a mixed-method evaluation combining RCTs with surveys and behavioral games to measure outcomes, along with ethnographies of a sub-set of villages in the RCT sample, to get a detailed understanding of the mechanisms underlying the observed impact.  New behavioral tools have been developed to assess difficult to measure outcomes such as voice and agency, and the ethnographies are providing valuable insights into processes of change – in particular to understand if and how livelihoods projects are able to equalize gender relations in these highly patriarchal societies. 

5) Experiments:  Given the variety of possible vertical interventions, the SO has been collaborating with project staff to conduct experimental pilots with RCTs to understand how to effectively address some important problems.  We have recently completed an evaluation of a food security intervention with Jeevika, and are working on developing a pilot intervention with PVP to change norms on domestic violence.  

6) Innovations:  Being embedded within projects helps researchers to see the world from the bottom-up.  One insight we had was that data collection is almost always an extractive activity and rarely used by the people we were trying to help.  To address this we are working on what we call P-Tracking in Tamil Nadu.  Women’s groups were asked to spend a few weeks to develop a short survey that would track indicators of well-being that they considered meaningful and important enough to track. They developed a simple half-hour questionnaire drawing on discussions across the network.  The SO is implementing the survey using a tablet-based method that will employ the SHG network to collect data for all PVP’s one million women members.   We are working with researchers from MIT’s Datahub and Media Lab to visualize these data in a manner that is understandable by the women (who have high levels of numerical illiteracy) so that that they can compare themselves to other women, and track their progress over time. 

Another important principle the SO operates under is leverage.  It is primarily funded by a grant of $2 million from the South Asia Food and Nutrition Security Initiative (SAFANSI), supplemented by research grants from various sources.  But, for every dollar the SO receives it has activated about $5 from the (usually dormant) M&E budgets of projects. Within this budget, in addition to the activities listed above, we are also engaged in several other activities –including setting up two new public use panel surveys tracking rural incomes in Bihar, and subjective well-being in Maharashtra. 

To summarize, the goal of the Social Observatory is to supplement IEs with a diverse set of tools to build the adaptive capacity of projects.  It allows the principles of good, accurate data, informed and project-relevant analysis, and a constant system of feedback to allow research to inform projects at several nodes of decision-making: at the level of design, everyday implementation, and long-term evaluation.   It is thus trying to turn research from merely a tool for assessment into a tool for implementation.