We must know what works in a crisis and what doesn't
Research and evaluation can identify interventions that could optimize our response to a crisis, but resources for such exercises unfortunately tend to get crunched at such times.
An often-ignored casualty of the kind of crisis we currently face is a focus on evaluations and applied research. Research that can tell us what works, for whom, how much, why and under what circumstances. One of the most common reasons cited for this casualty is a lack of resources, both human and monetary, and a lack of data. James Heckman, a Nobel Prize-winning economist, was quoted in a recent Science article as saying, “Never let a crisis go waste….We are getting new information. It’s very valuable information."
Well-known challenges emerge during times of crisis. Movement and new data collection become difficult. However, it is perhaps indicative of the stress we feel that we forget that there are still a lot of data sources available which can help us analyse the effectiveness of the money we are spending during such times. Here are some examples of the possibilities that exist to procure data from elsewhere and do further research.
Use administrative data: Administrative and survey datasets are available from many other sources. In the now protracted conflict-driven crisis in the Democratic Republic of Congo, the United Nations Children’s Emergency Fund (Unicef) has supplemented the most recent Demographic And Health Survey (which is still being carried out during this crisis) to collect and analyse data on deprivation. This analyses the extent to which children in different age groups (0-23 months, 2-5 years, 6-11 years and 12-17 years) are deprived on different dimensions. Unicef uses this data for targeting their operations in the country appropriately.
Ignore geography and geographic information system data at your own peril: Physiographic data is a far better predictor of human behaviour than we know. A paper that I had co-authored a long while ago examined the effectiveness of agricultural policy using both physiographic and socioeconomic factors. We found that even without socioeconomic data, our estimates remained just as good. To put it differently, it is clear that location and behaviour are determined to a large extent by geography (and location and behaviour are “endogenous"). This can be exploited in a lot of the work that we do to understand processes and possibilities for countries. A lot of physiographic data, such as on elevation, markets and climate, are available from free sources. In the Philippines, for example, on-the-ground data was used to understand differential responses and the impact of Typhoon Haiyan, including many political economy factors such as the power of the political elite and corruption.
Think of natural experiments and data from contact tracing: In a recent Science paper, economists discussed the possible uses of the pandemic as a “natural experiment". Indeed, my own take is that the phased and exogenously determined reach of the pandemic in different countries can help us use it as an “instrument" to understand and compare different systems. It’s also giving us a way to think about uncertainty.
Phone surveys can be a great opportunity: A webinar by Tavneet Suri of the Massachusetts Institute of Technology is a great source of the dos and don’ts in such surveys. Some key takeaways for me were:First, find ways to get phone numbers (there are difference methods to do this, as documented, but one could also dial random numbers if the idea is to get a large sample). Second, keep such surveys short, at a maximum of half an hour (face-to-face surveys, in contrast, can go on for anywhere between 10 minutes and four hours).Third, figure out a maximum of 10 outcomes that you want to target. Fourth, follow up at least two or three times at different times of the day. Fifth, ensure that there is 100% verification. And sixth, keep ready a pre-analysis plan and a protocol that matches the outcome to the variable in the question being asked in the survey.
Let’s get back to the possibilities.
Use evidence maps, systematic reviews, social media and machine learning to understand previous evidence: Last but not least, this is the time for innovation. Much has been done in synthesizing evidence previously. In my own office’s work to examine the effectiveness of climate change adaptation investments of the fund we evaluate, we will be using evidence reviews that systematically and exhaustively look at all high-quality evidence related to adaptation in low- and middle-income countries over the past 20 years. Machine learning and writing algorithms can help us search through data sources with care. Examining social media can also be an excellent data source. There exists a nice study that serves as a good example of how Twitter feeds have been used in Indonesia to understand what influences behaviour.
To conclude, for people like us in the field of evaluation and applied research, this crisis is unfortunate. But if we think laterally, we can use the immense strides we have made in technology and methodology to exploit the capacities that we as a community have invested in for years. Indeed, this is not a choice. The costs we as a society have to pay for not knowing what works and how well it works are immense, especially at a time like this, and as a scientific community, we cannot afford to shirk this responsibility.
This post first appeared in the April 23rd edition of Live Mint.
Have thoughts about evaluation in the time of COVID-19? Or want to ask Jo a question?
Live Mint: @livemint
The Independent Evaluation Unit, Green Climate Fund: @GCF_Eval
Dr. Jyotsna Puri (Jo): @Jo_Puri
Be sure to also check out Jo's video interview with the Global Landscapes Forum on evaluation in the time of COVID-19.