(Illustration by iStock/Visual Generation)
In 2022, our colleagues at the International Rescue Committee (IRC) were struggling to address one of the world’s most profound child health crises: hundreds of thousands of children in Chad suffering from acute malnutrition—what UNICEF called a “silent emergency.”
We had dollars to spend—nearly $3 million to provide life-saving treatment to 30,000 children in one of the most vulnerable provinces. But we had a problem: we were running resource-intensive malnutrition screening campaigns but not actually finding many malnourished kids. The cost was staggering—nearly $100 per child admitted for treatment—roughly equivalent to our entire project budget.
Facing failure, we needed alternative approaches—and we needed them to be much more affordable. The team had an idea drawn from their on-the-ground experience: Why not piggyback off existing immunization outreach, leveraging infrastructure and turning malnutrition and immunization services into a one-stop approach? We quickly reallocated dollars and assessed how costs stacked up against other options. The one-stop idea worked. Since we made this shift, we’ve halved admissions costs and treated thousands more kids.
What we saw in Chad is a powerful example of a deceptively simple idea: use cost evidence to help more people. It’s common sense—but getting there has been a journey.
Are you enjoying this article? Read more like this, plus SSIR’s full archive of content, when you subscribe.
The humanitarian sector faces a massive funding shortfall. Over 55 percent of need, or $32 billion, went unfunded in 2023, and 88 percent of all humanitarian funding comes from just 10 government donor budgets—a precarious formula.
At the same time, the people we serve face enormous overlapping challenges. Crises like conflict, climate change, extreme poverty, and the COVID-19 pandemic have left more people forcibly displaced around the world right now than at any time since the end of World War II, often compounding one another in contexts like South Sudan, Afghanistan, and Yemen. These are further exacerbated by new crises, like conflict in Ukraine and Gaza.
Many donors still operate on funding cycles that are inadequate and inappropriate to meet these polycrises. For example, the average grant cycle in the humanitarian sector is 12-18 months, while the average conflict (and time a refugee is in exile) is roughly 10 years.
More money is one answer, but realistically there will always be a gap between need and funding. How can we do more for the nearly 300 million people worldwide who need aid? We need to maximize impact for every dollar spent.
Over a decade ago, the IRC began to stitch cost evidence into the fabric of our work. We were already conducting impact evaluations to rigorously assess the effectiveness of our work. But if we were tracking impact, why not cost? We needed data to make evidence-informed decisions on where to spend more and where to save more to maximize our impact.
It made sense, but it was controversial. We were called unethical. We were told dollar signs detracted from the moral obligation to save lives. Our view, though, was that funders and organizations were already making implicit
tradeoffs; why not make them explicitly,
with thoughtfulness and rigor?
In 2015, the IRC created a “Best Use of Resources” (BUR) team
to measure our cost-effectiveness, and established an organizational evidence-based cost methodology to compare the costs of different programs with what they achieve. We built BUR with three goals in mind: Establishing efficiency guidance by generating, synthesizing, and using evidence to inform program decisions; pioneering and analyzing new approaches; and further optimizing effective programs for impact and scale.
We then embedded this work in three ways:
First, we put our money where our mouth was. We dedicated resources and staff to cost analysis, believing that this investment would pay dividends in the years to come. IRC’s cost evidence work gained real traction when we created a standalone division with unrestricted funds.
Second, we adapted key internal decision-making processes to require consideration of cost evidence. To give this work teeth and drive accountability, we initially included quarterly costing targets as key performance indicators. We have since retired this rigid and short-term system, but it set a marker and accelerated our journey to centering costing in program design and implementation.
Third, we created a public good for the sector. We recognized that the IRC was a small part of this story. The biggest gains would come from others making major budget allocation and design decisions based on comparable cost data. In 2018, we distilled our methodology into a product called Dioptra, a software tool for rapid, rigorous, and standardized cost-efficiency analyses. Dioptra uses project data and program staff knowledge to calculate the cost per output of any intervention. What started as a tool is now a consortium and a community: nine organizations with a combined $7 billion annual budget have adopted Dioptra.
Together, these initiatives have established a base of cost evidence and a culture of costing at IRC and beyond.
More than 10 years on, this work has shown we can stretch dollars to reach more people. To date, we have conducted over 400 cost analyses across 37 countries to directly examine more than $300 million dollars of humanitarian spending.
Earlier this year, we examined $86 million worth of IRC projects influenced by our cost evidence work. Of the $86 million, we found approximately $28 million in cost gains—a 32 percent efficiency improvement. We achieved this on a modest budget of $1.4 million, equivalent to a 20x return on investment. Some of what we found:
These were real savings that meaningfully transformed lives and stretched limited funds further.
We’ve proven the concept. We’ve demonstrated cost evidence is applicable across diverse contexts and cost-effective programming is possible in a big organization. And we’ve shown that when done in a focused way, cost evidence can help hundreds of thousands more people.
With crises mounting and budgets dwindling, the challenge now is to dramatically scale the use of cost evidence across the humanitarian and nonprofit landscape. Here’s where we go from here.
At IRC, we’re doubling down. We’ve committed to increase our cost-efficiency by $225 million over the next four years, and we are focusing analysis where it can have the most impact—by making calculated bets on where additional evidence will allow us to make the biggest gains on the biggest buckets of resources. We’re currently assessing our anticipatory action climate work—cash transfers sent to affected communities in advance of predictable climate hazards—as one example where we believe more analysis will unlock more savings.
To date, our 400 cost analyses have allowed us to build a transparent evidence base. Any nonprofit can do this by generating and sharing its own evidence, as well as incorporating thoughtful use of existing evidence into its own processes and culture. Get started here.
We’re building a movement for cost-effectiveness. We believe we can unlock billions of dollars in efficiencies if there is collective action. Our Dioptra partners
are working hard to drive this work within their organizations, but we cannot do it alone. We want to work with policymakers to incorporate cost evidence into global policies, such as simplified approaches to treat malnutrition. We also want to support local governments to make globally informed, locally grounded
decisions. This is the reason we’ve developed a tool for ministries of health to plug in their own malnutrition treatment costs and parameters. Critically, we want to align incentives at the source of the funding stream by supporting donors to embed cost evidence in their decision-making.
Ultimately, though, we need a new compact with donors.
Today, we find ourselves at the same inflection point with cost-effectiveness that impact evaluations faced 10 years ago—everyone wants one, even if they’re not sure what they’ll do with it. Some of the most well-intentioned offenders are donors: many organizations generate piecemeal evidence for the sole purpose of donor compliance, rather than any learning objective. For the humanitarian sector to truly embed cost evidence into its decision making, funders need to step up. This means:
Shifting donor relationships may seem quixotic, but we’ve already seen this in action: with support from private donor Open Philanthropy, USAID partnered with us to embed an IRC cost advisor into its office, as well as for Dioptra members to conduct 100 cost analyses on program areas where USAID believes it is leaving cost gains on the table. We believe this is strong precedent for a sector-wide shift.
At the start of our work on cost evidence, the arguments against us were value-based, and about a “race to the bottom” to defund “expensive” programs and organizations. But our colleagues in Chad showed the opposite. Smarter spending is a race to reach the most people with the most impact. It is a moral imperative.
In “The Moral Case for Evidence in Policymaking,” former Hewlett Foundation development economist Ruth Levine (now at the Packard Foundation) writes:
“Values alone are not enough to achieve distributive justice—and that’s where the evidence comes in… Fairness can be achieved only if full and unbiased information is available about current conditions, and about the costs and benefits of one way of acting—one policy option—versus another.”
We agree. Embracing evidence is a choice to keep impact and people at the center of our work. The humanitarian sector needs more funding to meet devastating needs. But it could reach millions more people—with the resources currently available—if we focused on more cost-effective interventions and more cost-efficient delivery. We have proven it is possible, we know what we need, and we have a moral imperative to act.
Support SSIR’s coverage of cross-sector solutions to global challenges.
Help us further the reach of innovative ideas. Donate today.
Read more stories by Justin Labeille & Jeannie Annan.