RCT on a budget?
What’s the counterfactual? What happens to people who don’t get your program?
– name your favorite donor
We’ve all faced this dreaded question. It frequently comes from a donor or a board member, but hopefully increasingly from the team itself, who has a genuine interest in better understanding the impact (or lack thereof) of the interventions they work so hard to design and deliver. In many cases, the question comes from all three.
However, a good counterfactual is hard to get. It typically involves conducting a randomized control trial (an RCT… dun, dun, dun). The general perception is that they are always exorbitantly expensive, can only be done if you are operating at a significant scale, and consume the entire organization’s energy.
We’re here to bust that myth.
Since 2018, we have been conducting an RCT with J-PAL to measure our program‘s impact on young women’s career preparation and progression in the Hindi Heartland. If you are interested in the findings, check out the midline report.
We conducted this study on a shoestring budget for RCTs (<$60k/₹45L) and in a way to minimize its impact on our operations (less than 15% of the team were involved).
We thought it would be helpful to share how we did this for all the other resource-constrained organizations (🤛) out there. It is certainly not a miracle or something that hasn’t been done before, but we didn’t come across many examples from the Indian non-profit world when we tried to make this happen.
1. Find a researcher who is passionate about what you do and the problem you are trying to solve.
This is a deal-breaker. Don’t move forward unless you feel confident about this.
In our case, it took many years to make this happen. We networked hard and reached out to many research firms. We shared (pitched) our work and potential research questions. Without a large grant behind us, the conversations didn’t go very far.
However, after multiple discussions with J-PAL, our partnership development form serendipitously found its way to Lori Beaman. I like to think it was like when Remy finds Gusteau’s in Ratatouille 😊.
Only after a few discussions with Lori, and some comfort that both of us were equally as excited and passionate about pursuing this, did we move forward.
2. Leverage your existing team as much as possible.
There is no reason why you need to have countless third-party enumerators (costly!) collect the baseline data.
In our case, we utilized our Student Relationship Managers (SRMs) to hold the lottery for randomization and facilitate the collection of baseline data through smartphones and tablets we provided (cost savings).
Since SRMs directly work with students, it was tough for them not to be able to work with all interested students in that year (argh, randomization). While it doesn’t necessarily make it any easier, we took extensive training sessions with the team, so they understand the ‘why’ behind an RCT design.
3. Reduce the sample size.
While there are reasons to have a big sample (researchers will frequently cite power calculations), it is in your best interest to reduce the sample as much as possible (of course, without sacrificing the integrity of the research).
In our case, we did a pilot sample of 600 students, followed by a second sample the following year of 1,500 students (treatment and control). This outreach was approximately 15% of our total students that year.
Reducing the sample size enabled us to insulate and minimize the study’s impact on the rest of the organization and team.
RCTs can be disruptive. They limit the number of people you can work with. And telling users that you are denying them service is never easy. If you can minimize this as much as possible, it will make life simpler.
So, while it is easier to accomplish this if you have extensive operations, as you can see, it’s still achievable with a scale of a couple of thousand users a year. In the Indian context, this is small 🤣
4. Explore pilot/small grant opportunities and existing funds your researcher may have access to.
Researchers often have discretionary funds that can be used for your study (granted, the amounts are not huge).
In our case, Lori could apply some of these funds to recharge sim cards for study participants and even hire a research associate when required.
If these funds are unavailable, you can explore pilot or smaller grants that do not require a significant scale. In most cases, the researcher will do the heavy lifting on these applications, saving you the time and expertise needed to secure them.
In our case, we got a pilot grant from J-PAL’s PPE Initiative and also explored small grants from Spencer Foundation. There will likely be many such opportunities in your world.
5. Position it as a learning exercise, not an existential question of whether your program works or not.
Ok, this one may just be gyaan (read pontification), but nonetheless.
RCTs, and other empirical research methods, are just another tool in the MEL toolbox to understand more about your program, users, and impact. They are not a silver bullet or perfect (apologies to all my development economist friends). Sometimes they only tell you minimal information about something you are doing.
One example from our case is that our students are more likely to have a CV compared to their peers who didn’t go through a Medha program. That’s nice to know, but it doesn’t tell us anything about how good that CV is and if it helped them secure the job they wanted.
The point is, if we approach RCTs as a learning opportunity, it reduces the pressure on the whole team and makes us focus on what we can learn from them, not whether someone sitting in Chicago (Lori 🤐) is going to tell us if the last ten years of our lives have been worth it.
Happy randomizing
So, there you have it, five ways to pull off an RCT in a low-cost and low-impact way (not that kind of low-impact).
With a bit of persistence (work hard to be resourceful and scrape together what limited funds exist), luck (finding the right researcher), and collaborative spirit (get the whole team behind the learning objective), you too can make it happen!
Additional Resources:
Low-Cost Randomized Control Trials, Brookings Institute
Rigorous Program Evaluations on a Budget, Coalition for Evidence-Based Policy
This story is a part of NonProfitWork: a collection of opinions and insights to help non-profits work better and drive greater impact.