Evidence That Travels: Standards for Rapid, Reliable Policy Analysis

Standfirst:
Fast does not have to mean flimsy. This note sets out a simple evidence ladder, minimal appraisal rules, and how to present uncertainty to busy principals.

The evidence ladder (at a glance)

  1. Systematic reviews/meta-analyses (Campbell, 3ie) → strongest synthesis.

  2. Randomized & strong quasi-experimental (DiD, synthetic control, RDD, IV).

  3. Observational with robust controls → moderate inference.

  4. Descriptive/qualitative & expert consensus → crucial for context, weaker for causality.

Minimal appraisal rules (fit for speed)

  • Relevance: Is the population, setting, and implementation context comparable?

  • Credibility: Identification strategy defensible? Pre-analysis plan? Attrition?

  • Precision: Confidence intervals, not just point estimates.

  • Transferability: What adaptations are needed locally?

  • Cost & feasibility: Ballpark cost per outcome; institutional requirements.

Packaging for principals

  • 2-page policy note: (1) decision question; (2) 2–3 options; (3) expected effect size with confidence ranges; (4) risks/mitigations; (5) cost & timeline; (6) implementation checklist; (7) references.

  • Use plain-language uncertainty statements (“We are moderately confident… based on three quasi-experimental evaluations in comparable settings.”).

Where to source high-quality evidence

  • J-PAL/IPA repositories; Campbell Collaboration reviews; 3ie Evidence Gap Maps; Cochrane (for health); OECD policy evaluations; World Bank Policy Research Working Papers; UK HM Treasury Green Book and Magenta Book (appraisal & evaluation).

Editor’s note (opinion): Don’t chase the “best” method—chase the best decision. Pair rigorous syntheses with local feasibility and cost.

References 

UK HM Treasury Green Book & Magenta Book; OECD Evaluation Policy; Campbell Collaboration; 3ie; J-PAL/IPA; World Bank PRWP series; Cochrane Handbook.

Scroll to Top