|
|
Conducting and Applying International Research and Evaluation
Governments and international donors are increasingly relying on applied research to more accurately identify what works and what doesn’t, with the ultimate goal of maximizing the impact of development assistance. Here’s a look at recent work in research and evaluation spanning many technical areas, including health systems strengthening, infectious diseases, family planning and reproductive health, and maternal, newborn, and child health.
|
|
|
Special Issue Provides Tools to Open Evaluation’s ‘Black Box’
It’s not enough to know whether a program or policy works. Instead, program administrators and policymakers want to know what’s inside the so-called ‘black box.’ Just how does a program or policy make a difference and by how much?
Now program administrators and evaluation students have methods to help them answer these questions. Laura Peck of Abt Associates brings together some leading scholars and practitioners – including Howard Rolston, Jacob Klerman, and Stephen Bell in “Social Experiments in Practice: The What, Why, When, Where and How of Experimental Design and Analysis,” a special issue of New Directions in Evaluation. In the issue, Peck and fellow authors provide in-depth answers to tough evaluation questions across social science research, including when to conduct experiments.
|
|
|
Internal versus External Validity in Rigorous Policy Impact Evaluations: Do We Have to Choose?
Can researchers give policymakers the right information about what is and is not working for the nation as a whole, especially when research is limited to select pockets of the country? This is not as impossible as it sounds, writes Abt’s Stephen Bell.
|
|
|
Black Boxes, the Counterfactual, and Bringing Order to RCTs
Laura Peck and Allan Porowski shared a few insights in evaluation – from tips to making the start-up period for randomized control trials go a little more smoothly to how getting to the “why” behind an experimental result is an increasingly important part of experimental impact evaluations.
|
|
|
Getting Meaningful Technical Assistance from Webinars: We Can Evaluate That
The field of technical assistance is changing rapidly. Many organizations that provide national and local technical assistance have moved toward the use of "virtual" TA. Does it work? Allison Hyra looks at how TA providers evaluate webinar-based TA.
|
|
|
Articles and Other Resources
|
|
Abt's Randall Juras authored this article, which uses data from two large multi-school, multi-district impact evaluations conducted in the United States and calculates estimates of the design parameters required for sizing school-based nutritional studies. The large size of the trials (252 and 1,327 schools) yields precise estimates of the parameters of interest.
|
|
In this article, Abt's Stephen Bell and Laura Peck define 15 common concerns about the viability and policy reliability of social experiments, in order to assess how much these issues constrain the use of the method in providing policy evidence. The authors use their experience designing and conducting dozens of social experiments to examine the basis for and soundness of each concern.
|
|
Abt's Emily Mangone co-authored this study, which developed a preliminary evaluation framework for the assessment of mobile apps for adolescent and young adult pregnancy prevention and used this framework to assess available apps in the Apple App Store and Google Play that targeted adolescents and young adults with family planning and pregnancy prevention support.
|
|
Join Our Team
|
|
Associate / Scientist - Monitoring and Evaluation Specialist, Bethesda, MD
Monitoring and Evaluation Manager - India - SHOPS Plus Project Research, India
Research Assistant, Environment and Resources, Cambridge, MA
Research, Monitoring, and Evaluation Manager, Nepal
Senior Analyst / PhD Economist, Social and Economic Policy, United States
|
|
|
|
|
|