Rank #1: Improving public policy through behavioral economics: An interview with Raj Chetty, Professor, Harvard University – Episode #78
How can tools from the behavioral sciences, such as behavioral economics, improve the design and implementation of public policies? We examine that question with a leading economist, Raj Chetty of Harvard University. In his recent keynote speech at the American Economic Association meeting, he argued that insights from the behavioral sciences can expand the scope of tools that are available to policymakers — insights such as the importance of defaults, salience and loss aversion.
Professor Chetty has been widely recognized for his research that combines empirical evidence and economic theory to help design more effective government policies. This is Part One of our conversation.
For part 2 of our conversation, on the use of administrative data (or “big data”) for research on what works in public policy, click here.
Feb 20 2015
Rank #2: The importance of administrative data for learning what works in public policy: An interview with Raj Chetty, Professor, Harvard University – Episode #83
Why is administrative data, also known as big data, important for learning what works in public policy? And what steps can help the U.S. strengthen its data infrastructure for policy-relevant research? To gain insights, we’re joined by Raj Chetty for part 2 of our conversation. A Professor of Economics at Harvard University, his research combines empirical evidence (often using administrative data) and economic theory to help design more effective government policies.
As background, administrative data means the data collected by government agencies for program administration, regulatory or law enforcement purposes. Federal and state administrative data include detailed, useful information on labor market outcomes, health care, criminal justice, housing, and other important topics. Access to administrative data for research purposes – while carefully protecting privacy – can produce important insights about what works and how to improve public sector programs and policies. For further reading, a useful resource is the chapter Building Evidence with Administrative Data from the Analytical Perspectives section of the President’s 2016 Budget.
For part 1 of our conversation, on behavioral economics, click here.
The post The importance of administrative data for learning what works in public policy: An interview with Raj Chetty, Professor, Harvard University – Episode #83 appeared first on Gov Innovator podcast.
Mar 18 2015
Rank #3: Determining if your program is having a positive impact (i.e., impact evaluation 101): An interview with David Evans, Senior Economist, The World Bank – Episode #122
Is my program or initiative having a positive impact?
It’s a question about which organizational leaders may want hard evidence, either to take stock and help improve program results, or to satisfy their authorizers or funders who may be asking for rigorous evidence of impact. Either way, how can you determine the impact of your program? And which strategies may sound useful but are unlikely to produce accurate answers?
To examine these these questions and get a “101” on impact evaluation, we’re joined by David Evans (@tukopamoja). He is a Senior Economist at the World Bank and the co-author, with Bruce Wydick, of a recent post on the Bank’s Development Impact blog on this topic.
The interview covers:
- The concept of impact
- Ways that organization could try to estimate impact that generally won’t be accurate
- Three strategies to more accurately estimate program impact:
- Using a lottery, aka a randomized experiment
- Using an eligibility cutoff, aka regression discontinuity design
- Using before and after data for both participants and nonparticipants, aka a differences-in-differences approach
- Factors to guide the choice of one impact evaluation strategy over another
The post Determining if your program is having a positive impact (i.e., impact evaluation 101): An interview with David Evans, Senior Economist, The World Bank – Episode #122 appeared first on Gov Innovator podcast.
May 02 2016
Rank #4: How Philadelphia became a leader in the use of data and evidence: An interview with Maia Jachimowicz, V.P. for Evidence-Based Policy, Results for America, and former policy director to Mayor Michael Nutter – Episode #109
Michael Nutter served as Mayor of Philadelphia from 2008 to 2016. During his eight years in office, the city became a leader in the use of data, evidence and evaluation to improve outcomes for city residents. In 2014, Governing Magazine named the Mayor one of the Public Officials of the Year, noting, “Philadelphia isn’t an easy place to govern. But Mayor Michael Nutter has undoubtedly made an outsized impact on the city, creating a Philadelphia that’s cleaner, safer, smarter and more fiscally sound than the city he began leading in 2008.”
To gain insights into some of the steps the city took to be more results-focused and effective, we’re joined by Maia Jachimowicz. She served as Deputy Director for Policy and, starting in 2013, as Policy Director for the Mayor until 2016. She recently became Vice President for Evidence-Based Policy at the nonprofit Results for America.
Web extra: Maia Jachimowicz provides an additional example of a city agency that became more results focused during the Nutter Administration. [click here]
The post How Philadelphia became a leader in the use of data and evidence: An interview with Maia Jachimowicz, V.P. for Evidence-Based Policy, Results for America, and former policy director to Mayor Michael Nutter – Episode #109 appeared first on Gov Innovator podcast.
Feb 15 2016
Rank #5: Creating a results-focused city government: An interview with Michael Nutter, former Mayor of Philadelphia – Episode #138
What is the value of evidence and data for elected city leaders as well as how can those leaders create a results-focused culture within city government? We get insights from Michael Nutter who served for eight years at the Mayor of Philadelphia, from 2008 to January 2016. Under his leadership, Philadelphia became known as a leader in the use of data and evidence.
In particular, the Nutter Administration established strategic goals with measurable targets; launched PhillyStat, Philadelphia’s performance management system; established Philadelphia’s open data policy in 2012 and launched an open data portal in 2015; and launched Philly 311, the city’s online customer service system.
Today Michael Nutter is a CNN political commentator, a professor at Columbia University, a fellow at the University of Chicago’s Institute of Politics and a senior fellow with the What Works Cities initiative, among other roles.
Aug 26 2016
Rank #6: Using randomized evaluations to address global poverty and other social policy challenges: An interview with Dean Karlan, Professor, Yale University, and President, Innovations for Poverty Action – Episode #112
Addressing the nation’s — and the world’s — biggest challenges will require learning and doing what works. A powerful tool for doing that is the randomized evaluation, also known as a randomized control trial (RCT). It is a tool that is increasingly being used in the U.S. and around the world. Well-designed and well-implemented RCTs can provide strong evidence about what works — not only whether a program works or not, but also which strategies within a program or policy work best.
As evaluation experts (including RCT proponents) will note, RCTs are one tool within public managers’ analytical tool boxes, along with performance measures, process evaluation, cost-benefit analysis or cost analysis, well-designed quasi-experiments and other approaches. The goal is to use the most rigorous method possible for the question at hand.
To learn more about the value of RCTs, as well as to address some of the concerns or criticisms of the approach, we are joined by Dean Karlan (@deankarlan), a leading expert in using randomized evaluations in social policy. He is a professor of economics at Yale University and the president and founder of Innovations for Poverty Action (IPA), a non-profit that has conducted over 500 evaluations in more than 50 countries to build evidence about effective solutions to global poverty problems. His most recent book, co-authored Jacob Appel, is titled, “More Than Good Intentions.”
The post Using randomized evaluations to address global poverty and other social policy challenges: An interview with Dean Karlan, Professor, Yale University, and President, Innovations for Poverty Action – Episode #112 appeared first on Gov Innovator podcast.
Mar 09 2016
Rank #7: Doubling community college graduation rates through CUNY’s ASAP program: An interview with Donna Linderman, Dean for Student Success Initiatives, City University of New York – Episode #104
Increasing the graduation rates at community colleges is an important national challenge. Nationally, less than 40 percent of community college students attain a degree or certificate — and students who come to campus underprepared for college-level work (those needing developmental or remedial classes) have graduation rates below 30 percent.
The City University of New York (CUNY) launched the Accelerated Study in Associate Programs (ASAP) in 2007 with the goal of doubling the graduation rates of community college students as well as encouraging timely graduation within three years. A rigorous, random assignment evaluation by MDRC found that ASAP nearly doubled the percentage of students needing developmental courses that completed an associate’s degree (40% versus 22% for the control group), by far the largest effects MDRC has found for a community college intervention. And CUNY’s own evaluation of the overall program (not just for those needing remediation) found that the program more than doubled graduation rates.
To learn more, we’re joined by Donna Linderman. She is the Dean for Student Success Initiatives at CUNY and the Executive Director of ASAP.
The post Doubling community college graduation rates through CUNY’s ASAP program: An interview with Donna Linderman, Dean for Student Success Initiatives, City University of New York – Episode #104 appeared first on Gov Innovator podcast.
Jan 08 2016
Rank #8: Lessons in applying behavioral insights to human services from the Behavioral Interventions to Advance Self-Sufficiency (BIAS) project: An interview with Lashawn Richburg-Hayes and Nadine Deshausay, MDRC – Episode #136
In 2010, the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services launched a project to explore how programs could advance their goals, and address specific challenges, by applying insights from behavioral sciences, including behavioral economics. It is called the Behavioral Interventions to Advance Self-Sufficiency (BIAS) project. Now, six years later, it has results from 15 randomized experiments conducted across seven states on the topics of employment, child support and childcare.
To get an overview and hear implementation lessons for human services agencies that might want to use these types of interventions — or “nudges,” as they are often called — we are joined by two researchers from the social policy research firm MDRC, which was a partner on the BIAS project. Lashawn Richburg-Hayes is a Director and Nadine Deshausay is a Research Associate at MDRC.
More information: For more information on the 15 projects, including their goals, strategies, results and costs, see MDRC’s PowerPoint presentation presented at the BIAS Capstone Convening in April 2016 [click here].
The post Lessons in applying behavioral insights to human services from the Behavioral Interventions to Advance Self-Sufficiency (BIAS) project: An interview with Lashawn Richburg-Hayes and Nadine Deshausay, MDRC – Episode #136 appeared first on Gov Innovator podcast.
Aug 10 2016
Rank #9: Four fundamental principles of evidence-based policy and practice, drawing from U.S. and European experience: An interview with Howard White, Executive Director, Campbell Collaboration – Episode #150
What principles can help guide public leaders—whether policymakers or public managers—in their use of evidence-based policy to improve results? Howard White (@HowardNWhite) of the Campbell Collaboration joins us to share four fundamental principles:
- 1. Use the right evidence to answer the right question. Different types of evidence — e.g., monitoring, process evaluation, impact evaluation and systematic reviews — all can produce useful information for decision makers. But each type of evidence should not be used to answer questions that are beyond its usefulness.
- 2. Don’t rely on single studies. When possible, leaders should avoid making important funding decisions based on single studies, especially those done in one site. That’s because the findings from one study are often different from those of further studies. The best approach is to use systematic reviews (where they exist), meaning syntheses of multiple high-quality studies.
- 3. Context matters for transferring evidence. Why do findings from one study often not replicate in another? A key reason is that context matters. For example, when a home visiting program found to be effective in the U.S. was tested in Britain, it produced no impact. Why? Likely it was the different context: Britain already provides services to low-income parents that are quite similar to the home visiting program in the U.S. It is why leaders should test out, with rigorous evaluation, programs and initiatives in their own setting, particularly if previous research was conducted in a different context.
- 4. Evidence-based policy is not a blueprint (aka cookie cutter) approach. This is a way of summarizing the previous two principles. While architects can take the blueprints for one building and build the same building elsewhere, and chefs can take a recipe from one restaurant and cook it in another, public leaders need to be careful when applying research from one place or setting to another.
Dr. White is the Chief Executive Officer of the Campbell Collaboration, a nonprofit best known for its use of systematic reviews to help policymakers and others make well-informed decisions. Previously he was the founding Executive Director of the International Initiative for Impact Evaluation (3ie) and led the impact evaluation program at the World Bank’s Independent Evaluation Group.
Photo credit: European Union
The post Four fundamental principles of evidence-based policy and practice, drawing from U.S. and European experience: An interview with Howard White, Executive Director, Campbell Collaboration – Episode #150 appeared first on Gov Innovator podcast.
Jun 25 2017
Rank #10: Creating successful researcher-practitioner partnerships at the Federal level: An interview with Dayanand Manoli, Professor, University of Texas at Austin – Episode #128
An important and underused opportunity for public agencies to improve their results and tackle critical challenges is researcher-practitioner partnerships. When researchers and government executives team up, public agencies can get credible answers to important operational and strategic questions. That can include insights from empirical analyses as well as from field experiments.
To get insights into what it takes to create a successful researcher-practitioner partnership, we’re joined by Dayanand Manoli. He is an economist at University of Texas at Austin whose research interests include policy related to social security and retirement policy, income tax policy and education policy. He has collaborated with the IRS on several research projects.
The post Creating successful researcher-practitioner partnerships at the Federal level: An interview with Dayanand Manoli, Professor, University of Texas at Austin – Episode #128 appeared first on Gov Innovator podcast.
Jul 01 2016
Rank #11: Making rigorous program evaluation easier with RCT-YES software: An interview with Peter Schochet, Fellow, Mathematica Policy Research – Episode #137
Public leaders — whether they’re helping run a state agency, a school system, a hospital, a set of Head Start centers or any other organization — are likely to implement changes over time, whether it’s adjusting programs or adding new services. Maybe it’s a new curriculum for students in a school district or new intake procedure for patients in a hospital. Whatever the change, how can those leaders determine if the change is actually effective?
Our focus today is new software, called RCT-YES, designed to help public leaders (and the researchers who work with them) answer that question. It was funded by the Institute of Education Sciences, the statistics, research, and evaluation arm of the U.S. Department of Education, and developed in partnership with Mathematica Policy Research. The software, available free to download online, is based on new statistical methods for analyzing data from randomized controlled trials (RCTs).
To learn more, we are joined by Peter Schochet. He is a nationally known methodological expert in program evaluation and a Senior Fellow at Mathematica. He led the team that developed RCT-YES.
Web extra: For those with deeper expertise in evaluation, Peter Schochet gives an overview of how the RCT-YES software is designed to conduct a wide range of analyses using RCT or QED data and how the software uses new statistical methods for analyzing those data. [click here]
Aug 19 2016
Rank #12: Insights from the City of New Orleans’ analytics unit, NOLAlytics, about using data to improve city services: An interview with Oliver Wise, Director, Office of Performance and Accountability, City of New Orleans – Episode #114
The City of New Orleans under Mayor Mitch Landrieu has gained a reputation as being one of the most innovative and data-driven city governments. An important element in those efforts is the Office of Performance and Accountability, launched in 2011. The mission of the office is to use data to set goals, track performance, and drive results across city government. In 2015, it launched an analytics unit called NOLAlytics that undertakes data-driven projects to improve city services.
To learn more, we are joined by Oliver Wise (@ojwise). He is the founding director of the Office of Performance and Accountability.
The post Insights from the City of New Orleans’ analytics unit, NOLAlytics, about using data to improve city services: An interview with Oliver Wise, Director, Office of Performance and Accountability, City of New Orleans – Episode #114 appeared first on Gov Innovator podcast.
Mar 18 2016
Rank #13: How the Rhode Island Innovative Policy Lab (RIIPL) works: An interview with Justine Hastings, Director, RIIPL – Episode #147
In 2015, a unique collaboration was launched call the Rhode Island Innovative Policy Lab (RIIPL). It is a partnership between researchers at Brown University and the Office of the Governor of Rhode Island, with the goal of helping state agencies design evidence-based policies to better serve Rhode Island families.
RIIPL’s goal is to use data and science to improve policy, alleviate poverty and increase equity of opportunity. To do that work, it has created a new linked database of public programs, connecting more than 100 previously independent data sets.
May 26 2017
Rank #14: Strategies to sustain program impacts for children and adolescents: An interview with Greg Duncan, Professor, University of California, Irvine – Episode #159
Many interventions that aim to increase the cognitive or socioemotional skills of children and adolescents have shown positive results, but far too often their impacts quickly disappear as children get older. Some programs, in contrast, have shown longer-lasting effects. In a new study published in the Journal of Research on Educational Effectiveness, Greg Duncan and his co-authors set out to identify the key features of interventions that can be expected to sustain persistently beneficial program impacts. They include:
- Skill building: Identifying key skills and building them in an intervention, producing impacts into the future. That might include analytical thinking, delayed gratification delay or grit
- Foot in the door: Designing the right intervention at the right time to help a child or adolescent through a period of risk or opportunity, such as interventions that keep young people from repeating grades.
- Sustaining environments: Providing additional interventions that build on the gains of the initial intervention, essentially creating “recharging stations” to sustain initial gains.
To learn more, we are joined by Greg Duncan. He is a Distinguished Professor in the School of Education at the University of California, Irvine.
The post Strategies to sustain program impacts for children and adolescents: An interview with Greg Duncan, Professor, University of California, Irvine – Episode #159 appeared first on Gov Innovator podcast.
Jul 31 2017
Rank #15: The use of impact bonds around the world: An interview with Emily Gustafsson-Wright, Fellow, Center for Universal Education, The Brookings Institution – Episode #158
Social Impact Bonds, also called Pay for Success projects in the U.S., draw on private sources of capital to fund preventive services, with governments acting as the outcome funders, paying back the money with a profit if specific targets are met. The approach started in the U.K. and is now being used in many different countries. A related strategy has also been created — Development Impact Bonds — that, as the name suggests, are primarily used in developing countries. They are used to social interventions and involve third parties, such as a donor agencies or a foundations, as the outcome funders, rather than governments. Overall, an estimated $200 million in upfront private capital has been leveraged by impact bonds for social services worldwide over the last six years, an amount that is expected to triple by 2020.
To learn more about global trends in impact bonds, we are joined by Emily Gustafsson-Wright (@EGWBrookings), a Fellow at the Center for Universal Education at the Brookings Institution. She is the co-author of the recent report, The Potential and Limitations of Impact Bonds: Lessons from the First Five Years of Experience Worldwide.
The post The use of impact bonds around the world: An interview with Emily Gustafsson-Wright, Fellow, Center for Universal Education, The Brookings Institution – Episode #158 appeared first on Gov Innovator podcast.
Jul 26 2017
Rank #16: How the Institute of Education Sciences at the U.S. Dept. of Education is helping the education field to learn and do what works: An interview with Russ Whitehurst, Senior Fellow, The Brookings Institution – Episode #119
Over the last 15 years, the field of education has become considerably more evidence focused, including a growing number of high-quality studies about how to help students succeed in school. An important catalyst has been the Institute of Education Sciences (IES). It is the independent, non-partisan statistics, research, and evaluation arm of the U.S. Department of Education. Created in 2002 during the George W. Bush Administration, it has continued to flourish under the Obama Administration and today has a budget of about $670 million and a staff of 180.
To learn more, including lessons for other public agencies, we’re joined by Russ Whitehurst. He was the first director of IES and served in that role from 2002 to 2008. Today he is a Senior Fellow at the Brookings Institution, including serving as editor of the Evidence Speaks series.
Web extra: Russ Whitehurst describes the origins of IES, including some of the key people involved in its creation and launch. [click here]
The post How the Institute of Education Sciences at the U.S. Dept. of Education is helping the education field to learn and do what works: An interview with Russ Whitehurst, Senior Fellow, The Brookings Institution – Episode #119 appeared first on Gov Innovator podcast.
Apr 15 2016
Rank #17: A primer on the Commission on Evidence-Based Policymaking’s recommendations: An interview with Nick Hart, Bipartisan Policy Center – Episode #160
While Democrats and Republicans can’t seem to agree on much these days, there was a bright spot for bipartisanship recently: Republican Speaker of the House Paul Ryan and Democratic Senator Patty Murray joined together to praise the recommendations of the Commission on Evidence-Based Policymaking (CEP), which Ryan and Murray launched last year. The Commission was co-chaired by Katharine Abraham of the University of Maryland and Ron Haskins of the Brookings Institution.
Some of the Commission’s key recommendations focus on making the most of the data the government already collects by giving qualified researchers—including academics as well as evaluation experts within government—greater access to data from government programs and surveys. At the same time, the CEP calls for strengthening privacy protections to ensure that those data are not misused. It also recommends ways that departments can increase their evidence capacity, meaning their ability to use and build evidence about what works.
To learn more, we are joined by Nick Hart (@NickrHart) who served as the Policy and Research Director for the Commission. Today he is the director of the Bipartisan Policy Center’s Evidence-Based Policymaking Initiative.
Oct 04 2017
Rank #18: Three strategies to promote relevance in program evaluations so that findings are useful to policymakers and practitioners: An interview with Evan Weissman, Senior Associate, MDRC – Episode #117
In program evaluation, using the most rigorous methods possible is essential for producing credible research findings. But beyond the goal of rigor, relevance is important too. In particular, the more that evaluations are able to address specific research or implementation questions that are of interest to practitioners and policymakers, the more likely that the findings will actually get used.
A rigorous evaluation (using a randomized controlled trial) of a student-aid initiative, called Aid Like a Paycheck, recently took three additional steps, beyond typical program evaluation, to ensure that the study produces information that is relevant to end users. The strategies will be of interest to other program evaluators, but also to foundations and other funders who want to support rigorous and relevant program evaluations. The strategies are:
- Implementing a pilot phase — in fact, one that ran longer than most (about 2 1/2 years);
- Forming an advisory group of stakeholders to provide input into the design of both the intervention and the research study; and
- Doing outreach to other stakeholders about both the preliminary intervention design and research design to get additional input.
To learn more, we’re joined by the evaluation’s lead researcher, Evan Weissman. He is a Senior Associate at the nonprofit research firm MDRC and has over 15 years of experience at MDRC directing projects, providing technical assistance, conducting qualitative research, and disseminating findings in a wide range of education and social policy settings.
The post Three strategies to promote relevance in program evaluations so that findings are useful to policymakers and practitioners: An interview with Evan Weissman, Senior Associate, MDRC – Episode #117 appeared first on Gov Innovator podcast.
Apr 04 2016
Rank #19: How Utah became a leader in evidence-based policymaking: An interview with Kristen Cox, Director, Governor’s Office of Planning and Budget, and Jonathan Ball, Director, Utah Fiscal Analysts Office – Episode #132
Utah is one of the top states in the U.S. in terms of evidence-based policymaking and budgeting. In particular, with efforts by the Utah State Legislature and the administration of Governor Gary Herbert, Utah has created a variety of agency-specific and cross-agency tools to incorporate evidence into policy and funding decisions. That includes:
- A requirement from the Governor’s budget office that agencies seeking new funding provide evidence of program efficiency and effectiveness and, for new programs, describe their program evaluation strategy.
- The Herbert administration’s use of a performance management framework for agencies.
- A requirement by the Legislature that proposals for new or significantly expanded programs require a performance note that describes how the program will measure its success (with followup by legislative auditors to track results).
- A statewide registry of evidence-based prevention interventions that guides the Utah Division of Substance Abuse and Mental Health in contracting decisions.
- The use of a comprehensive cost-benefit model in juvenile justice to help lawmakers identify evidence-based policies that provide the best return on taxpayers’ investment.
To learn more, we are joined by Kristen Cox, the Director of the Office of Planning and Budget for Governor Gary Herbert, and Jonathan Ball, the Director of the Utah Fiscal Analysts Office for the Legislature.
The post How Utah became a leader in evidence-based policymaking: An interview with Kristen Cox, Director, Governor’s Office of Planning and Budget, and Jonathan Ball, Director, Utah Fiscal Analysts Office – Episode #132 appeared first on Gov Innovator podcast.
Jul 22 2016
Rank #20: Using intensive, individualized math tutoring to boost academic outcomes of disadvantaged youth: An interview with Jonathan Guryan, Professor, Northwestern University – Episode #121
Improving schooling outcomes of disadvantaged youth is a top policy priority in the United States, but few interventions have produced convincing evidence that they can improve those outcomes, especially for adolescent youth — the age at which socially costly outcomes occur, such as high school dropout. As a result, it may be conventional wisdom that, by adolescence, it is too late and too costly to improve academic outcomes of children in poverty.
A recent study (and related Hamilton Project policy proposal), however, suggest that this conventional wisdom is wrong. It uses a rigorous evaluation design — a randomized controlled trial — to examine the effects of intensive, individualized (two students to one tutor) math tutoring among 9th and 10th grade boys in twelve Chicago public schools.
To learn more, we are joined by one of the study’s nine authors, Jonathan Guryan. He is a professor of human development and social policy at Northwestern University and a fellow at Northwestern’s Institute for Policy Research.
The post Using intensive, individualized math tutoring to boost academic outcomes of disadvantaged youth: An interview with Jonathan Guryan, Professor, Northwestern University – Episode #121 appeared first on Gov Innovator podcast.
Apr 26 2016