Level up decision-making with research-based policy analysis: Three Approaches Consultants Use to Support Local Government
- Aaron Sparks
- Sep 11
- 5 min read
Local governments face complex policy challenges that require careful consideration of evidence, trade-offs, and community values. Elected officials and municipal staff often have limited time and capacity to conduct in-depth analysis. Consultants can fill this gap by applying structured policy analysis and program evaluation methods that strengthen decision-making. This post highlights three approaches offered by the Sparks Lab: the goals/alternatives matrix to compare across different policy options, program evaluation using existing municipal data, and field experiment–style pilot programs. These approaches map to three common questions facing local decision-makers: 1) What is the best option to solve a new problem? 2) How effective are existing policies/programs? 3) How can we provide evidence that an innovative policy idea is actually effective?
1. The Goals/Alternatives Matrix: Structuring Policy Choices
One of the most widely used frameworks in policy analysis, is the goals/alternatives matrix. This method clarifies the objectives (goals) a community seeks to achieve and systematically compares different policy options (alternatives) against those goals (Weimer & Vining, 2017).
Example: Urban Heat Islands in La Crosse, Wisconsin
A recent study by the La Follette School of Public Affairs explored strategies to mitigate urban heat islands in La Crosse through expanded green space (La Follette School of Public Affairs, 2023). In consultation with city leaders, the researchers identified the following goals that a potential policy solution should address:
Feasibility: defined here as monetary cost and implementation practicality
Heat mitigation: estimate of temperature decrease
Equity: impact on marginalized communities
Positive externalities: benefits beyond the direct solution, eg: a park could also have a playground
Next, they developed several policy alternatives that could stand alone or be combined with another. These were:
Increase tree canopy by planting more trees
Develop “pocket parks”
Incentivizing green roofs and installing them on city buildings
Their analysis was based on a careful and systematic review of research to project how each alternative scored on each goal. The results of their analysis was to recommend expanded tree canopy cover and additional pocket parks, especially in neighborhoods that lack access to other green space.
2. Program Evaluation Using Existing Data
When governments already operate programs, a different analytic approach becomes valuable: program evaluation with administrative data. Rather than collecting new information, consultants can draw on data municipalities already track—budget records, service usage statistics, or performance metrics (Rossi, Lipsey, & Henry, 2019). Municipalities can sometimes suffer from DRIP - Data Rich, Information Poor - syndrome. This is where Sparks Lab can be of service: turning data into applicable information.
Example: Evaluating a Recycling Outreach Program
Consider a city that has funded a recycling education initiative for several years. A consultant can use waste tonnage data, recycling participation rates by neighborhood, and cost per household to evaluate whether the program has produced measurable results. By comparing trends over time or across areas with different levels of outreach, the evaluation can reveal whether the program is meeting expectations or requires redesign.
This approach is cost-effective and provides accountability to both officials and the public. Importantly, it demonstrates how existing data can be leveraged to inform evidence-based improvements without significant new expenditures.
3. Pilot Programs and Field Experiments
For new and innovative policies, municipalities often want stronger evidence of effectiveness before scaling citywide. Field experiments—or pilot programs with an experimental design—offer a rigorous way to test impact. Sparks Lab can design interventions where treatment and comparison groups are established, allowing causal inference about whether a program works (Gerber & Green, 2012).
Example 1: Reducing Residential Water Use
An example of this can be seen when researchers from the University of California, Santa Barbara teamed up with a local water district to reduce household water-use in times of extreme drought (Hodges et al, 2020). While water managers often incorporate messaging into their policy design, the messages are not always based in the psychological sciences and can be less effective.
In this case, researchers employed the Information-motivation-behavioral skills model to design persuasive messages. 10,000 single-family households were randomly assigned to different treatment and control groups to examine the effects of different messaging strategies. Water usage was then monitored. In the first month, households that received any message reduced water usage by over 500 gallons compared to those in the control group, and this number was even greater for high-usage households. Once effective messaging can be determined, the messaging can be scaled-up to reduce usage in the entire community.
Example 2: Universal Basic Income in Stockton, CA
As cities grapple with the potential of job losses due to automation, one policy gaining traction in some areas is Universal Basic Income (UBI). While providing UBI is straightforward, the potential impacts are not well-understood. Could it decrease interest in working? Could it have positive spillover effects extending beyond people receiving the payment?
In 2019, Stockton California became one of the first communities in the US to pilot this kind of program. City leaders designed a program to randomly provide 125 people living in poorer neighborhoods $500 month for two years with no strings attached. They found that job prospects improved, financial stability increased, and overall wellbeing was also enhanced as a result of selection into the program. Because selection into the program was random, researchers can have high confidence that it was the UBI, not a more general factor, that caused these positive outcomes (Ghuman 2022). Based on the success of this study, the state is now providing block grants for other communities across the state to pilot the program.
This approach reduces risk by ensuring that policy expansion is based on demonstrated effectiveness. It also builds public trust by showing that the city tested solutions carefully before investing at scale.
Conclusion
Policy analysis is not one-size-fits-all. By tailoring the method to the policy question, consultants can help local governments move from uncertainty to clarity, and from political debate to evidence-based decision-making. Whether through structured comparison of alternatives, careful evaluation of existing programs, or rigorous testing of new ideas, consultants add analytical capacity and credibility to local governance.
For communities weighing tough choices, engaging in systematic policy analysis ensures that decisions are transparent, accountable, and grounded in evidence.
References
Gerber, A. S., & Green, D. P. (2012). Field experiments: Design, analysis, and interpretation. W. W. Norton & Company.
Ghuman, U. (2022). A policy review of the SEED (Stockton economic empowerment demonstration) project: Is the devil in the details?. International Journal of Community Well-Being, 5(4), 819-830.
Hodges, H., Kuehl, C., Anderson, S. E., Ehret, P. J., & Brick, C. (2020). How managers can reduce household water use through communication: A field experiment. Journal of policy analysis and management, 39(4), 1076-1099.
La Follette School of Public Affairs. (2023). La Crosse green space expansion: 2023 Workshop in Public Affairs report. University of Wisconsin–Madison. Retrieved from https://lafollette.wisc.edu/research/2023-la-crosse-green-space-expansion/
Rossi, P. H., Lipsey, M. W., & Henry, G. T. (2019). Evaluation: A systematic approach (8th ed.). SAGE Publications.
Weimer, D. L., & Vining, A. R. (2017). Policy analysis: Concepts and practice (6th ed.). Routledge.


Comments