WAP is all the fuss about?

Posted November 29, 2015

In 2001 I found a passion: low-income program evaluations. It was the year PA Consulting (now Tetra Tech) kicked off a multi-year low-income evaluation for the Wisconsin Weatherization Assistance and Home Energy Assistance Programs, a project that I was an integral part of for all six years and then some.

The research questions were complex and included: How do these programs, separate and combined, affect customers’ sustainability? What are these low-income customers’ energy burdens, and how does that compare to the non-low-income and just-over income customers? What other financial and other burdens do these customers experience, and how do the assistance programs relieve some of those burdens, if at all? To that extent, what are the non-energy benefits? How can we quantify them? And, of course, what are the program’s impacts, not only in terms of energy savings but also in arrears?

To address these complex issues, we were given the latitude to complete extensive research that thousands of participant and nonparticipant surveys, hundreds of interviews, a billing analysis, two non-energy benefits analysis, and an arrearage analysis. This multi-year time frame allowed us to test pilot approaches and targeted process assessments. It was, dare I say, a fun project.

However, beyond the academic intrigue, the project left a lasting impression on me. I had the opportunity to speak directly with these participants and eligible nonparticipants and hear (sometimes at length) their stories. I saw clear evidence that the programs were affecting customers’ lives, not only from an energy bill perspective but also from a mental assurance; often customers said benefits gave them “peace of mind”. I heard from community action agencies and contractors the types of home conditions they went into and what the program did to improve the safety and comfort of residents’ living conditions, many of which had children and/or elderly present. And even with this research, we could only barely crack the nut of why customers didn’t participate in a free program, an answer far more complex than “lack of demand,” and one investigated in other research (such as California’s Energy Savings Assistance Needs Assessment Study (2013)).

So it is not without surprise that the recent controversy around a paper published by E2e, “Do Energy Efficiency Investments Deliver? Evidence from the Weatherization Assistance Program” struck a nerve with me. This paper, along with the follow-up article “Energy efficiency upgrades cost double the projected benefits” puts forth that “residential energy efficiency investments may not deliver on all that they promise”, hitting on the need for cost-effective savings (policies should ensure greater “bang for the buck”) and concern over reality of savings against predictive modeling used to direct the measures installed. The New York Times article, “For Government that Works, Call in the Auditors,” also wrote on this topic, discussing the approach for quantifying the non-energy related benefits and the need for independent auditing to validate energy program results.

The results themselves are not concerning. The analysis reported savings of 10% – 20%, which are consistent with other independently statewide WAP Program evaluations conducted around the country. Further, I agree with their finding that the NEAT audit over-represents measure-level savings. Engineering models –whether it’s a NEAT audit, simulation models, or engineering estimates – struggle to account for interactive effects and pre-existing home conditions that could affect savings (such a leaky windows). Independent evaluations have reported these findings in countless studies, and impacts have been adjusted accordingly. These findings are not new to our industry.

What is far more troublesome is the messaging around the results, which calls in to question not only the value of WAP efforts but residential efficiency programs writ large. Essentially, that the programs are not valuable components of energy efficiency policy goals. Further, the study failed to recognize the program theory and program goals, as well as the external non-energy benefits such as employment and health and safety (discussed further below).

Federal and most state-level WAP goals are fairly straightforward: reduce the burden of energy prices on the disadvantaged, and improve households’ health and safety, with a focus on vulnerable households. A number of states have considered or are considering targeting high-energy users as an effort to optimize energy saving opportunities, although program-level cost-effectiveness is not a stated goal of the program. When one considers that approximately 20 percent of funds can be used for the secondary goal of improving health and safety of homes, the omission of program-level cost-effectiveness makes sense.

By all standards, and as shown in most third-party evaluations (including the E2e study), the program meets the stated goals of serving low-income customers and reducing heating and cooling bills. What the study questions, and is reported within E2e and related media articles, is whether the savings justify the costs. But determining whether the savings justify the costs is a particularly complex question for a social program such as DOE’s WAP and not simply a matter of economic costs relative to energy benefits. The goals of WAP extend far beyond this simple calculation; WAP aims to achieve non-energy benefits including health, safety, and consumer well-being.

Valuing non-energy benefits, such as health and safety gains and improved sustainability, is no simple task. This does not mean, however, that the benefits do not exist and cannot be measured with some degree of confidence. We have been recognizing non-energy benefits for the weatherization assistance program for well over 20 years. As an example, the Department of Energy completed a fairly extensive non-energy benefits study in 1993, with a follow-up report completed in 2002. Many states, including Wisconsin, Massachusetts, California, and Connecticut have also attempted to value non-energy benefits and continue to do so.

Evidence shows that loss of energy has negative implications on low-income homes. Old – but good – statistics document the effects of limited access to energy resources in a study published in 1994 by economist Roger D. Colton as part of a National Housing Institute article titled Energy and Low-income Housing: Part 1, Energy Policy Hurts the Poor. Such negative impacts include:

  • Increases in children health issues following the coldest winter months
  • Occurrences of house fires and related fatalities associated with alternate heating methods following shutoffs for nonpayment
  • Increases in homelessness and/or building vacancies following service termination

Is there any reason to believe that, if national studies showed these cause and effects, that the effects would be any less dramatic than they were 20 years ago given the state of the current economy? If not, and we believe these non-energy benefits exist for a program such as WAP, we need to find ways to more effectively value and/or communicate those benefits.

Socially, these programs matter. In impact evaluations we are always forced to measure the counter-factual; what would have happened if the program did not exist. If we assess WAP holistically against its goals and review the decades of WAP evaluations and literature, we find that:

Irrefutably, if the program did not exist, low-income customers would produce significantly greater energy demand and would incur significantly greater energy costs. Even if the savings results were 40 percent of those measured in our studies, the results would remain meaningful. Applied across millions of homes, the energy impacts are significant.

Irrefutably, if the program did not exist, utilities would have to account for larger arrears and write-offs due to high energy costs to low-income customers.

Irrefutably, if the program did not exist, there would be far more homes with poor wiring, malfunctioning and improperly fitted furnaces, and poor or dangerous air quality (among other health and safety concerns). What is the true benefit of avoided home fires? The cost of carbon monoxide poisoning?

We as an industry – and as a society – need to be more careful in defining and communicating the goals of our programs and their achievements. While I am focusing on DOE WAP in this piece, this issue pertains to all types of programs. E2e’s reports on the implications of their WAP findings and residential efficiency programs in general highlight the need to:

  • Clearly, state near-term and long-term program goals and the associated metrics to measure progress against those goals.
  • Be transparent about analysis and what it does and does not measure, including the limitations of the research and opportunities for methodological improvements.
  • Engage external experts and peer reviewers outside of a study team to push against assumptions, analysis, and messaging, forcing us to think deeper.
  • Remain responsible stewards of the study findings and ensure that the study findings are not exaggerated or applied inappropriately to other programs in other contexts.

Yes, these are common sense take-aways, but worth stating in light of how the WAP findings have been used to make claims against the program and misconstrued in the media. It is our responsibility to accurately and fully account for how our (and other ratepayers) funds are being spent, and the impacts of those funds. And that responsibility puts the onus on us to push the envelope and be more thoughtful, rigorous, and impactful in our research and to accurately and carefully communicate our findings as part of the same public good charge.

I look forward to progressing with you all in these areas.

Ink