Building Blocks to Make Solutions Stick
Capital is not the constraint, alignment is: Catalyzing large-scale climate and energy infrastructure requires government to act as a systems integrator—synchronizing policy, de-risking commercialization, modernizing valuation, and coordinating markets so private capital can move with speed and confidence.
Implications for democratic governance
- Visible coordination builds credibility: Investors, communities, and companies need to see how policy pieces fit together. Fragmented, asynchronous policymaking erodes trust and slows deployment.
- Risk tolerance must be publicly legitimized: If democratic institutions punish every failed deal but ignore portfolio-level gains, agencies will default to paralysis. A mature democracy must distinguish responsible risk-taking from mismanagement.
- Transparency is market infrastructure: Open data, common modeling tools, and clearer capital pathways empower regulators, communities, and innovators to interrogate and improve investment decisions.
Capacity needs
- Systems-level policy choreography: Agencies capable of synchronizing rules, guidance, financing programs, and permitting reforms on coordinated timelines rather than rolling them out in isolation.
- Transaction-speed infrastructure: Staffing models, underwriting playbooks, and surge capacity that match private-sector deal timelines while maintaining integrity.
- Interstate coordination platforms. Formal mechanisms for states to harmonize standards, pool procurement, share data, and replicate successful pilots without restarting from scratch.
- Accessible technical/economic infrastructure: Publicly credible data sets, modeling tools, and valuation methodologies that lower barriers to entry and allow meaningful third-party scrutiny.
Deal templates and archetypes: Clear, standardized financing pathways that signal how government capital will engage at different risk tiers and technology stages.
Jump to…
Executive Summary
Historic commitments. Huge demand. Massive cost reductions. Ready technologies. Yet, infrastructure deployment levels are underperforming their potential. What gives? The U.S. clean energy sector has achieved remarkable milestones: solar and wind have tripled since 2015, costs have fallen 90%, and annual clean energy investment now exceeds $280 billion. Yet deployment has arguably fallen short of what both markets and the climate moment demand. The culprit isn’t a single bottleneck: not permitting, not subsidies, not technology readiness alone. The real constraint is misalignment across the multiple interdependent factors that investors need to see in place before committing capital at scale.
Think of it like a Rubik’s Cube: solving one face means nothing if the other five stay scrambled. This paper identifies six strategic levers that, when pulled in concert, can unlock the conditions for large-scale capital deployment:
- Market defragmentation: breaking down the patchwork of ~3,000 utilities and state-by-state rules that trap promising solutions in regional silos
- Commercialization partnerships: deploying innovative public-private joint ventures to de-risk capital-intensive, first-of-a-kind energy infrastructure that traditional markets won’t move on alone
- Transaction execution speed: closing the yawning gap between private-sector deal timelines and the multi-year gauntlet of public financing processes
- Policy synchronization: coordinating the release of rules, funding programs, and guidance so investors see a complete picture, not a puzzle missing half its pieces
- Holistic valuation: building common information infrastructure and market structures that capture the full economic value of energy solutions, moving beyond narrow cost metrics that systematically undervalue transformative infrastructure
- Proactive investor engagement: later-stage investors jumping in sooner and addressing the hard questions early to help bridge the “missing middle” between venture and infrastructure finance
The good news: the capital exists, the technologies are ready, and infrastructure is a solvable problem. With over 1,000 GW of clean energy in development and electricity demand projected to grow up to 50% over the next decade, the infrastructure build-out represents one of the largest capital deployment opportunities in American history. And global demand for U.S. clean energy technology has never been higher. The barriers identified in this paper are structural and systemic, not fundamental; most of the solutions proposed are actionable in the near term, without waiting for perfect legislation or perfect markets.
The window is open, but not indefinitely. For policymakers and investors alike, the question is not whether to act, but whether to act with the clarity, coordination, and urgency the moment demands. The frameworks, partnerships, and policy tools outlined here offer a practical roadmap for unlocking decades of economic growth, cost-of-living relief, and energy security for communities across every region of the country and beyond. The energy transition is not a cost to be managed; properly coordinated, it is a generational economic opportunity.
America has experienced extraordinary momentum in the growth and transformation of the energy sector. Solar and wind generation has more than tripled since 2015. In 2024, 50 GW of solar power was added to the U.S. grid, which is not only a record, but is the most new capacity that any energy technology has added in a year. Technology costs have plummeted dramatically: utility-scale solar and battery energy storage have each fallen 90% since 2010. Those declines have made them among the lowest cost forms of electricity in many places. Domestic manufacturing capacity has also surged with hundreds of clean energy manufacturing facilities, many of which have already come online. These technologies and projects have been reinvigorating communities and creating jobs across the nation, and we should see the benefits from these advances continue as capital is flowing into this sector at an unprecedented rate. Clean energy investment in the United States more than tripled since 2018 to $280 billion annually, with multiples more in commitments, and private markets alone have raised nearly $3 trillion over the past decade. For more established technologies like utility-scale solar and onshore wind, financing has become standardized. This includes established project finance structures, robust secondary markets, accurate energy production forecasts, and predictable returns that align with the needs of institutional capital. These asset classes now exhibit many of the hallmarks of market maturity: transparent pricing, deep liquidity, sophisticated risk assessment frameworks, and predictable transaction execution. This progress has been galvanized by unprecedented governmental support, including the Bipartisan Infrastructure Law of 2021 and the Inflation Reduction Act of 2022 (IRA), alongside bold state policies, aggressive corporate clean energy procurement, sustained advocacy, and relentless technological innovation.
Yet despite these achievements and the trillions of dollars in committed capital, the pace of deployment has arguably fallen short of what the market opportunity demands and what the climate crisis requires. Hundreds of IRA-supported energy and manufacturing projects have faced delays or cancellations: much of that is due to increased economic and logistical uncertainty (e.g. in the cost and availability of equipment, permitting timelines, import and export regulations); much of that too is due to sharp reversals in federal funding priorities (e.g. tax incentive changes from the One Big Beautiful Bill Act (OBBBA), direct project cancellations). Moreover, emerging solutions are still taking time to achieve true commercial liftoff. Despite billions in federal funding allocations, only a few carbon capture projects have meaningfully progressed, with others indefinitely delayed or cancelled. Growth of some demand-side energy solutions, like behind-the-meter solar and virtual power plants, has remained relatively regional, despite favorable economics. Advanced nuclear energy, though it remains a policy priority, has been challenged with long delivery timelines, and many project investors remain wary of the risk of cost overruns. Sustainable aviation fuel production capacity increased by ten times in 2024, but it is still a very small fraction of jet fuel demand. Furthermore, transmission capacity has not grown nearly as quickly as needed and remains a key constraint to progress.
The situation – deployment deficiencies despite historic support – can be entirely solved with just one thing… and that is… to stop acting as if there is just one thing. The relative underperformance described earlier is not attributable to any singular constraint: contrary to what some have argued, for instance, it’s not solely about removing permitting roadblocks nor creating more subsidies. Rather than seeking silver bullets, real progress can be made by recognizing that there are multiple elements involved and misalignment between those elements have curbed the rate of progress.
Accelerating new energy project investment is somewhat like solving a Rubik’s Cube. The key to solving the puzzle lies in its interdependence: every twist of one face ripples across the others. You can solve one face perfectly, but if the other five remain scrambled, you haven’t solved anything in the grand scheme. Progress requires coordinated progress across multiple dimensions simultaneously in the right sequence. The cube rewards systems-thinking and algorithms over siloed, non-coordinated actions. All six of its faces must align to win.
The same is true for new energy finance. When most project investors look at a sector, they approach it as a puzzle and look for as much alignment of the full picture as possible before being sufficiently comfortable to deploy capital. That’s the indicator for the risk-reward balance being in the right place to justify investment. Rather than defining the theory of progress by singular issues, policy and industry stakeholders need to create sufficient alignment of multiple puzzle pieces at the same time.
This paper offers a few perspectives on how to achieve the conditions for larger-scale capital deployment, drawing on both lessons learned and promising concepts from across the industry. Like the six faces of the Rubik’s cube, six priority strategy areas are articulated: market defragmentation, commercialization partnerships, transaction execution speed, policy synchronization, holistic valuation methodologies, and proactive investor engagement. Note that while some of the underlying strategies may take time and require deep structural realignment, most of the concepts discussed herein are actionable in the near term. A range of stakeholders, from policymakers to infrastructure investors to community and industry advocates, need to move in concert to solve the puzzle and unlock greater investment.
The opportunity before us is immense. With over a thousand gigawatts of clean energy in development and electricity demand projected to grow up to 50% over the next decade, the infrastructure build-out required represents one of the largest capital deployment opportunities in American history. Similarly, global demand for U.S. clean energy technologies to be a bigger part of the mix has soared over the past few years, as many countries are seeking to diversify away from China or access some of the more unique technologies that the U.S. is developing. And solutions can’t come quickly enough, in this era of fast-growing energy demand, spiking electricity bills, aging physical infrastructure, and burgeoning new industries, not to mention a plethora of old and new technology solutions and operational strategies poised to meet the moment. The question is not whether the capital exists (it does!), whether energy solutions are available (they are!), nor whether there is a silver bullet salve (there isn’t!). It’s whether we can align the six faces of our energy finance cube quickly enough and strategically to channel the right types of capital where it’s needed most, when it’s needed most. The energy transition presents a real opportunity to drive economic transformation that, when properly coordinated, can unlock decades of strong economic growth, cost-of-living reductions, innovation, and prosperity across every region of the country and the planet.
Chapter 1. “Come together”: Defragmenting markets through regional coordination
For many companies across multiple sectors, the U.S. market can seem like the golden goose. Its big population, high income, diversified economy, and strong purchasing power typically mean large total addressable markets (TAM). While those drivers can be true, the reality for many energy and climate solutions, especially early on, is that the large TAM is challenging to realize, as the market can be highly fragmented. In those sectors, the U.S. is less of a “market” per se and more of a loose mosaic. There are about 3000 utilities, ranging from large publicly traded corporations to rural cooperatives, in regulated and regulated markets. States and territories not only have different market drivers, but they also have their own regulations and regulators, business processes, permitting requirements, and market rules. This also complicates the go-to-market as you typically need large and locally-focused commercial organizations to tap the markets, which can be expensive and time-consuming to build, especially for newer companies. It also often means a less efficient path to scalability, as each set of local customers and regulators need to be brought up to speed and convinced about the fit of a solution (compared to having a few entities that speak for the entire country). In addition to the commercial elements, this dynamic introduces technical barriers to scalability, especially where deep integration and redesign are required to meet local requirements. Over the years, this has often flummoxed both U.S. startups and experienced foreign investors, who have approached the U.S. market with high expectations, only to be confounded by these complexities.
Harmonize local requirements to avoid the piloting death spiral
To the extent possible, to promote more rapid and widespread investment and deployment of solutions that could benefit their communities, local (and national) governments need to work more closely together to harmonize market designs and project requirements. Oftentimes, a solution provider may implement a solution in one state, but when they go to another state, that utility might make them start from the beginning and prove themselves all over again – many innovators have likened these continuously repeated pilots to death by a thousand cuts. If a good solution is successfully deployed in one place, the barriers associated with deploying the same solution in another market need to be lower. This concept applies to permitting and design as well. The more that permitting processes and tariff structures, or modular elements within, can be templatized, time and uncertainty are reduced. Additionally, uniformity lowers development costs because the solution doesn’t have to be fully reengineered for each locale. This could also correspond to standardizing equipment and project technical requirements between them, to minimize costly product redesigns and reengineering. Furthermore, state stakeholders seeking to deploy similar solutions should consider entering into reciprocal partnerships or MOUs, supporting collaboration that’s both technical (e.g., between their utilities and independent engineering organizations) and policy-focused (e.g., between their policymakers and regulators). As such, when a solution under that type of agreement is evaluated and approved in one state, when that solution is brought forward in a partnering state, the solution can be given an expedited evaluation and approval process.
Not just physically, but digitally
The dynamic described previously is not limited to hardware. It is present for many software solutions as well, particularly those that have to integrate with local operators’ control systems. For example, a locality may be interested in deploying a virtual power plant (VPP), which is a relatively low-cost, software-based approach to aggregate distributed and controllable energy resources in order to provide large-scale energy services. A VPP deployment would have to connect to a utility’s and/or grid operator’s distributed energy management system (DERMS) to talk to devices, energy management systems (EMS), energy dispatch and trading system for wholesale market participation, customer information systems to track billing and energy usage, and also be cybercompliant – note too that several utilities have yet to fully roll out these foundational modern digital systems. Not only that, each utility and grid operator might have their own implementations (vendors, versions, configurations, rules) of these systems. Even outside of controls-oriented functions, the variety of data structures, naming conventions, and IT systems can make it difficult to access available market data (e.g., energy pricing), electricity tariff rate structures, and other highly important information. This is a reason why you tend to see that many energy software solutions have their operations concentrated in just a few markets, as the costs and time associated with integrating with another market’s cadre of systems can be hard to justify and thwart efforts to scale.
This is an area where states can work together (along with their respective utilities, grid operators, technology providers, and regulators) to agree upon more uniform ways to structure data, access market information, and securely interface with market and control systems. This could include partnering with groups that are developing common standards and protocols (such as RMI VP3, LF Energy), building an implementation roadmap across those states and utilities (accelerating implementation of FERC Order 2022), and taking corresponding legislative actions to ensure investments are made to build out the enabling foundational digital systems.
Aggregated demand and collaborative procurement
In a similar vein, collaboration between state and national governments can level the playing field and expand markets. When it comes to infrastructure, states and countries may often endeavor to ensure that local manufacturing capacity and supply chains are set up within their territories – this can create long term economic growth opportunities, reduce equipment delivery risks, and improve the public’s return on their investment.
However, issues can arise when multiple states are trying to duplicate efforts in the same sector. Take offshore wind in the early 2020s. Multiple eastern U.S. states were not only supporting a new wave of projects, many funding programs had requirements that the projects needed to source materials and equipment from suppliers located in the state.. The effect of this for a small, burgeoning industry was dilutive and slowed down factory investments as the scaling factors were harder to justify. After all, there are only so many blade, monopile, vessel, and cabling factories that can be supported at a given time, especially early on in the industry’s development. In response, thirteen states and the federal government signed a memorandum of understanding, where they agreed to take more regionalized, collaborative approaches to procurement and supply chain development.
Relatedly, an area where significant improvements could be made is around the procurement of critical common equipment. To accommodate the load growth from new factories, data centers, and building and vehicle electrification efforts, there are many pieces of equipment that will be needed irrespective of what types of energy are associated: things like transformers, circuit breakers, switchgear, and so on. There are considerable production capacity shortages and long lead times on these, which raise costs and create execution risks for projects. Despite the robust market demand, manufacturers have been somewhat hesitant to invest in expanding production as they worry that the demand will not materialize, which would leave them with underutilized or even stranded assets.
State and local governments can respond to these challenges in multiple ways. For instance, they could pool together their demand and drive standardization of the equipment so that the equipment is more fungible and interchangeable, as has been previously highlighted by the U.S. National Infrastructure Advisory Committee’s report on protecting critical infrastructure. Also, states can create well-defined demand guarantees where they can provide assurances to manufacturers and consumers that necessary equipment will be there, as needed.
For example, in 2013, the Illinois’ Department of Transportation led a seven-state procurement initiative to jointly acquire a standardized set of efficient locomotives and railcars, with additional funding provided by the Federal Railway Authority to support domestic manufacturing. This effort pulled forward new, more efficient railway vehicles into the market and lowered lifecycle costs. These concepts can additionally apply to secondary markets as well, for example, providing residual value guarantees on heavy-duty electric trucking procurement, to help mitigate risks on the initial purchase (e.g., traditional resale markets not emerging or asset residual values not being realized as projected).
Chapter 2. “That’s what friends are for”: Overcoming commercialization barriers through partnerships
The next facet of the cube pertains to early market formation and investment into technologies that are not yet fully commercialized, especially first-of-a-kind (FOAK) and early-of-kind (EOAK) infrastructure, and why capital formation has been easier for some types versus others. Differences in the ability to demonstrate, commercialize, and scale new infrastructure do not purely depend on the ultimate value of the solution; they are often driven by how the characteristics of that infrastructure affect the pathway to value realization, particularly the inherent capital-intensity and modularity of the solution.
For highly modularized solutions with lower capital requirements, the pathway can be much more straightforward. Take solar photovoltaics. Though module R&D and fabrication are far from trivial, technical demonstration and deployment are relatively simple. One can usually install and field-test new solar quickly and inexpensively. The advantage extends when scaling the solution to bigger projects: once you reach megawatt scale, given the modularity of solar cells and their balance of plant (inverters, cables, trackers, etc), you can obtain a reasonably clear picture of how even gigawatt-scale projects should fundamentally work. Other highly-modular solutions like batteries and EV chargers have enjoyed similar advantages. Tesla, for instance, in order to address potential range anxiety issues for its customers, leveraged its own balance sheet and government funding to build out a network of standardized superchargers, taking advantage of charger modularity to build in waves. This characteristic has made it easier for rapid demonstration and scaling of those solutions, as the financial community can enter the market, investigate, learn, and expand with relatively low risk.
However, the commercialization process becomes significantly more challenging with more capital- intensive and complex technologies, such as carbon capture, nuclear, e-fuels, etc. For some of these types of solutions, the early projects can require billions of dollars to construct and demonstrate. For these types of infrastructure, smaller-scale systems might not provide comparatively representative technical proof points of how the larger system needs to operate. Furthermore, larger sums of capital are often needed for early deployments. What’s more, the financial risk can be compounded as the long-term payoff is not guaranteed, as FOAK & EOAK projects typically have more uncertainty, and the learning rates of subsequent projects may also be less obvious. Furthermore, instead of a typical first-mover advantage that you often see with new technologies, early project investors here might actually suffer from a first-mover disadvantage, where they have the risk and cost of participating in the earlier projects, but don’t accrue the benefits and learnings that are seen in projects executed down the line.
To help address these types of challenges often seen with capital-intensive, less modular FOAK and EOAK infrastructure projects, adopting a new suite of partnership structures can greatly help to accelerate market formation and improve investability.
Multi-project joint ventures
Catalyzing capital for this class of infrastructure may mean going significantly farther than providing a few incentives and having strong advocacy efforts. More complex and elaborate agreements, private and/or public, are often necessary to drive deployment. Particularly in the form of deployment coalitions, consortiums, and joint ventures that support multiple projects. At the highest level, this can exist in several forms and can be originated by the private or public sector, as appropriate. For illustration, a few examples of commercial approaches to scale new nuclear energy projects, roughly in increasing order of relative deployment impact:
- Demand-side vehicles/buyers clubs. This is where consumers get together and make commitments to purchase a solution. This can be used as a market signaling mechanism to stimulate demand and attract solutions. There are example programs run by governments, such as the First Movers Coalition, and others led by the private sector, such as the Advanced Clean Electricity Initiative (Google, Nucor, Microsoft). Large corporations and governments both have a long track record of entering into energy procurement contracts for solutions with attributes that they have deemed highly desirable: like low-carbon, rapidly deliverable, and/or resilient. Some of these entities have started to strike above-market offtake contracts and guarantees in order to bring those solutions to market faster: as have some hyperscalers with nuclear, Amazon with electric delivery vans, NYSERDA with offshore wind, or the US Air Force with enhanced geothermal.
- Supply-side vehicles. This is where a development consortium, public or private, might be formed to deliver the new solutions across multiple projects and geographies, immediately delivering sufficient scale to suppliers while also amortizing some of the deployment risk. As an example, the UK government formed Great British [Energy -] Nuclear as a publicly-funded, but arm’s length, national development company of small modular reactor projects. It has been charged with performing activities like finding sites around the country, raising capital, selecting partners, and facilitating construction. As part of partner selection, it ran a fairly competitive solicitation inviting technology vendors from around the world to showcase their offerings and enter into contention to deploy across a range of sites. It ultimately selected a vendor, Rolls-Royce, as its preferred partner and is moving forward with project development. It is leveraging the power of its sovereign backing to derisk the projects, navigate regulatory hurdles, attract capital, and internalize the societal benefits of the investment (job creation, manufacturing, etc).
- Joint supply-demand venture. Here, customers and developers tightly collaborate to deploy new solutions. A version of this was created with Amazon & X-energy’s partnership to deploy small modular reactors. Here, building off X-energy’s DOE Advanced Reactor Demonstration Program-supported deployment grant and partnership with Dow Chemical (serving as both industrial offtaker and infrastructure delivery expert partner), Amazon not only committed to purchase the energy from 5 GW of SMRs across multiple sites to meet their needs, but also invested in the company itself, meaning they are more aligned and get to share in X-energy’s future success. In turn, this could be particularly meaningful as it represents the first close representation of a project ‘orderbook’ for nuclear in the US in recent years, and thereby provides an opportunity to standardize their product (particularly given the same customer and same vendor), reduce future execution risk, and accelerate cost reductions through improved learning rates. Separately, but with some parallels, the U.S. DOE’s Gen III+ SMR program, created in 2024 and awarded in 2025, also prioritized applications that formed deployment consortia.
All of them offer significant advantages over pursuing projects on an individual basis. They provide demand signals to supply chains to create manufacturing capacity and to labor groups to create a workforce. Generally, both of these stakeholders may need to see firm demand signals before they will undertake significant investments, which are in turn typically needed to reach a solution’s cost and performance entitlement (otherwise creating a chicken-and-egg problem). They also create more concrete opportunities to drive project standardization; this not only allows a technology to achieve faster learning curves, but it also helps to derisk and justify the investment by providing a more tangible line of sight to the large market. The point for manufacturers and workforce development groups is equally applicable for financiers, who often want to see a pipeline of repeatable opportunities before spinning up their underwriting teams.
Risk and reward sharing
Having an orderbook of the first several projects, per se, may not be sufficient to create sufficient activation energy at the project level. Though it sends a good signal for supply chains and others, it does not necessarily address the first-mover disadvantage issue that may exist.
A differentiator in partnership approaches, including the ones described previously, is how to think about alignment and value creation. Traditionally, the way in which governmental entities approach financial partnerships is through mechanisms like subsidies, loan guarantees, offtake guarantees, backstops, and fast-tracked processes. These help to reduce financial exposure to stakeholders, but alone, they miss a key part of the story: the long-term upside that can be created by successfully deploying and opening up a market for these solutions.
Usually, for the product owners and corporate investors, this is more naturally accounted for and balanced against the downside risk. For instance, companies, from software providers to aircraft manufacturers, might sell their first products as loss-leaders, providing lower pricing to early adopters to reduce the risk to the buyer, which they justify knowing that they should be able to rapidly recoup the costs of early expenses (and even failures) over the broader market pool, if they are successful. For large infrastructure projects, this is more challenging to achieve. When projects are highly capital-intensive, the financial exposures may be too great for the product company and/or equity investor to bear – that company or fund might be entirely wiped out by an individual project failure (for example, Westinghouse had to declare bankruptcy in 2017 when their two U.S. nuclear projects faced challenges). Project stakeholders and financiers might be asymmetrically exposed to the downside risk and, therefore, be inclined to avoid investing in early projects. For a promising technology, you may often find several customers (e.g., electric utilities) lining up for the ‘third’ or ‘nth-of-a-kind’ project, which would likely be derisked and less expensive, while at the same time taking a passive wait-and-see approach with the first project – which produce a stalemate if the first project is hard to get off the ground.
This is where deployment partnerships with structures that more fully align economic incentives and share in both the downside risk and upside of value creation can be a powerful catalyst for action. Amazon’s structure, for instance, creates this too by being involved in multiple projects where they would ultimately benefit from improvements over time and also through their equity investment into X-energy itself, so they should (depending on terms) continue to benefit down the line if the company is successful.
In addition to encouraging the formation of joint ventures and consortia as described earlier, states and/or national governments can work together to strategically invest in key solutions, run competitive tenders to prospective providers, and strike profits-sharing agreements and/or warrants (as opposed to pure equity) in situations where government investment played an outsized role in value creation.
A potential example of this is the recent $80 billion framework agreement between Brookfield and the US Government to deploy new large nuclear reactors. Notably, beyond packaging existing products and authorities (e.g., low-interest government loans for projects), this proposed deal additionally stipulates a proposed profits sharing mechanism where the US Government would receive a share of future profits from reactor sales. Noting that this partnership is early-stage and important details have yet to be disclosed, this mechanism could be appropriate here as you have a hard-to-commercialize sector, with strategic national and geopolitical value, with effectively no domestic competitive products.
State entrepreneurship
Public sector funders can also play a significant role in creating and incentivizing these kinds of deployment partnerships. Though more commonplace in countries with state-run industries, countries with free market economies have often found it more delicate to navigate. There may be legitimate concerns about how governments’ “picking winners” may create adverse incentives and undermine competitive markets, in some cases. Plus, it may confuse governments’ role of trying to maximize public benefit, versus showing favoritism or extracting economic rents from corporations.
All that said, there are models for state entrepreneurship that can be very powerful here, balancing the need to pull forward solutions and capital, while protecting the public and maintaining market competition – particularly in markets that are pre-commercial, have few players, have outsized national strategic benefits, and otherwise would not develop on their own without heavy external intervention. These are cases where, though the market benefits are considerable, the activation energy may be too high to stimulate deployment without deep governmental intervention.
Furthermore, consider where you have first-mover disadvantage challenges but a strong set of prospective fast-followers. To avoid the risk of a Spiderman-meme-like situation where stakeholders (e.g. local utilities or individual states) are pointing at each other to make the first move, downstream project investors could, for instance, co-invest (debt, equity, or backstops) for the first project, even on a minority basis, which would both mitigate risk on the first project and also enable them to access a cost-effective pathway to the technologies they desire to build down the line. You do not traditionally see state and local entities investing in infrastructure projects in other jurisdictions, but doing so could be net beneficial as a faster and lower-cost way to derisk and execute their own projects.
In addition, where appropriate, public financing entities investing domestically (e.g., states, green banks, federal agencies) could, where appropriate, consider extending their authorities to borrow some concepts from the US’s international playbook. Organizations like the Development Finance Corporation (DFC) can make equity investments in strategic, high-value projects, particularly where the normal capital markets would otherwise struggle to enter until the investment thesis is more clearly actionable. Such a process would have very clear scopes, firm guardrails, clear commercial competition plans, and be compatible with legal and market structures to create the intended benefits without confusing or distorting markets.
In any scenario, there should be a corresponding plan for how the public profits would be used. For example, it could be used to raise capital for other governmental activities or directly returned to the public in some way. Or it could be efficient to recycle the funding towards related activities and balancing of the governmental ‘venture capital’ portfolio, as would a strategic sovereign wealth fund.
Chapter 3. “Highway to the deployment zone”: Faster, risk-weighted transaction execution
There’s a common cliché from finance that time kills or wounds all deals. Increasing the speed of policy formation and deal execution is essential to unlocking growth and investment, especially for newer sectors. We will particularly focus on the public capital side of the equation. Here, there is a great mismatch between public and private investment decision time scales. In the private sector, deals are expected to be completed on the order of months or even weeks. Often in the public sector, this can add many months to years, with a high degree of uncertainty, depending on the program. There can be many idiosyncrasies associated with public funding – e.g. infrastructure projects with federal dollars may have additional compliance requirements (e.g. for environmental regulation or for domestic manufacturing). Though there are many deep policy questions here, for this discussion, this piece will focus on ways to accelerate the process.
Staffing for success
While it’s easy to say that the government should move faster, the reality for many is that the individual government program officers are typically working at a rapid pace. This is especially true at the political level where the motivation to make progress in a short amount of time tends to be very high. Particularly, when it comes to developing new programs, they do a massive amount of work, mostly unseen by the public, and with very few resources and are often overstretching to meet deliverables.
The other side of this is that when new programs and initiatives are rolled out, there often isn’t a similar level of flexibility in staffing levels and allocations. In fact, staffers at the federal, state, and city level can get overwhelmed by the volume of direct work and information requests, following the relevant laws and statutes. A new capital program may get introduced, but the number of people to implement that program might not change rapidly. For instance, the Inflation Reduction Act of 2022 introduced and/or influenced dozens of tax credit programs, and accordingly had to issue almost a hundred pieces of new guidance, so the market could act on them. A relatively small group of people led by the Treasury Department’s Office of Tax Policy were charged with generating that official guidance (as required, to ensure consistency and fairness). In addition, a number of the programs therein had complex elements which required deep technical expertise (e.g. tax law, energy markets, carbon accounting, energy technology) to complete their work well – skills that are in relatively short supply and in high demand, both inside and outside the government. The rate of progress was also slowed by some ambiguity in the law itself, where key technical questions (e.g. accounting methodologies and criteria) accordingly needed additional time to be addressed during implementation instead of beforehand. The associated teams ran at breakneck pace to complete all those issuances in just two years. Yet, many engaged market actors who were excited to proceed with shovel-ready projects experienced challenges, as they waited for guidance, thereby slowing initial progress.
To meet the rapid needs of an eager market and particularly in times when the governments are trying to push comprehensive reforms, agency leaders and legislators need to consider ways to make sure that the implementing organizations are sufficiently well-staffed and resourced. This should not only consider program staff (both existing and new), but also functional teams (e.g. legal, communications, stakeholder engagement) where bottlenecks can often form as they support multiple programs. This can include having surge capacity resources (either short and/or long-term, internal or external) bringing on technical and subject matter experts to help enable fast and fair processing. Also, an implementation staffing needs assessment should ideally be conducted as part of the policy-formation & legislative process so that the appropriate resources can be allocated early and efficiently. To further ensure efficiency of resource allocation and implementation speed, legislators should consider expedient ways to drive greater clarity and specificity at the point of legislation, where applicable.
Iterative capital deployment programs
It is tempting and important to get things totally right on the first try, particularly when it comes to government funding, which is highly scrutinized. That said, an approach which has delivered success is to release capital in phases. Here, instead of issuing all funding at one time, the program office (especially for a large competitive program) might split the deployment into phases over time. The first phase is executed quickly and the subsequent phases are introduced later. While this may introduce some short-term friction, it not only gets capital and projects moving faster, perhaps as importantly, it gives both the funders and market chances to build momentum, learn from one round, and improve towards the next ones. A good example of this was the DOE’s Grid Resilience and Innovation Partnerships (“GRIP”) program. This was a $10 billion program from the Bipartisan Infrastructure Law to enhance grid flexibility and improve the resilience of the power system against extreme weather. That $10 billion allocation was split into three phases to be issued over a few years. Over those three phases, the quality and ambition of the applications and programs funded increased significantly as all stakeholders were able to learn and adapt with each round.
Progress over perfection
For public programs, this is a major challenge driven by misalignments of risk tolerance. Many ambitious government funding programs are faced with a strange pickle. On one end, they have the duty, mandate, and power to drive innovation, do deals ahead of the commercial markets, and derisk promising solutions to a point where they can scale on their own and deliver the broad public benefits associated with that solution. On the other hand, the funds being used are raised from people’s hard-earned money or a state or country’s valuable natural resources, and neither should be handled frivolously. The fear of the political fallout from the latter may drown out the benefits of delivering the projects; that is, in the eyes of an underwriter or program officer, the downside risk may often outweigh the upside creation; doing no deal may feel safer than doing a ‘bad’ deal.
Take the case of two companies that received funding from the DOE’s Loan Programs Office (LPO) in the early 2010s. One is the solar cell manufacturer Solyndra, whose idea was to decrease the cost of solar energy by using cylindrical, thin-film solar cells that could capture the sun’s energy from multiple angles, compared to their conventional flat-panel counterparts, and thereby bring down their levelized costs. Solyndra received $535 million in federal loan guarantees, which it defaulted on and went bankrupt when market conditions changed (they, and other promising new solar companies, were undercut by plummeting prices of silicon solar cells from China). The default filled the news cycles for months, sparked several congressional hearings and investigations, and also left a profound imprint in the minds of many program funders. No government underwriter wants to be dragged to the Hill or see their name in the papers for this reason. On the other end of the spectrum, you have a little-known car company called Tesla, which received a $465 million loan to expand electric vehicle production. Tesla, as we know, went on to become one of the most transformational and successful automobile companies in recent history. And yet, comparatively little fanfare has been made about the government’s role in making that a success. Two loans, about the same size, issued around the same time, by the same organization. Not only did their actual outcomes differ, but the financial outcomes to the upside and the political fallout to the downside were diametrically-opposed.
This contrast becomes even more stark when looking at the broader picture. Again, taking the example of the LPO (now Office of Energy Dominance Financing (EDF)). It has historically had a loss rate of less than 3%, which is on par with most commercial and investment banks – entities which often invest in markets with more proven solutions and less uncertainty. Moreover, other governmental programs, like DARPA, NIH, and ARPA-E, also have strong investment track records. All that said, the perception of risk has led to an overly deep risk conservatism and a sense of fear of doing deals that might go sideways. This creates huge process drag for the entire organization and curbs the rate of progress that can be made, as underwriting processes can become elongated and difficult to navigate. Furthermore, for many loan applicants, it can take several years to get through the loan process; anecdotally, some applicants have complained it was slower and harder to access than what they could wrestle from the commercial market.
Overall, this is a situation where the tolerance for and understanding of losses for public financing needs to be reconceptualized and appropriately balanced, given the mission. Not all losses are bad. Individual losses are not necessarily detrimental if outweighed by net gains. There are significant opportunity costs of not taking risks appropriately. More can be accomplished without jeopardizing public interest.
Given the governments’ having historically demonstrated their ability to be good stewards of capital across long periods of time, the inherent risk that accompanies the pre-commercial asset classes they support, and the urgent need to make progress and unlock markets, this is a situation where more streamlined, faster underwriting processes that increase the speed of execution are critical and warranted. Furthermore, governmental funding organizations need to be given more ‘air-cover’ so that individual misses do not get over-politicized, but are understood to be reasonable elements of a process for progress. Note that accomplishing this process and cultural shift requires major work internally with staff, policymakers, and the broader public. This is an aspect where integrating concepts like state entrepreneurism and more balanced portfolio-based risk and reward approaches, described previously, can also unlock new investment and risk management strategies, greater societal benefits, and increased comfort to staff and leaders to usher in that kind of transformation.
Creating longer and more durable windows of action
A more obvious reason to move quickly on policymaking is to deliver benefits faster to project and community stakeholders – which is, of course, the main objective of the policy in the first place. But beyond that, investors understand that the political time windows covering favorable conditions could be short. This is particularly acute for assets with long development cycles and/or high upfront costs: e.g., building new manufacturing facilities or developing interregional transmission lines can take years. Indeed, it was estimated that 60% of committed IRA-funded clean energy manufacturing projects were originally not slated to come online until between 2025 and 2028.
Moreover, the IRA timeline created some interesting time crunches. Though the law was thoughtfully conceived with some longer time horizons for tax credits, in practice, the actionable investment window for that version ended up being incredibly short. The law was passed in August 2022. It then took time for programs to be formed and guidance to be released, as described earlier. In parallel, the investment community had to learn, come up the learning curve on the new opportunities, and build ecosystem collaborators (which themselves are also reacting and forming). Next thing you knew, as the election window started to ramp and policy uncertainty increased, many investors started to park their capital and take a wait-and-see approach in early 2024, as evidenced by strong increases in fund ‘dry powder’ (raised but uncommitted capital) but a sharp dropoff in actual capital deployment and assets under management at that same time.
Moving quickly is critical to give investors, communities, and other associated stakeholders as much time as possible to understand the landscape, develop deployment pathways, build new solutions, and ideally iterate, given the chance to take more shots.
That was a shorter-term perspective. In the spirit of leveraging speed to open the front end of the window, longer term, some thought should be made to how one extends the investability window. Investors typically do not decide to invest in a project purely due to the project’s merits. Particularly when entering a new sector, it’s also driven by the commercial prospect of potential follow-on deals. Short political windows and the associated ‘stroke-of-pen’ risks often raise major flags for the risk committees at financial institutions. As was mentioned earlier, many IRA programs arguably had much less than two years of impact. Deeper policy stability is critical to ensure continued, long-term investment. That type of stability has, at least historically, been a hallmark of the US regulatory & commercial system and a positive differentiator in the race to attract capital and talent from across the globe. For sectors with high strategic value, high early capital requirements, and long investment cycles, policymakers should consider more mechanisms to provide longer-term policy guarantees to give investors assurance that they have long enough windows to justify their business cases.
Chapter 4. “Okay, now let’s get in formation”: programmatic policy synchronization for fast market formation
Catalyzing the deployment of new infrastructure is usually enabled by a bevy of policy actions. This can be important as transforming a sector may require several changes in economics, behaviors, and processes. Especially when resulting from expansive new legislation and/or executive actions, the government may be required to deliver a host of new policy programs, including new rules (e.g., permitting reforms, categorical exemptions), funding allocations and programs, implementation guidance (e.g., for tax rules), informational reports (e.g., National Lab technical studies, commercialization reports, etc). These types of activities are highly valuable as they tackle different aspects of the deployment challenge and take huge amounts of effort to be effective. However, they often get rolled out and implemented on separate and independent processes. This can actually stall and frustrate deployment efforts as most investors will want to see the major policy puzzle pieces locked in place before getting comfortable enough to deploy capital – for most risk organizations, ‘stroke-of-pen’ risks are often viewed as red flags. Conversely, this hesitation can cause consternation for policymakers and advocates who may feel that they have done the heavy lifting in passing new legislation, but don’t see a corresponding flood of serious commitments immediately after.
Policy deliverable schedule alignment
One way to address this is to implement a visible and synchronized schedule, showing all the related policy efforts and programs for an initiative. The interdependencies between those activities would be easier to identify and allow relevant stakeholders to see when all the major puzzle pieces would be in place, and in turn, also align their investment and advocacy efforts accordingly.
An example of this comes from carbon capture: the federal tax credit for carbon dioxide sequestration (45Q) was first issued in 2008; however, the first set of tax guidance was not issued until 2020, as the IRS, Treasury, EPA, and other agencies had to build a suite of complex regulations for around reporting, verification, address stakeholder comments, and more. A consequence is that, though the tax credit was in place (and though there were strong complementary financing capabilities from renewables tax equity and thermal power plant development already in place), little to no investment went into this sector, effectively ‘wasting’ many years of eligibility and frustrating many interested stakeholders.
By contrast, take advanced transmission technologies: despite being rapidly-deployable and cost-effective solutions to increase transmission and distribution system capacity and performance, they have been historically underutilized. To increase awareness and deployment, the federal government developed a suite of products including the formation of the Federal-State Modern Grid Deployment Initiative, grant funding via the DOE’s Grid Resilience Innovation Partnerships program, loan funding from the LPO’s Energy Infrastructure Reinvestment program, categorical exemptions in federal environmental permitting on upgrading existing transmission lines, a Pathways to Commercial Liftoff report on grid modernization, new national deployment goals, technical reports and new assistance programs by the National Labs, and more. These were all released within a couple of months of each other in 2024, so the market had a fuller picture to which it could react and begin to make greater progress. Since then, dozens of states have passed new laws, and the number of projects being pursued and funded has also been on the rise.
Capital source navigators
Relatedly, new legislation may result in several new governmental funding programs or changes in missions for existing funding programs. Many of these efforts and changes might go unnoticed or disproportionately utilized. Take the energy- and climate-tech startups, who may be seeking capital to grow or transform their businesses. Government capital tends to be attractive as it is often willing to embrace early technology risk (unlike most commercial capital), is often non-dilutive to the company’s capital stack, and can give the company extra visibility. Most people in the energy sector will know of programs like DOE’s ARPA-E (Advanced Research Program Agency for Energy) or the Loan Programs Office. Far fewer may know that there may be funding available through ‘non-energy’ agencies like the US Department of Agriculture, Small Business Administration (SBA), the Governmental Services Administration (GSA), or the Department of Defense (DOD). These have increased the pool of capital available and provided a wider array of financing products that increase the chances that the right kinds of capital are available to serve the spectrum of company needs.
Initiatives like the Climate Capital Guidebook, published in 2024, can be helpful to make these types of programs less opaque and easier to access, especially for startups and small businesses. At the state level, databases like DSIRE USA have been providing a beneficial service aggregating information on state incentive programs for years.
Making information on federal, state, and/or municipal funding programs highly accessible and searchable from a centralized, common location is key. Or else they may get lost, buried in webpages that few know where to access. This process can be further enhanced using cross-cutting discovery technological tools. For example, AI-based agents could be used to continuously and automatically map these programs and keep information organized. Also, large language models can be deployed to allow stakeholders to more readily identify and compare programs of best fit (matching things like user capital needs versus program ‘ticket sizes’, usage restrictions, and eligibility requirements).
Zooming out from individual needs, this would also augment new solution developers’ and investors’ ability to more comprehensively understand the relationship between governmental capital programs and the role they play in energy solution commercialization and deployment. For instance, related to technology readiness, it would make it easier to chart what programs are available to technologies at different stages of maturity. From the National Science Foundation (NSF) for fundamental research, or the Advanced Research Program Agency-Energy (ARPA-E) for more applied technology development and early manufacturability demonstration, to planning grants from various agencies, and federal tax credits for infrastructure projects.
Similarly, funding programs could be mapped against project development phases. In areas like international project finance, this exercise would be valuable to demystify which programs and institutions are suitable for various phases of project development. In some cases, an international energy project developer using American technology might need to navigate a gauntlet of different funding institutions: from the US Trade and Development Association (USTDA) provide grants for front-end engineering development (FEED) studies, the Export-Import (EXIM) bank for domestic manufacturing loans, to the Development Finance Corporation (DFC) for equity co-investment and political risk insurance. Not to mention multilateral development banks like the World Bank and International Finance Corporation (IFC), which themselves have an array of funding programs and instruments. Here, providing clearer, more cohesive representations of how a patchwork of funding sources can work in tandem and be packaged together can have outsized strategic competitiveness for American companies. This would thereby help level the playing field for companies competing against companies backed by governments that can provide fully wrapped financing solutions.
Chapter 5. “C.R.E.A.M.”: More holistic valuation tools and methodologies
This facet addresses the challenge, which is still too common in solution valuation. Not of valuing the companies themselves, but of proving to customers and investors that the proposed energy solution is worth adopting. Especially because regulation alone is usually not a salve for driving energy transition activities in free market economies. While regulation may steer what should happen, costs and economics are often bigger drivers of how quickly that transition occurs. Borrowing a chemistry analogy, economics determines the activation energy and kinetics of transition policy. Solutions need to demonstrate their fit and attractiveness in often economically competitive and constrained environments. In addition, stakeholders with shared interests (e.g., not just federal, but also at the industry and state & city levels) should invest to build a common valuation infrastructure (e.g., resource characterization data, system models, and more) that helps lower the barriers to deployment and investment. Doing so will also make it easier to appropriately size any associated financial programs (like subsidies and grants), to ensure that there is sufficient catalysis to get multiple stakeholder groups moving and investing.
Understanding end-use unit economics
This means that solution providers, policymakers, and advocates need to develop very deep understandings of the commercial drivers and realities of the markets they are looking to serve. They need to put themselves in the shoes of their customers and related stakeholders. This is particularly important when trying to sell solutions into new and competing markets or applications that may not otherwise be required to change (e.g. regulations on fuel use or emissions). This should seem obvious and has always needed to have been a primary focus, yet it’s a step that some innovators, policymakers, and advocates have not always adequately prioritized.
This is a recipe for failure, particularly in the infrastructure space. Saying it’s good for the world is not sufficient to get traction; a need does not mean there’s a market. Getting a strong, detailed, and accurate understanding of customer unit economics is foundational to the success of any infrastructure. By their nature, this should encompass more rigorous estimations of how much a solution costs to produce and deliver (which often tends to be underestimated in early stages, and leave stakeholders to be surprised later on by cost overruns upon implementation). And it should likewise reflect an understanding of the customer’s cost and value drivers, as this affects project revenue and adoption readiness. This is a question that sometimes gets missed in the early stages, but in later stages, particularly when seeking significant capital to fund projects, it becomes highly pertinent as investors tend to take a much more critical viewpoint of the economic potential of the project, both to the upside and downside. As part of the process of developing detailed assumptions, they should develop reasonable sensitivities and scenarios that illustrate how the financial performance of the project may vary due to changing internal and macroeconomic conditions.
Diagnosing this early not only helps the solution providers to be better positioned for commercial success with their customers but also enables them to catch potential flaws early enough, make different design choices, and ensure the product’s value proposition is more robust and resilient. In turn, this helps reduce project risk and gives comfort to financial investors along the way.
Governments can help by driving easier price discovery and transparency, collaborating with project stakeholders (especially developers and customers) to compile and share relevant cost and value data in more public forums. Reports generated by government agencies (e.g., the series of Pathways to Commercial Liftoff reports by the US Department of Energy’s Office of Technology Transition/Commercialization), national laboratories, and private third-party analysts (e.g., BNEF, S&P, Lazard) have made strong contributions in attempting to fill those information gaps. Continuing to support and drive efforts like those would be valuable. Having state and regional actors (e.g., groups of state economic development organizations) can also help to make them even more granular, and perhaps more local, which would drive even more actionability. They could also compel information disclosure through legislation (akin to efforts in healthcare and drug pricing transparency over the past few years) or require a greater level of disclosure as a part of some government-funded programs, especially with new industries.
Development of accessible, trustworthy technoeconomic evaluation analysis tools
Intending to perform the types of deep technoeconomic analyses described in the previous paragraph is one thing. But having the ability to do that is another. In many situations, you either suffer from data unavailability or asymmetric access. Taking the electricity system, for example, there are very few stakeholders (usually utilities and grid operators) with deep access to information about how the system and its underlying assets are performing. Sometimes, this is intentional due to potential concerns around security and market manipulation. Sometimes, like often in the case of customers requesting their own historical hourly usage data, the process might be archaic and difficult.
That said, a major downside of this situation is that third parties are often not in a position to interrogate resource plans, challenge priorities, or test and validate new ideas. There are third-party analytics tools, but they sometimes don’t have the requisite data, fidelity, or trust to ensure that their results cannot be readily dismissed. That also makes it too easy for grid operators to dismiss new ideas without adequately considering them. Not having accessible data and models can result in excluding beneficial solutions from being part of the menu of options or from making it to market.
This has been a pretty common battle with an array of more ‘disruptive’ energy technologies, like distributed energy systems and advanced transmission technologies. But it also occurs with generation. One example is the prospective transition of the Brandon Shores coal plant in Maryland. The plant’s owner, Talen Energy, filed to retire the facility because it was no longer economically viable to operate (following a trend with many coal plants around the country). However, the grid operator, PJM, sought to force an extension of its operation (via striking a reliability must run (RMR) contract) for four years until new transmission capacity can be brought in, citing potential reliability concerns. State and congressional officials strongly opposed this plan as extending the operation via an RMR contract would increase costs to ratepayers and increase local pollution. A group of advocates and energy experts, led by the Sierra Club and GridLab, proposed replacing the plant with a mix of energy storage, reconductoring, and voltage supports, which they claimed would be not just cleaner, but more cost-effective than what was proposed – even more so in the likely event that the transmission project is delayed. Note too that a similar concept was deployed in New York City at the Ravenswood power plant. However, that suggestion was dismissed by PJM, without real allowance for iteration, arguably for not using the right modeling methodology, and for not being a project sponsor. Aside, the decision is also reflective of PJM’s energy storage market structure’s inability to effectively value energy storage’s benefits as both energy and transmission assets. Though an agreement was ultimately reached, many stakeholders view this settlement as suboptimal, not just because of its outcome (even more so as it does not help wider issues like high-capacity market prices), but because the advocates did not have the modeling tools in place to meaningfully evaluate the options and force a more substantive dialogue with the grid operator.
Rebalancing this situation is essential in order to allow additional key stakeholders like policymakers, regulators, project developers, solution providers, and other experts to interrogate the opportunity space and propose actionable new ideas. This should include creating common, accessible infrastructure for data, models, and evaluation methodologies that an interested stakeholder would need to know to assess potential project options – which are otherwise difficult or prohibitively expensive to access. Note too that government endorsement of these tools greatly assists the credibility of the prospective solution.
The Australia Energy Market Operator (AEMO), for example, did exactly this by implementing the world’s first connection simulation tool. This tool is a digital twin of the country’s electric grid that project developers are using to rapidly evaluate their prospective solutions in an accurate, safe, and trustworthy environment.
Also consider two approaches to accelerate geothermal project development, where insufficient quantification of the resource can add significant development costs and project risks. One example is Project InnerSpace: this is a collaborative effort funded by philanthropy, the US federal government, and Google to provide a common, open set of surface and subsurface characterization data. Another example is the Geothermal Development Company: this is a special purpose vehicle fully-owned by the Kenyan government, which performs the resource characterization and steam development themselves, and shares the information with prospective geothermal power producers, which in turn has significantly lowered the barriers to entry and made Kenya a global leader in geothermal power production. Both approaches are being used to help project investors to be more targeted, capital-efficient, and prolific.
Similarly, Virginia recently passed a grid utilization law requiring their local utilities to measure the utilization of their transmission and distribution systems, including establishing metrics as well as plans to improve those metrics. If implemented well and the associated data is made available, having that data can drive more targeted investments, more cost-effectively manage customer energy bills, and allow new solutions like virtual power plants, distributed energy, and grid-enhancing technologies to be appropriately valued and play bigger roles in the energy solution mix.
Quantify, aggregate, and internalize external benefits and costs
Many solutions labeled for “climate” often have a wide array of other benefits – lowering costs, boosting reliability, creating jobs, improving health, to name a few. In some cases, reducing emissions might be a secondary or even tertiary benefit. This often results in the cost-benefit of a potential solution being understated and capital being underallocated. Alternatively, it can lead to greenwashing, where the benefits are overstated relative to the impact and capital being misallocated.
In some cases, like in electricity markets and industrials, decisions are made on a narrower set of financial criteria, ignoring the broader value proposition – e.g., which solution has the lowest upfront cost? What is the least costly way to meet power demand on hourly basis or does the solution pay itself back in three years? There have been some directional approaches that at least help with the first issue of benefit underquantification.
An example is with FERC Order 1920, a rulemaking that covers new approaches to transmission planning and cost allocation. It called on state decision criteria to be expanded from just cost and reliability to a consideration of seven benefits: avoided or deferred infrastructure costs, reduced power outages, reduced production costs, reduced energy losses, reduced congestion, mitigated extreme weather impact, and reduced peak capacity costs. As it gets implemented across the country, that should provide a considerably more fair and holistic basis to assess the potential benefits of transmission projects and will likely increase the viability of game-changing concepts (e.g., reconductoring, interstate/interregional transmission).
In other cases, like in some larger governmental grantmaking or policy efforts, a suite of benefits may be quantified but are estimated and presented in siloes, where the benefits appear to be orthogonal and nonadditive. So, the key to addressing that is developing valuation frameworks that do the difficult task of weighing the benefits together in a clear manner that’s directly relatable to the investment thesis. Applying ways to translate those benefits commensurately into the project’s financial terms is critical to ensure they get prioritized and realized. The IRA had elements of this at least conceptually, for instance, by applying bonus credits to low-carbon energy projects that paid fair wages, were put in economically disadvantaged areas, or used domestically-manufactured equipment. On a local level, there are state laws like Montana’s transmission law that established a new, elevated cost-recovery mechanism for transmission projects that used more efficient, high-performance conductors.
Going further along the point of internalization, there are more structural issues where markets may not be designed to solve for the outcomes that stakeholders are seeking. In electricity, for example, power markets generally solve for meeting demand for the least cost in a short time period (following a narrowly-defined reliability scheme). This may not only ignore solutions that may save more money over longer periods of time, but it also does not explicitly solve for attributes like resilience, sustainability, or flexibility. That often means external out-of-market solutions are needed to create desired outcomes (e.g., reliability-must-run contracts, tax credits, renewable energy credits). While those have had a great impact on their specific goals, they are imperfect and may have unintended consequences, like distorting market behavior or disincentivizing cost-cutting innovations. Solving that on a greater scale and more fundamental basis in some segments may require greater reforms, like redesigning electricity market structures, revisiting the Energy Policy Act and Federal Power Act, and more.
Chapter 6. “Take it to the bridge”: Rethinking the ‘missing middle’ problem
For the last face of the cube, it is incumbent to describe the role that a whole cadre of investors needs to play. Not just venture and early stage, but particularly later stage capital providers such as project financiers (equity and debt), institutional investors, pension funds, insurance funds, and even utility balance sheets. They have a massively important role and arguably need to be more proactively involved with ensuring the maturation of promising earlier-stage solutions. The ‘missing middle’ problem in energy, where a lack of transition and demonstration capital thwarts promising venture-backed solutions from progressing to mainstream infrastructure, is well-known. While continued innovation is needed to form new capital solutions to fill that gap, there is a lot that investors can do to shrink the gap and make that chasm easier to traverse.
Engaging earlier to pull companies to maturity
Though they control the greatest share of the majority of assets under management and can support bigger ticket sizes, by the nature of their investment mandates, late-stage investors’ risk tolerances tend to skew more conservative. They tend to concentrate their efforts on solutions with established track records and large addressable markets that provide greater certainty of execution and relatively consistent returns. And they typically have more than enough deal volume to justify their focus in those markets. Consequently, though these investors typically at least follow major new trends, they tend to be hesitant to enter newer markets; in fact, many are content to just ‘wait’ for the market to come to them before they engage. This introduces several challenges.
First, at the most basic level, it means that many lower-cost sources of capital may be hard to access for newer solutions, whether climate and otherwise, which makes it more challenging for them to compete on a level playing field early on. Next, as importantly, it means that companies may miss critical opportunities to get sharper earlier. The attributes that are valued by investors change dramatically over the life cycle.
For instance, many early-stage products are rated by things like their uniqueness, differentiation, disruptiveness, and total addressable market. Those attributes that tend to attract venture capital and garner the most market visibility in the media. By contrast, at later stages, especially for project uniqueness and differentiation, might actually be seen as sources of risk: risks that get compounded if the solution is supplied by a new market entrant. Moreover, fungibility and supplier optionality may hold even greater weight. To mitigate the risk of a situation where things could go wrong with a project’s vendor, project investors are often comforted knowing there are substitutes that can be brought in as part of a contingency plan. In addition, though addressable market matters to both groups of investors, early-stage investment tends to focus on alignment with macroeconomic trends. While for later stages, micro arguably supersedes macros, as diligence tends to be more deeply and narrowly-focused on project-specific questions, like contracts, pricing, and execution. Furthermore, late-stage underwriters may need to conduct deeper diligence and acquire more data to get comfortable with the new technical attributes, features, and vendors, which can be a drag on their process as they have to spend more time and money to go through that process. Whereas for earlier stage investors, that deep focus on the new features already tends to be an integral part of the diligence and value creation process, and they get rewarded for that accordingly.
Overall, not understanding these differences can create shocks for new companies that have had great success with attracting capital early, but get stopped in their tracks when graduating to the next level of maturity. This has also often meant that prospective solutions providers miss the opportunity to sharpen their pencils, address more detailed questions, and have their key assumptions stress-tested. Even if they are not prepared to transact, later-stage investors, especially in infrastructure project finance, should devote additional time to engage with promising technologies early on and bring them along. This is also in the investors’ interest as it allows them to get up the learning curve faster, be better positioned to take advantage of those opportunities when the markets come around, and ensure that the solutions that do make it to later stages are of higher quality and more likely to yield successful transactions.
Formulate deal templates and archetypes
The previous steer for early engagement comes with a conundrum. For many investors, it is hard to meaningfully engage until there is a complete deal on the table. By complete, I mean a fully-fleshed out representation of all core theses, puzzle pieces, assumptions, and more, mapped to a specific, actionable situation. This is the scaffolding onto which the financing packages are built and the basis by which most risk managers are trained to evaluate the financeability of a solution. This can be true for governmental and commercial financing programs; “bring us deals to look at” is a common refrain.
The conundrum comes because many of those details may not be fully known early on, so there may not be a fully-formed deal to bring per se, and in addition, there might not be a clear set of underwriting criteria that the company can aim for. Even for offices like the DOE LPO, which was proactive and engaged, struggled to get significant market traction for a few years, both to their and the market’s frustration. Instead of both sides staring at each other like the Spider-man meme, the impasse can be broken by creating deal templates and archetypes, which can take a more hypothetical representation of assumptions, including reasonable scenarios, and frame what the financial structure and execution pathway would look like for that.
The LPO, for instance, did just that, creating several deal archetypes based on customer type and technology, with terms and execution timing aligned based on the associated risk. For instance, deals involving more established energy solutions (e.g., solar, storage, transmission) with investment-grade utilities providing corporate guarantees were allowed to have faster execution processes than some other deals, commensurate with the comparatively low level of credit risk involved. Next, deals associated with a narrow focus, like the Advanced Technology Vehicle Manufacturing program, tended to have more well-defined execution processes. For others that may have more default risk (e.g., newer technologies, non-investment grade counterparties), those deals might take more time to diligence and underwrite. This benefitted both the Office and the applicants, as this created clear, agreeable expectations for each. Providing this clarity greatly increased deal volume and traction, as more clients brought more loan applications and had greater clarity and confidence in the transaction process they were entering into.
More broadly, this is an area where the companies, industry associations, and other advocates can play an active role, independently and in collaboration with governments. Formulating early pictures and archetypes for financial stakeholders and investors can significantly enhance feedback and capital formation.
Collaborate, Celebrate, and Replicate
Finally, energy investors, especially in less mature sectors, need to find ways to be more open, as appropriate, about their investments and investment strategy. Though for product companies, this usually comes a bit more naturally and is necessary to market their products, later-stage investor communications for individual deals often tend to be more guarded and high-level. This is usually not because the information is unavailable. Sometimes this can be hard as they may be attempting to protect sensitive information, or the move may be to protect potential market share and not lead other players on to the same strategy (particularly if they worked hard to open a new market). The richer information transfer usually happens privately during deal execution (e.g., as part of due diligence) or during project- or fund-level capital raises.
In newer spaces, however, progress itself is often catalytic (rising tides floating all boats). The easiest way to persuade a risk committee to invest is to show precedents and comparables. An underwriter can stand up with greater confidence when they can show that someone else has done it before. It can be even more validating when that ‘someone else’ is a competitor. Project investors should endeavor to share more about how they got comfortable with the deals, market, and/or technologies involved, as appropriate. This is not out of altruism. Especially in emerging sectors, rapidly expanding the market and creating a foundational flywheel can be commercially more beneficial for the firm versus purely protecting market share. Getting more investors comfortable makes the pie bigger and encourages other investors to pursue their own projects. This then sends actionable signals to ecosystem stakeholders (e.g. supply chains) to invest and create production and delivery efficiencies. The efficiencies improve unit economics, reduce risk, and increase returns both down the line and potentially even on the early projects (e.g. operating expense and replacement parts scarcity reduction). These scale efficiencies and flywheel creation should help investors generate more deal flow and revenue, building on the expertise they have built and leadership position they have established. The advantages can be extended where significant public funds were allocated to the project, perhaps in exchange for preferable financing terms from the public funding institution.
Similar ideas apply to public sector funding programs. Making announcements about projects and ribbon cuttings, though important, are shorter-term and quick-hitting communications strategies that tend to be more formulaic. Instead, policymakers should take a page from commercial product marketing. Policymakers should view their deployment policy efforts as their products. As such, they should create consistent, thematic narratives to which individual initiatives, projects, legislation, and rules, can all be framed. Even though they may be exhausted after delivering the policy itself, government officials and program officers should not undervalue the uplift phase. It is crucial to plan to spend ample time and resources to explain and repeat the micro- and macro significance of each product to investors and community stakeholders, especially in today’s competitive information environment. Building greater public buy-in both nationally and with communities, is crucial, especially to longer-term, transformational projects. Government funders should work hard to bring along additional local governments, nonprofits, and investors along to collaborate, celebrate, and replicate the successes.
Akin to what was mentioned in the valuation section about common information infrastructure, working closely with investors, industry, and other stakeholders to collate and amplify key investment theses, lessons learned, and others will be key to building investor confidence and creating more of a flywheel effect for follow-on investments.
Conclusion. “The Next Episode”
Taking a step back, we have laid out many ideas and concepts in this paper: harmonization, collaboration, acceleration, synchronization, valuation, and amplification, to name a few. It may seem daunting to look at policymaking across so many vectors, particularly in a manner where many of those puzzle pieces have to align and move in sync in order to unlock significant and consistent investment. That said, the power of a strong and well-intentioned administrative state, at both national and local levels, lies in its very ability to wrap its arms around big challenges, partner with private industry, and leverage its resources to create high-value solutions with outsized benefits. This has repeatedly been proven in the US and globally across time and multiple sectors: whether it’s going to the moon or inventing life-saving medical treatments; building massive infrastructure, or delivering nanoscale electronics. States and towns should roll up their sleeves, find creative ways to collaborate, develop foundational information tools, and remove unnecessary market barriers. Investors should take an even more active role, making their needs known to early-stage companies and policymakers, building consortia to pull new opportunities forward, and creating an actionable set of commercial opportunities that they would find attractive. What’s more, acting now to design and implement new, actionable administrative structures, especially at the state and local level, will not only create more high-value pathways for progress now, but if it is well-coordinated, it can also lay a foundation for federal actions that can be taken, nearer and longer term. Though the challenges and journey are complex, the opportunity before us is massive, the imperatives are clear, the transformations are tractable, and success is achievable. This can be done, so let’s get busy!
Rebuilding Environmental Governance: Understanding the Foundations
Our environmental system was built for 1970s-era pollution control, but today it needs stable, integrated, multi-level governance that can make tradeoffs, share and use evidence, and deliver infrastructure while demonstrating that improved trust and participation are essential to future progress.
02.12.26
|
27 min read
Costs Come First in a Reset Climate Agenda
Durable and legitimate climate action requires a government capable of clearly weighting, explaining, and managing cost tradeoffs to the widest away of audiences, which in turn requires strong technocratic competency.
02.12.26
|
42 min read
From Ambition to Action: A Policy Primer
Remaining globally competitive on critical clean technologies requires far more than pointing out that individual electric cars and rooftop solar panels might produce consumer savings.
02.12.26
|
41 min read
Bureaucracy as Social Hope: An Argument for Renewing the Administrative State
The American administrative state, since its modern creation out of the New Deal and the post-WWII order, has proven that it can do great things. But it needs some reinvention first.
02.12.26
|
29 min read
