Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Explosion of power-hungry data centers could derail California clean energy goals

News Feed
Monday, August 12, 2024

Near the Salton Sea, a company plans to build a data center to support artificial intelligence that would cover land the size of 15 football fields and require power that could support 425,000 homes. In Santa Clara — the heart of Silicon Valley — electric rates are rising as the municipal utility spends heavily on transmission lines and other infrastructure to accommodate the voracious power demand from more than 50 data centers, which now consume 60% of the city’s electricity.And earlier this year, Pacific Gas & Electric told investors that its customers have proposed more than two dozen data centers, requiring 3.5 gigawatts of power — the output of three new nuclear reactors. Vantage Data Center in Santa Clara is equipped with its own electrical substations. (Paul Kuroda / For The Times) While the benefits and risks of AI continue to be debated, one thing is clear: The technology is rapacious for power. Experts warn that the frenzy of data center construction could delay California’s transition away from fossil fuels and raise electric bills for everyone else. The data centers’ insatiable appetite for electricity, they say, also increases the risk of blackouts.Even now, California is at the verge of not having enough power. An analysis of public data by the nonprofit GridClue ranks California 49th of the 50 states in resilience — or the ability to avoid blackouts by having more electricity available than homes and businesses need at peak hours.“California is working itself into a precarious position,” said Thomas Popik, president of the Foundation for Resilient Societies, which created GridClue to educate the public on threats posed by increasing power use.The state has already extended the lives of Pacific Gas & Electric Co.’s Diablo Canyon nuclear plant as well as some natural gas-fueled plants in an attempt to avoid blackouts on sweltering days when power use surges. Worried that California could no longer predict its need for power because of fast-rising use, an association of locally run electricity providers called on state officials in May to immediately analyze how quickly demand was increasing. The California Community Choice Assn. sent its letter to the state energy commission after officials had to revise their annual forecast of power demand upward because of skyrocketing use by Santa Clara’s dozens of data centers. A large NTT data center rises in a Santa Clara neighborhood. (Paul Kuroda / For The Times) The facilities, giant warehouses of computer servers, have long been big power users. They support all that Americans do on the internet — from online shopping to streaming Netflix to watching influencers on TikTok.But the specialized chips required for generative AI use far more electricity — and water — than those that support the typical internet search because they are designed to read through vast amounts of data.A ChatGPT-powered search, according to the International Energy Agency, consumes 10 times the power as a search on Google without AI.And because those new chips generate so much heat, more power and water is required to keep them cool.“I’m just surprised that the state isn’t tracking this, with so much attention on power and water use here in California,” said Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside.Ren and his colleagues calculated that the global use of AI could require as much fresh water in 2027 as that now used by four to six countries the size of Denmark.Driving the data center construction is money. Today’s stock market rewards companies that say they are investing in AI. Electric utilities profit as power use rises. And local governments benefit from the property taxes paid by data centers. Transmission lines are reflected on the side of the NTT data center in Santa Clara. (Paul Kuroda / For The Times) Silicon Valley is the world’s epicenter of AI, with some of the biggest developers headquartered there, including Alphabet, Apple and Meta. OpenAI, the creator of ChatGPT, is based in San Francisco. Nvidia, the maker of chips needed for AI, operates from Santa Clara.The big tech companies leading in AI, which also include Microsoft and Amazon, are spending billions to build new data centers around the world. They are also paying to rent space for their servers in so-called co-location data centers built by other companies.In a Chicago suburb, a developer recently bought 55 homes so they could be razed to build a sprawling data center campus.Energy officials in northern Virginia, which has more data centers than any other region in the world, have proposed a transmission line to shore up the grid that would depend on coal plants that had been expected to be shuttered.In Oregon, Google and the city of The Dalles fought for 13 months to prevent the Oregonian from getting records of how much water the company’s data centers were consuming. The newspaper won the court case, learning the facilities drank up 29% of the city’s water.By 2030, data centers could account for as much as 11% of U.S. power demand — up from 3% now, according to analysts at Goldman Sachs.“We must demand more efficient data centers or else their continued growth will place an unsustainable strain on energy resources, impact new home building, and increase both carbon emissions and California residents’ cost of electricity,” wrote Charles Giancarlo, chief executive of the Santa Clara IT firm Pure Storage.Santa Clara a top market for data centers (Paul Kuroda / For The Times) California has more than 270 data centers, with the biggest concentration in Santa Clara. The city is an attractive location because its electric rates are 40% lower than those charged by PG&E.But the lower rates come with a higher cost to the climate. The city’s utility, Silicon Valley Power, emits more greenhouse gas than the average California electric utility because 23% of its power for commercial customers comes from gas-fired plants. Another 35% is purchased on the open market where the electricity’s origin can’t be traced.The utility also gives data centers and other big industrial customers a discount on electric rates.While Santa Clara households pay more for each kilowatt hour beyond a certain threshold, the rate for data centers declines as they use more power.The city receives millions of dollars of property taxes from the data centers. And 5% of the utility’s revenue goes to the city’s general fund, where it pays for services such as road maintenance and police.An analysis last year by the Silicon Valley Voice newspaper questioned the lower rates data centers pay compared with residents.“What impetus do Santa Clarans have to foot the bill for these environmentally unfriendly behemoth buildings?” wrote managing editor Erika Towne.In October, Manuel Pineda, the utility’s top official, told the City Council that his team was working to double power delivery over the next 10 years. “We prioritize growth as a strategic opportunity,” he said.He said usage by data centers was continuing to escalate, but the utility was nearing its power limit. He said 13 new data centers were under construction and 12 more were moving forward with plans.“We cannot currently serve all data centers that would like to be in Santa Clara,” he said. Dozens of data centers have been built for artificial intelligence and the internet in Santa Clara. (Paul Kuroda / For The Times) To accommodate increasing power use, the city is now spending heavily on transmission lines, substations and other infrastructure. At the same time, electric rates are rising. Rates had been increasing by 2% to 3% a year, but they jumped by 8% in January 2023, another 5% in July 2023 and 10% last January.Pineda told The Times that it wasn’t just the new infrastructure that pushed rates up. The biggest factor, he said, was a spike in natural gas prices in 2022, which increased power costs.He said residential customers pay higher rates because the distribution system to homes requires more poles, wires and transformers than the system serving data centers, which increases maintenance costs.Pineda said the city’s decisions to approve new data centers “are generally based on land use factors, not on revenue generation.”Loretta Lynch, former chair of the state’s public utilities commission, noted that big commercial customers such as data centers pay lower rates for electricity across the state. That means when transmission lines and other infrastructure must be built to handle the increasing power needs, residential customers pick up more of the bill.“Why aren’t data centers paying their fair share for infrastructure? That’s my question,” she said.PG&E eyes profits from boom The grid’s limited capacity has not stopped PG&E from wooing companies that want to build data centers.“I think we will definitely be one of the big ancillary winners of the demand growth for data centers,” Patricia Poppe, PG&E’s chief executive, told Wall Street analysts on an April conference call.Poppe said she recently invited the company’s tech customers to an event at a San José substation.“When I got there, I was pleasantly surprised to see AWS, Microsoft, Apple, Google, Equinix, Cisco, Western Digital Semiconductors, Tesla, all in attendance. These are our customers that we serve who want us to serve more,” she said on the call. “They were very clear: they would build … if we can provide.”In June, PG&E revealed it had received 26 applications for new data centers, including three that need at least 500 megawatts of power, 24 hours a day. In all, the proposed data centers would use 3.5 gigawatts. That amount of power could support nearly 5 million homes, based on the average usage of a California household of 6,174 kilowatts a year.In the June presentation, PG&E said the new data centers would require it to spend billions of dollars on new infrastructure.Already PG&E can’t keep up with connecting customers to the grid. It has fallen so far behind on connecting new housing developments that last year legislators passed a law to try to shorten the delays. At that time, the company told Politico that the delays stemmed from rising electricity demand, including from data centers.In a statement to The Times, PG&E said its system was “ready for data centers.”The company said its analysis showed that adding the data centers would not increase bills for other customers.Most of the year, excluding extreme hot weather, its grid “is only 45% utilized on average,” the company said.“Data centers’ baseload will enable us to utilize more of this percentage and deliver more per customer dollar,” the company said. “For every 1,000 MW load from data centers we anticipate our customers could expect 1-2% saving on their monthly electricity bill.”The company added that it was “developing tools to ensure that every customer can cost-effectively connect new loads to the system with minimal delay.”Lynch questioned the company’s analysis that adding data centers could reduce bills for other customers. She pointed out that utilities earn profits by investing in new infrastructure. That’s because they get to recover that cost — plus an annual rate of return — through rates billed to all customers.“The more they spend, the more they make,” she said.In the desert, cheap land and green energy A geothermal plant viewed from across the Salton Sea in December 2022. (Gina Ferazzi / Los Angeles Times) The power and land constraints in Santa Clara and other cities have data center developers looking for new frontiers.“On the edge of the Southern California desert in Imperial County sits an abundance of land,” begins the sales brochure for the data center that a company called CalEthos is building near the south shore of the Salton Sea.Electricity for the data center’s servers would come from the geothermal and solar plants built near the site in an area that has become known as Lithium Valley.The company is negotiating to purchase as much as 500 megawatts of power, the brochure said.Water for the project would come from the state’s much fought over allotment from the Colorado River.Imperial County is one of California’s poorest counties. More than 80% of its population are Latino. Many residents are farmworkers.Executives from Tustin-based CalEthos told The Times that by using power from the nearby geothermal plants it would help the local community.“By creating demand for local energy, CalEthos will help accelerate the development of Lithium Valley and its associated economic benefits,” Joel Stone, the company’s president, wrote in an email.“We recognize the importance of responsible energy and water use in California,” Stone said. “Our data centers will be designed to be as efficient as possible.”For example, Stone said that in order to minimize water use, CalEthos plans a cooling system where water is recirculated and “requires minimal replenishment due to evaporation.” Already, a local community group, Comite Civico del Valle, has raised concerns about the environmental and health risks of one of the nearby geothermal plants that plans to produce lithium from the brine brought up in the energy production process.One of the group’s concerns about the geothermal plant is that its water use will leave less to replenish the Salton Sea. The lake has been decreasing in size, creating a larger dry shoreline that is laden with bacteria and chemicals left from decades of agricultural runoff. Scientists have tied the high rate of childhood asthma in the area to dust from the shrinking lake’s shores.James Blair, associate professor of geography and anthropology at Cal Poly Pomona, questioned whether the area was the right place for a mammoth data center.“Data centers drain massive volumes of energy and water for chillers and cooling towers to prevent servers from overheating,” he said. Blair said that while the company can tell customers its data center is supported by environmentally friendly solar and geothermal power, it will take that renewable energy away from the rest of California’s grid, making it harder for the state to meet its climate goals. Newsletter Toward a more sustainable California Get Boiling Point, our newsletter exploring climate change, energy and the environment, and become part of the conversation — and the solution. You may occasionally receive promotional content from the Los Angeles Times.

Experts warn that a frenzy of data center construction could delay California's transition away from fossil fuels and raise everyone's electric bills.

Near the Salton Sea, a company plans to build a data center to support artificial intelligence that would cover land the size of 15 football fields and require power that could support 425,000 homes.

In Santa Clara — the heart of Silicon Valley — electric rates are rising as the municipal utility spends heavily on transmission lines and other infrastructure to accommodate the voracious power demand from more than 50 data centers, which now consume 60% of the city’s electricity.

And earlier this year, Pacific Gas & Electric told investors that its customers have proposed more than two dozen data centers, requiring 3.5 gigawatts of power — the output of three new nuclear reactors.

An electrical substation.

Vantage Data Center in Santa Clara is equipped with its own electrical substations.

(Paul Kuroda / For The Times)

While the benefits and risks of AI continue to be debated, one thing is clear: The technology is rapacious for power. Experts warn that the frenzy of data center construction could delay California’s transition away from fossil fuels and raise electric bills for everyone else. The data centers’ insatiable appetite for electricity, they say, also increases the risk of blackouts.

Even now, California is at the verge of not having enough power. An analysis of public data by the nonprofit GridClue ranks California 49th of the 50 states in resilience — or the ability to avoid blackouts by having more electricity available than homes and businesses need at peak hours.

“California is working itself into a precarious position,” said Thomas Popik, president of the Foundation for Resilient Societies, which created GridClue to educate the public on threats posed by increasing power use.

The state has already extended the lives of Pacific Gas & Electric Co.’s Diablo Canyon nuclear plant as well as some natural gas-fueled plants in an attempt to avoid blackouts on sweltering days when power use surges.

Worried that California could no longer predict its need for power because of fast-rising use, an association of locally run electricity providers called on state officials in May to immediately analyze how quickly demand was increasing.

The California Community Choice Assn. sent its letter to the state energy commission after officials had to revise their annual forecast of power demand upward because of skyrocketing use by Santa Clara’s dozens of data centers.

A large data center rises in an urban business district.

A large NTT data center rises in a Santa Clara neighborhood.

(Paul Kuroda / For The Times)

The facilities, giant warehouses of computer servers, have long been big power users. They support all that Americans do on the internet — from online shopping to streaming Netflix to watching influencers on TikTok.

But the specialized chips required for generative AI use far more electricity — and water — than those that support the typical internet search because they are designed to read through vast amounts of data.

A ChatGPT-powered search, according to the International Energy Agency, consumes 10 times the power as a search on Google without AI.

And because those new chips generate so much heat, more power and water is required to keep them cool.

“I’m just surprised that the state isn’t tracking this, with so much attention on power and water use here in California,” said Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside.

Ren and his colleagues calculated that the global use of AI could require as much fresh water in 2027 as that now used by four to six countries the size of Denmark.

Driving the data center construction is money. Today’s stock market rewards companies that say they are investing in AI. Electric utilities profit as power use rises. And local governments benefit from the property taxes paid by data centers.

Transmission lines are reflected on the side of a building.

Transmission lines are reflected on the side of the NTT data center in Santa Clara.

(Paul Kuroda / For The Times)

Silicon Valley is the world’s epicenter of AI, with some of the biggest developers headquartered there, including Alphabet, Apple and Meta. OpenAI, the creator of ChatGPT, is based in San Francisco. Nvidia, the maker of chips needed for AI, operates from Santa Clara.

The big tech companies leading in AI, which also include Microsoft and Amazon, are spending billions to build new data centers around the world. They are also paying to rent space for their servers in so-called co-location data centers built by other companies.

In a Chicago suburb, a developer recently bought 55 homes so they could be razed to build a sprawling data center campus.

Energy officials in northern Virginia, which has more data centers than any other region in the world, have proposed a transmission line to shore up the grid that would depend on coal plants that had been expected to be shuttered.

In Oregon, Google and the city of The Dalles fought for 13 months to prevent the Oregonian from getting records of how much water the company’s data centers were consuming. The newspaper won the court case, learning the facilities drank up 29% of the city’s water.

By 2030, data centers could account for as much as 11% of U.S. power demand — up from 3% now, according to analysts at Goldman Sachs.

“We must demand more efficient data centers or else their continued growth will place an unsustainable strain on energy resources, impact new home building, and increase both carbon emissions and California residents’ cost of electricity,” wrote Charles Giancarlo, chief executive of the Santa Clara IT firm Pure Storage.

Santa Clara a top market for data centers

Boys ride their bikes on Main Street near a large data center in Santa Clara.

(Paul Kuroda / For The Times)

California has more than 270 data centers, with the biggest concentration in Santa Clara. The city is an attractive location because its electric rates are 40% lower than those charged by PG&E.

But the lower rates come with a higher cost to the climate. The city’s utility, Silicon Valley Power, emits more greenhouse gas than the average California electric utility because 23% of its power for commercial customers comes from gas-fired plants. Another 35% is purchased on the open market where the electricity’s origin can’t be traced.

The utility also gives data centers and other big industrial customers a discount on electric rates.

While Santa Clara households pay more for each kilowatt hour beyond a certain threshold, the rate for data centers declines as they use more power.

The city receives millions of dollars of property taxes from the data centers. And 5% of the utility’s revenue goes to the city’s general fund, where it pays for services such as road maintenance and police.

An analysis last year by the Silicon Valley Voice newspaper questioned the lower rates data centers pay compared with residents.

“What impetus do Santa Clarans have to foot the bill for these environmentally unfriendly behemoth buildings?” wrote managing editor Erika Towne.

In October, Manuel Pineda, the utility’s top official, told the City Council that his team was working to double power delivery over the next 10 years. “We prioritize growth as a strategic opportunity,” he said.

He said usage by data centers was continuing to escalate, but the utility was nearing its power limit. He said 13 new data centers were under construction and 12 more were moving forward with plans.

“We cannot currently serve all data centers that would like to be in Santa Clara,” he said.

A data center rises many stories into the sky.

Dozens of data centers have been built for artificial intelligence and the internet in Santa Clara.

(Paul Kuroda / For The Times)

To accommodate increasing power use, the city is now spending heavily on transmission lines, substations and other infrastructure. At the same time, electric rates are rising. Rates had been increasing by 2% to 3% a year, but they jumped by 8% in January 2023, another 5% in July 2023 and 10% last January.

Pineda told The Times that it wasn’t just the new infrastructure that pushed rates up. The biggest factor, he said, was a spike in natural gas prices in 2022, which increased power costs.

He said residential customers pay higher rates because the distribution system to homes requires more poles, wires and transformers than the system serving data centers, which increases maintenance costs.

Pineda said the city’s decisions to approve new data centers “are generally based on land use factors, not on revenue generation.”

Loretta Lynch, former chair of the state’s public utilities commission, noted that big commercial customers such as data centers pay lower rates for electricity across the state. That means when transmission lines and other infrastructure must be built to handle the increasing power needs, residential customers pick up more of the bill.

“Why aren’t data centers paying their fair share for infrastructure? That’s my question,” she said.

PG&E eyes profits from boom

The grid’s limited capacity has not stopped PG&E from wooing companies that want to build data centers.

“I think we will definitely be one of the big ancillary winners of the demand growth for data centers,” Patricia Poppe, PG&E’s chief executive, told Wall Street analysts on an April conference call.

Poppe said she recently invited the company’s tech customers to an event at a San José substation.

“When I got there, I was pleasantly surprised to see AWS, Microsoft, Apple, Google, Equinix, Cisco, Western Digital Semiconductors, Tesla, all in attendance. These are our customers that we serve who want us to serve more,” she said on the call. “They were very clear: they would build … if we can provide.”

In June, PG&E revealed it had received 26 applications for new data centers, including three that need at least 500 megawatts of power, 24 hours a day. In all, the proposed data centers would use 3.5 gigawatts. That amount of power could support nearly 5 million homes, based on the average usage of a California household of 6,174 kilowatts a year.

In the June presentation, PG&E said the new data centers would require it to spend billions of dollars on new infrastructure.

Already PG&E can’t keep up with connecting customers to the grid. It has fallen so far behind on connecting new housing developments that last year legislators passed a law to try to shorten the delays. At that time, the company told Politico that the delays stemmed from rising electricity demand, including from data centers.

In a statement to The Times, PG&E said its system was “ready for data centers.”

The company said its analysis showed that adding the data centers would not increase bills for other customers.

Most of the year, excluding extreme hot weather, its grid “is only 45% utilized on average,” the company said.

“Data centers’ baseload will enable us to utilize more of this percentage and deliver more per customer dollar,” the company said. “For every 1,000 MW load from data centers we anticipate our customers could expect 1-2% saving on their monthly electricity bill.”

The company added that it was “developing tools to ensure that every customer can cost-effectively connect new loads to the system with minimal delay.”

Lynch questioned the company’s analysis that adding data centers could reduce bills for other customers. She pointed out that utilities earn profits by investing in new infrastructure. That’s because they get to recover that cost — plus an annual rate of return — through rates billed to all customers.

“The more they spend, the more they make,” she said.

In the desert, cheap land and green energy

Dusk settles over the low Salton Sea.

A geothermal plant viewed from across the Salton Sea in December 2022.

(Gina Ferazzi / Los Angeles Times)

The power and land constraints in Santa Clara and other cities have data center developers looking for new frontiers.

“On the edge of the Southern California desert in Imperial County sits an abundance of land,” begins the sales brochure for the data center that a company called CalEthos is building near the south shore of the Salton Sea.

Electricity for the data center’s servers would come from the geothermal and solar plants built near the site in an area that has become known as Lithium Valley.

The company is negotiating to purchase as much as 500 megawatts of power, the brochure said.

Water for the project would come from the state’s much fought over allotment from the Colorado River.

Imperial County is one of California’s poorest counties. More than 80% of its population are Latino. Many residents are farmworkers.

Executives from Tustin-based CalEthos told The Times that by using power from the nearby geothermal plants it would help the local community.

“By creating demand for local energy, CalEthos will help accelerate the development of Lithium Valley and its associated economic benefits,” Joel Stone, the company’s president, wrote in an email.

“We recognize the importance of responsible energy and water use in California,” Stone said. “Our data centers will be designed to be as efficient as possible.”

For example, Stone said that in order to minimize water use, CalEthos plans a cooling system where water is recirculated and “requires minimal replenishment due to evaporation.”

Already, a local community group, Comite Civico del Valle, has raised concerns about the environmental and health risks of one of the nearby geothermal plants that plans to produce lithium from the brine brought up in the energy production process.

One of the group’s concerns about the geothermal plant is that its water use will leave less to replenish the Salton Sea. The lake has been decreasing in size, creating a larger dry shoreline that is laden with bacteria and chemicals left from decades of agricultural runoff. Scientists have tied the high rate of childhood asthma in the area to dust from the shrinking lake’s shores.

James Blair, associate professor of geography and anthropology at Cal Poly Pomona, questioned whether the area was the right place for a mammoth data center.

“Data centers drain massive volumes of energy and water for chillers and cooling towers to prevent servers from overheating,” he said.

Blair said that while the company can tell customers its data center is supported by environmentally friendly solar and geothermal power, it will take that renewable energy away from the rest of California’s grid, making it harder for the state to meet its climate goals.

Newsletter

Toward a more sustainable California

Get Boiling Point, our newsletter exploring climate change, energy and the environment, and become part of the conversation — and the solution.

You may occasionally receive promotional content from the Los Angeles Times.

Read the full story here.
Photos courtesy of

South Texas coal-fired power plant to switch to clean energy after receiving more than $1 billion in federal money

San Miguel Electric Cooperative's plan to turn into a solar and battery plant will leave only 14 coal-fired power plants in the state.

Sign up for The Brief, The Texas Tribune’s daily newsletter that keeps readers up to speed on the most essential Texas news. A South Texas coal-fired power plant will receive more than $1 billion in funding from the U.S. Department of Agriculture to convert into a solar and battery facility, according to the agency. The switch by San Miguel Electric Cooperative, located in Christine in Atascosa County, to a solar and battery plant will be funded by more than $1.4 billion of a $4.37 billion federal grant to support clean energy while maintaining rural jobs. With the co-op’s transition to a renewable energy plant, only 14 coal-fired power plants will be left in the state. In September, the CEO of San Miguel Electric Cooperative, Craig Courter, told a local newspaper that with federal funding, the co-op can “virtually eliminate our greenhouse gas emissions while continuing to provide affordable and reliable power to rural Texans.” “We take pride in our attention to detail in safety, environmental compliance, community service and mined land reclamation,” Courter told the Pleasanton Express. According to the USDA’s Thursday announcement, the transformation will reduce climate pollution by more than 1.8 million tons yearly and support as many as 600 jobs. In 2019, a Texas Tribune investigation showed that state agencies allowed San Miguel Cooperative to contaminate acres with toxic chemicals. These chemicals can leach into groundwater and soil and endanger people’s health. According to 2023 EPA data, the plant is the fourth-largest mercury polluter of all power plants in the state. “For years, folks in my county have been worried about water contamination from San Miguel’s lignite mine, so with this announcement, we are hopeful that McMullen County’s water will be clean long into the future,” McMullen County Judge James Teal told the Sierra Club, a grassroots environmental group. Teal said that county government officials are looking forward to a benefits plan that will “implement a quality remediation process for the existing plant and mine and provide us with peace of mind that the mess has been cleaned up.” The most important Texas news,sent weekday mornings. San Miguel will still need to establish a timeline for shutting down the coal plant. Still, it’s a “historic victory” for South Texas, said James Perkins, a Sierra Club Texas campaign organizer. Other co-ops in Arizona, Colorado, Florida, Georgia, Minnesota, and Nebraska received similar federal funding. “Texans want healthy air and water and affordable, reliable energy — and we’re ready to come together to get it done,” said Perkins.

Hawaiian Electric Company's Shaky Credit Prompts Proposal for Help From State

Still reeling financially from the devastating wildfires that destroyed much of Lahaina in 2023, Hawaiian Electric Co. wants the state to back the utility’s contracts with wind and solar farms

Still reeling financially from the devastating wildfires that killed at least 102 people and destroyed much of Lahaina in 2023, Hawaiian Electric Co. wants the state to back the utility’s contracts with wind and solar farms.The idea is to make sure new projects can come online despite a cloud of uncertainty in financial markets over HECO. Rebecca Dayhuff Matsushima, HECO’s vice president for resource procurement, said the company hasn’t finished revising proposed legislation for lawmakers to introduce. But she acknowledged the company has been briefing key lawmakers on its proposal ahead of the legislative session that starts in January.“We’re still refining that draft and we hope to get close to a final version later this week,” she said.The idea is for the state to step into HECO’s shoes if the company were to default on payment obligations to wind and solar farms.At stake, Matsushima said, is the ability for HECO to seamlessly bring online large-scale renewable projects to replace aging fossil-fuel burning generators targeted to shut down in the next several years. “Utility scale projects are being put on hold left and right,” said Isaac Moriwake, managing attorney for Earthjustice’s regional office in Honolulu. “Right now, we’re completely stalled out.”Hawaii Rep. Nicole Lowen, chair of the House Energy and Environmental Protection Committee, said HECO’s proposal makes sense conceptually as a solution and should pose little or no risk to utility customers or taxpayers. “But,” Lowen said, “the devil is always in the details.” Contracts Are Key Part Of Hawaii’s Energy Policy Hawaii’s energy policy calls for all electricity sold in the state to be produced from renewable resources by 2045. To achieve that goal, HECO relies on third-party “independent power producers” to build large-scale projects — chiefly wind and solar farms, which require massive investments recouped over decades.To pay for the projects, the power producers enter long-term contracts with HECO to buy electricity for a certain price. The producers then borrow money to pay for the projects up front, with a promise to use payments from HECO to repay the loans.The problem is HECO’s credit profile, which was battered after the August 2023 wildfire. The company faces hundreds of lawsuits related to the fire, which was started when a downed HECO power line ignited dry grasses, according to official investigations. As a result, the company’s stock price has plummeted, and its credit rating has been cut to junk status.That’s made it hard for the power producers to borrow money when they go to credit markets saying their customer is a utility facing billions of dollars in potential liability.“Independent Power Producers (‘IPPs’) have expressed concerns with the Hawaiian Electric’s credit rating and the inability of the IPPs to finance projects or to finance them at reasonable rates given the Company’s current credit rating and financial situation,” the company explains in a document shared with lawmakers and others.The problem has lingered since last session, when it started becoming clear that fallout from the fires was affecting Hawaii’s progress toward its renewable energy goals.At that time, lawmakers proposed a bill to enable HECO to strengthen its credit profile by letting it issue a new type of bond. Unlike other types of corporate debt, the new bonds could have been secured by a new fee charged directly to utility customers. Such bonds are viewed as carrying little risk and are frequently used by utilities to raise money because they bear lower interest rates than standard corporate debt. The securitization bill along with other measures theoretically would have shored up HECO’s credit profile and could have made it easier for the power producers to borrow money at low rates to finance their projects. Supporters included producers like Longroad Energy and Clearway Energy, as well as the Ulupono Initiative, which invests in renewables. But some lawmakers viewed the securitization bill as an open-ended bailout for HECO and sought sweeping changes from the utility in return. The measure took another political hit when HECO’s chief executive, Shelee Kimura, testified that HECO might use funds from securitization to pay wildfire claims as a last resort. The measure ultimately stalled.The new idea is a narrower proposal to backstop HECO’s renewable energy contracts using the state’s creditworthiness.“With the state’s ability to step into the utility’s place, it is likely that financing parties will view contracts with the utility as being supported by the investment grade credit rating of the state instead of the utility, avoiding higher bills and risks to reliability,” the company says in its presentation. As envisioned, the proposal would mean little risk to the state if it had to step into HECO’s shoes, Lowen said.Electricity generated by the power producers would go to customers who would pay for it. But instead of that money flowing through HECO to the power producers, the money would flow through the state.But Lowen said it’s unlikely the state would have to step up for HECO.And HECO’s fortunes soon may change dramatically. The utility and its parent, Hawaiian Electric Industries, have joined other defendants in the massive wildfire litigation to craft a $4 billion offer designed to settle all wildfire claims. While the fire victims have agreed to settle, the insurance industry remains a major holdout. Having paid more than $2 billion in wildfire claims to victims, the insurers want to sue HECO and others allegedly responsible for starting the fires to recoup their claims.The Hawaii Supreme Court is expected to rule next month on whether the parties can settle without the insurers signing on.In the meantime, HECO’s Matsushima said it’s important to give the power producers confidence to invest in Hawaii. Permits for existing fossil fuel generators on Maui and the Big Island are set to expire in 2028 and additional projects on Maui are heading toward obsolescence in 2030 and 2031. Oahu generators face no deadlines, but there is room for expansion, she said.It benefits customers to get renewable projects on track to ensure customers reliable access to electricity from clean resources at good prices, Matsushima said.“This definitely is something we should be looking at,” Earthjustice’s Moriwake said.This story was originally published by Honolulu Civil Beat and distributed through a partnership with The Associated Press.Copyright 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See - Sept. 2024

Oil and gas firms operating in Colorado falsified environmental impact reports

State’s energy and carbon management commission said fraudulent pollution data was reported for at least 344 wellsOil and gas companies operating in Colorado have submitted hundreds of environmental impact reports with “falsified” laboratory data since 2021, according to state regulators.Colorado’s energy and carbon management commission (ECMC) said on 13 December that contractors for Chevron and Oxy had submitted reports with fraudulent data for at least 344 oil and gas wells across the state, painting a misleading picture of their pollution levels. Consultants for a third company, Civitas, had also filed forms with falsified information for an unspecified number of wells, regulators said. Continue reading...

Oil and gas companies operating in Colorado have submitted hundreds of environmental impact reports with “falsified” laboratory data since 2021, according to state regulators.Colorado’s energy and carbon management commission (ECMC) said on 13 December that contractors for Chevron and Oxy had submitted reports with fraudulent data for at least 344 oil and gas wells across the state, painting a misleading picture of their pollution levels. Consultants for a third company, Civitas, had also filed forms with falsified information for an unspecified number of wells, regulators said.Some of the reports, which were conducted and filed by the consulting groups Eagle Environmental Consulting and Tasman Geosciences, obscured the levels of dangerous contaminants in nearby soils, including arsenic, which is linked to heart disease and a variety of cancers, and benzene, which is linked to leukemia and other blood disorders, among other pollutants, according to the commission.“I do believe that the degree of alleged fraud warrants some criminal investigation,” said Julie Murphy, the ECMC director, in November.Regulators first revealed in November that widespread data fabrication had occurred, noting that the companies had voluntarily disclosed the issue months earlier. Last week, as officials specified which sites were known to be affected, the New Mexico attorney general’s office said it was also gathering information about the consulting groups’ testing methods.“This highlights the whole problem of our regulatory agency relying on operator-reported data,” said Heidi Leathwood, climate policy analyst for 350 Colorado, an environmental non-profit. “The public needs to know that they are really being put at risk by these carcinogens.”Paula Beasley, a Chevron spokesperson, wrote via email that an independent contractor – which ECMC identified as Denver-based Eagle Environmental Consulting – notified the company in July that an employee had manipulated laboratory data.“When Chevron became aware of this fraud, it immediately launched an investigation into these incidents and continues to cooperate fully and work closely with the Colorado Energy and Carbon Management Commission,” Beasley wrote. “Chevron is shocked and appalled that any third-party contractor would intentionally falsify data and file it with state officials.”Jennifer Price, an Oxy spokesperson, also wrote via email that a third-party environmental consultant informed the company about employee-altered lab reports and associated forms. “Upon notification, we reported the issue to Colorado’s Energy and Carbon Management Commission and are reassessing the identified sites to confirm they meet state environmental and health standards,” she added.In emailed responses, Tasman Geosciences spokesperson Andy Boian said that Tasman’s data alterations were the work of a single employee and were “minor” in nature, and presented “no human health risk”. But Kristin Kemp, the ECMC’s community relations manager, said the commission’s investigation had not yet confirmed whether that was true.“What we can say already is that the degree of falsified data is vast, from seemingly benign to more significant impact,” she said.Boian also said Tasman “has filed legal action” against its former employee.Civitas and Eagle Environmental Consulting did not respond to requests for comment.Across the US, cash-strapped state regulators have long outsourced environmental analysis to fossil fuel companies, who self-report their own ground-level impacts. But the revelations about widespread data fabrication in Colorado – the fourth-largest oil- and gas-producing state in the US – raises questions about whether operators and their consultants can truly self-police.“It’s obvious: if you want the oil and gas industry to pay you money for a service, you better not find any big problems, or they’re not going to pay you,” said Sharon Wilson, a former consultant for the oil and gas industry who is now an anti-fracking activist in Texas. She said she left her post after her employer’s findings, which she described as trustworthy, were routinely ignored by industry.It is not uncommon for hired consultants to misreport numbers in a way that benefits their clients in the fossil fuel industry, said Anthony Ingraffea, emeritus professor of civil engineering at Cornell University. In 2020, he published a study that found widespread anomalies in how methane emissions were reported across fracking sites in Pennsylvania.“Make sure that the responsibility – the regulatory responsibility, the moral responsibility – is as uncertain as your lawyers can set it up to be,” he said of the practice of outsourcing environmental impact studies. “In other words, point to somebody else.”In an email, Kemp said that companies, contractors and regulators support one another like legs on a three-legged stool, with each trusting the other to pull its weight. She explained that regulators like the ECMC will always be at least somewhat dependent on self-reported data, due to the impracticality of monitoring hundreds of operators at thousands of sites – but that existing processes may need reconsideration.“ECMC’s regulatory workflow is grounded in an expectation that people abide by the law, with reasonable measures in place to ensure that to be the case,” she wrote. “But if we determine we can no longer rely broadly on receiving accurate information, we’d need action – and the scope and scale of that action will be determined by what we learn during the ongoing investigation.”According to the commission, 278 of the wells disclosed so far to have falsified information are operated by Chevron, which contracted with Eagle Environmental. Sixty-six belong to Oxy, a Houston-based energy firm which contracted with Tasman Geosciences. Civitas, which also worked with Eagle Environmental Consulting, disclosed it too had filed falsified data, but has yet not shared information about which of its sites were affected.Most of the wells in question are in rural Weld county, in north-eastern Colorado, which is home to 82% of the state’s oil production and contains more than half of its gas wells. However, regulators revealed that some of the sites with falsified data are close to cities such as Fort Collins, Greeley and Boulder. About half are no longer operational and had been deemed safely remediated by the state.So far, the only sites shared with the public have been those self-reported by the operators, rather than discovered by the ECMC. “It’s likely more sites will become known as the ongoing investigation unfolds,” Kemp wrote.Eagle and Tasman, the consultants who allegedly provided false data, also work outside the state, raising concerns their employees may have submitted fraudulent data elsewhere.“We believe that this is potentially of such danger and magnitude that the situation warrants further inquiry,” said Mariel Nanasi, executive director of the Santa Fe-based non-profit New Energy Economy.Lauren Rodriguez, director of communications for New Mexico’s office of the attorney general, confirmed on 16 December that the office was indeed looking into the allegations around the consulting groups’ work.“The single Tasman individual involved in the data alteration did not do any work for Tasman in [New Mexico], or any other states,” Boian said by email.Kemp, the ECMC spokesperson, said it was still unclear why two independent third-party consultants came forward to self-report data falsification around the same time. But the consequences could be serious: forging an official document filed to a public office is a class 5 felony in Colorado, punishable by one to three years in prison and up to $100,000 in fines. The ECMC will also consider fines and other enforcement actions, she said.The Colorado attorney general’s office declined to comment on the ongoing investigation. And while Kemp said it wasn’t yet clear why the environmental consultants admitted the falsification when they did, she noted that the buck ultimately stops with the oil and gas operators.“Regardless of who’s at fault, the burden of responsibility falls to the operator,” she said.

Feds to assess environmental risks of proposed Northwest Hydrogen Hub

Companies have proposed 10 projects for the Northwest hub so far, including several hydrogen production facilities, hydrogen distribution pipelines and storage projects, and projects that would spur adoption of hydrogen-powered trucks, buses and hydrogen refueling stations, according to the U.S. Department of Energy.

A year after naming the Northwest one of seven new “regional hydrogen hubs” in a nationwide competition, the U.S. Department of Energy is beginning its review of possible environmental risks of developing certain hydrogen projects and is inviting the public into the process.The review, announced Wednesday, will analyze any adverse effects from developing hydrogen projects and the impact of potential infrastructure, their scope, design and construction. But the assessments are only a first step and do not necessarily mean the projects will go forward and receive funding, the agency said. It is holding a virtual meeting for the public in January and will take comments until spring.The projects involve the development and distribution of “green” hydrogen energy and its end users. Green hydrogen can be produced with water and used without emitting greenhouse gases. Green hydrogen energy is seen as a key source of clean energy to help reduce climate-warming emissions from sectors that currently rely on fossil fuels and are hard to electrify because of the huge amounts of energy they demand.The Pacific Northwest Hydrogen Hub, which includes Washington, Oregon and Montana, was chosen in 2023 to receive about $1 billion in federal funding during the next decade. Companies have proposed 10 projects for the Northwest hub so far, including several hydrogen production facilities, hydrogen distribution pipelines and storage projects, and projects that would spur adoption of hydrogen-powered trucks, buses and hydrogen refueling stations, according to the U.S. Department of Energy.The hydrogen produced in the Northwest could also be used to make fertilizer and power energy-demanding processes like semiconductor manufacturing.By replacing fossil fuels in some transportation and in hard to electrify sectors, the hub could divert up to 1.7 million metric tons of carbon dioxide from entering the atmosphere each year, according to the Pacific Northwest Hydrogen Association. That’s equivalent to removing about 400,000 gasoline-powered cars from roads annually.But the Northwest Hub has faced challenges getting off the ground, with project developers pausing plans due to unaffordable renewable energy prices as regional rates for electricity — needed to make green hydrogen — skyrocket. They’re also facing a lack of demand along with delays and confusion over a federal tax credit that was meant to spur investment and jump-start the industry.Learn more and submit commentsRegister here to attend a virtual meeting about the hydrogen hub environmental assessment on Wednesday, Jan. 22 from 6 to 8 p.m.Submit comments on the environmental assessment process through March 23, 2025 here.‘Green hydrogen’Green hydrogen starts with water, which is made up of hydrogen and oxygen. Using a device called an electrolyzer, an electric current is passed through the water, causing a reaction that splits the hydrogen and oxygen from one another. The hydrogen is captured and stored. The production process requires a lot of electricity. But as long as that electricity comes from a renewable source, such as wind or solar power, the hydrogen is “green” and carbon neutral. When burned as fuel, hydrogen emits no carbon dioxide or greenhouse gases, just water.-- Alex Baumhardt, Oregon Capital Chronicle, abaumhardt@oregoncapitalchronicle.comOregon Capital Chronicle is part of States Newsroom, the nation’s largest state-focused nonprofit news organization.

Need a research hypothesis? Ask AI.

MIT engineers developed AI frameworks to identify evidence-driven hypotheses that could advance biologically inspired materials.

Crafting a unique and promising research hypothesis is a fundamental skill for any scientist. It can also be time consuming: New PhD candidates might spend the first year of their program trying to decide exactly what to explore in their experiments. What if artificial intelligence could help?MIT researchers have created a way to autonomously generate and evaluate promising research hypotheses across fields, through human-AI collaboration. In a new paper, they describe how they used this framework to create evidence-driven hypotheses that align with unmet research needs in the field of biologically inspired materials.Published Wednesday in Advanced Materials, the study was co-authored by Alireza Ghafarollahi, a postdoc in the Laboratory for Atomistic and Molecular Mechanics (LAMM), and Markus Buehler, the Jerry McAfee Professor in Engineering in MIT’s departments of Civil and Environmental Engineering and of Mechanical Engineering and director of LAMM.The framework, which the researchers call SciAgents, consists of multiple AI agents, each with specific capabilities and access to data, that leverage “graph reasoning” methods, where AI models utilize a knowledge graph that organizes and defines relationships between diverse scientific concepts. The multi-agent approach mimics the way biological systems organize themselves as groups of elementary building blocks. Buehler notes that this “divide and conquer” principle is a prominent paradigm in biology at many levels, from materials to swarms of insects to civilizations — all examples where the total intelligence is much greater than the sum of individuals’ abilities.“By using multiple AI agents, we’re trying to simulate the process by which communities of scientists make discoveries,” says Buehler. “At MIT, we do that by having a bunch of people with different backgrounds working together and bumping into each other at coffee shops or in MIT’s Infinite Corridor. But that's very coincidental and slow. Our quest is to simulate the process of discovery by exploring whether AI systems can be creative and make discoveries.”Automating good ideasAs recent developments have demonstrated, large language models (LLMs) have shown an impressive ability to answer questions, summarize information, and execute simple tasks. But they are quite limited when it comes to generating new ideas from scratch. The MIT researchers wanted to design a system that enabled AI models to perform a more sophisticated, multistep process that goes beyond recalling information learned during training, to extrapolate and create new knowledge.The foundation of their approach is an ontological knowledge graph, which organizes and makes connections between diverse scientific concepts. To make the graphs, the researchers feed a set of scientific papers into a generative AI model. In previous work, Buehler used a field of math known as category theory to help the AI model develop abstractions of scientific concepts as graphs, rooted in defining relationships between components, in a way that could be analyzed by other models through a process called graph reasoning. This focuses AI models on developing a more principled way to understand concepts; it also allows them to generalize better across domains.“This is really important for us to create science-focused AI models, as scientific theories are typically rooted in generalizable principles rather than just knowledge recall,” Buehler says. “By focusing AI models on ‘thinking’ in such a manner, we can leapfrog beyond conventional methods and explore more creative uses of AI.”For the most recent paper, the researchers used about 1,000 scientific studies on biological materials, but Buehler says the knowledge graphs could be generated using far more or fewer research papers from any field.With the graph established, the researchers developed an AI system for scientific discovery, with multiple models specialized to play specific roles in the system. Most of the components were built off of OpenAI’s ChatGPT-4 series models and made use of a technique known as in-context learning, in which prompts provide contextual information about the model’s role in the system while allowing it to learn from data provided.The individual agents in the framework interact with each other to collectively solve a complex problem that none of them would be able to do alone. The first task they are given is to generate the research hypothesis. The LLM interactions start after a subgraph has been defined from the knowledge graph, which can happen randomly or by manually entering a pair of keywords discussed in the papers.In the framework, a language model the researchers named the “Ontologist” is tasked with defining scientific terms in the papers and examining the connections between them, fleshing out the knowledge graph. A model named “Scientist 1” then crafts a research proposal based on factors like its ability to uncover unexpected properties and novelty. The proposal includes a discussion of potential findings, the impact of the research, and a guess at the underlying mechanisms of action. A “Scientist 2” model expands on the idea, suggesting specific experimental and simulation approaches and making other improvements. Finally, a “Critic” model highlights its strengths and weaknesses and suggests further improvements.“It’s about building a team of experts that are not all thinking the same way,” Buehler says. “They have to think differently and have different capabilities. The Critic agent is deliberately programmed to critique the others, so you don't have everybody agreeing and saying it’s a great idea. You have an agent saying, ‘There’s a weakness here, can you explain it better?’ That makes the output much different from single models.”Other agents in the system are able to search existing literature, which provides the system with a way to not only assess feasibility but also create and assess the novelty of each idea.Making the system strongerTo validate their approach, Buehler and Ghafarollahi built a knowledge graph based on the words “silk” and “energy intensive.” Using the framework, the “Scientist 1” model proposed integrating silk with dandelion-based pigments to create biomaterials with enhanced optical and mechanical properties. The model predicted the material would be significantly stronger than traditional silk materials and require less energy to process.Scientist 2 then made suggestions, such as using specific molecular dynamic simulation tools to explore how the proposed materials would interact, adding that a good application for the material would be a bioinspired adhesive. The Critic model then highlighted several strengths of the proposed material and areas for improvement, such as its scalability, long-term stability, and the environmental impacts of solvent use. To address those concerns, the Critic suggested conducting pilot studies for process validation and performing rigorous analyses of material durability.The researchers also conducted other experiments with randomly chosen keywords, which produced various original hypotheses about more efficient biomimetic microfluidic chips, enhancing the mechanical properties of collagen-based scaffolds, and the interaction between graphene and amyloid fibrils to create bioelectronic devices.“The system was able to come up with these new, rigorous ideas based on the path from the knowledge graph,” Ghafarollahi says. “In terms of novelty and applicability, the materials seemed robust and novel. In future work, we’re going to generate thousands, or tens of thousands, of new research ideas, and then we can categorize them, try to understand better how these materials are generated and how they could be improved further.”Going forward, the researchers hope to incorporate new tools for retrieving information and running simulations into their frameworks. They can also easily swap out the foundation models in their frameworks for more advanced models, allowing the system to adapt with the latest innovations in AI.“Because of the way these agents interact, an improvement in one model, even if it’s slight, has a huge impact on the overall behaviors and output of the system,” Buehler says.Since releasing a preprint with open-source details of their approach, the researchers have been contacted by hundreds of people interested in using the frameworks in diverse scientific fields and even areas like finance and cybersecurity.“There’s a lot of stuff you can do without having to go to the lab,” Buehler says. “You want to basically go to the lab at the very end of the process. The lab is expensive and takes a long time, so you want a system that can drill very deep into the best ideas, formulating the best hypotheses and accurately predicting emergent behaviors. Our vision is to make this easy to use, so you can use an app to bring in other ideas or drag in datasets to really challenge the model to make new discoveries.”

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.