Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Why the government won’t let you see its best tool for forecasting hurricanes

News Feed
Thursday, September 26, 2024

The National Oceanic and Atmospheric Administration for four years has used a hurricane forecasting tool that often surpasses all others in its accuracy, but it won’t release its predictions to the public, spurring concerns that it is holding back information that could help people prepare for deadly storms.The tool, known as the HCCA model, was developed by NOAA as part of a program to reduce errors in hurricane forecasts. Statistics published by NOAA’s National Hurricane Center show that from 2020 to 2023, HCCA was one of the two best models for forecasting a storm’s track and intensity. In 2022, HCCA provided the most accurate track forecasts for all lead times out to four days, even beating the Hurricane Center’s official forecast.The HCCA model produced superior two-day and three-day track forecasts to the Hurricane Center during Ian, the devastating Category 4 hurricane that struck Florida in late September 2022. That hurricane was particularly hard to predict, and better track forecasts could have improved evacuation decisions and saved lives.But because of agreements with a vendor, NOAA has refused to release the model’s results to the public. With a massive storm headed toward a U.S. landfall this week, critics of the agreement argue taxpayer-funded forecasts should be freely and openly available. They say the model’s forecast could be highlighted in television and online graphics as one of the more reliable scenarios given its track record.“The HCCA is the gold standard in modern consensus modeling, and if it were available, we would show it,” Bryan Norcross, Fox Weather hurricane specialist, said in an email.The HCCA, or Hurricane Forecast Improvement Program (HFIP) Corrected Consensus Approach model, is one of more than 25 models used by the National Hurricane Center and is often referenced in its forecast discussions. It uses a proprietary technique, obtained from the private weather risk firm now known as RenaissanceRe Risk Sciences, to blend forecasts from other hurricane forecast models.“HCCA combines input from a number of models in a way that is weighted by their past performance,” Mark DeMaria, senior research scientist at Colorado State University and co-author of a research article describing the model, said in an email. “That allows biases from individual models to cancel each other and provide a more accurate forecast.”The agreement signed in 2020 by NOAA and the company enabled the agency to collaborate with the firm but does not allow the government to provide compensation. It states HCCA forecasts are “trade secrets and confidential information” that “shall not be publicly disclosed or disseminated” for a period of five years from the effective date of the agreement. The terms of the agreement were released to The Washington Post in response to a Freedom of Information Act request.Some worry the model’s inaccessibility sets a bad precedent for future partnerships between the government and private industry if it keeps potentially lifesaving information from the public.Maureen O’Leary, a National Weather Service spokesperson, said the agency strives for unrestricted public access to data and models but that “we must honor legal agreements made.”She added that NOAA is “constantly evaluating new opportunities to improve our products and services and seeks to find the appropriate balance to share that information publicly.”A company spokesperson for RenaissanceRe said in an email that its collaboration with NOAA is “one of [its] many public-private partnerships … which encourages risk knowledge sharing so communities around the world can better protect themselves.”Some private weather providers, however, have voiced concerns about the lack of access to the model’s forecasts.“This HCCA model … was developed at NOAA obviously using taxpayer resources,” Jonathan Porter, senior vice president at the forecasting services company AccuWeather, said in an interview. “This is an urgent public safety issue. It’s about ensuring … that we all have access to the same critical data as the Hurricane Center to effectively understand and communicate risks to people in harm’s way.”Baron Weather, a longtime provider of weather content to broadcast media, also supports wider access to the model.“It would certainly be a welcome addition for all broadcast meteorologists and assist them in communicating tropical forecast information and hazards to their viewers,” Bob Dreisewerd, the company’s chief executive, said in an email to The Post.Open data policies challenged by commercial business modelsNOAA plans to start making HCCA forecasts publicly available after its five-year agreement with RenaissanceRe, previously known as WeatherPredict Consulting, expires in March. “It is our intent to publicly release real-time HCCA model output and the source code before the start of the 2025 hurricane season,” O’Leary said in an email.Porter said AccuWeather is “delighted that NOAA … will make HCCA forecast guidance available to meteorologists across the country so that they can better understand the rationale behind the National Hurricane Center’s forecast and warnings.”But, he argues, restricted access to the model during this and previous hurricane seasons has been “a major setback” that “goes against the basic principles of … free and open distribution of government-based data.”“It’s setting a very precarious precedent … threatening to unravel and reverse over 50 years of progress that’s been achieved through the cooperation of the government, academic and private sectors,” Porter said. It “raises the question of what won’t be distributed next.”U.S. weather forecasting has long been a collaborative endeavor. Historically, NOAA and its international government partners have provided the foundational sensors and systems for making forecasts while the private sector helps to widely disseminate predictions and creates specialized products and services. The 2003 National Academies’ “Fair Weather” report helped define the roles of the U.S. government, private sector and academia at a time of growing friction between the sectors due to their increasingly overlapping roles.The report noted “the government’s obligation to make its information as widely available as possible to those who paid for it — the taxpayers,” but also recognized the challenges of government-industry partnerships and the desire for policy “that permits commercial objectives to be achieved.”The lines between the U.S. weather sectors have become even more blurred in recent years as the private sector has built up capabilities that were once exclusively undertaken by governments. NOAA now buys commercial satellite, aircraft and ground data and is collaborating with private companies that have recently built powerful AI weather models.“The weather enterprise has become a lot more complicated in the past decade, with more observations and modeling being done by private-sector entities,” Keith Seitter, executive director emeritus at the American Meteorological Society, said in an email. “This has challenged the historical approach having all the data being openly and freely distributed … because the private-sector producers often need to protect their intellectual property as part of their business model.”Seitter and Mary Glackin, former deputy undersecretary for operations at NOAA, are among those leading an American Meteorological Society study looking at the state of the weather enterprise two decades after the “Fair Weather” report. Glackin, now chair of the National Academies’ Board of Atmospheric Sciences and Climate, said policy around commercial weather data and technology acquisition presents a growing challenge.“Plans must be a balance of public good and costs while also considering maintaining a vibrant U.S. private sector,” Glackin said in an email. “I suspect each opportunity will need to be weighed independently — at least until we have more experience.”New policy guidance published in July by NOAA addresses the challenge of balancing public and commercial interests, stating that its “programs and offices should seek to maximize the public benefit derived from environmental data and data products obtained through commercial solutions by negotiating the least restrictive terms of use possible.”Andrew Rosenberg, a former NOAA official and a senior fellow at the University of New Hampshire’s Carsey School of Public Policy, said the confidentiality requirements that come along with NOAA’s commercial partnerships can sometimes be too broad, at the expense of transparency that is designed to instill trust in its work.That is especially concerning when it comes to weather forecasts that are meant to serve public health and safety, Rosenberg added. “I do think it’s problematic,” he said. “That isn’t really the way you want to serve the public interest.”

The lack of access to this model is spurring concerns that NOAA is holding back information that could help people prepare for deadly storms.

The National Oceanic and Atmospheric Administration for four years has used a hurricane forecasting tool that often surpasses all others in its accuracy, but it won’t release its predictions to the public, spurring concerns that it is holding back information that could help people prepare for deadly storms.

The tool, known as the HCCA model, was developed by NOAA as part of a program to reduce errors in hurricane forecasts. Statistics published by NOAA’s National Hurricane Center show that from 2020 to 2023, HCCA was one of the two best models for forecasting a storm’s track and intensity. In 2022, HCCA provided the most accurate track forecasts for all lead times out to four days, even beating the Hurricane Center’s official forecast.

The HCCA model produced superior two-day and three-day track forecasts to the Hurricane Center during Ian, the devastating Category 4 hurricane that struck Florida in late September 2022. That hurricane was particularly hard to predict, and better track forecasts could have improved evacuation decisions and saved lives.

But because of agreements with a vendor, NOAA has refused to release the model’s results to the public. With a massive storm headed toward a U.S. landfall this week, critics of the agreement argue taxpayer-funded forecasts should be freely and openly available. They say the model’s forecast could be highlighted in television and online graphics as one of the more reliable scenarios given its track record.

“The HCCA is the gold standard in modern consensus modeling, and if it were available, we would show it,” Bryan Norcross, Fox Weather hurricane specialist, said in an email.

The HCCA, or Hurricane Forecast Improvement Program (HFIP) Corrected Consensus Approach model, is one of more than 25 models used by the National Hurricane Center and is often referenced in its forecast discussions. It uses a proprietary technique, obtained from the private weather risk firm now known as RenaissanceRe Risk Sciences, to blend forecasts from other hurricane forecast models.

“HCCA combines input from a number of models in a way that is weighted by their past performance,” Mark DeMaria, senior research scientist at Colorado State University and co-author of a research article describing the model, said in an email. “That allows biases from individual models to cancel each other and provide a more accurate forecast.”

The agreement signed in 2020 by NOAA and the company enabled the agency to collaborate with the firm but does not allow the government to provide compensation. It states HCCA forecasts are “trade secrets and confidential information” that “shall not be publicly disclosed or disseminated” for a period of five years from the effective date of the agreement. The terms of the agreement were released to The Washington Post in response to a Freedom of Information Act request.

Some worry the model’s inaccessibility sets a bad precedent for future partnerships between the government and private industry if it keeps potentially lifesaving information from the public.

Maureen O’Leary, a National Weather Service spokesperson, said the agency strives for unrestricted public access to data and models but that “we must honor legal agreements made.”

She added that NOAA is “constantly evaluating new opportunities to improve our products and services and seeks to find the appropriate balance to share that information publicly.”

A company spokesperson for RenaissanceRe said in an email that its collaboration with NOAA is “one of [its] many public-private partnerships … which encourages risk knowledge sharing so communities around the world can better protect themselves.”

Some private weather providers, however, have voiced concerns about the lack of access to the model’s forecasts.

“This HCCA model … was developed at NOAA obviously using taxpayer resources,” Jonathan Porter, senior vice president at the forecasting services company AccuWeather, said in an interview. “This is an urgent public safety issue. It’s about ensuring … that we all have access to the same critical data as the Hurricane Center to effectively understand and communicate risks to people in harm’s way.”

Baron Weather, a longtime provider of weather content to broadcast media, also supports wider access to the model.

“It would certainly be a welcome addition for all broadcast meteorologists and assist them in communicating tropical forecast information and hazards to their viewers,” Bob Dreisewerd, the company’s chief executive, said in an email to The Post.

Open data policies challenged by commercial business models

NOAA plans to start making HCCA forecasts publicly available after its five-year agreement with RenaissanceRe, previously known as WeatherPredict Consulting, expires in March. “It is our intent to publicly release real-time HCCA model output and the source code before the start of the 2025 hurricane season,” O’Leary said in an email.

Porter said AccuWeather is “delighted that NOAA … will make HCCA forecast guidance available to meteorologists across the country so that they can better understand the rationale behind the National Hurricane Center’s forecast and warnings.”

But, he argues, restricted access to the model during this and previous hurricane seasons has been “a major setback” that “goes against the basic principles of … free and open distribution of government-based data.”

“It’s setting a very precarious precedent … threatening to unravel and reverse over 50 years of progress that’s been achieved through the cooperation of the government, academic and private sectors,” Porter said. It “raises the question of what won’t be distributed next.”

U.S. weather forecasting has long been a collaborative endeavor. Historically, NOAA and its international government partners have provided the foundational sensors and systems for making forecasts while the private sector helps to widely disseminate predictions and creates specialized products and services. The 2003 National Academies’ “Fair Weather” report helped define the roles of the U.S. government, private sector and academia at a time of growing friction between the sectors due to their increasingly overlapping roles.

The report noted “the government’s obligation to make its information as widely available as possible to those who paid for it — the taxpayers,” but also recognized the challenges of government-industry partnerships and the desire for policy “that permits commercial objectives to be achieved.”

The lines between the U.S. weather sectors have become even more blurred in recent years as the private sector has built up capabilities that were once exclusively undertaken by governments. NOAA now buys commercial satellite, aircraft and ground data and is collaborating with private companies that have recently built powerful AI weather models.

“The weather enterprise has become a lot more complicated in the past decade, with more observations and modeling being done by private-sector entities,” Keith Seitter, executive director emeritus at the American Meteorological Society, said in an email. “This has challenged the historical approach having all the data being openly and freely distributed … because the private-sector producers often need to protect their intellectual property as part of their business model.”

Seitter and Mary Glackin, former deputy undersecretary for operations at NOAA, are among those leading an American Meteorological Society study looking at the state of the weather enterprise two decades after the “Fair Weather” report. Glackin, now chair of the National Academies’ Board of Atmospheric Sciences and Climate, said policy around commercial weather data and technology acquisition presents a growing challenge.

“Plans must be a balance of public good and costs while also considering maintaining a vibrant U.S. private sector,” Glackin said in an email. “I suspect each opportunity will need to be weighed independently — at least until we have more experience.”

New policy guidance published in July by NOAA addresses the challenge of balancing public and commercial interests, stating that its “programs and offices should seek to maximize the public benefit derived from environmental data and data products obtained through commercial solutions by negotiating the least restrictive terms of use possible.”

Andrew Rosenberg, a former NOAA official and a senior fellow at the University of New Hampshire’s Carsey School of Public Policy, said the confidentiality requirements that come along with NOAA’s commercial partnerships can sometimes be too broad, at the expense of transparency that is designed to instill trust in its work.

That is especially concerning when it comes to weather forecasts that are meant to serve public health and safety, Rosenberg added. “I do think it’s problematic,” he said. “That isn’t really the way you want to serve the public interest.”

Read the full story here.
Photos courtesy of

Can Lula still save the Amazon?

The power imbalance in Brazil's government that keeps environmental protections and Indigenous rights under threat.

When Brazilian president Luiz Inácio Lula da Silva took office in January 2023, he inherited environmental protection agencies in shambles and deforestation at a 15-year high. His predecessor, Jair Bolsonaro, had dismantled regulations and gutted institutions tasked with enforcing environmental laws. Lula set out to reverse these policies and to put Brazil on a path to end deforestation by 2030.  Environmental protection agencies have been allowed to resume their work. Between January and November of 2023, the government issued 40 percent more infractions against illegal deforestation in the Amazon when compared to the same period in 2022, when Bolsonaro was still in office. Lula’s government has confiscated and destroyed heavy equipment used by illegal loggers and miners, and placed embargoes on production on illegally cleared land. Lula also reestablished the Amazon Fund, an international pool of money used to support conservation efforts in the rainforest. Just this week, at the G20 Summit, outgoing U.S. President Joe Biden pledged $50 million to the fund.   Indeed, almost two years into Lula’s administration, the upward trend in deforestation has been reversed. In 2023, deforestation rates fell by 62 percent in the Amazon and 12 percent in Brazil overall (though deforestation in the Cerrado, Brazil’s tropical savannahs, increased). So far in 2024, deforestation in the Amazon has fallen by another 32 percent.       Throughout this year, Brazilians also bore witness to the effects of climate change in a new way. In May, unprecedented floods in the south of the country impacted over 2 million people, displacing hundreds of thousands and leaving at least 183 dead. Other regions are now into their second year of extreme drought, which led to yet another intense wildfire season. In September, São Paulo and Brasília were shrouded in smoke coming from fires in the Amazon and the Cerrado.   And yet, despite the government’s actions, environmental protections and Indigenous rights are still under threat. Lula is governing alongside the most pro-agribusiness congress in Brazilian history, which renders his ability to protect Brazil’s forests and Indigenous peoples in the long-term severely constrained.  “I do believe that the Lula administration really cares about climate change,” said Belen Fernandez Milmanda, Assistant Professor of Political Science and International Studies at Trinity College and author of Agrarian Elites and Democracy in Latin America. “But on the other side, part of their governing coalition is also the agribusiness, and so far I feel like the agribusiness is winning.” Read Next In Brazil, Lula vows to halt deforestation — but it won’t be easy Blanca Begert Brazilian politics has always been fragmented, with weak parties. The current Chamber of Deputies, Brazil’s equivalent to the House of Representatives, is made up of politicians from 19 different parties. “It makes it really difficult to govern without some kind of coordination device,” said Fernandez Milmanda. Weak party cohesion makes it easier for interest groups to step into the vacuum and act as this coordination device.  Agribusiness has long been one of the most powerful interest groups in Brazilian politics, but its influence has grown steadily over the past decade as the electorate shifted to the right and the sector developed more sophisticated strategies to affect politics. In Congress, agribusiness is represented by the bancada ruralista, or agrarian caucus, a well-organized, multi-party coalition of landowning and agribusiness-linked legislators that controls a majority in both houses of congress. Of the 513 representatives in the Chamber of Deputies, 290 are members of the agrarian caucus. In the senate, they make up 50 of 81 legislators.  Today, the agrarian caucus is larger than any single party in the Brazilian legislature. “Members of the agrarian caucus vote together. They have high discipline and most Brazilian parties don’t,” said Fernandez Milmanda. “This gives them immense leverage towards any president.”  Much of the coordination around the legislative agenda takes place away from congress, at the headquarters of Instituto Pensar Agropecuária (IPA), a think tank founded in 2011 and financed largely by major agribusiness corporations, including some in the US and the European Union. Among IPA’s main backers are Brazilian beef giant JBS, German pesticide producer BASF, and the US-based corporation Cargill, the world’s largest agribusiness. Core members of the agrarian caucus reportedly meet weekly at IPA headquarters in Brasilia’s embassy row to discuss the week’s legislative agenda.  “IPA is really important because they are the ones doing all the work, all the technical work,” says Milmanda. “They are drafting the bills that they then give to the legislators, and the legislators will present it as their own.”  Read Next Marco temporal: The anti-Indigenous theory that just won’t die Lyric Aquino The agrarian caucus has tallied several long-awaited victories in the current congress, which took office alongside Lula in January 2023. Late last year, they overhauled Brazil’s main law governing the use of pesticides. The new legislation, which Human Rights Watch called a “serious threat to the environment and the right to health,” removes barriers for previously banned substances and reduces the regulatory oversight of the health and environment agencies. Instead, the Ministry of Agriculture, which has traditionally been led by a member of the agrarian caucus, now has the final say in determining which pesticides are cleared for use. Lula attempted to veto parts of the bill, but was overruled by congress. In the Brazilian system, an absolute majority in each chamber is enough to overrule a presidential veto. Another recent victory for the agrarian caucus came as a major blow to Indigenous rights. Agribusiness has long been fighting in the courts for a legal theory called marco temporal (“time frame,” in English), which posits that Indigenous groups can only claim their traditional lands if they were occupying it in 1988, the year the current Brazilian constitution was drafted. Opponents of the theory argue it disregards the fact that many Indigenous groups were expelled from their native lands long before that date. It has dire implications for the hundreds of Indigenous territories in Brazil currently awaiting demarcation, and could even impact territories that have already been recognized by law.  The theory had been making its way through the Brazilian justice system for 16 years, until it was ruled unconstitutional by the Supreme Court last year. Blatantly flouting the court’s ruling, congress passed a bill codifying marco temporal into law. Lula tried to veto the bill, but he was overruled by the agrarian caucus again. The bill is currently being discussed in conciliation hearings overseen by the Supreme Court, which is tasked with figuring out how the new law will work in light of the court’s 2023 decision. The legal grey area in which many Indigenous groups occupying disputed lands now find themselves has contributed to a wave of attacks by land-grabbers and farmers in recent months.  These are only two examples of legislation that are part of what environmentalists have come to call the “destruction package,” a group of at least 20 bills and three constitutional amendments currently proposed in congress that take aim at Indigenous rights and environmental protections.  “The executive has to put a stop to this, because otherwise the tendency will be towards very serious setbacks,” said Suely Araújo, Public Policy Coordinator at Observatório do Clima, a coalition of climate-focused civil society organizations.  But the government has limited tools at its disposal to block anti-environmental legislation. In the past, the executive branch had greater control over discretionary spending and was able to use this to its advantage while negotiating with congress. The past decade has seen a major power shift in Brazilian politics. Congress has managed through a series of legislative maneuvers to capture a significant portion of the federal budget, weakening the hand of the executive.  Among projects which have a high likelihood of passing, according to analysis by Observatório do Clima, are bills that weaken Brazil’s Forest Code, the key piece of legislation governing the use and management of forests. “It would make control much more difficult because illegal forms of deforestation would become legal,” said Araújo.  One such bill reduces the amount of land farmers in the Amazon must preserve within their property from 80 to 50 percent. The move could open almost 18 million hectares of forest to agricultural development, according to a recent analysis that the deforestation mapping organization MapBiomas conducted for the Brazilian magazine Piauí. That is an area roughly the size of New York state, New Jersey, and Massachusetts combined. In a similar vein, another bill in the package removes protections for native grasslands, including large parts of the Cerrado and the Pantanal (the world’s largest tropical wetland). In theory this would affect 48 millions hectares of native vegetation. Yet another bill, which has already been approved in the Chamber of Deputies, overhauls the process of environmental licensing, essentially reducing it to a rubber stamp. “It does away with 40 years of environmental licensing in Brazil,” said Araújo. “You might as well not have licensing legislation.”  Part of the reason many of these bills have a chance of passing is the Lula government’s limited leverage. With little support in congress and less control over the budget, bargaining with the agricultural caucus becomes a necessary tool to pass even legislation unrelated to the environment, such as economic reforms. During these negotiations, some environmentalists believe concerns over Brazil’s forests fall by the wayside.  “Perhaps there is a lack of leadership from the president himself, with a stronger stance in response to the demands of the ruralistas,” said Araújo. “There are political agreements and negotiations that must be made. The bargaining chip cannot be environmental legislation.” This story was originally published by Grist with the headline Can Lula still save the Amazon? on Nov 21, 2024.

Overwhelmed by ever more clothing donations, charities are exporting the problem. Local governments must step up

We give or throw away more and more clothes every year, overwhelming charities and triggering large exports of secondhand clothing. There’s a better way

anna.spoka/ShutterstockWhat happens to your clothes after you don’t want them any more? Chances are, you will donate them to op shops run by a charity organisation. There are more and more clothes in circulation, and they are getting cheaper and lower quality. That means the clothes you give away are worth less and less. For charities, this means donated clothes are less gift, more rubbish. Our new research explores what happens to clothes and other textiles after we don’t want them across nine cities in Europe, North America and Australia. The pattern was the same in most cities. The sheer volume is overwhelming many shops. In Geneva, donations to charity shops have surged 1,200% in three decades, from 250 tonnes in 1990 to 3,000 tonnes in 2021. Worldwide, we now dump 92 million tonnes of clothes and textiles a year, double the figure of 20 years earlier. There’s less and less value in managing these clothes locally. As a result, charities are forced to send more clothes to landfill – or sell bale after bale of clothing to resellers, who ship them to nations in the Global South. Local governments usually handle other waste streams. But on clothes and textiles, they often leave it to charitable organisations and commercial resellers. This system is inherited from a time when used clothing was a more valuable resource, but the rising quantity of clothing has pushed this system towards collapse. From January, all EU states have to begin rolling out collection services for used textiles. But in Australia and the United States there are no moves to do the same – even though these two countries consume the most textiles per capita in the world. As we work towards creating circular economies, where products are continually reused, this will have to change. Textile waste is a new problem Historically, textiles were hard to make and hence valuable. They were used for as long as possible and reused as rags or other purposes before becoming waste. These natural fibres would biodegrade or be burned for energy. But synthetic fibres and chemical finishings have made more and more clothes unable to biodegrade. Fast fashion – in which high fashion trends are copied and sold at low cost – is only possible because of synthetic fibers such as polyester. These clothes are often worn for a brief period and then given or thrown away. What happens to this waste? To find out, we looked at textile waste in Amsterdam, Austin, Berlin, Geneva, Luxembourg, Manchester, Melbourne, Oslo and Toronto. In eight of these cities, charities and commercial resellers dealt with the lion’s share of clothing waste. But in Amsterdam, local governments manage the problem. Across the nine cities, most donated clothes go not to charity shops shelves but to export. In Oslo, 97% of clothes are exported. The flood of clothes is producing strange outcomes in some places. In Melbourne, charities are now exporting higher quality secondhand clothes to Europe. But we found this forces independent secondhand clothing outlets to import similar clothes back from Europe or the US. Charity organisations usually export the clothes across the Global South. But shipping container loads of secondhand clothes and textiles can do real damage environmentally. Clothing that can’t be sold becomes waste. In Ghana, there are now 20-metre-high hills made of clothing waste. Synthetic clothes clog up rivers, trap animals and spread plastics as they break apart. This practice has been dubbed “waste colonialism”. Uganda has recently banned imports. The secondhand clothing export industry provides work, but its social and environmental impacts have been devastating. What should we do? At present, charities and resellers are struggling with managing the rising volume of donations, but they have little room to change. Exporting excess clothing is getting less profitable, but moving to local sorting and resale would be even less so due to higher costs and too many clothes collected relative to demand for secondhand items. These clothes are disposed of by consumers who live in cities in wealthier countries. The actions city leaders take can reduce the problem globally, such as encouraging residents to buy fewer new clothes and boosting local reuse, repair and recycling efforts. Charity shops are inundated with fast fashion donations. triocean/Shutterstock We are already seeing seeing grassroots initiatives emerging in all nine studied cities promoting local reuse and repair, with some receiving government support and others operating independently. To make real change, municipal governments will have to take on a larger role. At present, municipal governments in most of the cities we studied have limited roles, ranging from sending clothing waste to landfill to sharing data on clothing recycling bins, letting charities and resellers set up collection points and supporting repair and swapping. Here are three ways local governments could take the lead: 1. Curb overconsumption Dealing with waste is a major role for municipal governments, and comes with major costs. To reduce clothing waste, cities could launch campaigns against overconsumption by focusing on the environmental damage done by fast fashion – or even banning ads for clothing retailers in city centres. 2. Boost reuse Local governments could stop charging charities for the right to collect clothing and instead offer compensation for every kilo of collected textiles to help replace the money they get from sending clothing bales to resellers for export. Cities can also train and support circular economy practitioners, such as those involved in repair and upcycling. 3. Reduce exports of clothing waste City leaders could reduce textile exports by recognising them as a waste stream and including textiles in their waste management planning. One thing is certain – if we keep going as we are, flows of clothing waste will only grow, leading to more waste locally – and greatly increase the waste problem overseas. The authors do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointment.

Construction is the world’s biggest polluter, yet Labour still refuses to tackle it | Simon Jenkins

Refurbishing an old building is subject to full VAT, but it isn’t if you build a polluting new one. The government’s priorities are all wrongYou can damn oil companies, abuse cars, insult nimbys, kill cows, befoul art galleries. But you must never, ever criticise the worst offender of all. The construction industry is sacred to both the left and the right. It may be the world’s greatest polluter, but it is not to be criticised. It is the elephant in the global-heating room.It’s hard not to feel as though we have a blind spot when it comes to cement, steel and concrete. A year has now passed since the UN’s environment programme stated baldly that “the building and construction sector is by far the largest emitter of greenhouse gases”. The industry accounts for “a staggering 37% of global emissions”, more than any other single source. Yet it rarely gets the same attention as oil or car companies.Simon Jenkins is a Guardian columnist Continue reading...

You can damn oil companies, abuse cars, insult nimbys, kill cows, befoul art galleries. But you must never, ever criticise the worst offender of all. The construction industry is sacred to both the left and the right. It may be the world’s greatest polluter, but it is not to be criticised. It is the elephant in the global-heating room.It’s hard not to feel as though we have a blind spot when it comes to cement, steel and concrete. A year has now passed since the UN’s environment programme stated baldly that “the building and construction sector is by far the largest emitter of greenhouse gases”. The industry accounts for “a staggering 37% of global emissions”, more than any other single source. Yet it rarely gets the same attention as oil or car companies.Everywhere, building anew is revered. In her budget last month, Rachel Reeves refused to end the 20% VAT imposed on the refurbishment of old buildings, while continuing to exempt new ones from VAT. This amounts to a subsidy for polluting, carbon-intensive activities, much like her failure to index fuel duty. She is doing so because she and the prime minister, like the Tories before them, are putty in the hands of Britain’s powerful construction and oil lobbies. In his party conference speech, Keir Starmer promised to “bulldoze” local planners who impeded developers. He wants to build 1.5m new homes, and defies locals who oppose him. He even harks back to the 1940s and 50s in planning new towns in the countryside, the most carbon-costly, car-reliant form of development imaginable.Of course, Britain needs more homes, and many of these are likely to be new ones. Yet there is plenty the government could do to free up houses. Britain’s regulation of its existing building stock is atrocious. Council tax bands have been frozen since 1991, meaning people rattling around in homes that are too big for them are discouraged from moving. A million homes in England are now estimated to be lying empty and millions more are underoccupied. Deterred by the VAT that applies to refurbishing existing buildings, developers are allowed to demolish 50,000 mostly reusable buildings a year. The release of their embodied carbon is colossal.A much greater conversation is needed about the merits of constructing new homes versus retrofitting existing housing stock. While Labour has pledged to build 300,000 eco-friendly new homes each year, and is expected to tighten environmental standards for construction projects, the housebuilding lobby will fiercely resist any changes. The dominant cry is still build, baby, build. Rarely is it “convert”, “retrofit” or “reuse”. There are plenty of reasons we should be wary of this simple mantra, aside from the climate costs. Building lots more houses in south-east England would only worsen the widening gap between north and south, for example. Towns awash in brownfield sites need restoring, while villages should be allowed to grow organically, rather than being forced to swallow giant housing developments.Starmer is relying on his energy secretary, Ed Miliband, to mitigate the emissions from his construction boom. The latest astonishing proposal is for western Europe’s biggest solar panel farm in the Oxfordshire countryside. It will utterly destroy this area, passing through 15 villages. Britain needs renewable energy, but this should be planned according to national priorities, not allowed to deface the countryside wherever a landowner chooses. And if we are to reduce our impact on the environment, we surely also need to be taking a closer look at properly valuing and zoning the rural landscape in general. Bulldozing all sense of town and country planning to appease a commercial lobby is a shocking abuse of the last shred of local democracy in Britain – the public’s right to some say in the physical future of its communities.

Satellites spot methane leaks – but ‘super-emitters’ don’t fix them

Governments and companies almost never take action when satellites alert them about large methane leaks coming from oil and gas infrastructure

A methane plume at least 4.8 kilometres long billows into the atmosphere south of Tehran, IranNASA/JPL-Caltech The world has more ways than ever to spot the invisible methane emissions responsible for a third of global warming so far. But according to a report released at the COP29 climate summit, methane “super-emitters” rarely take action when alerted that they are leaking large amounts of the potent greenhouse gas. “We’re not seeing the transparency and the sense of urgency that we require,” says Manfredi Caltagirone, director of the United Nations Environment Programme’s International Methane Emissions Observatory, which recently launched a system that uses satellite data to alert methane emitters about leaks. Methane is the second most important greenhouse gas to address, behind carbon dioxide, and a rising number of countries have promised to slash methane emissions in order to avoid near-term warming. At last year’s COP28 climate summit, many of the world’s largest oil and gas companies also pledged to “eliminate” methane emissions from their operations. Today, a growing number of satellites are beginning to detect methane leaks from the biggest sources of such emissions: oil and gas infrastructure, coal mines, landfill and agriculture. That data is critical to holding emitters to account, says Mark Brownstein at the Environmental Defense Fund, an environmental advocacy group that recently launched its own methane-sensing satellite. “But data by itself doesn’t solve the problem,” he says. The first year of the UN methane alert system illustrates the yawning gap between data and action. Over the past year, the programme issued 1225 alerts to governments and companies when it identified plumes of methane from oil and gas infrastructure large enough to be detected from space. It now reports that emitters only took steps to control those leaks 15 times, a response rate of under 1 per cent. There are a number of possible reasons for this, says Caltagirone. Emitters might lack technical or financial resources and some sources of methane can be difficult to cut off, although emissions from oil and gas infrastructure are widely seen to be the easiest to deal with. “It’s plumbing. It’s not rocket science,” he says. Another explanation might be that emitters are still getting used to the new alert system. However, other methane monitors have reported a similar lack of response. “Our success rate is not much better,” says Jean-Francois Gauthier at GHGSat, a Canadian company that has issued similar satellite alerts for years. “It’s on the order of 2 or 3 per cent.” All 2974 methane super-emitter plumes that the Copernicus Sentinel-5P satellite detected in 2021ESA/SRON There have been some successes. For instance, the UN issued several alerts this year to the Algerian government about a methane source that had been continuously leaking since at least 1999, with a global warming effect equivalent to half a million cars driven for a year. By October, satellite data showed it had disappeared. But the overall picture suggests monitoring isn’t yet translating into emissions reductions. “Simply showing methane plumes is not enough to generate action,” says Rob Jackson at Stanford University in California. A core problem he sees is that satellites rarely reveal who owns the leaky pipeline or the well emitting methane, making accountability difficult. Methane is a major topic of discussion at the COP29 meeting, now under way in Baku, Azerbaijan. A summit this week on “non-CO2 greenhouse gases”, convened by the US and China, saw countries announce several actions on methane emissions. They include a fee on methane in the US, which is aimed at oil and gas emitters – although many expect the incoming Trump administration to undo that rule.

A Federal Court Just Upended Decades of Environmental Regulation

The D.C. Circuit Court of Appeals ruled this week that federal agencies and courts have been misinterpreting a major environmental law for the last half-century, casting doubt on whether a key White House agency can actually write binding regulations on environmental policy.In an unsigned 2-1 decision, a three-judge panel concluded that the Council on Environmental Quality, or CEQ, had been issuing binding regulations in error since the late 1970s. It held that the National Environmental Policy Act of 1969, also known as NEPA, did not grant rulemaking authority to the agency—which it had nonetheless wielded since the Carter administration.The panel sounded almost surprised that it had to reach this conclusion in the first place. “The separation of powers and statutory interpretation issue that CEQ’s regulations present is thus unremarkable,” it noted in its ruling. “What is quite remarkable is that this issue has remained largely undetected and undecided for so many years in so many cases.”Tuesday’s ruling in Marin Audobon Society v. Federal Aviation Administration is a complicated one and it will likely come under intense scrutiny on appeal. Even the panel majority acknowledged that it is somewhat at odds with the Supreme Court’s own rulings on the matter. (More on that later.) Nonetheless, its conclusions could have far-reaching implications for how the federal government writes new regulations—and how it considers environmental issues when doing so.Congress enacted NEPA in 1969 after the environmentalist movement emerged as a major force in American politics. In its most basic form, NEPA created environmental guardrails for other federal agencies to follow when carrying out their duties. When those agencies wrote or re-wrote federal regulations, for example, the law generally required them to assess the environmental impact of their changes along the way.NEPA also established the Council on Environmental Quality, or CEQ, a White House council that coordinates with the rest of the executive branch on crafting environmental-impact assessments and weighing their impact on policymaking. CEQ is led by a three-member commission whose members are appointed by the president and confirmed by the Senate, giving it an unusually strong imprimatur among White House organs.Since the Jimmy Carter administration, CEQ has also been issuing regulations of its own that bind other federal agencies. These are not merely advisory opinions that it renders on behalf of the White House. To the contrary, they are submitted through the notice-and-comment process like other federal regulations and are published in the Federal Register. Courts have often used them to resolve legal disputes over an agency’s actions and answer questions about whether and when they comply with NEPA.That framework was supposed to decide this case as well. A coalition of local environmental groups in the San Francisco Bay Area sued the FAA over its plans to allow tour flights over four national parks in the area. The FAA had argued that it did not need to conduct an environmental-impact assessment under NEPA when authorizing a formal plan for the flights.In theory, the dispute turned on whether the FAA was correct to argue that it had a “categorical exemption” from NEPA, or if the environmental groups were correct to argue that the plan did not fall into the exemption based on CEQ’s regulations. Neither the groups nor the agency itself had argued that the NEPA regulations at issue were invalid, or that CEQ lacked the authority it had claimed for itself under the law.The problem, according to the panel, is that NEPA does not grant CEQ the power to issue those regulations in the first place. The misconception apparently stemmed from an executive order issued by Jimmy Carter in 1977. After the Justice Department had previously told the Supreme Court that CEQ’s guidelines were non-binding, Carter’s order stated that federal agencies had to “comply” with “regulations” issued by CEQ to the extent allowed by law, citing NEPA and other environmental laws.After Carter’s order, CEQ issued a wave of mandatory regulations for other agencies to follow, sweeping aside their existing practices and procedures for complying with NEPA. Succeeding generations of lawyers and judges assumed that CEQ had the power to issue those regulations, apparently without looking much further into Carter’s executive order or the text of NEPA itself to locate the actual legal authority.Complicating matters is that the Supreme Court itself has upheld these regulations from time to time, operating under the presumption that CEQ had the lawful authority to issue them. In one case, for example, the justices said outright that CEQ had been “established by NEPA with authority to issue regulations interpreting it.” Judges in the lower federal courts are bound by Supreme Court precedent at all times.This time, however, the panel majority concluded it was not bound by this description or other stray remarks in the high court’s rulings over the years. “The statement appeared without any accompanying legal analysis,” the panel noted, quoting from past rulings. “We must obey ‘carefully considered language of the Supreme Court, even if technically dictum.’ But we are not bound by every stray remark on an issue the parties [in those cases] neither raised nor discussed in any meaningful way.” In other words, because the Supreme Court had never directly considered CEQ’s authority, the panel was not bound by its assumption that it existed.Judge Sri Srinivasan, who dissented in part from the court’s ruling, criticized his colleagues for ruling on CEQ’s authority even though no party had asked for it. He pointed to a doctrine known as the party-presentation principle, which generally holds that judges are only supposed to decide legal questions that are raised by litigants and briefed and argued by them. “Time and again, we have refrained from questioning the CEQ’s authority to adopt binding NEPA regulations because the parties did not raise the challenge,” he noted, pointing to past D.C. Circuit cases where they had assumed CEQ’s authority was valid.Srinivasan also criticized the majority for adopting a remedy that the environmental groups had not sought. Because the FAA had relied on CEQ regulations when adopting its current plan for air tours over the parks, the panel vacated the current plan and ordered it to start anew. Srinivasan noted that the D.C. Circuit’s usual practice is to avoid remedies that would paradoxically lead to lesser environmental protections than if the environmental groups hadn’t challenged it at all. “When confronted with similar circumstances, our court has repeatedly remanded to an agency without vacating a flawed but environmentally protective agency action,” he noted.Where this ruling goes from here is unclear. The FAA could, in theory, appeal the ruling to be reviewed by the entire D.C. Circuit, where Democratic appointees hold a majority. (The two judges who ruled against CEQ’s authority were appointed by Republican presidents.) The environmental groups could also asked the entire D.C. Circuit to review the remedy itself. If the D.C. Circuit reverses the panel’s ruling, neither party may be interested in taking it any further to the Supreme Court.But the justices may already be aware of the issue with CEQ’s rulemaking authority. Next month, the Supreme Court will hear oral arguments in Seven County Infrastructure Coalition v. Eagle County, Colorado, a NEPA case about when and how federal agencies must consider environmental impacts. None of the parties in that case questioned CEQ’s rulemaking authority. A group of conservative law professors filed a friend-of-the-court brief in September that argued it might be invalid.“As a component of the White House, there is no doubt that CEQ has authority to promulgate rules of administration to guide agencies in their implementation of NEPA’s procedural requirements,” the law professors told the justices. “But there is no basis for those rules’ being judicially enforceable, and the D.C. Circuit’s enforcement of them was another source of reversible error.” It would be stunning if the justices took the same blunt approach in that case as the D.C. Circuit did this week. But it would not be a surprise if at least one or more justices publicly called for a future case to be heard on the matter.It is worth emphasizing here that overturning CEQ’s regulatory power would not eliminate NEPA or its environmental-impact requirements for federal agencies. Instead, each federal agency would likely adopt its own practices and procedures for following the law, just as they did before Carter’s executive order in 1977. That could lead to greater regulatory confusion if different agencies take different approaches to the law’s requirements, especially in the short term. For a Supreme Court that has already taken major swings at the administrative state, throwing NEPA into chaos would be one of its most far-reaching moves yet.

The D.C. Circuit Court of Appeals ruled this week that federal agencies and courts have been misinterpreting a major environmental law for the last half-century, casting doubt on whether a key White House agency can actually write binding regulations on environmental policy.In an unsigned 2-1 decision, a three-judge panel concluded that the Council on Environmental Quality, or CEQ, had been issuing binding regulations in error since the late 1970s. It held that the National Environmental Policy Act of 1969, also known as NEPA, did not grant rulemaking authority to the agency—which it had nonetheless wielded since the Carter administration.The panel sounded almost surprised that it had to reach this conclusion in the first place. “The separation of powers and statutory interpretation issue that CEQ’s regulations present is thus unremarkable,” it noted in its ruling. “What is quite remarkable is that this issue has remained largely undetected and undecided for so many years in so many cases.”Tuesday’s ruling in Marin Audobon Society v. Federal Aviation Administration is a complicated one and it will likely come under intense scrutiny on appeal. Even the panel majority acknowledged that it is somewhat at odds with the Supreme Court’s own rulings on the matter. (More on that later.) Nonetheless, its conclusions could have far-reaching implications for how the federal government writes new regulations—and how it considers environmental issues when doing so.Congress enacted NEPA in 1969 after the environmentalist movement emerged as a major force in American politics. In its most basic form, NEPA created environmental guardrails for other federal agencies to follow when carrying out their duties. When those agencies wrote or re-wrote federal regulations, for example, the law generally required them to assess the environmental impact of their changes along the way.NEPA also established the Council on Environmental Quality, or CEQ, a White House council that coordinates with the rest of the executive branch on crafting environmental-impact assessments and weighing their impact on policymaking. CEQ is led by a three-member commission whose members are appointed by the president and confirmed by the Senate, giving it an unusually strong imprimatur among White House organs.Since the Jimmy Carter administration, CEQ has also been issuing regulations of its own that bind other federal agencies. These are not merely advisory opinions that it renders on behalf of the White House. To the contrary, they are submitted through the notice-and-comment process like other federal regulations and are published in the Federal Register. Courts have often used them to resolve legal disputes over an agency’s actions and answer questions about whether and when they comply with NEPA.That framework was supposed to decide this case as well. A coalition of local environmental groups in the San Francisco Bay Area sued the FAA over its plans to allow tour flights over four national parks in the area. The FAA had argued that it did not need to conduct an environmental-impact assessment under NEPA when authorizing a formal plan for the flights.In theory, the dispute turned on whether the FAA was correct to argue that it had a “categorical exemption” from NEPA, or if the environmental groups were correct to argue that the plan did not fall into the exemption based on CEQ’s regulations. Neither the groups nor the agency itself had argued that the NEPA regulations at issue were invalid, or that CEQ lacked the authority it had claimed for itself under the law.The problem, according to the panel, is that NEPA does not grant CEQ the power to issue those regulations in the first place. The misconception apparently stemmed from an executive order issued by Jimmy Carter in 1977. After the Justice Department had previously told the Supreme Court that CEQ’s guidelines were non-binding, Carter’s order stated that federal agencies had to “comply” with “regulations” issued by CEQ to the extent allowed by law, citing NEPA and other environmental laws.After Carter’s order, CEQ issued a wave of mandatory regulations for other agencies to follow, sweeping aside their existing practices and procedures for complying with NEPA. Succeeding generations of lawyers and judges assumed that CEQ had the power to issue those regulations, apparently without looking much further into Carter’s executive order or the text of NEPA itself to locate the actual legal authority.Complicating matters is that the Supreme Court itself has upheld these regulations from time to time, operating under the presumption that CEQ had the lawful authority to issue them. In one case, for example, the justices said outright that CEQ had been “established by NEPA with authority to issue regulations interpreting it.” Judges in the lower federal courts are bound by Supreme Court precedent at all times.This time, however, the panel majority concluded it was not bound by this description or other stray remarks in the high court’s rulings over the years. “The statement appeared without any accompanying legal analysis,” the panel noted, quoting from past rulings. “We must obey ‘carefully considered language of the Supreme Court, even if technically dictum.’ But we are not bound by every stray remark on an issue the parties [in those cases] neither raised nor discussed in any meaningful way.” In other words, because the Supreme Court had never directly considered CEQ’s authority, the panel was not bound by its assumption that it existed.Judge Sri Srinivasan, who dissented in part from the court’s ruling, criticized his colleagues for ruling on CEQ’s authority even though no party had asked for it. He pointed to a doctrine known as the party-presentation principle, which generally holds that judges are only supposed to decide legal questions that are raised by litigants and briefed and argued by them. “Time and again, we have refrained from questioning the CEQ’s authority to adopt binding NEPA regulations because the parties did not raise the challenge,” he noted, pointing to past D.C. Circuit cases where they had assumed CEQ’s authority was valid.Srinivasan also criticized the majority for adopting a remedy that the environmental groups had not sought. Because the FAA had relied on CEQ regulations when adopting its current plan for air tours over the parks, the panel vacated the current plan and ordered it to start anew. Srinivasan noted that the D.C. Circuit’s usual practice is to avoid remedies that would paradoxically lead to lesser environmental protections than if the environmental groups hadn’t challenged it at all. “When confronted with similar circumstances, our court has repeatedly remanded to an agency without vacating a flawed but environmentally protective agency action,” he noted.Where this ruling goes from here is unclear. The FAA could, in theory, appeal the ruling to be reviewed by the entire D.C. Circuit, where Democratic appointees hold a majority. (The two judges who ruled against CEQ’s authority were appointed by Republican presidents.) The environmental groups could also asked the entire D.C. Circuit to review the remedy itself. If the D.C. Circuit reverses the panel’s ruling, neither party may be interested in taking it any further to the Supreme Court.But the justices may already be aware of the issue with CEQ’s rulemaking authority. Next month, the Supreme Court will hear oral arguments in Seven County Infrastructure Coalition v. Eagle County, Colorado, a NEPA case about when and how federal agencies must consider environmental impacts. None of the parties in that case questioned CEQ’s rulemaking authority. A group of conservative law professors filed a friend-of-the-court brief in September that argued it might be invalid.“As a component of the White House, there is no doubt that CEQ has authority to promulgate rules of administration to guide agencies in their implementation of NEPA’s procedural requirements,” the law professors told the justices. “But there is no basis for those rules’ being judicially enforceable, and the D.C. Circuit’s enforcement of them was another source of reversible error.” It would be stunning if the justices took the same blunt approach in that case as the D.C. Circuit did this week. But it would not be a surprise if at least one or more justices publicly called for a future case to be heard on the matter.It is worth emphasizing here that overturning CEQ’s regulatory power would not eliminate NEPA or its environmental-impact requirements for federal agencies. Instead, each federal agency would likely adopt its own practices and procedures for following the law, just as they did before Carter’s executive order in 1977. That could lead to greater regulatory confusion if different agencies take different approaches to the law’s requirements, especially in the short term. For a Supreme Court that has already taken major swings at the administrative state, throwing NEPA into chaos would be one of its most far-reaching moves yet.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.