Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Enormous glaciers on Snowball Earth helped life evolve

News Feed
Sunday, March 16, 2025

Around 700 million years ago, enormous ancient glaciers gave life on Earth an evolutionary boost. Find out how in this 1-minute video, and read more below. Glaciers helped life evolve For billions of years, the moving sheets of ice we call glaciers have helped form our landscapes. And new research suggests they might also have helped form us. On Februrary 25, 2025, a team of researchers said that 700 million years ago, during one of our planet’s frigid Snowball Earth periods, enormous glaciers carved through our planet’s crust, drawing up minerals previously locked beneath the ice. And when the glaciers eventually melted, these minerals flooded Earth’s oceans, creating conditions that allowed life on our planet to surge in complexity. The researchers published their study in the peer-reviewed journal Geology on February 25, 2025. A hiker overlooks the Aletsch Glacier, the largest glacier in the Alps. Image via Robert J. Heath/ Wikimedia Commons (CC BY 2.0). The 2025 EarthSky lunar calendar makes a great gift. Get yours today! Glaciers: Nature’s bulldozers Also known as rivers of ice, glaciers are dense ice bodies that move downhill under their own weight. They form when many layers of snowfall compress each other, squeezing out any pockets of air and forming dense ice crystals. This process means that glacial ice is technically a form of metamorphic rock! But it flows downhill like slow-moving water, shaped by the land while also shaping the land itself. Glaciers have the power to bulldoze through forests with ease, and they carve out distinctive U-shaped valleys as they slide downhill. Currently, glaciers cover about 10% of Earth’s surface and hold around 68% of our freshwater. That includes the great glacial ice sheets at the poles. But scientists think that around 700 million years ago, these ice sheets covered the entire globe. This was during one of the hypothesized Snowball Earth periods, when glaciers up to 1km thick spread from pole to pole, likely due to a drop in atmospheric carbon dioxide. The researchers compared the chemical makeup of sedimentary rocks dating to before and after this glacial period. And they concluded that these enormous glaciers must have scraped deep into Earth’s crust as they moved, picking up masses of sediment and minerals. Artist’s concept of a Snowball Earth, when our planet’s surface was completely covered in ice and snow. Image via NASA/ University of Washington. Unlocking the Earth’s potential Within a few tens of millions of years, carbon dioxide levels increased to the point at which the greenhouse effect started to warm Earth out of its ice age. And as the glaciers melted, these minerals – which had previously been locked below masses of ice – were released. Lead author Chris Kirkland of Curtin University explained: When these giant ice sheets melted, they triggered enormous floods that flushed minerals and their chemicals, including uranium, into the oceans. This influx of elements changed ocean chemistry, at a time when more complex life was starting to evolve. At this time, life on Earth was very simple, and mostly – if not entirely – confined to the oceans. And it seems that this cycling of key minerals into the oceans might have given life an evolutionary boost. Kirkland said: We see different forms of life, more complex life, developing after this period of time. This wasn’t the first time in Earth’s history that widespread glaciers melted into the oceans. So you might question: why did an evolutionary explosion occur after this ice age in particular? One reason could be that it coincided with the Neoproterozoic oxygenation event, which saw a huge increase in oxygen in Earth’s atmosphere and oceans. Scientists believe this increased availability of oxygen could have contributed to the sudden development of complex life in this period, perhaps in tandem with the influx of minerals into the ocean. The mountain in the distance is Aoraki/ Mount Cook, the tallest peak in New Zealand. Long ago, the glacier running down the mountain carved out this now-verdant valley (the Hooker Valley). Image via Will Triggs. An interconnected, ever-changing world To Kirkland, the study’s findings are a testament to the interconnectedness of Earth’s natural processes. He said: This study highlights how Earth’s land, oceans, atmosphere and climate are intimately connected; where even ancient glacial activity set off chemical chain reactions that reshaped the planet. And this is especially important, Kirkland suggested, in light of our currently changing climate. He explained: This research is a stark reminder that while Earth itself will endure, the conditions that make it habitable can change dramatically. These ancient climate shifts demonstrate that environmental changes, whether natural or human-driven, have profound and lasting impacts. Bottom line: A new study says that glaciers covering our planet during a Snowball Earth period likely helped give life an evolutionary boost. Source: The Neoproterozoic glacial broom Via Curtin University Read more: What drove Snowball Earth? A drop in a greenhouse gas Read more: Salt glaciers on Mercury could harbor habitable nichesThe post Enormous glaciers on Snowball Earth helped life evolve first appeared on EarthSky.

A new study says that glaciers covering our planet during a Snowball Earth period likely helped give life an evolutionary boost. The post Enormous glaciers on Snowball Earth helped life evolve first appeared on EarthSky.


Around 700 million years ago, enormous ancient glaciers gave life on Earth an evolutionary boost. Find out how in this 1-minute video, and read more below.

Glaciers helped life evolve

For billions of years, the moving sheets of ice we call glaciers have helped form our landscapes. And new research suggests they might also have helped form us. On Februrary 25, 2025, a team of researchers said that 700 million years ago, during one of our planet’s frigid Snowball Earth periods, enormous glaciers carved through our planet’s crust, drawing up minerals previously locked beneath the ice. And when the glaciers eventually melted, these minerals flooded Earth’s oceans, creating conditions that allowed life on our planet to surge in complexity.

The researchers published their study in the peer-reviewed journal Geology on February 25, 2025.

A wide icy stretch curves sharply through a rocky valley, with snow-covered peaks on the right. A man in hiking gear in the foreground looks at the glacier.
A hiker overlooks the Aletsch Glacier, the largest glacier in the Alps. Image via Robert J. Heath/ Wikimedia Commons (CC BY 2.0).

The 2025 EarthSky lunar calendar makes a great gift. Get yours today!

Glaciers: Nature’s bulldozers

Also known as rivers of ice, glaciers are dense ice bodies that move downhill under their own weight. They form when many layers of snowfall compress each other, squeezing out any pockets of air and forming dense ice crystals.

This process means that glacial ice is technically a form of metamorphic rock! But it flows downhill like slow-moving water, shaped by the land while also shaping the land itself. Glaciers have the power to bulldoze through forests with ease, and they carve out distinctive U-shaped valleys as they slide downhill.

Currently, glaciers cover about 10% of Earth’s surface and hold around 68% of our freshwater. That includes the great glacial ice sheets at the poles. But scientists think that around 700 million years ago, these ice sheets covered the entire globe. This was during one of the hypothesized Snowball Earth periods, when glaciers up to 1km thick spread from pole to pole, likely due to a drop in atmospheric carbon dioxide.

The researchers compared the chemical makeup of sedimentary rocks dating to before and after this glacial period. And they concluded that these enormous glaciers must have scraped deep into Earth’s crust as they moved, picking up masses of sediment and minerals.

Rocky planet completely covered in glaciers, ice and snow.
Artist’s concept of a Snowball Earth, when our planet’s surface was completely covered in ice and snow. Image via NASA/ University of Washington.

Unlocking the Earth’s potential

Within a few tens of millions of years, carbon dioxide levels increased to the point at which the greenhouse effect started to warm Earth out of its ice age. And as the glaciers melted, these minerals – which had previously been locked below masses of ice – were released.

Lead author Chris Kirkland of Curtin University explained:

When these giant ice sheets melted, they triggered enormous floods that flushed minerals and their chemicals, including uranium, into the oceans. This influx of elements changed ocean chemistry, at a time when more complex life was starting to evolve.

At this time, life on Earth was very simple, and mostly – if not entirely – confined to the oceans. And it seems that this cycling of key minerals into the oceans might have given life an evolutionary boost. Kirkland said:

We see different forms of life, more complex life, developing after this period of time.

This wasn’t the first time in Earth’s history that widespread glaciers melted into the oceans. So you might question: why did an evolutionary explosion occur after this ice age in particular? One reason could be that it coincided with the Neoproterozoic oxygenation event, which saw a huge increase in oxygen in Earth’s atmosphere and oceans. Scientists believe this increased availability of oxygen could have contributed to the sudden development of complex life in this period, perhaps in tandem with the influx of minerals into the ocean.

A tall snowy mountain peak in the distance below a blue sky, with a steep, grassy valley leading from it towards the viewer, who is on the valley floor.
The mountain in the distance is Aoraki/ Mount Cook, the tallest peak in New Zealand. Long ago, the glacier running down the mountain carved out this now-verdant valley (the Hooker Valley). Image via Will Triggs.

An interconnected, ever-changing world

To Kirkland, the study’s findings are a testament to the interconnectedness of Earth’s natural processes. He said:

This study highlights how Earth’s land, oceans, atmosphere and climate are intimately connected; where even ancient glacial activity set off chemical chain reactions that reshaped the planet.

And this is especially important, Kirkland suggested, in light of our currently changing climate. He explained:

This research is a stark reminder that while Earth itself will endure, the conditions that make it habitable can change dramatically. These ancient climate shifts demonstrate that environmental changes, whether natural or human-driven, have profound and lasting impacts.

Bottom line: A new study says that glaciers covering our planet during a Snowball Earth period likely helped give life an evolutionary boost.

Source: The Neoproterozoic glacial broom

Via Curtin University

Read more: What drove Snowball Earth? A drop in a greenhouse gas

Read more: Salt glaciers on Mercury could harbor habitable niches

The post Enormous glaciers on Snowball Earth helped life evolve first appeared on EarthSky.

Read the full story here.
Photos courtesy of

As Starlink and Other Satellites Proliferate, Astronomers Learn to Manage Interference

Swarms of satellites launched by SpaceX and other companies are disrupting astronomical observations. Here's how scientists are coping

In the next few months, from its perch atop a mountain in Chile, the Vera C. Rubin Observatory will begin surveying the cosmos with the largest camera ever built. Every three nights, it will produce a map of the entire southern sky filled with stars, galaxies, asteroids and supernovae — and swarms of bright satellites ruining some of the view.Astronomers didn’t worry much about satellites photobombing Rubin’s images when they started drawing up plans for the observatory more than two decades ago. But as the space around Earth becomes increasingly congested, researchers are having to find fresh ways to cope — or else lose precious data from Rubin and hundreds of other observatories.The number of working satellites has soared in the past five years to around 11,000, mostly because of constellations of orbiters that provide Internet connectivity around the globe (see ‘Satellite surge’). Just one company, SpaceX in Hawthorne, California, has more than 7,000 operational Starlink satellites, all launched since 2019; OneWeb, a space communications company in London, has more than 630 satellites in its constellation. On paper, tens to hundreds of thousands more are planned from a variety of companies and nations, although probably not all of these will be launched.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Satellites play a crucial part in connecting people, including bringing Internet to remote communities and emergency responders. But the rising number can be a problem for scientists because the satellites interfere with ground-based astronomical observations, by creating bright streaks on images and electromagnetic interference with radio telescopes. The satellite boom also poses other threats, including adding pollution to the atmosphere.When the first Starlinks launched, some astronomers warned of existential threats to their discipline. Now, researchers in astronomy and other fields are working with satellite companies to help quantify and mitigate the impacts on science — and society. “There is growing interest in collaborating and finding solutions together,” says Giuliana Rotola, a space-policy researcher at the Sant’Anna School of Advanced Studies in Pisa, Italy.Timing things rightThe first step to reduce satellite interference is knowing when and where a satellite will pass above an observatory. “The aim is to minimize the surprise,” says Mike Peel, an astronomer at Imperial College London.Before the launch of Starlinks, astronomers had no centralized reference for tracking satellites. Now, the International Astronomical Union (IAU) has a virtual Centre for the Protection of the Dark and Quiet Sky from Satellite Constellation Interference (CPS), which serves as an information hub and to which researchers, including Peel and Rotola, volunteer their time.One of the centre’s tools, called SatChecker, draws on a public database of satellite orbits, fed by information from observers and companies that track objects in space. Astronomers can use SatChecker to confirm what satellite is passing overhead during their observations. The tool isn’t perfect; atmospheric drag and intentional manoeuvring can affect a satellite’s position, and the public database doesn’t always reflect the latest information. For instance, the BlueWalker 3 satellite from telecommunications firm AST SpaceMobile in Midland, Texas, launched in 2022 and was sometimes brighter than most stars; yet uncertainty of its position was so great at times that astronomers had difficulty predicting whether it would be in their field of view for their night-time observations.Starlink satellites leave streaks in a 2019 image taken by a 4-meter telescope at the Cerro Tololo Inter-American Observatory in Chile.Tools such as SatChecker help telescope operators to avoid problems by allowing them to target a different part of the sky when a satellite passes overhead or by simply pausing observations as it flies by. It would aid astronomers if SatChecker had even more accurate information about satellite positions, but there are constraints on improving the system. SatChecker data come from the US Space Force, which draws on a global network of sensors that tracks objects in orbit and issues updates on satellite locations as often as several times a day. The frequency of these updates is limited by factors such as how often a sensor can observe an object and whether the sensor can distinguish what it’s looking at.Currently, satellite streaks are a relatively minor issue for telescope operators. But the problem will grow as satellite numbers continue to increase drastically, meaning more observation time will be lost, and this issue will be magnified for Rubin.Fixing the streaksRubin, which cost US$810 million to build, is a unique case because it scans large swathes of the sky frequently — meaning it can detect rapidly changing phenomena such as incoming asteroids or cosmic explosions. Astronomers don’t want to be fooled by passing satellites, as happened in 2017 when researchers spotted what they thought was a γ-ray burst — high-energy flashes of light — from a distant galaxy but turned out to be sunlight reflecting off a piece of space junk.Rubin’s powerful camera, coupled with its 8.4-metre telescope, will take about 1,000 nightly exposures of the sky, each about 45 times the area of the full Moon. That’s more wide-field pictures of the sky than any optical observatory has ever taken. Simulations suggest that if satellite numbers in low Earth orbit rise to around 40,000 over the 10 years of Rubin’s survey — a not-impossible forecast — then at least 10% of its images, and the majority of those taken during twilight, will contain a satellite trail3.SpaceX took early steps to try to mitigate the problem. Working with Rubin astronomers, the company tested changes to the design and positions of Starlinks to try to keep their brightness beneath a target threshold. Amazon, the retail and technology giant based in Seattle, Washington, is also testing mitigations on prototype satellites for its planned Kuiper constellation. Such changes reduce, but don’t eliminate, the problem.To limit satellite interference, Rubin astronomers are creating observation schedules to help researchers avoid certain parts of the sky (for example, near the horizon) and at certain times (such as around twilight)4. For when they can’t avoid the satellites, Rubin researchers have incorporated steps into their data-processing pipeline to detect and remove satellite streaks. All these changes mean less time doing science and more time processing data, but they need to be done, astronomers say. “We are really looking forward to getting data from Rubin and seeing how it turns out,” Peel says.For other observatories, the IAU CPS is working on tools to help astronomers identify and correct satellite streaks in their data. One is a new database of crowdsourced observations of satellite brightnesses called SCORE, which is currently being beta tested and is planned for wider release in the coming months. This will help scientists to work backwards — they might see something puzzling in their past observations and be able to work it out, Peel says.The database “is definitely a very valuable tool” because it’s one of few that have data freely available, says Marco Langbroek, a space-tracking specialist at Delft University of Technology in the Netherlands. As a beta tester, Langbroek has added a number of entries to SCORE, including measurements of a NASA solar sail that changes in brightness as it tumbles through space. Going forwards, he says, SCORE will be most useful if a lot of astronomers contribute high-quality observations to the database, thereby building up a resource over time.Tuning things outAstronomers who work in the radio portion of the electromagnetic spectrum face extra challenges when it comes to satellites.Big radio telescopes are typically located in remote regions, to be as far as possible from mobile-phone masts and other technological infrastructure that leak radio emissions. But satellites can’t be avoided. “If signals are coming from the sky, they’re always there,” says Federico Di Vruno, an astronomer at the Square Kilometre Array Observatory in Jodrell Bank, UK, and co-director of the IAU CPS.When satellites transmit signals, the electromagnetic interference can overwhelm faint radio signals coming from the cosmos. One solution is to re-direct or temporarily turn off satellite transmissions. The US National Radio Astronomy Observatory and SpaceX have been working on ways to accomplish this, and the company now momentarily redirects or disables transmissions when Starlinks pass above sensitive telescopes including the Green Bank Telescope in West Virginia5. The method requires voluntary buy-in by all partners, plus a lot of data sharing and intensive programming by the companies and by the astronomers, but it does reduce interference. It has been successful enough that small group of radio astronomers visited China last month to discuss the strategy with satellite operators and scientists there.An image made from multiple exposures shows streaks from Starlink satellites, the International Space Station and other satellites over a site in Wales.But as soon as one solution is found, fresh challenges appear. One is the rise of ‘direct-to-cell’ satellites, which function like mobile-phone towers in space and can transmit to areas on the ground that otherwise don’t have coverage. Optical astronomers worry about these because they are physically large and therefore bright6, and they are a big problem for radio astronomers because direct-to-cell transmissions are extremely powerful. If one of those hits a radio observatory, “the telescope might be blind for a little bit”, Di Vruno says. So astronomers and satellite operators are discussing how they can share information about these as well, to avoid each other when a satellite passes over an observatory.Another emerging challenge is ‘unintended’ emissions — which happen when satellites ‘leak’ radiation in wavelengths far outside the bands typically used for transmissions and other tasks. Early tests for the Square Kilometre Array radio telescopes, which are under construction in Australia and South Africa, discovered such leakage coming from Starlinks and other satellites7.Many of these unintended emissions are at the low frequencies that are used in some studies including those of the early Universe. So far, astronomers haven’t come up with a good solution, other than scheduling telescopes to not record data when a satellite passes through the part of the sky being observed. In the future, it is possible that authorities such as the International Telecommunication Union might be able to issue regulations on this, as it already does for other shared uses of the electromagnetic spectrum.Cleaning up the atmosphereAstronomers aren’t the only researchers concerned about the impacts of satellite constellations. In the past few years, a growing number of atmospheric scientists have been warning that these fleets will pollute Earth’s upper atmosphere during launches and then when their orbits decline and they burn up. Researchers are just starting to get to grips with the scope of this pollution, says Connor Barker, an atmospheric chemist at University College London (UCL).The point of satellite constellations is to have lots of satellites in orbit, but refreshing them when new technology comes along means that the pace of launches and re-entries will accelerate. In February alone, an average of four Starlink satellites a day re-entered the atmosphere and burned up.Each re-entry adds chemicals to the upper atmosphere. In a 2023 study, researchers reported that measurements made during high-altitude aeroplane flights detected more than 20 chemical elements in Earth’s upper atmosphere that probably came from satellite re-entries, including aluminium, copper and lead8. Other work has found that satellite constellations contributed around 40% of many types of carbon emission from the space industry in 2022, including black carbon particles and carbon dioxide9 that could contribute to warming the atmosphere. It’s not yet clear how much this warms the planet or contributes to other environmental problems. Some early analyses suggest that satellite launches could contribute a small but measurable amount of ozone destruction.There are no regulations on satellite atmospheric pollution. Barker and his colleagues at UCL say a good first step towards a solution is to get better estimates of the scope of the problem. They have been building an emissions inventory for rocket launches and satellite re-entries, carefully tallying up the contaminants involved and estimating the altitudes at which they enter the atmosphere. “Even though this is currently a relatively small industry that’s having a relatively small impact on the atmosphere, we should still be aware of it,” says Eloise Marais, an atmospheric chemist at UCL.Researchers are trying to raise the profile of these and other concerns linked to satellite fleets. Some of these issues were discussed in February in Vienna, at a meeting of the United Nations Committee on the Peaceful Uses of Outer Space. It was the first time that the committee formally discussed the impacts of satellite constellations on astronomy.No major actions were taken, as expected for these early discussions. But “now all of the member states know of dark and quiet skies”, Di Vruno says. That in itself, he says, is a success.This article is reproduced with permission and was first published on March 18, 2025.

Urban Wildfire Smoke Sensors Miss Harmful Chemicals

As fires burned in Los Angeles this year, newer toxin monitors found contaminants that aren’t measured by standard methods. Now scientists and officials are pushing for better detection

When the catastrophic Los Angeles fires broke out in January of 2025, John Volckens suspected firefighters and residents were breathing toxic air from the burning homes, buildings, and cars, but it was unclear how much risk the public faced. So, the professor of environmental health at Colorado State University devised a plan to get answers.Volckens shipped 10 air pollution detectors to Los Angeles to measure the amounts of heavy metals, benzene, and other chemicals released by the flames, which burned more than 16,000 homes, businesses, and other structures, making it one of the country’s costliest natural disasters.“These disaster events keep happening. They release pollution into the environment and to the surrounding community,” said Volckens, who shared his results with local air regulators. “We have this kind of traumatic experience, and then we’re left with: Well, what did we just breathe in?”On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Scientists and public health officials have long tracked the pollutants that cause smog, acid rain, and other environmental health hazards and shared them with the public through the local Air Quality Index. But the monitoring system misses hundreds of harmful chemicals released in urban fires, and the Los Angeles fires have led to a renewed push for state and federal regulators to do more as climate change drives up the frequency of these natural disasters.It’s questionable whether the Trump administration will act, however. Earlier this month, Environmental Protection Agency Administrator Lee Zeldin announced what he described as the “biggest deregulatory” action in history, which critics warn will lead to a rollback of environmental health regulations.While Air Quality Index values are a good starting place for knowing what’s in the air, they don’t provide a full picture of pollutants, especially during disasters, said Yifang Zhu, a professor of environmental health sciences at UCLA. In fact, the AQI could be in a healthy range, “but you could still be exposed to higher air toxins from the fires,” she added.Heavy smoke from the Eaton fire in Los Angeles.Josh EdelsonAFP via Getty ImagesIn February, nearly a dozen lawmakers from California called on the EPA to create a task force of local and federal authorities to better monitor what’s in the air and inform the public. Locals are “unsure of the actual risks they face and confused by conflicting reports about how safe it is to breathe the air outside, which may lead to families not taking adequate protective measures,” the lawmakers wrote in a letter to James Payne, who was then the acting EPA administrator. The EPA press office declined to comment in an e-mail to KFF Health News.Lawmakers have also introduced bills in Congress and in the California legislature to address the gap. A measure by U.S. Rep. Mike Thompson (D-Calif.) and U.S. Sen. Jeff Merkley (D-Ore.) would direct the EPA to allocate grant money to local air pollution agencies to communicate the risks of wildfire smoke, including deploying air monitors. Meanwhile, a bill by Democratic state Assembly member Lisa Calderon would create a “Wildfire Smoke Research and Education Fund” to study the health impacts of wildfire smoke, especially on firefighters and residents affected by fires.The South Coast Air Quality Management District, a regional air pollution control agency, operates about 35 air monitoring stations across nearly 11,000 square miles of the Los Angeles region to measure pollutants like ozone and carbon monoxide.During the fires, the agency, which is responsible for the air quality of 16.8 million residents, relied on its network of stations to monitor five common pollutants, including PM2.5, the fine particles that make up smoke and can travel deep inside the body. After the fires, the South Coast AQMD deployed two mobile monitoring vans to assess air quality in cleanup areas and expanded neighborhood-level monitoring during debris removal, said Jason Low, head of the agency’s monitoring and analysis division.Local officials also received the data collected by Volcken’s devices, which arrived on-site four days after the fires broke out. The monitors — about the size of a television remote control and housed in a plastic cover the size of a bread loaf — were placed at air monitoring stations around the fires’ perimeters, as well as at other sites, including in West Los Angeles and Santa Clarita. The devices, called AirPens, monitored dozens of air contaminants in real time and collected precise chemical measurements of smoke composition.Researchers replaced the sensors every week, sending the filters to a lab that analyzed them for measurements of volatile organic compounds like benzene, lead, and black carbon, along with other carcinogens. Volcken’s devices provided public health officials with data for a month as cleanup started. The hope is that the information provided can help guide future health policies in fire-prone areas.“There’s not one device that can measure everything in real time,” Low said. “So, we have to rely on different tools for each different type of purpose of monitoring.”ASCENT, a national monitoring network funded by the National Science Foundation, registered big changes after the fires. One monitor, about 11 miles south of the Eaton fire in the foothills of the San Gabriel Mountains, detected 40 times the normal amount of chlorine in the air and 110 times the typical amount of lead in the days following the fires. It was clear the chemical spikes came from urban wildfire smoke, which is more dangerous than what would be emitted when trees and bushes burn in rural areas, said Richard Flagan, the co-principal investigator at the network’s site in Los Angeles.“Ultimately, the purpose is to get the data out there in real time, both for the public to see but also for people who are doing other aspects of research,” said Flagan, adding that chemical measurements are critical for epidemiologists who are developing health statistics or doing long-term studies of the impact of air pollution on peoples’ health.Small, low-cost sensors could fill in gaps as government networks age or fail to adequately capture the full picture of what’s in the air. Such sensors can identify pollution hot spots and improve wildfire smoke warnings, according to a March 2024 U.S. Government Accountability Office report.Although the devices have become smaller and more accurate in the past decade, some pollutants require analysis with X-ray scans and other costly high-level equipment, said J. Alfredo Gómez, director of the Natural Resources and Environment team at the GAO. And Gómez cautioned that the quality of the data can vary depending on what the devices monitor.“Low-cost sensors do a good job of measuring PM2.5 but not such a good job for some of these other air toxins, where they still need to do more work,” Gómez said.UCLA’s Zhu said the emerging technology of portable pollution monitors means residents — not just government and scientists — might be able to install equipment in their backyards and broaden the picture of what’s happening in the air at the most local level.“If the fires are predicted to be worse in the future, it might be a worthwhile investment to have some ability to capture specific types of pollutants that are not routinely measured by government stations,” Zhu said.KFF Health News, formerly known as Kaiser Health News (KHN), is a national newsroom that produces in-depth journalism about health issues and is one of the core operating programs at KFF — the independent source for health policy research, polling, and journalism.

New Mexico set to become third state to implement full PFAS product ban

New Mexico is poised to become the third state to institute a full-fledged ban on products that contain toxic "forever chemicals," as two key bills head to the governor's desk. The concurrent pieces of legislation, which have both passed through the state Legislature, would prohibit most items that contain these compounds, while also deeming specific...

New Mexico is poised to become the third state to institute a full-fledged ban on products that contain toxic "forever chemicals," as two key bills head to the governor's desk. The concurrent pieces of legislation, which have both passed through the state Legislature, would prohibit most items that contain these compounds, while also deeming specific types of discarded firefighting foam as hazardous waste. The chemicals in question, called per- and polyfluoroalkyl substances (PFAS), are notorious for their propensity to linger in the human body and in the environment. Linked to various types of cancers and other serious illnesses, PFAS are found in numerous household products, as well as in certain kinds of firefighting foam. The first bill, H.B. 212, would ban products that contain intentionally added PFAS and would authorize the state's Environmental Improvement Board to adopt relevant rules. Starting on Jan. 1, 2027, manufacturers would not be able to sell or distribute cookware, food packaging, dental floss and juvenile products that have intentionally added PFAS. The same would apply to carpets, cleaning fluids, cosmetics, fabric treatments, menstrual products, textiles, ski wax and upholstered furniture on Jan. 1, 2028. With respect to pesticides, fertilizers and other agricultural materials — many of which contain PFAS — the Environmental Improvement Board would consult with the New Mexico Department of Agriculture before setting rules on these subjects.  There would be some exceptions to the prospective ban, such as those items that the board has determined "to be essential for health, safety or the functioning of society and for which alternatives are not reasonably available." The accompanying piece of legislation, H.B. 140, focuses on a range of chemicals, including PFAS, and provides new clarity on the meaning of the term "hazardous waste." The bill would redefine hazardous waste as "any solid waste or combination of solid wastes" that because of its amount, concentration or characteristics could "cause or significantly contribute to an increase in mortality or an increase in serious irreversible or incapacitating reversible illness." Among the specific hazardous waste items mentioned is "discarded aqueous film-forming foam containing intentionally added [PFAS]." If Gov. Michelle Lujan Grisham (D) signs the two bills as anticipated, New Mexico would become the third state to implement a near-complete prohibition on PFAS-containing products. The state would follow in the footsteps of Maine and Minnesota, which passed their respective laws in 2021 and 2023. Numerous other states have approved legislation forbidding PFAS in certain product categories, rather than instituting bans across the board. While the three-year price of preventing PFAS pollution via such a ban would climb to about $2.8 million, the cost of removing and destroying just one pound of PFAS from water could be up to $18 million, according to an analysis of the legislation from the New Mexico Environment Department. "With approximately 1,100 public drinking water systems in New Mexico serving 94 percent of our residents, preventing contamination is the only affordable means of securing our drinking water supply," the analysis concluded.

Communities are rebuilding after L.A. fires despite lack of soil testing for toxic substances

Rebuilding in Altadena and Pacific Palisades has begun, despite the lack of official requirements to test soil for heavy metals and other toxic substances.

In Altadena and the Pacific Palisades neighborhood of L.A., reconstruction has begun despite the fact that the soil on affected properties has not been tested for toxic substances. The Federal Emergency Management Agency’s controversial decision to forgo soil testing in communities burned in the Eaton and Palisades wildfires sparked pushback Wednesday as California lawmakers questioned whether the practice will prevent residents from knowing if there are toxic substances on the land before rebuilding begins.Federally hired cleanup crews have been removing ash and debris, in addition to a 6-inch layer of topsoil, from buildings burned by the wildfires. But, asked last month by The Times, FEMA and the U.S. Army Corps of Engineers confirmed they won’t test the soil at these properties after they finish their cleanup, breaking with a long-standing practice that was intended to ensure that homes and schools don’t still contain excessive levels of harmful chemicals after environmental disasters such as a wildfire.Led by U.S. Rep Laura Friedman (D-Glendale), a contingent of eight federal lawmakers from California objected to FEMA’s decision to forgo soil testing in a letter to Cameron Hamilton, the agency’s acting administrator. The lawmakers pressed Hamilton to explain the change in strategy. One key question was how FEMA could ensure that removing 6 inches of soil would be sufficient to rid properties of toxic substances.“The residents of greater Los Angeles should be informed of any potential toxins in the soil as they navigate the complicated recovery process,” the letter reads. “Wildfire survivors deserve to return to safe, toxin-free properties.”The Eaton and Palisades wildfires — among the most destructive in California history — damaged or destroyed more than 13,500 properties across Los Angeles County. The resulting public health risks are too great to skimp on environmental testing, Friedman said.“FEMA’s refusal to test for toxins in the soil after wildfire cleanup in Los Angeles County is unacceptable,” Friedman said in a statement. “Families deserve to know their homes are safe and free of dangerous chemicals. This is a break from decades of FEMA precedent — and it risks exposing entire communities to long-term health threats.”The letter comes as rebuilding efforts are swiftly moving forward. So far, federal cleanup crews have cleared ash and rubble from more than 860 properties, according to the U.S. Army Corps of Engineers. About 200 rebuilding permits have been filed with local agencies — and a few have already been approved, although it unclear how many at this point.Los Angeles city and county officials say they won’t require soil testing before issuing most rebuilding permits. Without soil testing, many residents worry that new buildings could be built on contaminated land, increasing the likelihood that residents and workers may be exposed to toxic chemicals by inhaling airborne dust. Environmental and health officials have warned that wildfire ash from burned buildings can contain hazardous substances including cancer-causing arsenic and brain-damaging lead. Experts warn that the pace of rebuilding shouldn’t outpace necessary safety precautions. “The nation is captivated by how and when L.A. will rebound,” said Mohamed Sharif, co-chair of the local chapter of the American Institute of Architects’ wildfire disaster response task force. “We know fire is not the only source of catastrophe and disaster in California. We have a multiplex of things, whether it’s seismic events or landslides or rain events. But fire has really illuminated just how fragile we are as a society.” Soil testing in the aftermath of previous wildfires found that a significant portion of properties still had excessive levels of heavy metals even after cleanup crews removed a 3-to-6-inch layer of topsoil. In those cases — such as the 2018 Camp fire in Northern California and the Woolsey fire near Malibu in the same year — for properties where contaminants exceeded California’s standards, cleanup crews returned to remove another layer of soil, and additional soil testing was conducted.But now FEMA officials insist that excavating 6 inches of soil from properties is enough to remove fire-related contamination. Anything deeper, they argue, is likely to be preexisting contamination, which is beyond the agency’s purview. FEMA encouraged state and local officials to pay for soil testing, if they believe it’s necessary.So far, no state or local plans for soil testing have been unveiled.“You’re going to have to show me definitive testing that shows that material below 6 inches is attributed to the fire or debris caused by the fire,” FEMA Region 9 administrator Robert Fenton told The Times in a recent interview. “I have not found that yet.”But FEMA’s decision to skip soil sampling has left many homeowners unsure about what’s next. Abigail Greydanus, her husband and their 1-year-old son evacuated their Altadena home shortly after the Eaton fire broke out. When a neighbor returned to check on their home, the property was unrecognizable. “It was a pile of smoldering ashes,” Greydanus said. “You could still see the shell of the oven, the weight rack my husband had in the garage. But everything else was just melted or destroyed.” The couple signed up for the Army Corps debris removal program. But even after crews cleared rubble and debris from their property, they are wary to rebuild without confirming whether lingering pollutants may still be in the soil. “No one wants to go back to a home if it’s going to be unsafe, if their children will be [exposed to] lead from playing in the backyard,” Greydanus said. In lieu of government-led soil testing, homeowners and school districts may have to pay for soil sampling if they want answers. Some research institutions are stepping into the breach, including USC, which is providing free lead testing, and a coalition of researchers from UCLA, Loyola Marymount and Purdue universities, who are offering a full panel of soil tests for those in affected areas. Meanwhile, some school officials in these areas are already hiring companies — and paying out of pocket — to test for toxic chemicals.Three Los Angeles Unified School District schools were damaged or destroyed in the Palisades fire: Marquez Charter Elementary, Palisades Charter Elementary and Palisades Charter High School. The Army Corps of Engineers oversaw the cleanup of these campuses earlier this month.An LAUSD spokesperson said the school district “will conduct a full environmental assessment throughout the entire campus — including soil sampling of existing landscaping as well as areas to be uncovered that will be a part of the buildout of the interim campus.” They hired environmental consultants to assess the soil at the elementary schools. Because Palisades Charter High School is an independent charter school, LAUSD referred requests for comment to its administration; a representative for the high school did not respond to a request for comment. Pasadena Unified School District also saw extensive fire damage at several of its campuses, including public and charter schools: Franklin Elementary, Eliot Arts Magnet Middle School, Odyssey Charter School, Pasadena Rosebud Academy, Oak Knoll Montessori School and Aveson School of Leaders. School district officials would not confirm whether the district would perform soil testing on its properties.“Pasadena Unified is actively working across all levels of government to further examine whether there are any remaining risks,” a spokesperson said. “Discussions are ongoing. Our commitment is to keep our school community safe and informed throughout this entire process.”Under state law, the California Department of Toxic Substances Control is required to oversee soil sampling at newly constructed schools or campus expansions to ensure they comply with the state standards. But when asked about how it would approach rebuilding schools in Altadena and Pacific Palisades, the state agency was noncommittal. “Sampling plans are required by law in limited circumstances, like when new property is purchased to build a school with state funds,” a DTSC representative told The Times. “For schools in the Altadena and Pacific Palisades communities, DTSC will provide technical assistance to school districts by request, which includes helping them prepare sampling plans and reviewing results of the samples that they collect.” The agency would not say whether testing would be required before schools began to rebuild. Meanwhile, even if government regulators don’t get involved, property owners may find it difficult to hire contractors to rebuild.“Any professional geotechnical engineer will not go to test for the foundation strength unless they know that site is free of toxins,” said Sharif, of the American Institute of Architects.Rebuilding is complex, he noted, involving many economic, environmental and safety considerations. It’s unwise to leave the decision to thousands of individual property owners. After all, contamination on one property can affect neighboring homes.“I shudder to think what owners of the lots next door to a hypothetical owner aren’t doing,” Sharif said. “This is to say that while the majority of the damage is on private land, it’s insane to entrust private citizens with public health.”

Mystery solved: our tests reveal the tiny algae killing fish and harming surfers on SA beaches

A harmful algal bloom of Karenia mikimotoi made dozens of surfers sick and killed seadragons, fish and octopuses on two South Australian beaches.

Anthony RowlandConfronting images of dead seadragons, fish and octopuses washed up on South Australian beaches – and disturbing reports of “more than 100” surfers and beachgoers suffering flu-like symptoms after swimming or merely breathing in sea spray – attracted international concern last week. Speculation about the likely cause ranged from pollution and algae to unusual bacterial infections or viruses. Today we can reveal the culprit was a tiny – but harmful – type of planktonic algae called Karenia mikimotoi. The SA government sent us water samples from Waitpinga Beach, Petrel Cove Beach, Encounter Bay Boat Ramp and Parsons Headland on Tuesday. We studied the water under the microscope and extracted DNA for genetic analysis. Our results revealed high numbers of the tiny harmful algal species – each just 20 microns in diameter (where one micron is one thousandth of a millimetre). While relatively common in Australian coastal waters, blooms of K. mikimotoi occur only sporadically. But similar harmful algal blooms and fish kills due to K. mikimotoi have happened in the past, such as the 2014 bloom in Coffin Bay, SA. And this latest one won’t be the last. Sick surfers and dead marine life from strange sea foam (ABC News) Harmful algal blooms Single-celled, microbial algae occur naturally in seawater all over the world. They are also called phytoplankton, because they float in the water column and photosynthesise like plants. “Phyto” comes from the Greek word for plant and “plankton” comes from the Greek word for wanderer, which relates to their floating movement with ocean currents and tides. Like plants on land, the microalgae or phytoplankton in the ocean capture sunlight and produce up to half the oxygen in our atmosphere. There are more than 100,000 different species of microalgae. Every litre of seawater will normally contain a mixed group of these different microalgae species. But under certain conditions, just a single species of microalgae can accumulate in one area and dominate over the others. If we are unlucky, the dominant species may be one that produces a toxin or has a harmful effect. This so-called “harmful algal bloom” can cause problems for people and for marine life such as fish, invertebrates such as crabs, and even marine mammals such as whales and seals. There are hundreds of different species of harmful algae. Each produces its own type of toxin with a particular toxic effect. Most of these toxic chemical compounds produced by harmful algae are quite well known, including neurotoxins that affect the brain. But others are more complicated, and the mechanisms of toxicity are poorly understood. This can make it more difficult to understand the factors leading to the deaths of fish and other marine life. Unfortunately, the toxins from K. mikimotoi fall into this latter category. Introducing Karenia mikimotoi Karenia mikimotoi under the microscope. Shauna Murray The species responsible for recent events in SA beaches, K. mikimotoi, causes harmful algal blooms in Asia, Europe, South Africa and South America, as well as Australia and New Zealand. These blooms all caused fish deaths, and some also caused breathing difficulties among local beachgoers. The most drastic of these K. mikimotoi blooms have occurred in China over the past two decades. In 2012, more than 300 square kilometres of abalone farms were affected, causing about A$525 million in lost production. Explaining the toxic effects Microalgae can damage the gills of fish and shellfish, preventing them from breathing. This is the main cause of death. But some studies have also found damage to the gastrointestinal tracts and livers of fish. Tests using fish gill cells clearly show the dramatic toxic effect of K. mikimotoi. When the fish gill cells were exposed to intact K. mikimotoi cells, after 3.5 hours more than 80% of the fish cells had died. Fortunately, the toxin does not persist in the environment after the K. mikimotoi cells are dead. So once the bloom is over, the marine environment can recover relatively quickly. Its toxicity is partly due to the algae’s production of “reactive oxygen species”, reactive forms of oxygen molecules which can cause the deaths of cells in high doses. K. mikimotoi cells may also produce lipid (fat) molecules that cause some toxic effects. Finally, a very dense bloom of microalgae can sometimes reduce the amount of dissolved oxygen in the water column, which means there is less oxygen for other marine life. The human health effects are not very well known but probably relate to the reactive oxygen species being an irritant. K. mikimitoi cells can also produce “mucilage”, a type of thick, gluey substance made of complex sugars, which can accumulate bacteria inside it. This can cause “sea foam”, which was evident on beaches last week. South Australia’s marine emblem, the leafy seadragon, washed up dead on the beach. Anthony Rowland Unanswered questions remain A question for many people is whether increasing water temperatures make blooms of K. mikimotoi more likely. Another concern is whether nutrient runoff from farms, cities and aquaculture could cause more harmful algal blooms. Unfortunately, for Australia at least, the answer to these questions is we don’t know yet. While we know some harmful algal blooms do increase when nutrient runoff is higher, others actually prefer fewer nutrients or colder temperatures. We do know warmer water species seem to be moving further south along the Australian coastline, changing phytoplankton species abundance and distribution. While some microalgal blooms can cause bioluminescence that is beautiful to watch, others such as K. mikimotoi can cause skin and respiratory irritations. If you notice discoloured water, fish deaths or excessive sea foam along the coast or in an estuary, avoid fishing or swimming in the area and notify local primary industry or environmental authorities in your state. Shauna Murray receives funding from the Fisheries Research and Development Corporation, the New South Wales Recreational Fisheries Trust, the Australian Centre for International Agricultural Research, and the Storm and Flood Industry Recovery Program. She is President of the Austalasian Society of Phycology and Aquatic Botany and past chair of the NSW Shellfish Committee.Greta Gaiani does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.