Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

GoGreenNation News

Learn more about the issues presented in our films
Show Filters

Predicting Chaos With AI: The New Frontier in Autonomous Control

Recent research highlights the development of advanced machine learning algorithms capable of controlling complex systems efficiently. These new algorithms, tested on digital twins of chaotic...

Advanced machine learning algorithms have shown potential in efficiently controlling complex systems, promising significant improvements in autonomous technology and digital infrastructure.Recent research highlights the development of advanced machine learning algorithms capable of controlling complex systems efficiently. These new algorithms, tested on digital twins of chaotic electronic circuits, not only predict and control these systems effectively but also offer significant improvements in power consumption and computational demands.Systems controlled by next-generation computing algorithms could give rise to better and more efficient machine learning products, a new study suggests.Using machine learning tools to create a digital twin, or a virtual copy, of an electronic circuit that exhibits chaotic behavior, researchers found that they were successful at predicting how it would behave and using that information to control it. The Limitations of Linear ControllersMany everyday devices, like thermostats and cruise control, utilize linear controllers – which use simple rules to direct a system to a desired value. Thermostats, for example, employ such rules to determine how much to heat or cool a space based on the difference between the current and desired temperatures.Yet because of how straightforward these algorithms are, they struggle to control systems that display complex behavior, like chaos.As a result, advanced devices like self-driving cars and aircraft often rely on machine learning-based controllers, which use intricate networks to learn the optimal control algorithm needed to best operate. However, these algorithms have significant drawbacks, the most demanding of which is that they can be extremely challenging and computationally expensive to implement.The Impact of Efficient Digital TwinsNow, having access to an efficient digital twin is likely to have a sweeping impact on how scientists develop future autonomous technologies, said Robert Kent, lead author of the study and a graduate student in physics at The Ohio State University.“The problem with most machine learning-based controllers is that they use a lot of energy or power and they take a long time to evaluate,” said Kent. “Developing traditional controllers for them has also been difficult because chaotic systems are extremely sensitive to small changes.”These issues, he said, are critical in situations where milliseconds can make a difference between life and death, such as when self-driving vehicles must decide to brake to prevent an accident.The study was published recently in Nature Communications.Advancements in Machine Learning ArchitectureCompact enough to fit on an inexpensive computer chip capable of balancing on your fingertip and able to run without an internet connection, the team’s digital twin was built to optimize a controller’s efficiency and performance, which researchers found resulted in a reduction of power consumption. It achieves this quite easily, mainly because it was trained using a type of machine learning approach called reservoir computing.“The great thing about the machine learning architecture we used is that it’s very good at learning the behavior of systems that evolve in time,” Kent said. “It’s inspired by how connections spark in the human brain.”Practical Applications and Future DirectionsAlthough similarly sized computer chips have been used in devices like smart fridges, according to the study, this novel computing ability makes the new model especially well-equipped to handle dynamic systems such as self-driving vehicles as well as heart monitors, which must be able to quickly adapt to a patient’s heartbeat.“Big machine learning models have to consume lots of power to crunch data and come out with the right parameters, whereas our model and training is so extremely simple that you could have systems learning on the fly,” he said.To test this theory, researchers directed their model to complete complex control tasks and compared its results to those from previous control techniques. The study revealed that their approach achieved a higher accuracy at the tasks than its linear counterpart and is significantly less computationally complex than a previous machine learning-based controller.“The increase in accuracy was pretty significant in some cases,” said Kent. Though the outcome showed that their algorithm does require more energy than a linear controller to operate, this tradeoff means that when it is powered up, the team’s model lasts longer and is considerably more efficient than current machine learning-based controllers on the market.“People will find good use out of it just based on how efficient it is,” Kent said. “You can implement it on pretty much any platform and it’s very simple to understand.” The algorithm was recently made available to scientists.Economic and Environmental ConsiderationsOutside of inspiring potential advances in engineering, there’s also an equally important economic and environmental incentive for creating more power-friendly algorithms, said Kent.As society becomes more dependent on computers and AI for nearly all aspects of daily life, demand for data centers is soaring, leading many experts to worry over digital systems’ enormous power appetite and what future industries will need to do to keep up with it.And because building these data centers as well as large-scale computing experiments can generate a large carbon footprint, scientists are looking for ways to curb carbon emissions from this technology.To advance their results, future work will likely be steered toward training the model to explore other applications like quantum information processing, Kent said. In the meantime, he expects that these new elements will reach far into the scientific community.“Not enough people know about these types of algorithms in the industry and engineering, and one of the big goals of this project is to get more people to learn about them,” said Kent. “This work is a great first step toward reaching that potential.”Reference: “Controlling chaos using edge computing hardware” by Robert M. Kent, Wendson A. S. Barbosa and Daniel J. Gauthier, 8 May 2024, Nature Communications.DOI: 10.1038/s41467-024-48133-3This study was supported by the U.S. Air Force’s Office of Scientific Research. Other Ohio State co-authors include Wendson A.S. Barbosa and Daniel J. Gauthier.

The pivotal role of a tiny hydropower plant in preserving the Colorado River's future

A sprawling water district that serves residents, ranchers and recreators on the Western Slope of the Rocky Mountains is preparing to invest a mammoth $98.5 million on a tiny hydropower plant in a bipartisan, multi-sectorial effort to help secure the Colorado River's future. In the coming weeks, Colorado Gov. Jared Polis (D) is expected to...

A sprawling water district that serves residents, ranchers and recreators on the Western Slope of the Rocky Mountains is preparing to invest a mammoth $98.5 million on a tiny hydropower plant in a bipartisan, multi-sectorial effort to help secure the Colorado River's future. In the coming weeks, Colorado Gov. Jared Polis (D) is expected to sign into law a comprehensive water conservation bill that would include a $20 million state contribution through the Colorado Water Conservation Board to the overall purchase of the water rights associated with the Shoshone Generating Station. This sum would cover a significant share of the total purchase and sale agreement signed in December by the Colorado River Water Conservation District, which represents 15 counties on the Western Slope, with the Public Service Company of Colorado, a subsidiary of Xcel Energy. The deal has earned vast support from voices that would not usually come together as a united front: the farmers, whitewater rafters and environmental advocates who may have diverse motivations, but whose efforts could safeguard the river for its 40 million users downstream. "Every interest group out there in western Colorado who cares about the greater river — they see great advantages to preserving the flows on the river," Andy Mueller, general manager of the Colorado River Water Conservation District, told The Hill. The Shoshone station, a Glenwood Springs, Colo., mainstay since 1909, has a small capacity of just 15 megawatts but is "one of the oldest hydroelectric plants in western Colorado that relies on the river flow rather than water stored in a reservoir," according to Xcel. As a basis of comparison, the Glen Canyon Dam at Lake Powell has a capacity of 1,320 megawatts, while the Hoover Dam at Lake Mead has a capacity of 2,080 megawatts, according to the federal Bureau of Reclamation. Both of those facilities do rely on water stored in reservoirs. The importance of tiny Shoshone is tied to the Western concept of “water rights” that stems from the mid-19th century homesteading and gold rush era. Also known as "prior appropriation," this system was built upon a first-come, first-serve hierarchy and is not based on proximity to a river — meaning that those with more “junior” status are the first to give up water during a shortage. This approach allowed farmers, miners and other landowners to claim and divert water for “beneficial use,” such as irrigation, industry and power production. But if a decade passes without putting that right to beneficial use, the owner loses the title in what the Colorado Department of Water Resources deems a "water rights abandonment." Despite its small size, the Shoshone Generating Station also holds the Upper Colorado River's most senior "non-consumptive" water right, which dictates that every drop used by the power facility must go back into the river. But because Shoshone’s maintenance is expensive, Western Slope stakeholders have long feared that its rights could be sold to an upstream entity in the higher-populated Eastern Slope, resulting in a diversion of water that would otherwise flow downstream. However, if the purchase agreement is finalized, it would place the power station’s rights in the public’s hands: a senior appropriation from 1902 and a second, more junior allocation from 1929. The retention of Shoshone’s senior status, Mueller explained, would prevent the river from being "siphoned out of the headwaters." Secure river flow would also strengthen the fish population, supporting the survival of both sport fishing and endangered animals, he said. Mueller emphasized the need to further the success of a federal, state and community partnership initiative called the Upper Colorado River Endangered Fish Recovery Program, which has helped revive floundering fish populations. A robust river would also give reliable access to high-quality resources to both farmers who irrigate their lands and cities that withdraw and then discharge treated water into the system, Mueller explained. The knock-on effects, he added, would persist downstream, by preventing cuts from the physical amount of water flowing “from the headwaters all the way to Lake Powell." “By preserving this right, we are assisting the functioning of the entire Colorado River system,” he said. The deadline for closing the transaction is Dec. 31, 2027, by which time the Colorado River District must not only secure all the necessary funding, but it also must negotiate what’s called "an instream flow agreement” with the Colorado Water Conservation Board. In simpler terms, the parties need to redefine what constitutes a "beneficial use” for the Shoshone water rights. While the Colorado River District plans to lease the water to Xcel for now, cessation of power generation for more than a decade would currently lead to a water rights abandonment.  The board has been approving instream flows, natural flows for environmental purposes, as a beneficial use since 1973.  But such authorizations must occur in one of Colorado's "water courts," specialized forums that preside over each of the state’s seven river basins. Mueller explained that the Colorado River District is now working with the state and plans to file a request in water court to add this new beneficiary to the existing decree, while maintaining the same 1902 senior appropriation date. Doing so, Mueller reiterated, would be vital for any future incident in which hydropower production is suspended. "This instream flow will remain in place and will keep the river functioning and flowing the exact same way that it has for the last 120 years," he said. Hattie Johnson, Southern Rockies restoration director for American Whitewater, echoed these sentiments, noting that "having a flowing, functioning, healthy river helps everybody.” Johnson, whose organization promotes river conservation and safe recreation, noted that such a waterway is “a fun one to paddle on,” emphasizing how those who do recreate on rivers “are empowered and excited to protect” them. As far as funding is concerned, the Colorado River District has collected a sizable number of pledges toward the $98.5 million total sales price — plus an additional $500,000 in transaction costs — but still has some fundraising to do. By the end of last month, the partners had raised $48.05 million, including the $20 million from the Colorado Water Conservation Board, $20 million from the district's Community Funding Partnership and the remainder from Western Slope communities. The buyers are hoping to secure the remaining $49 million through the Bureau of Reclamation, with $4 million possibly coming from Inflation Reduction Act funds designated for drought mitigation.  “We think there's incredible value to the federal government from this transaction,” Mueller said. Johnson, meanwhile, expressed hopes that the message of cross-sectoral support “is received across the board." Recognizing that there is still much to accomplish, she described the efforts to date as "a really cool example of what folks can do when they come together on something." The partnership to protect the Shoshone water rights developed following more than a century of debate and looming uncertainty over Eastern versus Western slope usage of the Colorado River. "The legal right to appropriate and transport water from one watershed to another has been attacked since statehood," water rights lawyer Jim Lochhead said in a 1987 article on the subject. Such challenges, he explained, stem back to an 1882 case in which judges recognized "Colorado's arid nature and the 'imperative necessity' of allowing diversion of water for beneficial use elsewhere." "The Eastern Slope is relatively arid, whereas the Western Slope provides a snowpack which sustains the entire Colorado River," Lochhead wrote.  Because the Eastern Slope also "holds the bulk of the state's population and economic activity," it has “outstripped its local water supply” with growth, causing officials to look toward the Western Slope for more resources. Part of the reason it is so important to the Colorado River District to secure the Shoshone rights is due precisely to these circumstances — concerns that in the case of a future sale, Front Range communities on the Eastern Slope might rush to ramp up their water security, Mueller explained. While Mueller credited Denver for already developing a robust water portfolio, he looked toward areas south of the capital, such as Douglas County, that are actively seeking alternate supplies. "Let me be really clear, we don't want to deprive any of our population centers or cities or industries of water — we understand how connected we are," Mueller said. "We also think there are responsible ways we can all develop, as we continue to grow to make sure that we live within the means of the Colorado River."

Thousands in Devon no longer have to boil drinking water, says supplier

But authorities say households in some areas need to continue safety measures amid waterborne parasitic diseaseAnger in Devon as more cases of waterborne disease expectedThousands of households in the Brixham area of Devon can now safely use and drink their tap water without having to boil it first, South West Water (SWW) has said.The water company said about 14,500 households in the Alston supply area can now use their tap water safely, although about 2,500 properties in Hillhead, upper parts of Brixham and Kingswear should continue to boil their supply before drinking it. Continue reading...

Thousands of households in the Brixham area of Devon can now safely use and drink their tap water without having to boil it first, South West Water (SWW) has said.The water company said about 14,500 households in the Alston supply area can now use their tap water safely, although about 2,500 properties in Hillhead, upper parts of Brixham and Kingswear should continue to boil their supply before drinking it.The decision was made in consultation with the UK Health Security Agency (UKHSA) and the local authority’s environmental health department.UKHSA has confirmed 46 cases of cryptosporidium infection in the Brixham area, while more than 100 other people have reported symptoms which include diarrhoea, stomach pains and dehydration.SWW’s chief customer officer, Laura Flowerdew, said: “Following rigorous testing this week, it is now safe to lift the boil water notice in the Alston water supply area. This decision has been supported by the government’s public health experts and the local authority’s environmental health department.“This situation has caused an immense amount of disruption, distress and anxiety. We are truly sorry this has happened.“The public rightly expect a safe, clean and reliable source of drinking water and on this occasion, we have fallen significantly short of expectations. We will not stop working until this has been fully resolved.“With the boil water notice still in place in Hillhead, upper parts of Brixham and Kingswear, we are urging customers who are unsure if they are still affected to visit the postcode checker on our website or call us so we can check for them.”More details soon …

What Is Pasteurization, and How Does It Keep Milk Safe?

The pasteurization process was invented in the 1860s and continues to keep people safe from a range of foodborne illnesses

What Is Pasteurization, and How Does It Keep Milk Safe?The pasteurization process was invented in the 1860s and continues to keep people safe from a range of foodborne illnessesBy Kerry E. Kaylegian & The Conversation USJacobs Stock Photography Ltd/Getty ImagesThe following essay is reprinted with permission from The Conversation, an online publication covering the latest research.Recent reports that the H5N1 avian flu virus has been found in cow’s milk have raised questions about whether the U.S. milk supply is safe to drink. According to the federal Food and Drug Administration, the answer is yes, as long as the milk is pasteurized.Nonetheless, raw (unpasteurized) milk sales are up, despite health experts’ warning that raw milk could contain high levels of the virus, along with many other pathogens.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.As an extension food scientist in a state where raw milk sales are legal, I provide technical support to help processors produce high-quality, safe dairy foods. I also like to help people understand the confusing world of pasteurization methods on their milk labels, and why experts strongly discourage consuming raw milk and products made from it.What can make milk unsafeDairy products, like many foods, have inherent risks that can cause a variety of illnesses and even death. Our milk comes from animals that graze outdoors and live in barns. Milk is picked up from the farm in tanker trucks and delivered to the processing plant. These environments offer numerous opportunities for contamination by pathogens that cause illness and organisms that make food spoil.For example, listeria monocytogenes comes from environmental sources like soil and water. Mild infections with listeriosis cause flu-like symptoms. More serious cases are, unfortunately, too common and can cause miscarriages in pregnant women and even death in extreme cases.Other pathogens commonly associated with dairy animals and raw milk include E. coli, which can cause severe gastrointestinal infections and may lead to kidney damage; Campylobacter, the most common cause of diarrheal illness in the U.S.; and Salmonella, which cause abdominal pain, diarrhea and other symptoms.Keeping beverages safe with heatIn the 1860s, French microbiologist Louis Pasteur discovered that heating wine and beer killed the organisms that caused spoilage, which then was a significant problem in France.This heating process, which became known as pasteurization, was adopted in the U.S. prior to World War II, at a time when milk was responsible for 25% of all U.S. outbreaks of foodborne illnesses. In 1973, the federal government required that all milk sold across state lines in the U.S. had to be pasteurized, and in 1987, it banned interstate sales of raw milk.Pasteurization heats every particle of a food to a specific temperature for a continuous length of time in order to kill the most heat-resistant pathogen associated with that product. Different organisms have different responses to heat, so controlled scientific studies are required to determine what length of time at a given temperature will kill a specific organism.Since 1924, pasteurization in the U.S. has been guided by the Grade “A” Pasteurized Milk Ordinance, a federal guidance document that is updated every two years to reflect current science and has been adopted by all 50 states. Pasteurization equipment in the U.S. must meet stringent requirements that include sanitary design, safety controls and material standards.Pasteurization methodsDairy processors can choose among several different types of pasteurization. When executed properly, all of these methods produce the same result: pathogen-free milk. Processors may treat milk beyond minimum times or temperatures to provide an extra margin of safety, or to reduce bacteria that can cause milk to spoil, thus increasing the product’s shelf life.Vat pasteurizers, also known as batch pasteurizers, often are used by smaller-scale processors who handle limited volumes. The milk is pumped into a temperature-controlled tank with a stirrer, heated to a minimum of 145 degrees Fahrenheit (63 Celsius) and held there continuously for 30 minutes. Then it is cooled and pumped out of the vat.The most common method used for commercial milk is high-temperature short-time pasteurization, which can treat large volumes of milk. The milk is pumped through a series of thin plates at high speed to reach a minimum temperature of 161 F (71 C). Then it travels through a holding tube for 15 seconds, and the temperature is checked automatically for safety and cooled.The most complex and expensive systems are ultra-pasteurizers and ultra-high-temperature pasteurizers, which pasteurize milk in just a few seconds at temperatures above 285 F (140 C). This approach destroys many spoilage organisms, giving the milk a significantly longer shelf life than with other methods, although sometimes products made this way have more of a “cooked” flavor.Ultra-high-temperature products are processed in a sterile environment and packaged in sterile packaging, such as lined cartons and pouches. They can be shelf-stable for up to a year before they are opened. Ultra-high-temperature packaging makes taking milk to school for lunch safe for kids every day.Avian flu in milkThe detection of avian flu virus fragments in milk is a new challenge for the dairy industry. Scientists do not have a full picture of the risks to humans but are learning.Research so far has shown that virus particles end up in the milk of infected cows, but that pasteurization will inactivate the virus. However, the FDA is advising consumers not to drink raw milk because there is limited information about whether it may transmit avian flu.The agency also is urging producers not to manufacture or sell raw milk or raw milk products, including cheese, made with milk from cows showing symptoms of illness.It’s never a good time to get a foodborne illness, and this is the beginning of ice cream season. At a time when avian flu is showing up in new species and scientists are still learning about how it is transmitted, I agree with the FDA that raw milk poses risks not worth taking.This article was originally published on The Conversation. Read the original article.

As fossil fuel plants face retirement, a Puerto Rico community pushes for rooftop solar

Land for large solar arrays is limited on the island. Rooftop panels can provide electricity during blackouts and bring the island closer to its clean energy goals.

The coastal communities of Guayama and Salinas in southern Puerto Rico feature acres of vibrant green farmland, and a rich, biodiverse estuary, the protected Jobos Bay, which stretches between the neighboring townships. But this would-be tropical paradise is also the home of both a 52-year-old oil-fired power plant and a 22-year-old coal-fired power plant, which local residents say contaminate their drinking water and air, and harm people’s health.  “It’s a classic sacrifice zone,” said Ruth Santiago, a lawyer and community activist who has fought against environmental injustice in Puerto Rico for more than 20 years. “A friend calls this ‘the beautiful place with serious problems.’” Local residents envision a cleaner future as these fossil fuel plants are scheduled to retire within the next several years. They see rooftop solar as the best alternative as the island transitions to renewable energy.  In November 2023, the federal government allocated $440 million in funding for rooftop solar energy in Puerto Rico, part of a billion dollar energy investment in the island. Officials, in recent years, have acknowledged that the region has suffered as the home of polluting power plants. After a 2022 visit to Salinas and Guayama, Environmental Protection Agency Administrator Michael Regan announced a plan to spend $100,000 to improve monitoring of the air and water pollution from the coal-fired power plant, which is owned by Virginia-based Applied Energy Services Corporation, or AES. Read Next As states slash rooftop solar incentives, Puerto Rico extends them Gabriela Aoun Angueira “For too long, communities in Puerto Rico have suffered untold inequities—from challenges with access to clean drinking water to fragile infrastructure that cannot withstand the increase and intensity of storms brought on by climate change,” Regan said in a press release.  The EPA examined a drinking water sample in May 2023 from groundwater near the power plants that supplies drinking water to the region and found that metal levels did not exceed federal criteria. EPA public information officer Carlos Vega said more samples will be analyzed and the EPA will continue to inform the community. No timeline for the additional testing has been established. For decades, most of Puerto Rico’s electricity has been generated in the southern part of the island. The Puerto Rico Electric Power Authority uses over 30,000 miles of distribution lines to send energy generated in the south to more urban areas, primarily in the north, like San Juan. Coastal power plants in the south have posed health risks for community members, Santiago said; according to a 2022 report from the environmental law nonprofit Earthjustice, the AES plant produces 800 tons of coal ash waste per day that contaminates the air and nearby waters. Many low-income residents in the south struggle to pay electricity bills that are more than 30 percent higher than in the U.S. as a whole. Nearly half of Guayama’s residents were below the poverty line in 2022.  AES did not respond to multiple email requests for comment. AES Puerto Rico said that their plants are in compliance with regulations in a 2020 press release. The Aguirre Power Complex oil-fired plant in Salinas, Puerto Rico is scheduled to retire by 2030. The plant neighbors Jobos Bay, a protected estuary home to multiple endangered species, according to the National Oceanic and Atmospheric Association. Esther Frances/Medill News Service An EPA inspection in 2021 revealed the coal facility was not in compliance with the Clean Water Act for releasing polluted stormwater without a permit. In 2022, the EPA found the coal plant exceeded legal emission limits for pollutants like carbon monoxide and mercury, according to an Earthjustice analysis. The EPA issued several other violations for the coal plant dating back to 2019, citing it for inadequate disposal of coal ash and endangering residents, according to the Environmental Integrity Project, an environmental watchdog group. “People know that it’s a terrible impact, but it’s not easy to move to find somewhere else to live,” Santiago said. Many local residents cannot move because average home prices have increased across the island since Hurricane Maria hit in 2017, according to activists and researchers in Puerto Rico.  The coal plant is scheduled to retire in 2027, when a 25-year contract expires between AES and the Puerto Rico Electric Power Authority. To replace coal, AES has turned to utility-scale solar power. AES Puerto Rico began construction for the 135-acre Ilumina Solar PV Park in Guayama in 2011. AES Puerto Rico’s coal plant, solar farm and some smaller projects together supplied up to 25 percent of Puerto Rico’s electricity. A few solar farms have already been built on the South Coast, and in February 2022, the Puerto Rican Energy Bureau approved 18 new utility-scale solar panel projects across the island. Critics say the solar farms are using dwindling agricultural land, and a group of environmental and public health organizations including Earthjustice and the Sierra Club Puerto Rico filed a lawsuit in August 2023 to stop the government of Puerto Rico from allowing the solar farms to be built on ecologically important land. Read Next Energy Department backs solar loans for low-income Puerto Ricans Gabriela Aoun Angueira A 2019 law mandated that the Puerto Rico Electric Power Authority reduce the use of fossil fuels for electrical generation on the island and generate 100 percent renewable energy by 2050. In addition, the Puerto Rico Electric Power Authority issued an Integrated Resource Plan in 2020 that includes a plan to retire the Aguirre Power Complex oil-fired plant by 2030.  Instead of large solar farms, many local organizations in Puerto Rico see a better solution for their region’s electricity production—rooftop solar panels. They prefer this kind of solar energy for communities because unlike large solar facilities, rooftop solar installations do not use up farmland, which in Puerto Rico decreased by 37.5 percent between 2012 and 2018, according to the Census of Agriculture.  In February, the Department of Energy released results of its study of Puerto Rican renewable energy, named PR100. The study reported the huge potential for rooftop solar in Puerto Rico—up to 6,100 MW by 2050 under the most aggressive scenario—but said utility-scale renewable energy would still be needed.  The study also noted the challenges in deploying rooftop solar, including unstable roofs and lack of property titles. But Puerto Rico has a long way to go to reach the 2050 green energy goal; as of 2022, only 6 percent of electricity generated in Puerto Rico was renewable. Ruth Santiago’s son Jose and other electricians helped install rooftop solar panels in Salinas neighborhoods through Coquí Solar, a community-based organization working to help low-income and vulnerable residents access solar energy. The solar kits from Coquí Solar provided homes with solar panels and batteries, which could provide electricity during a blackout. Rooftop solar arrays often cannot meet a home’s entire electricity demand, but the battery storage the solar array generates can run crucial things like refrigerators, lights and medical equipment in case of an electrical blackout, while also reducing a household’s energy bills significantly.  Read Next Puerto Rico is using residents’ home batteries to back up its grid Gabriela Aoun Angueira The kits cost about $7,000, which Ruth Santiago said Coquí Solar purchased using grants from various Puerto Rico-based organizations and foundations. Coquí Solar, working with other organizations in the area, also installed the equipment in the homes of vulnerable community members for free. Jose Santiago said the elderly and people living with chronic illnesses and disabilities in the area suffer during blackouts, which are frequent on the island. “Every year, the power leaves for five, six days,” Jose Santiago said. “Sometimes more, sometimes several times, and you don’t want to see the old people in the line at the gas station trying to get ice to put in their fridge. So, [rooftop solar energy] helps them.” After Hurricane Maria caused structural damages to the island’s electrical infrastructure in 2017, the Puerto Rico Electric Power Authority reported that all of their electric consumers, over 1.5 million customers, were without power. Some Puerto Rico residents spent close to 11 months without power, according to climate change and development specialist Ramón Bueno. “That just sounds like a number, but all we have to think about is how do we deal with losing power for two, three days? That’s radical,” Bueno said. “So, two, three months is very radical. And five times that is even more.” Ruth Santiago said Coquí Solar’s rooftop solar installments have empowered the community by giving residents “agency” over their electricity generation. The desire for electricity independence had grown in Puerto Rico after recent destructive hurricanes and other impacts of climate change. Organizations like Coquí Solar have spent years working toward decentralizing solar energy across the island, and Bueno said many are strong and independent. “They’re pretty articulate framers of an alternative way to move forward with energy systems,” Bueno said.  Ruth Santiago worried that the retirement dates of the coal and oil plants could be delayed or that a new infrastructure would depend heavily on utility scale solar that would rely on a centralized grid and expose communities to blackouts during and after storms. She hoped that concerns about the community’s health and environment would be enough to force the plants to close on schedule and that rooftop solar would be prioritized over large-scale solar.  “We need to really go beyond resilience, we need to go toward energy security and sovereignty, and that’s what we’re trying to do, at least create and do these pilot projects, these community-based examples of what that transformation would look like,” Ruth Santiago said. “If we don’t do it now, then when?” This story was originally published by Grist with the headline As fossil fuel plants face retirement, a Puerto Rico community pushes for rooftop solar on May 18, 2024.

Toxic ‘forever chemicals’ ubiquitous in Great Lakes basin, study finds

PFAS chemicals present in air, rain, atmosphere and water in basin, which holds nearly 95% of US freshwaterToxic PFAS “forever chemicals” are ubiquitous in the Great Lakes basin’s air, rain, atmosphere and water, new peer-reviewed research shows.The first-of-its-kind, comprehensive picture of PFAS levels for the basin, which holds nearly 95% of the nation’s freshwater, also reveals that precipitation is probably a major contributor to the lakes’ contamination. Continue reading...

Toxic PFAS “forever chemicals” are ubiquitous in the Great Lakes basin’s air, rain, atmosphere and water, new peer-reviewed research shows.The first-of-its-kind, comprehensive picture of PFAS levels for the basin, which holds nearly 95% of the nation’s freshwater, also reveals that precipitation is probably a major contributor to the lakes’ contamination.“We didn’t think the air and rain were significant sources of PFAS in the Great Lakes’ environment, but it’s not something that has been studied that much,” said Marta Venier, a co-author with Indiana University.PFAS are a class of 15,000 chemicals used across dozens of industries to make products resistant to water, stains and heat. The chemicals are linked to cancer, kidney disease, birth defects, decreased immunity, liver problems and a range of other serious diseases.They are dubbed “forever chemicals” because they do not naturally break down and are highly mobile once in the environment, so they continuously move through the ground, water and air. PFAS have been detected in all corners of the globe, from penguin eggs in Antarctica to polar bears in the Arctic.The new paper is part of a growing body of evidence showing how the chemicals move through the atmosphere and water.Measurements found PFAS levels in the air varied throughout the basin – they were much higher in urban locations such as Chicago than in rural spots in northern Michigan. That tracks with how other chemical pollutants, like PCBs, are detected, Venier said.But levels in rain were consistent throughout the basin – virtually the same in industrialized areas such as Chicago and Cleveland as in Sleeping Bear Dunes, a remote region in northern Michigan. The finding was a bit “puzzling” Venier said, adding that it probably speaks to the chemicals’ ubiquity.A fisherman in Bayfield, Wisconsin. Photograph: Scott Olson/Getty ImagesPFAS “background levels” are now so high and the environmental contamination so widespread that the atmospheric counts, including in rain, are relatively consistent. The PFAS in rain could be carried from local sources, or have traveled long distances from other regions. Regardless, it is a major source of pollution that contributes to the lakes’ levels, Venier added.Water contamination levels were highest in Lake Ontario, which holds the most major urban areas, such as Toronto and Buffalo, and is last in line in the lake system’s west to east flow. Lake Superior, which is the largest and deepest body with few urban areas on its shores, showed the lowest levels.PFAS tend to accumulate in Lake Superior and Huron because there’s little water exchange, while Lake Ontario relatively quickly moves the chemicals into the Saint Lawrence Seaway and Atlantic Ocean.The study did not address what the levels mean for human health and exposure, but fish consumption advisories are in place across the region, and many cities have contaminated drinking water.The levels found in water and atmosphere will probably increase as scientists are able to identify more PFAS, most of which cannot be detected by currently reliable technology.“We need to take a broad approach to control sources that release PFAS into the atmosphere and into bodies of water … since they eventually all end up in the lakes,” Venier said.

MIT Researchers Identify Genetic Markers That Could Revolutionize ALS Treatment

In a study of cells from nearly 400 ALS patients, MIT researchers identified genomic regions with chemical modifications linked to disease progression. An analysis revealed...

MIT researchers have discovered significant epigenetic modifications in ALS patients that could lead to targeted therapies. These modifications, identified in motor neurons from 380 ALS patients, indicate that ALS might consist of various subtypes, each with distinct genetic influences on disease progression.In a study of cells from nearly 400 ALS patients, MIT researchers identified genomic regions with chemical modifications linked to disease progression.An analysis revealed a strong differential signal associated with a known subtype of ALS, and about 30 locations with modifications that appear to be linked to rates of disease progression in ALS patients.For most patients, it’s unknown exactly what causes amyotrophic lateral sclerosis (ALS), a disease characterized by degeneration of motor neurons that impairs muscle control and eventually leads to death. Studies have identified certain genes that confer a higher risk of the disease, but scientists believe there are many more genetic risk factors that have yet to be discovered. One reason why these drivers have been hard to find is that some are found in very few patients, making it hard to pick them out without a very large sample of patients. Additionally, some of the risk may be driven by epigenomic factors, rather than mutations in protein-coding genes.Working with the Answer ALS consortium, a team of MIT researchers has analyzed epigenetic modifications — tags that determine which genes are turned on in a cell — in motor neurons derived from induced pluripotent stem (IPS) cells from 380 ALS patients.This analysis revealed a strong differential signal associated with a known subtype of ALS, and about 30 locations with modifications that appear to be linked to rates of disease progression in ALS patients. The findings may help scientists develop new treatments that are targeted to patients with certain genetic risk factors.“If the root causes are different for all these different versions of the disease, the drugs will be very different and the signals in IPS cells will be very different,” says Ernest Fraenkel, the Grover M. Hermann Professor in Health Sciences and Technology in MIT’s Department of Biological Engineering and the senior author of the study. “We may get to a point in a decade or so where we don’t even think of ALS as one disease, where there are drugs that are treating specific types of ALS that only work for one group of patients and not for another.”MIT postdoc Stanislav Tsitkov is the lead author of the paper, which was published on May 2 in the journal Nature Communications.What is amyotrophic lateral sclerosis (ALS)?Amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s disease, is a neurological disorder that targets motor neurons—nerve cells in the brain and spinal cord responsible for controlling voluntary muscle movement and breathing. As these motor neurons deteriorate and die, they cease transmitting messages to muscles, leading to muscle weakening, twitching (fasciculations), and wasting away (atrophy).Over time, ALS progresses, and individuals affected by the disease gradually lose the ability to initiate and control voluntary movements such as walking, talking, and chewing, including the ability to breathe. The symptoms of ALS worsen progressively.Finding Risk FactorsALS is a rare disease that is estimated to affect about 30,000 people in the United States. One of the challenges in studying the disease is that while genetic variants are believed to account for about 50 percent of ALS risk (with environmental factors making up the rest), most of the variants that contribute to that risk have not been identified.Similar to Alzheimer’s disease, there may be a large number of genetic variants that can confer risk, but each individual patient may carry only a small number of those. This makes it difficult to identify the risk factors unless scientists have a very large population of patients to analyze.“Because we expect the disease to be heterogeneous, you need to have large numbers of patients before you can pick up on signals like this. To really be able to classify the subtypes of disease, we’re going to need to look at a lot of people,” Fraenkel says.About 10 years ago, the Answer ALS consortium began to collect large numbers of patient samples, which could allow for larger-scale studies that might reveal some of the genetic drivers of the disease. From blood samples, researchers can create induced pluripotent stem cells and then induce them to differentiate into motor neurons, the cells most affected by ALS.“We don’t think all ALS patients are going to be the same, just like all cancers are not the same. And the goal is being able to find drivers of the disease that could be therapeutic targets,” Fraenkel says.In this study, Fraenkel and his colleagues wanted to see if patient-derived cells could offer any information about molecular differences that are relevant to ALS. They focused on epigenomic modifications, using a method called ATAC-seq to measure chromatin density across the genome of each cell. Chromatin is a complex of DNA and proteins that determines which genes are accessible to be transcribed by the cell, depending on how densely packed the chromatin is.In data that were collected and analyzed over several years, the researchers did not find any global signal that clearly differentiated the 380 ALS patients in their study from 80 healthy control subjects. However, they did find a strong differential signal associated with a subtype of ALS, characterized by a genetic mutation in the C9orf72 gene.Additionally, they identified about 30 regions that were associated with slower rates of disease progression in ALS patients. Many of these regions are located near genes related to the cellular inflammatory response; interestingly, several of the identified genes have also been implicated in other neurodegenerative diseases, such as Parkinson’s disease.“You can use a small number of these epigenomic regions and look at the intensity of the signal there, and predict how quickly someone’s disease will progress. That really validates the hypothesis that the epigenomics can be used as a filter to better understand the contribution of the person’s genome,” Fraenkel says.“By harnessing the very large number of participant samples and extensive data collected by the Answer ALS Consortium, these studies were able to rigorously test whether the observed changes might be artifacts related to the techniques of sample collection, storage, processing, and analysis, or truly reflective of important biology,” says Lyle Ostrow, an associate professor of neurology at the Lewis Katz School of Medicine at Temple University, who was not involved in the study. “They developed standard ways to control for these variables, to make sure the results can be accurately compared. Such studies are incredibly important for accelerating ALS therapy development, as they will enable data and samples collected from different studies to be analyzed together.”Targeted DrugsThe researchers now hope to further investigate these genomic regions and see how they might drive different aspects of ALS progression in different subsets of patients. This could help scientists develop drugs that might work in different groups of patients, and help them identify which patients should be chosen for clinical trials of those drugs, based on genetic or epigenetic markers.Last year, the U.S. Food and Drug Administration approved a drug called tofersen, which can be used in ALS patients with a mutation in a gene called SOD1. This drug is very effective for those patients, who make up about 1 percent of the total population of people with ALS. Fraenkel’s hope is that more drugs can be developed for, and tested in, people with other genetic drivers of ALS.“If you had a drug like tofersen that works for 1 percent of patients and you just gave it to a typical phase two clinical trial, you probably wouldn’t have anybody with that mutation in the trial, and it would’ve failed. And so that drug, which is a lifesaver for people, would never have gotten through,” Fraenkel says.The MIT team is now using an approach called quantitative trait locus (QTL) analysis to try to identify subgroups of ALS patients whose disease is driven by specific genomic variants.“We can integrate the genomics, the transcriptomics, and the epigenomics, as a way to find subgroups of ALS patients who have distinct phenotypic signatures from other ALS patients and healthy controls,” Tsitkov says. “We have already found a few potential hits in that direction.”Reference: “Disease related changes in ATAC-seq of iPSC-derived motor neuron lines from ALS patients and controls” by Stanislav Tsitkov, Kelsey Valentine, Velina Kozareva, Aneesh Donde, Aaron Frank, Susan Lei, the Answer ALS Consortium, Jennifer E. Van Eyk, Steve Finkbeiner, Jeffrey D. Rothstein, Leslie M. Thompson, Dhruv Sareen, Clive N. Svendsen and Ernest Fraenkel, 2 May 2024, Nature Communications.DOI: 10.1038/s41467-024-47758-8The research was funded by the Answer ALS program, which is supported by the Robert Packard Center for ALS Research at Johns Hopkins University, Travelers Insurance, ALS Finding a Cure Foundation, Stay Strong Vs. ALS, Answer ALS Foundation, Microsoft, Caterpillar Foundation, American Airlines, Team Gleason, the U.S. National Institutes of Health, Fishman Family Foundation, Aviators Against ALS, AbbVie Foundation, Chan Zuckerberg Initiative, ALS Association, National Football League, F. Prime, M. Armstrong, Bruce Edwards Foundation, the Judith and Jean Pape Adams Charitable Foundation, Muscular Dystrophy Association, Les Turner ALS Foundation, PGA Tour, Gates Ventures, and Bari Lipp Foundation. This work was also supported, in part, by grants from the National Institutes of Health and the MIT-GSK Gertrude B. Elion Research Fellowship Program for Drug Discovery and Disease.

California's effort to plug abandoned, chemical-spewing oil wells gets $35-million boost

The Biden administration funding is among the "largest ever in American history to address legacy pollution," U.S. Secretary of the Interior Deb Haaland said.

California will receive more than $35 million in federal funding to help address the scourge of abandoned oil wells that are leaking dangerous chemicals and planet-warming methane in areas across the state, including many in Los Angeles.The investment from the Biden-Harris administration is among the “largest ever in American history to address legacy pollution,” U.S. Secretary of the Interior Deb Haaland said Friday during a joint announcement with Los Angeles Mayor Karen Bass and California Deputy Secretary for Energy Le-Quyen Nguyen.California will use the funding to plug and remediate 206 high-risk orphaned oil and gas wells and decommission 47 attendant production facilities with about 70,000 feet of associated pipelines.“Capping hazardous orphaned wells and addressing legacy pollution across our country will have a profound impact on our environment, our water quality, and the health and well-being of our communities,” Haaland said. Aggressive and impactful reporting on climate change, the environment, health and science. The Golden State is home to at least 5,300 abandoned or orphaned oil wells — or wells for which there are no legally liable parties to plug them — according to estimates from the California Geologic Energy Management Division. There are more than 35,000 known idle wells, with thousands more that will soon come to the end of their lives.Many are located in and around communities where residents have been sickened by their toxic emissions. What’s more, many unclogged wells leak methane, a planet-warming gas that is more than 25 times as potent as carbon dioxide at trapping heat in the atmosphere. California Deputy Secretary for Energy Le-Quyen Nguyen, left, U.S. Secretary of the Interior Deb Haaland and Los Angeles Mayor Karen Bass announce federal funding to plug and remediate orphaned oil wells. (Hayley Smith / Los Angeles Times) “We have thousands of orphaned wells in California, and each well poses a risk to public health, safety and the environment, as well as further contributes to climate change,” Nguyen said. “The funding that was announced today by Secretary Haaland will continue our momentum in plugging these orphaned wells in California, as well as remediating those sites and removing that legacy pollution. It will also make a meaningful, positive impact to our communities, as well as creating good jobs.”California’s award is part of a larger, $660-million formula grant pot that will be released to states on a rolling basis, Haaland said. As part of its award, California will also work to detect and measure methane emissions from orphaned oil and gas wells, screen for groundwater and surface water impacts, and prioritize cleaning up wells near disadvantaged communities. The grant program stems from an overall $4.7-billion investment from President Biden’s Bipartisan Infrastructure Law to plug orphaned wells nationwide. Other buckets of funding include more than $565 million in initial grant funding that has already been awarded to 25 states, including $25 million to California. A planned matching grants program will also award up to $30 million apiece to states that commit to increasing their spending on cleaning up orphaned wells. Bass said it was too soon to specify how much of the state’s latest award will go to Los Angeles. However, state officials said some of the initial funding is being used to plug 19 wells that remain uncapped at the AllenCo drill site in South Los Angeles, which stand among more than 370 high-priority wells identified in the first round of planning. Residents who live near the AllenCo site have complained for years about headaches, nosebleeds, respiratory diseases and other health issues. Among them is Nalleli Cobo, who grew up about 30 feet from the site and was diagnosed with reproductive cancer at age 19. “I’ve lost my childhood to the fossil fuel industry and I’ve also lost my future to the fossil fuel industry, and that’s not the reality that our community should be facing,” Cobo said. “When you ask a person what belongs in a community, not a lot of people will say an oil well.” She noted that about 18 million Americans live one mile or less from an active oil or gas well. Friday’s federal investment announcement is “definitely a step in the right direction,” she said, “but we need to make sure we are prioritizing communities like sacrifice zones, because we are the front-line communities that live day in and day out breathing these toxic emissions.”Officials said the latest round of funding advances Biden’s Justice40 Initiative, which aims to deliver at least 40% of benefits from certain climate, housing and energy investments to disadvantaged communities.“This is an issue of environmental justice,” Bass said. “Today we are locking arms across the city, state and federal governments to continue our work to end neighborhood oil drilling in the city of Los Angeles to protect the health of Angelenos and advance our vision of environmental justice.” Since the enactment of Biden’s Bipartisan Infrastructure Law in 2021, states have plugged more than 7,700 orphaned wells and reduced approximately 11,530 metric tons of potential methane emissions, according to the Department of the Interior.Gov. Gavin Newsom in October also approved AB 1167, legislation that will require companies that acquire oil wells to secure bonds to properly seal the wells once their use has ended. Some local communities, such as Culver City, have banned new drilling and are moving to phase out existing wells. “California is one of the states that is leading the way in putting these new resources to work, because it’s going to take all of us working together to ensure that we are making the kind of enduring impact that will last for generations to come,” Haaland said.But while the federal support is encouraging, there is still much work that remains, said Brenda Valdivia, a lifelong resident of the Vista Hermosa Heights neighborhood in L.A. Valdivia said she developed an autoimmune disease and had two strokes following her exposure to nearby wells. “We could always do more,” she said. Times staff writer Tony Briscoe contributed to this report.

‘The Interview’: Dr. Ayana Elizabeth Johnson

Ayana Elizabeth Johnson on how to overcome the “soft” climate denial that keeps us buying junk.

I don’t think it’s an exaggeration to say that the Intergovernmental Panel on Climate Change’s 2018 report on global warming drastically changed the way many people thought — or felt — about the climate crisis. That report laid out, with grim clarity, both the importance and extreme difficulty of preventing global warming from reaching 1.5 degrees Celsius above preindustrial levels. Its warnings about what was likely to happen to our planet if we didn’t turn things around were severe.The starkness of the I.P.C.C.’s report led to a surge of pessimism, fear and, in response to those emotions, climate activism that hasn’t really abated. But recently there has been a growing counterresponse to those darker feelings, including from some experts who have a clear view on what’s coming — and that response is a cautious optimism.Though she doesn’t go so far as to call herself hopeful, Dr. Ayana Elizabeth Johnson is one of those experts trying to change the mood. She’s a marine biologist and a founder of the Urban Ocean Lab, a think tank focusing on climate and coastal cities. She has also worked with the Environmental Protection Agency and advised lawmakers on climate policy. Additionally, Johnson, who is 43, is a leading climate activist and communicator. She was an editor of the best-selling climate anthology “All We Can Save,” and her next book, “What if We Get It Right?” which will be published this summer, is a collection of interviews with leaders from various fields about promising climate possibilities.The question posed by that book’s title — what if we get it right on climate? — is one I think about often, and skeptically. I’m not quite convinced that people are motivated more by positivity than fear. But I would like to be, and I was hoping Johnson could help.

No Results today.

Our news is updated constantly with the latest environmental stories from around the world. Reset or change your filters to find the most active current topics.

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.