Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

GoGreenNation News

Learn more about the issues presented in our films
Show Filters

3 Questions: Addressing the world’s most pressing challenges

Mihaela Papa discusses the BRICS Lab, her role at the Center for International Studies, and the center's ongoing ambition to tackle the world's most complex challenges in new and creative ways.

The Center for International Studies (CIS) empowers students, faculty, and scholars to bring MIT’s interdisciplinary style of research and scholarship to address complex global challenges. In this Q&A, Mihaela Papa, the center's director of research and a principal research scientist at MIT, describes her role as well as research within the BRICS Lab at MIT — a reference the BRICS intergovernmental organization, which comprises the nations of Brazil, Russia, India, China, South Africa, Egypt, Ethiopia, Indonesia, Iran and the United Arab Emirates. She also discusses the ongoing mission of CIS to tackle the world's most complex challenges in new and creative ways.Q: What is your role at CIS, and some of your key accomplishments since joining the center just over a year ago?A: I serve as director of research and principal research scientist at CIS, a role that bridges management and scholarship. I oversee grant and fellowship programs, spearhead new research initiatives, build research communities across our center's area programs and MIT schools, and mentor the next generation of scholars. My academic expertise is in international relations, and I publish on global governance and sustainable development, particularly through my new BRICS Lab. This past year, I focused on building collaborative platforms that highlight CIS’ role as an interdisciplinary hub and expand its research reach. With Evan Lieberman, the director of CIS, I launched the CIS Global Research and Policy Seminar series to address current challenges in global development and governance, foster cross-disciplinary dialogue, and connect theoretical insights to policy solutions. We also convened a Climate Adaptation Workshop, which examined promising strategies for financing adaptation and advancing policy innovation. We documented the outcomes in a workshop report that outlines a broader research agenda contributing to MIT’s larger climate mission.In parallel, I have been reviewing CIS’ grant-making programs to improve how we serve our community, while also supporting regional initiatives such as research planning related to Ukraine. Together with the center's MIT-Brazil faculty director Brad Olsen, I secured a MITHIC [MIT Human Insight Collaboration] Connectivity grant to build an MIT Amazonia research community that connects MIT scholars with regional partners and strengthens collaboration across the Amazon. Finally, I launched the BRICS Lab to analyze transformations in global governance and have ongoing research on BRICS and food security and data centers in BRICS. Q: Tell us more about the BRICS Lab.A: The BRICS countries comprise the majority of the world’s population and an expanding share of the global economy. [Originally comprising Brazil, Russia, India, and China, BRICS currently includes 11 nations.] As a group, they carry the collective weight to shape international rules, influence global markets, and redefine norms — yet the question remains: Will they use this power effectively? The BRICS Lab explores the implications of the bloc’s rise for international cooperation and its role in reshaping global politics. Our work focuses on three areas: the design and strategic use of informal groups like BRICS in world affairs; the coalition’s potential to address major challenges such as food security, climate change, and artificial intelligence; and the implications of U.S. policy toward BRICS for the future of multilateralism.Q: What are the center’s biggest research priorities right now?A: Our center was founded in response to rising geopolitical tensions and the urgent need for policy rooted in rigorous, evidence-based research. Since then, we have grown into a hub that combines interdisciplinary scholarship and actively engages with policymakers and the public. Today, as in our early years, the center brings together exceptional researchers with the ambition to address the world’s most pressing challenges in new and creative ways.Our core focus spans security, development, and human dignity. Security studies have been a priority for the center, and our new nuclear security programming advances this work while training the next generation of scholars in this critical field. On the development front, our work has explored how societies manage diverse populations, navigate international migration, as well as engage with human rights and the changing patterns of regime dynamics.We are pursuing new research in three areas. First, on climate change, we seek to understand how societies confront environmental risks and harms, from insurance to water and food security in the international context. Second, we examine shifting patterns of global governance as rising powers set new agendas and take on greater responsibilities in the international system. Finally, we are initiating research on the impact of AI — how it reshapes governance across international relations, what is the role of AI corporations, and how AI-related risks can be managed.As we approach our 75th anniversary in 2026, we are excited to bring researchers together to spark bold ideas that open new possibilities for the future.

Is there such a thing as a ‘problem shark’? Plan to catch repeat biters divides scientists

Some experts think a few sharks may be responsible for a disproportionate number of attacks. Should they be hunted down?First was the French tourist, killed while swimming off Saint-Martin in December 2020. The manager of a nearby water sports club raced out in a dinghy to help, only to find her lifeless body floating face down, a gaping wound where part of her right thigh should have been. Then, a month later, another victim. Several Caribbean islands away, a woman snorkelling off St Kitts and Nevis was badly bitten on her left leg by a shark. Fortunately, she survived.Soon after the fatal incident in December, Eric Clua, a marine biologist at the École Pratique des Hautes Études in Paris, got a phone call. Island nations often ask for his help after a shark bite, he says, “because I am actually presenting a new vision … I say, ‘You don’t have a problem with sharks, you have a problem with one shark.’” Continue reading...

First was the French tourist, killed while swimming off Saint-Martin in December 2020. The manager of a nearby water sports club raced out in a dinghy to help, only to find her lifeless body floating face down, a gaping wound where part of her right thigh should have been. Then, a month later, another victim. Several Caribbean islands away, a woman snorkelling off St Kitts and Nevis was badly bitten on her left leg by a shark. Fortunately, she survived.Soon after the fatal incident in December, Eric Clua, a marine biologist at the École Pratique des Hautes Études in Paris, got a phone call. Island nations often ask for his help after a shark bite, he says, “because I am actually presenting a new vision … I say, ‘You don’t have a problem with sharks, you have a problem with one shark.’”Human-shark conflicts are not solely the result of accidents or happenstance, Clua says. Instead, he says there are such things as problem sharks: bold individuals that may have learned, perhaps while still young, that humans are prey. It’s a controversial stance, but Clua thinks that if it’s true – and if he can identify and remove these problem sharks – it might dissuade authorities from taking even more extreme forms of retribution, including culls.A shark killed a man at Long Reef beach in Dee Why, Sydney, on 6 September, 2025. Photograph: Dean Lewins/AAPThough culls of sharks after human-shark conflict are becoming less common and are generally regarded by scientists as ineffective, they do still happen. One of the last big culls took place near Réunion, a French island in the Indian Ocean, between 2011 and 2013, resulting in the deaths of more than 500 sharks. Even that was not enough for some – four years later, a professional surfer called for daily shark culls near the island.And so, in the immediate aftermath of the French tourist’s death in Saint-Martin, when one of Clua’s contacts called to explain what had happened, he recalls telling them: “Just go there on the beach … I want swabbing of the wounds.”After that bite and the one that occurred a month later, medical professionals collected samples of mucus that the shark had left behind to send off for analysis, though it took weeks for the results to come back. But as Clua and colleagues describe in a study published last year, the DNA analysis confirmed that the same tiger shark was responsible for both incidents.Even before the DNA test was complete, however, analysis of the teeth marks left on the Saint-Martin victim, and of the tooth fragment collected from her leg, suggested the perpetrator was a tiger shark (Galeocerdo cuvier) roughly 3 metres (10ft) long. Armed with this knowledge, Clua and his colleagues set out to catch the killer.During January and February 2021, Clua and his team hauled 24 tiger sharks from the water off Saint-Martin and analysed a further 25 sharks that they caught either around St Barts or St Kitts and Nevis.Eric Clua and his colleagues took DNA samples from nearly 50 tiger sharks to try to find one that had bitten two women. Photograph: Courtesy of Eric CluaBecause both of the women who were bitten had lost a substantial amount of flesh, the scientists saw this as a chance to find the shark responsible. Each time they dragged a tiger shark out of the water they flipped it upside down, flooded its innards with water, and pressed firmly on its stomach to make it vomit. A shark is, generally, “a very easy puker”, Clua says. The team’s examinations turned up no evidence of human remains.Clua and his colleagues also took DNA samples from each of the tiger sharks, as well as from dead sharks landed by fishers in St Kitts and Nevis. None matched the DNA swabbed from the wounds suffered by the two women.But the team has not given up. Clua is now waiting for DNA analysis of mucus samples recovered from a third shark bite that happened off Saint-Martin in May 2024. If that matches samples from the earlier bites, Clua says, that would suggest it “might be possible” to catch the culprit shark in the future.For people who don’t want to risk interacting with sharks, I have great news – swimming pools existCatherine Macdonald, conservation biologistThat some specific sharks have developed a propensity for biting people is controversial among marine scientists, though Lucille Chapuis, a marine sensory ecologist at La Trobe University in Australia, is not entirely sure why. The concept of problem animals is well established on land, she says. Terrestrial land managers routinely contend with problem lions, tigers and bears. “Why not a fish?” asks Chapuis. “We know that fishes, including sharks, have amazing cognitive abilities.”Yet having gleaned a range of opinions on Clua’s ideas, some marine scientists rejected the concept of problem sharks outright.A tiger shark. Some scientists fear that merely talking about problem sharks could perpetuate the preconception of human-eating monsters. Photograph: Jeff Milisen/AlamyClua is aware that his approach is divisive: “I have many colleagues – experts – that are against the work I’m doing.”The biggest pushback is from scientists who say there is no concrete evidence for the idea that there are extra dangerous, human-biting sharks roaming the seas. Merely talking about problem sharks, they say, could perpetuate the idea that some sharks are hungry, human-eating monsters such as the beast from the wildly unscientific movie Jaws.Clua says the monster from Jaws and his definition of a problem shark are completely different. A problem shark is not savage or extreme; it’s just a shark that learned at some point that humans are among the things it might prey on. Environmental factors, as well as personality, might trigger or aggravate such behaviour.Besides the tiger shark that struck off Saint-Martin and St Kitts and Nevis, Clua’s 2024 study detailed the case of another tiger shark involved in multiple bites in Costa Rica. A third case focused on an oceanic whitetip shark in Egypt that killed a female swimmer by biting off her right leg. The same shark later attempted to bite the shoulder of one of Clua’s colleagues during a dive.Pilot fish follow an oceanic whitetip shark. A woman was killed when an oceanic whitetip bit off her right leg in Egypt. Photograph: Amar and Isabelle Guillen/Guillen Photo LLC/AlamyToby Daly-Engel, a shark expert at the Florida Institute of Technology, says the genetic analysis connecting the same tiger shark to two bite victims in the Caribbean is robust. However, she says such behaviour must be rare. “They’re just opportunistic. I mean, these things eat tyres.”Diego Biston Vaz, curator of fishes at the Natural History Museum in London, also praises Clua’s work, calling it “really forensic”. He, too, emphasises it should not be taken as an excuse to demonise sharks. “They’re not villains; they’re just trying to survive,” he says.Chapuis adds that the small number of animals involved in Clua’s recent studies mean the research does not prove problem sharks are real. Plus, while some sharks might learn to bite humans, she questions whether they would continue to do so long term. People tend to defend themselves well and, given there are only a few dozen unprovoked shark bites recorded around the world each year, she says there is no data to support the idea that even the boldest sharks benefit from biting people.Plus, Clua’s plan – to capture problem sharks and bring them to justice – is unrealistic, says David Shiffman, a marine conservation biologist based in Washington DC. Even if scientists can prove beyond doubt that a few specific sharks are responsible for a string of incidents – “which I do not believe he has done”, Shiffman adds – he thinks finding those sharks is not viable.Any resources used to track down problem sharks would be better spent on preventive measures such as lifeguards, who could spot sharks approaching a busy beach, says Catherine Macdonald, a conservation biologist at the University of Miami in Florida.While identifying and removing a problem shark is better than culling large numbers, she urges people to answer harder questions about coexisting with predators. “For people who don’t want to risk interacting with sharks, I have great news,” she says. “Swimming pools exist.”Identifying and removing a problem shark is often regarded as better than culling large numbers. Photograph: Humane Society International/AAPClua, for his part, intends to carry on. He’s working with colleagues on Saint-Martin to swab shark-bite injuries when they occur, and to track down potential problem sharks.Asked whether he has ever experienced a dangerous encounter with a large shark himself, Clua says that in 58 years of diving it has happened only once, while spear fishing off New Caledonia. Poised underwater, waiting for a fish to appear, he turned his head. “There was a bull shark coming [toward] my back,” he says.He got the feeling at that moment that he was about to become prey. But there was no violence. Clua looked at the bull shark as it turned and swam away.This story was originally published in bioGraphic, an independent magazine about nature and regeneration from the California Academy of Sciences.

Biomethane not viable for widespread use in UK home heating, report finds

Gas derived from farm waste can meet only 18% of current gas demand by 2050, despite claims of fossil fuel lobbyists, study findsGas derived from farm waste will never be an alternative to the widespread adoption of heat pumps, research shows, despite the claims of fossil fuel lobbyists.Biomethane, which comes mainly from “digesting” manure, sewage and other organic waste, has been touted as a low-carbon substitute for fossil fuel gas, for use in home heating. Proponents say it would be less disruptive than ripping out the UK’s current gas infrastructure and installing heat pumps. Continue reading...

Gas derived from farm waste will never be an alternative to the widespread adoption of heat pumps, research shows, despite the claims of fossil fuel lobbyists.Biomethane, which comes mainly from “digesting” manure, sewage and other organic waste, has been touted as a low-carbon substitute for fossil fuel gas, for use in home heating. Proponents say it would be less disruptive than ripping out the UK’s current gas infrastructure and installing heat pumps.But research seen by the Guardian shows that while there may be a role for biomethane in some industries and on farms, it will not make a viable alternative for the vast majority of homes.A study by the analyst company Regen, commissioned by the MCS Foundation charity, found that biomethane could account for only up to 18% of the UK’s current gas demand by 2050. That is because the available sources: manure, farm waste and sewage, cannot be scaled up to the extent needed without distorting the UK’s economy, or using unsustainable sources.Faced with the limitations of biomethane, ministers would do better to rule out its widespread use in home heating and concentrate on heat pumps, MCS concluded.Garry Felgate, the chief executive of the MCS Foundation, said: “Biomethane has an important role to play in decarbonisation – but not in homes. If we are to meet our climate targets and ensure that every household has access to secure, affordable energy, there is simply no viable way that we can continue to heat homes using the gas grid, whether that is using fossil gas, hydrogen, or biomethane.”Gas companies have a strong vested interest in the future of biomethane because its widespread use would allow them to keep the current gas infrastructure of pipelines, distribution technology and home boilers in operation. If the UK shifts most homes to heat pumps, those networks will become redundant.The same arguments are made by gas companies, and by some trade unions, in favour of hydrogen, which has also been touted as a low-carbon alternative to heat pumps, but which numerous studies have shown will not be economically viable at the scale required.At the Labour party conference this week, delegates were bombarded by lobbyists claiming that biomethane could take the place of 6m gas boilers and delay the phase-out of gas boilers.Felgate said ministers must require the decommissioning of the gas grid by 2050, and set a clear deadline for phasing out boilers.“Failure to plan for the decommissioning of the gas grid will result in it becoming a stranded asset,” he said. “Consumers and industry need certainty: biomethane will not replace fossil fuel gas in homes, electric heating such as heat pumps is the only viable way to decarbonise homes.”Tamsyn Lonsdale-Smith, the energy analyst at Regen who wrote the report, said there were uses for biomethane in industry, but that it was not suitable for widespread consumer use. “Biomethane can be a green gas with minimal environmental and land use impacts – but only if produced from the right sources, in the right way and at an appropriate scale,” she said. “The government is right to be focusing on scaling up biomethane production, but as sustainable supplies are likely to be limited, it is critical that its use is prioritised for only the highest value uses where carbon reductions are greatest.”A government spokesperson said: “Biomethane can play an important role in reducing our reliance on imported gas, increasing our country’s energy security and helping to deliver net zero. We are looking at how we can further support the sector and plan to publish a consultation on biomethane early next year.”

Responding to the climate impact of generative AI

Explosive growth of AI data centers is expected to increase greenhouse gas emissions. Researchers are now seeking solutions to reduce these environmental harms.

In part 2 of our two-part series on generative artificial intelligence’s environmental impacts, MIT News explores some of the ways experts are working to reduce the technology’s carbon footprint.The energy demands of generative AI are expected to continue increasing dramatically over the next decade.For instance, an April 2025 report from the International Energy Agency predicts that the global electricity demand from data centers, which house the computing infrastructure to train and deploy AI models, will more than double by 2030, to around 945 terawatt-hours. While not all operations performed in a data center are AI-related, this total amount is slightly more than the energy consumption of Japan.Moreover, an August 2025 analysis from Goldman Sachs Research forecasts that about 60 percent of the increasing electricity demands from data centers will be met by burning fossil fuels, increasing global carbon emissions by about 220 million tons. In comparison, driving a gas-powered car for 5,000 miles produces about 1 ton of carbon dioxide.These statistics are staggering, but at the same time, scientists and engineers at MIT and around the world are studying innovations and interventions to mitigate AI’s ballooning carbon footprint, from boosting the efficiency of algorithms to rethinking the design of data centers.Considering carbon emissionsTalk of reducing generative AI’s carbon footprint is typically centered on “operational carbon” — the emissions used by the powerful processors, known as GPUs, inside a data center. It often ignores “embodied carbon,” which are emissions created by building the data center in the first place, says Vijay Gadepally, senior scientist at MIT Lincoln Laboratory, who leads research projects in the Lincoln Laboratory Supercomputing Center.Constructing and retrofitting a data center, built from tons of steel and concrete and filled with air conditioning units, computing hardware, and miles of cable, consumes a huge amount of carbon. In fact, the environmental impact of building data centers is one reason companies like Meta and Google are exploring more sustainable building materials. (Cost is another factor.)Plus, data centers are enormous buildings — the world’s largest, the China Telecomm-Inner Mongolia Information Park, engulfs roughly 10 million square feet — with about 10 to 50 times the energy density of a normal office building, Gadepally adds. “The operational side is only part of the story. Some things we are working on to reduce operational emissions may lend themselves to reducing embodied carbon, too, but we need to do more on that front in the future,” he says.Reducing operational carbon emissionsWhen it comes to reducing operational carbon emissions of AI data centers, there are many parallels with home energy-saving measures. For one, we can simply turn down the lights.“Even if you have the worst lightbulbs in your house from an efficiency standpoint, turning them off or dimming them will always use less energy than leaving them running at full blast,” Gadepally says.In the same fashion, research from the Supercomputing Center has shown that “turning down” the GPUs in a data center so they consume about three-tenths the energy has minimal impacts on the performance of AI models, while also making the hardware easier to cool.Another strategy is to use less energy-intensive computing hardware.Demanding generative AI workloads, such as training new reasoning models like GPT-5, usually need many GPUs working simultaneously. The Goldman Sachs analysis estimates that a state-of-the-art system could soon have as many as 576 connected GPUs operating at once.But engineers can sometimes achieve similar results by reducing the precision of computing hardware, perhaps by switching to less powerful processors that have been tuned to handle a specific AI workload.There are also measures that boost the efficiency of training power-hungry deep-learning models before they are deployed.Gadepally’s group found that about half the electricity used for training an AI model is spent to get the last 2 or 3 percentage points in accuracy. Stopping the training process early can save a lot of that energy.“There might be cases where 70 percent accuracy is good enough for one particular application, like a recommender system for e-commerce,” he says.Researchers can also take advantage of efficiency-boosting measures.For instance, a postdoc in the Supercomputing Center realized the group might run a thousand simulations during the training process to pick the two or three best AI models for their project.By building a tool that allowed them to avoid about 80 percent of those wasted computing cycles, they dramatically reduced the energy demands of training with no reduction in model accuracy, Gadepally says.Leveraging efficiency improvementsConstant innovation in computing hardware, such as denser arrays of transistors on semiconductor chips, is still enabling dramatic improvements in the energy efficiency of AI models.Even though energy efficiency improvements have been slowing for most chips since about 2005, the amount of computation that GPUs can do per joule of energy has been improving by 50 to 60 percent each year, says Neil Thompson, director of the FutureTech Research Project at MIT’s Computer Science and Artificial Intelligence Laboratory and a principal investigator at MIT’s Initiative on the Digital Economy.“The still-ongoing ‘Moore’s Law’ trend of getting more and more transistors on chip still matters for a lot of these AI systems, since running operations in parallel is still very valuable for improving efficiency,” says Thomspon.Even more significant, his group’s research indicates that efficiency gains from new model architectures that can solve complex problems faster, consuming less energy to achieve the same or better results, is doubling every eight or nine months.Thompson coined the term “negaflop” to describe this effect. The same way a “negawatt” represents electricity saved due to energy-saving measures, a “negaflop” is a computing operation that doesn’t need to be performed due to algorithmic improvements.These could be things like “pruning” away unnecessary components of a neural network or employing compression techniques that enable users to do more with less computation.“If you need to use a really powerful model today to complete your task, in just a few years, you might be able to use a significantly smaller model to do the same thing, which would carry much less environmental burden. Making these models more efficient is the single-most important thing you can do to reduce the environmental costs of AI,” Thompson says.Maximizing energy savingsWhile reducing the overall energy use of AI algorithms and computing hardware will cut greenhouse gas emissions, not all energy is the same, Gadepally adds.“The amount of carbon emissions in 1 kilowatt hour varies quite significantly, even just during the day, as well as over the month and year,” he says.Engineers can take advantage of these variations by leveraging the flexibility of AI workloads and data center operations to maximize emissions reductions. For instance, some generative AI workloads don’t need to be performed in their entirety at the same time.Splitting computing operations so some are performed later, when more of the electricity fed into the grid is from renewable sources like solar and wind, can go a long way toward reducing a data center’s carbon footprint, says Deepjyoti Deka, a research scientist in the MIT Energy Initiative.Deka and his team are also studying “smarter” data centers where the AI workloads of multiple companies using the same computing equipment are flexibly adjusted to improve energy efficiency.“By looking at the system as a whole, our hope is to minimize energy use as well as dependence on fossil fuels, while still maintaining reliability standards for AI companies and users,” Deka says.He and others at MITEI are building a flexibility model of a data center that considers the differing energy demands of training a deep-learning model versus deploying that model. Their hope is to uncover the best strategies for scheduling and streamlining computing operations to improve energy efficiency.The researchers are also exploring the use of long-duration energy storage units at data centers, which store excess energy for times when it is needed.With these systems in place, a data center could use stored energy that was generated by renewable sources during a high-demand period, or avoid the use of diesel backup generators if there are fluctuations in the grid.“Long-duration energy storage could be a game-changer here because we can design operations that really change the emission mix of the system to rely more on renewable energy,” Deka says.In addition, researchers at MIT and Princeton University are developing a software tool for investment planning in the power sector, called GenX, which could be used to help companies determine the ideal place to locate a data center to minimize environmental impacts and costs.Location can have a big impact on reducing a data center’s carbon footprint. For instance, Meta operates a data center in Lulea, a city on the coast of northern Sweden where cooler temperatures reduce the amount of electricity needed to cool computing hardware.Thinking farther outside the box (way farther), some governments are even exploring the construction of data centers on the moon where they could potentially be operated with nearly all renewable energy.AI-based solutionsCurrently, the expansion of renewable energy generation here on Earth isn’t keeping pace with the rapid growth of AI, which is one major roadblock to reducing its carbon footprint, says Jennifer Turliuk MBA ’25, a short-term lecturer, former Sloan Fellow, and former practice leader of climate and energy AI at the Martin Trust Center for MIT Entrepreneurship.The local, state, and federal review processes required for a new renewable energy projects can take years.Researchers at MIT and elsewhere are exploring the use of AI to speed up the process of connecting new renewable energy systems to the power grid.For instance, a generative AI model could streamline interconnection studies that determine how a new project will impact the power grid, a step that often takes years to complete.And when it comes to accelerating the development and implementation of clean energy technologies, AI could play a major role.“Machine learning is great for tackling complex situations, and the electrical grid is said to be one of the largest and most complex machines in the world,” Turliuk adds.For instance, AI could help optimize the prediction of solar and wind energy generation or identify ideal locations for new facilities.It could also be used to perform predictive maintenance and fault detection for solar panels or other green energy infrastructure, or to monitor the capacity of transmission wires to maximize efficiency.By helping researchers gather and analyze huge amounts of data, AI could also inform targeted policy interventions aimed at getting the biggest “bang for the buck” from areas such as renewable energy, Turliuk says.To help policymakers, scientists, and enterprises consider the multifaceted costs and benefits of AI systems, she and her collaborators developed the Net Climate Impact Score.The score is a framework that can be used to help determine the net climate impact of AI projects, considering emissions and other environmental costs along with potential environmental benefits in the future.At the end of the day, the most effective solutions will likely result from collaborations among companies, regulators, and researchers, with academia leading the way, Turliuk adds.“Every day counts. We are on a path where the effects of climate change won’t be fully known until it is too late to do anything about it. This is a once-in-a-lifetime opportunity to innovate and make AI systems less carbon-intense,” she says.

A Revolution in Tracking Life on Earth

A suite of technologies are helping taxonomists speed up species identification.

Across a Swiss meadow and into its forested edges, the drone dragged a jumbo-size cotton swab from a 13-foot tether. Along its path, the moistened swab collected scraps of life: some combination of sloughed skin and hair; mucus, saliva, and blood splatters; pollen flecks and fungal spores.Later, biologists used a sequencer about the size of a phone to stream the landscape’s DNA into code, revealing dozens upon dozens of species, some endangered, some invasive. The researchers never saw the wasps, stink bugs, or hawk moths whose genetic signatures they collected. But all of those, and many more, were out there.The researchers, from the Swiss Federal Institute for Forest, Snow and Landscape Research, were field-testing a new approach to biodiversity monitoring, in this case to map insect life across different kinds of vegetation. They make up one of many teams now deploying a suite of technologies to track nature at a resolution and pace once unimaginable for taxonomists. “We know a lot more about what’s happening,” Camille Albouy, an environmental scientist at ETH Zurich, and member of the team, told me, “even if a lot still escapes us.”Today, autonomous robots collect DNA while state-of-the-art sequencers process genetic samples quickly and cheaply, and machine-learning algorithms detect life by sound or shape. These technologies are revolutionizing humanity’s ability to catalog Earth’s species, which are estimated to number 8 million—though perhaps far, far more—by illuminating the teeming life that so often eludes human observation. Only about 2.3 million species have been formally described. The rest are nameless and unstudied—part of what biologists call dark taxa.Insects, for example, likely compose more than half of all animal species, yet most (an estimated four out of five) have never been recorded by science. From the tropics to the poles, on land and in water, they pollinate, prey, scavenge, burrow, and parasitize—an unobserved majority of life on Earth. “It is difficult to relate to nonspecialists how vast our ignorance truly is,” an international consortium of insect scientists lamented in 2018. Valerio Caruso, an entomologist at the University of Padua, in Italy, studies scuttle flies, a skittering family containing an estimated 30,000 to 50,000 species. Only about 4,000 have been described, Caruso told me. “One lifetime is not enough to understand them all.”The minute distinctions within even one family of flies matter more than they might seem to: Species that look identical can occupy entirely different ecological niches—evading different predators and hunting different prey, parasitizing different hosts, pollinating different plants, decomposing different materials, or carrying different diseases. Each is a unique evolutionary experiment that might give rise to compounds that unlock new medicines, behaviors that offer agricultural solutions, and other adaptations that could further our understanding of how life persists.Only with today’s machines and technology do scientists stand a chance of keeping up with life’s abundance. For most of history, humans have relied primarily on their eyes to classify the natural world: Observations of shape, size, and color helped Carl Linnaeus catalog about 12,000 species in the 18th century—a monumental undertaking, but a laughable fraction of reality. Accounting for each creature demanded the meticulous labor of dehydrating, dissecting, mounting, pinning, labeling—essentially the main techniques available until the turn of the 21st century, when genetic sequencing allowed taxonomists to zoom in on DNA bar codes. Even then, those might not have identified specimens beyond genus or family.Now technologies such as eDNA, high-throughput sequencing, autonomous robotics, and AI have broadened our vision of the natural world. They decode the genomes of fungi, bacteria, and yeasts that are difficult or impossible to culture in a lab. Specialized AI isolates species’ calls from noisy recordings, translating air vibrations into an acoustic field guide. Others parse photo pixels to tease out variations in wing veins or bristles as fine as a dust mote to identify and classify closely related species. High-resolution 3-D scans allow researchers to visualize minuscule anatomies without lifting a scalpel. Other tools can map dynamic ecosystems as they transform in real time, tracking how wetlands contract and expand season by season or harnessing hundreds of millions of observations from citizen-science databases to identify species and map their shifting ranges.One unassuming setup in a lush Panamanian rainforest involved a UV light luring moths to a white panel and a solar-powered camera that snapped a photo every 10 seconds, from dusk to dawn. In a single week, AI processed many thousands of images each night, in which experts detected 2,000 moth species—half of them unknown to science. “It breaks my heart to see people think science is about wrapping up the last details of understanding, and that all the big discoveries are done,” David Rolnick, a computer scientist at McGill University and Mila - Quebec AI Institute, who was part of the expedition, told me. In Colombia, one of the world’s most biodiverse countries, the combination of drone-collected data and machine learning has helped describe tens of thousands of species, 200 of which are new to science.These tools’ field of view is still finite. AI algorithms see only as far as their training data, and taxonomical data overrepresent the global North and charismatic organisms. In a major open-access biodiversity database, for example, less than 5 percent of the entries in recent years pertained to insects, while more than 80 percent related to birds (which account for less than 1 percent of named species). Because many dark taxa are absent from training data sets, even the most advanced image-recognition models work best as triage—rapidly sorting through familiar taxa and flagging likely new discoveries for human taxonomists to investigate.AI systems “don’t have intuition; they don’t have creativity,” said Rolnick, whose team co-created Antenna, a ready-to-use AI platform for ecologists. Human taxonomists are still better at imagining how a rare feature arose evolutionarily, or exploring the slight differences that can mark an entirely new species. And ultimately, every identification—whether by algorithm or DNA or human expert—still depends on people.That human labor is also a dwindling resource, especially in entomology. “The number of people who are paid to be taxonomists in the world is practically nil,” Rolnick said. And time is against them. The world’s largest natural-history museums hold a wealth of specimens and objects (more than 1 billion, according to one study) yet only a fraction of those have digitally accessible records, and genomic records are accessible for just 0.2 percent of biological specimens. Many historical collections—all those drawers packed with pinned, flattened, and stuffed specimens; all those jars of floating beings—are chronically underfunded, and their contents are vulnerable to the physical consequences of neglect. Preservation fluids evaporate, poor storage conditions invite pests and mold, and DNA degrades until it is unsequenceable.Today’s tools are still far from fully capturing the extent and complexity of Earth’s biodiversity, and much of that could vanish before anyone catalogs it. “We are too few, studying too many things,” Caruso, the Padua entomologist, said. Many liken taxonomy to cataloging an already burning library. As Mehrdad Hajibabaei, chief scientific officer for the Center for Biodiversity Genomics at the University of Guelph, in Canada, told me: “We’re not stamp-collecting here.” Taxonomists are instead working to preserve a planetary memory—an archive of life—and to decode which traits help creatures adapt, migrate, or otherwise survive in a rapidly changing climate.The climate crisis is unraveling the life cycles of wildlife around the world—by one estimate, for about half of all species. Flowers now bloom weeks before pollinators stir; fruit withers before migrating birds can reach it. Butterflies attuned to rainfall falter in drought. Tropical birds and alpine plants climb toward cooler, though finite, mountaintops. Fish slip farther out to sea; disease-carrying mosquitoes ride the heat into new territories. Extreme weather at the poles stresses crucial moss and lichen, and shreds entire habitats in hours. Mass die-offs are now routine.“Once you lose one species, you’ll probably lose more species,” Caruso said. “Over time, everything is going to collapse.” One in eight could vanish by century’s end—many of them dark taxa, lost before we ever meet them. Most countries—and global bodies such as the International Union for Conservation of Nature—cannot assess, and therefore cannot protect, unnamed organisms. As Edward O. Wilson told Time in 1986: “It’s like having astronomy without knowing where the stars are.”Today’s machine-assisted taxonomy faces the same problem Linnaeus did: Nature’s complexity still far outstrips human insight, even with machines’ assistance. “We don’t perceive the world as it is in all its chaotic glory,” the biologist Carol Kaesuk Yoon wrote in her 2010 book, Naming Nature. “We sense a very particular subset of what surrounds us, and we see it in a particularly human way.” On the flip side, every new data point sharpens the predictive models guiding conservation, says Evgeny Zakharov, genomics director for the Center for Biodiversity Genomics. “The more we know about the world, the more power we have to properly manage and protect it,” he told me. With tools, the speed of taxonomists’ work is accelerating, but so is the countdown—they will take all the help they can get.

A beacon of light

A lantern created in the Design Intelligence Lab creates sustainable alternatives for consumer electronics.

Placing a lit candle in a window to welcome friends and strangers is an old Irish tradition that took on greater significance when Mary Robinson was elected president of Ireland in 1990. At the time, Robinson placed a lamp in Áras an Uachtaráin — the official residence of Ireland’s presidents — noting that the Irish diaspora and all others are always welcome in Ireland. Decades later, a lit lamp remains in a window in Áras an Uachtaráin.The symbolism of Robinson’s lamp was shared by Hashim Sarkis, dean of the MIT School of Architecture and Planning (SA+P), at the school’s graduation ceremony in May, where Robinson addressed the class of 2025. To replicate the generous intentions of Robinson’s lamp and commemorate her visit to MIT, Sarkis commissioned a unique lantern as a gift for Robinson. He commissioned an identical one for his office, which is in the front portico of MIT at 77 Massachusetts Ave.“The lamp will welcome all citizens of the world to MIT,” says Sarkis.No ordinary lanternThe bespoke lantern was created by Marcelo Coelho SM ’08, PhD ’12, director of the Design Intelligence Lab and associate professor of the practice in the Department of Architecture.One of several projects in the Geoletric research at the Design Intelligence Lab, the lantern showcases the use of geopolymers as a sustainable material alternative for embedded computers and consumer electronics.“The materials that we use to make computers have a negative impact on climate, so we’re rethinking how we make products with embedded electronics — such as a lamp or lantern — from a climate perspective,” says Coelho.Consumer electronics rely on materials that are high in carbon emissions and difficult to recycle. As the demand for embedded computing increases, so too does the need for alternative materials that have a reduced environmental impact while supporting electronic functionality.The Geolectric lantern advances the formulation and application of geopolymers — a class of inorganic materials that form covalently bonded, non-crystalline networks. Unlike traditional ceramics, geopolymers do not require high-temperature firing, allowing electronic components to be embedded seamlessly during production.Geopolymers are similar to ceramics, but have a lower carbon footprint and present a sustainable alternative for consumer electronics, product design, and architecture. The minerals Coelho uses to make the geopolymers — aluminum silicate and sodium silicate — are those regularly used to make ceramics.“Geopolymers aren’t particularly new, but are becoming more popular,” says Coelho. “They have high strength in both tension and compression, superior durability, fire resistance, and thermal insulation. Compared to concrete, geopolymers don’t release carbon dioxide. Compared to ceramics, you don’t have to worry about firing them. What’s even more interesting is that they can be made from industrial byproducts and waste materials, contributing to a circular economy and reducing waste.”The lantern is embedded with custom electronics that serve as a proximity and touch sensor. When a hand is placed over the top, light shines down the glass tubes.The timeless design of the Geoelectric lantern — minimalist, composed of natural materials — belies its future-forward function. Coelho’s academic background is in fine arts and computer science. Much of his work, he says, “bridges these two worlds.”Working at the Design Intelligence Lab with Coelho on the lanterns are Jacob Payne, a graduate architecture student, and Jean-Baptiste Labrune, a research affiliate.A light for MITA few weeks before commencement, Sarkis saw the Geoelectric lantern in Palazzo Diedo Berggruen Arts and Culture in Venice, Italy. The exhibition, a collateral event of the Venice Biennale’s 19th International Architecture Exhibition, featured the work of 40 MIT architecture faculty.The sustainability feature of Geolectric is the key reason Sarkis regarded the lantern as the perfect gift for Robinson. After her career in politics, Robinson founded the Mary Robinson Foundation — Climate Justice, an international center addressing the impacts of climate change on marginalized communities.The third iteration of Geolectric for Sarkis’ office is currently underway. While the lantern was a technical prototype and an opportunity to showcase his lab’s research, Coelho — an immigrant from Brazil — was profoundly touched by how Sarkis created the perfect symbolism to both embody the welcoming spirit of the school and honor President Robinson.“When the world feels most fragile, we need to urgently find sustainable and resilient solutions for our built environment. It’s in the darkest times when we need light the most,” says Coelho. 

74 countries have now ratified a landmark treaty to protect the high seas. Why hasn’t NZ?

The High Seas Treaty comes into force in January. New Zealand lags behind on several fronts, including marine protection and recognition of Māori customary rights.

Getty ImagesThe ratification by more than 60 states, the minimum required to turn the Agreement on Biodiversity Beyond National Jurisdiction (better known as the High Seas Treaty) into law, means it will enter into force on January 17. The treaty covers nearly two-thirds of the ocean – an area of sea and seabed outside the national jurisdiction of any country, which has come under growing pressure from mining, fishing and geoengineering interests, with climate change a compounding factor. The High Seas Treaty sits under the United Nations Convention on the Law of the Sea, which New Zealand ratified in 1996. This established the international legal framework governing the marine environment within each country’s jurisdiction, including the territorial sea, exclusive economic zone (EEZ) and continental shelf. New Zealand’s EEZ is the fifth largest in the world and 15 times its landmass. The objective of the High Seas Treaty is to ensure the conservation and sustainable use of marine biological diversity beyond national jurisdiction – where the seabed and its resources are “common heritage of humankind”. It addresses four main issues: marine genetic resources and benefit sharing, marine protection, environmental impact assessments, and technology transfer. New Zealand is the last country reported to be bottom trawling in the South Pacific high seas for species such as the long-lived orange roughy. It also has ambitions to allow seabed mining in its own waters. The High Seas Treaty is drawing much-needed attention to New Zealand’s approach to ocean governance, both at home and on the world stage. What this means for NZ New Zealand was an active participant in the drafting of the High Seas Treaty and an early signatory in September 2023. A total of 74 nations have now ratified it, but New Zealand is not one of them. The deep seafloor beneath much of the high seas includes various habitats with rich biodiversity, much of it undescribed. Bottom trawling uses large nets to scrape the seafloor. The bycatch can include deepwater corals and sponges, which destroys the habitat of fish and other species. While the High Seas Treaty doesn’t directly regulate extractive activities such as fishing and mining in the high seas and deep seabed, it has implications for their exercise. International organisations such as the International Seabed Authority and regional fisheries management groups regulate mining and fisheries, respectively. But new international institutions will be established to enforce compliance with the High Seas Treaty, including to establish marine protected areas in support of the Global Biodiversity Framework’s goal of protecting 30% of the ocean by 2030. The Treaty also requires new activities in the high seas and deep seabed - aquaculture, geoengineering or seabed mining – to undergo an evaluation of environmental impacts. A beacon for best-practice ocean governance The High Seas Treaty reflects contemporary international legal consensus on best-practice ocean governance. Its guiding principles include: Those who pollute marine areas should bear the costs of managing the issue any benefits flowing from marine resources should be shared equitably (including with Indigenous peoples) states should take a precautionary approach to marine uses where their effects are not well understood states should take an ecosystem-based and integrated approach to ocean management states should use an ocean-governance approach that builds resilience to climate change and recognises the ocean’s role in the global carbon cycle, and states should use the best available science and traditional knowledge in ocean governance and respect the rights of Indigenous peoples. These principles align with broader ocean-focused initiatives as part of the UN Decade of Ocean Science for Sustainable Development and the sustainable development goals, which signals a growing awareness of the need to improve how ocean resources are managed. In New Zealand, international law is not directly enforceable in the courts unless incorporated into domestic legislation. But the courts can refer to international treaties when interpreting domestic legislation. This happened when the Supreme Court used the Law of the Sea Convention to direct decision makers to take a precautionary and ecosystem-based approach to approving seabed mining within New Zealand’s EEZ, based on science, tikanga and mātauranga Māori. The High Seas Treaty also reflects the unequivocal international recognition that states, including New Zealand, have obligations under international law to reduce the impacts of climate change on marine areas, reduce pollution and support the restoration of the ocean. However, New Zealand lags behind other countries in the protection of marine biodiversity. The government has delayed marine protection legislation in the Hauraki Gulf and proposed the removal of a requirement for cameras on fishing industry boats. It has also increased catch limits for some commercial fish species, but reduced them for orange roughy after being taken to court by environmental advocates. It has also opened up seabed mining to the fast-track consenting regime, despite a failure to meet basic standards for environmental impact assessment. And it is proposing to rework the coastal policy statement to enable the use and development of the coastal environment for “priority activities” such as aquaculture, resource extraction and energy generation. Time for NZ to show ocean leadership Ocean advocates and scientists have repeatedly called for reform of New Zealand’s highly fragmented and outdated oceans governance frameworks. The international call to states to uphold the rights of Indigenous peoples stands in stark contrast to the New Zealand government’s recent track record on Māori marine and coastal rights and interests. The courts recently overturned government polices that failed to uphold Māori fishing rights protected by Treaty of Waitangi settlements. But the government nevertheless plans legal changes that would further undermine Māori customary rights in marine and coastal areas. Upholding Māori rights in line with international law is not just an obligation but an opportunity. Iwi and hapū Māori have significant knowledge to contribute to the management of the ocean. It is high time for New Zealand to show leadership on oceans policy on the global stage by ratifying the High Seas Treaty. But it is as important to look after matters within domestic waters, aligning fragmented and outdated marine laws to match global best practice in ocean governance. Elizabeth Macpherson receives funding from Te Apārangi The Royal Society of New Zealand. Conrad Pilditch receives funding from Department of Conservation, MBIE, regional councils and PROs. He is affiliated with the Mussel Reef Restoration Trust and the Whangateau Catchment Collective. Karen Fisher receives funding from MBIE and the Government of Canada’s New Frontiers in Research Fund (NFRF). Simon Francis Thrush receives funding from MBIE, the Marsden Fund, the EU and philanthropic sources.

Bills Target Crucitas Gold Mining Mess in Costa Rica

Crucitas ranks among Costa Rica’s most severe environmental setbacks. Illegal gold mining has ravaged the area for years, bringing crime, community unrest, water pollution, and deaths among those risking their lives in unauthorized operations. The once-rich natural zone now shows clear signs of decline, with forests cleared and rivers tainted by chemicals. Recent events highlight […] The post Bills Target Crucitas Gold Mining Mess in Costa Rica appeared first on The Tico Times | Costa Rica News | Travel | Real Estate.

Crucitas ranks among Costa Rica’s most severe environmental setbacks. Illegal gold mining has ravaged the area for years, bringing crime, community unrest, water pollution, and deaths among those risking their lives in unauthorized operations. The once-rich natural zone now shows clear signs of decline, with forests cleared and rivers tainted by chemicals. Recent events highlight the ongoing trouble. Just this month, authorities detained five Nicaraguans for illegal mining, and earlier, two young brothers from Nicaragua died when a tunnel collapsed on them. Rescue teams recovered their bodies after hours of work, a grim reminder of the dangers. These incidents add to a long list of fatalities, as people cross borders chasing gold amid poverty. Lawmakers in the Legislative Assembly are pushing several bills to tackle this mess. The government’s plan stands out—it would permit gold exploration and extraction in Crucitas to curb the chaos from illegal activities. The Alajuela Commission gave it a green light on September 11 with an 8-1 vote, sending it to the full assembly for debate. It awaits scheduling, and motions could still alter it. Supporters argue that regulated mining would bring order, generate jobs, and fund cleanup, but critics question the fit with Costa Rica’s eco-friendly reputation. Open-pit methods, which the bill would allow under strict rules, carry heavy costs. They strip away land, wipe out habitats, and reduce plant and animal diversity. Air gets dusty, water sources shift or get contaminated, and noise drives away wildlife. Communities nearby face health risks from pollutants, as seen already in Crucitas where mercury and cyanide have seeped into streams. Despite bans since 2010, illegal digs persist, often tied to organized groups, making the site a hotspot for violence and smuggling. Another bill, backed by the Frente Amplio party and the Civic Environmental Parliament, takes a different path. It proposes a Sustainable Development Hub for the Huetar Norte region, focusing on recovery without mining. At its core is the Crucitas International Environmental Geopark, covering wooded hills between Fortuna and Botija. A natural and historical museum would join it, highlighting the area’s past and ecology. This approach draws from UNESCO geoparks, with 13 already in Latin America, including one in Nicaragua. Costa Rica’s planning ministry has approved a similar site in Rio Cuarto. The idea is to protect resources while allowing research and low-key recreation. No gold digging permitted—that aligns with the country’s green identity. The hub would put the National System of Conservation Areas in charge of oversight. Locals could run small-scale businesses with support from the Development Bank and rural agencies. Educational programs through the National Learning Institute and universities would train people, creating opportunities on the ground. Tax breaks aim to attract private projects that fit the goals, like eco-tourism or studies. A key part involves cleaning up the damage. Remediation targets the toxins left behind, aiming to restore soil and water. Some still push for mining as the fix, claiming it would stop illegals and boost the economy, but that ignores the added harm to an already battered spot. The debate boils down to priorities: quick cash from gold versus long-term protection. Costa Rica has built its image on sustainability, drawing tourists to parks and beaches. Reopening to mining could shift that, while the hub option builds on strengths in conservation. As bills move forward, locals watch closely, hoping for a solution that heals rather than harms. The post Bills Target Crucitas Gold Mining Mess in Costa Rica appeared first on The Tico Times | Costa Rica News | Travel | Real Estate.

The first animals on Earth may have been sea sponges, study suggests

MIT researchers traced chemical fossils in ancient rocks to the ancestors of modern-day demosponges.

A team of MIT geochemists has unearthed new evidence in very old rocks suggesting that some of the first animals on Earth were likely ancestors of the modern sea sponge.In a study appearing today in the Proceedings of the National Academy of Sciences, the researchers report that they have identified “chemical fossils” that may have been left by ancient sponges in rocks that are more than 541 million years old. A chemical fossil is a remnant of a biomolecule that originated from a living organism that has since been buried, transformed, and preserved in sediment, sometimes for hundreds of millions of years.The newly identified chemical fossils are special types of steranes, which are the geologically stable form of sterols, such as cholesterol, that are found in the cell membranes of complex organisms. The researchers traced these special steranes to a class of sea sponges known as demosponges. Today, demosponges come in a huge variety of sizes and colors, and live throughout the oceans as soft and squishy filter feeders. Their ancient counterparts may have shared similar characteristics.“We don’t know exactly what these organisms would have looked like back then, but they absolutely would have lived in the ocean, they would have been soft-bodied, and we presume they didn’t have a silica skeleton,” says Roger Summons, the Schlumberger Professor of Geobiology Emeritus in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS).The group’s discovery of sponge-specific chemical fossils offers strong evidence that the ancestors of demosponges were among the first animals to evolve, and that they likely did so much earlier than the rest of Earth’s major animal groups.The study’s authors, including Summons, are lead author and former MIT EAPS Crosby Postdoctoral Fellow Lubna Shawar, who is now a research scientist at Caltech, along with Gordon Love from the University of California at Riverside, Benjamin Uveges of Cornell University, Alex Zumberge of GeoMark Research in Houston, Paco Cárdenas of Uppsala University in Sweden, and José-Luis Giner of the State University of New York College of Environmental Science and Forestry.Sponges on steroidsThe new study builds on findings that the group first reported in 2009. In that study, the team identified the first chemical fossils that appeared to derive from ancient sponges. They analyzed rock samples from an outcrop in Oman and found a surprising abundance of steranes that they determined were the preserved remnants of 30-carbon (C30) sterols — a rare form of steroid that they showed was likely derived from ancient sea sponges.The steranes were found in rocks that were very old and formed during the Ediacaran Period — which spans from roughly 541 million to about 635 million years ago. This period took place just before the Cambrian, when the Earth experienced a sudden and global explosion of complex multicellular life. The team’s discovery suggested that ancient sponges appeared much earlier than most multicellular life, and were possibly one of Earth’s first animals.However, soon after these findings were released, alternative hypotheses swirled to explain the C30 steranes’ origins, including that the chemicals could have been generated by other groups of organisms or by nonliving geological processes.The team says the new study reinforces their earlier hypothesis that ancient sponges left behind this special chemical record, as they have identified a new chemical fossil in the same Precambrian rocks that is almost certainly biological in origin.Building evidenceJust as in their previous work, the researchers looked for chemical fossils in rocks that date back to the Ediacaran Period. They acquired samples from drill cores and outcrops in Oman, western India, and Siberia, and analyzed the rocks for signatures of steranes, the geologically stable form of sterols found in all eukaryotes (plants, animals, and any organism with a nucleus and membrane-bound organelles).“You’re not a eukaryote if you don’t have sterols or comparable membrane lipids,” Summons says.A sterol’s core structure consists of four fused carbon rings. Additional carbon side chain and chemical add-ons can attach to and extend a sterol’s structure, depending on what an organism’s particular genes can produce. In humans, for instance, the sterol cholesterol contains 27 carbon atoms, while the sterols in plants generally have 29 carbon atoms.“It’s very unusual to find a sterol with 30 carbons,” Shawar says.The chemical fossil the researchers identified in 2009 was a 30-carbon sterol. What’s more, the team determined that the compound could be synthesized because of the presence of a distinctive enzyme which is encoded by a gene that is common to demosponges.In their new study, the team focused on the chemistry of these compounds and realized the same sponge-derived gene could produce an even rarer sterol, with 31 carbon atoms (C31). When they analyzed their rock samples for C31 steranes, they found it in surprising abundance, along with the aforementioned C30 steranes.“These special steranes were there all along,” Shawar says. “It took asking the right questions to seek them out and to really understand their meaning and from where they come.”The researchers also obtained samples of modern-day demosponges and analyzed them for C31 sterols. They found that, indeed, the sterols — biological precursors of the C31 steranes found in rocks — are present in some species of contemporary demosponges. Going a step further, they chemically synthesized eight different C31 sterols in the lab as reference standards to verify their chemical structures. Then, they processed the molecules in ways that simulate how the sterols would change when deposited, buried, and pressurized over hundreds of millions of years. They found that the products of only two such sterols were an exact match with the form of C31 sterols that they found in ancient rock samples. The presence of two and the absence of the other six demonstrates that these compounds were not produced by a random nonbiological process.The findings, reinforced by multiple lines of inquiry, strongly support the idea that the steranes that were found in ancient rocks were indeed produced by living organisms, rather than through geological processes. What’s more, those organisms were likely the ancestors of demosponges, which to this day have retained the ability to produce the same series of compounds.“It’s a combination of what’s in the rock, what’s in the sponge, and what you can make in a chemistry laboratory,” Summons says. “You’ve got three supportive, mutually agreeing lines of evidence, pointing to these sponges being among the earliest animals on Earth.”“In this study we show how to authenticate a biomarker, verifying that a signal truly comes from life rather than contamination or non-biological chemistry,” Shawar adds.Now that the team has shown C30 and C31 sterols are reliable signals of ancient sponges, they plan to look for the chemical fossils in ancient rocks from other regions of the world. They can only tell from the rocks they’ve sampled so far that the sediments, and the sponges, formed some time during the Ediacaran Period. With more samples, they will have a chance to narrow in on when some of the first animals took form.This research was supported, in part, by the MIT Crosby Fund, the Distinguished Postdoctoral Fellowship program, the Simons Foundation Collaboration on the Origins of Life, and the NASA Exobiology Program. 

No Results today.

Our news is updated constantly with the latest environmental stories from around the world. Reset or change your filters to find the most active current topics.

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.