Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Too Much Trust in AI Poses Unexpected Threats to the Scientific Process

News Feed
Monday, March 18, 2024

Machine-learning models are quickly becoming common tools in scientific research. These artificial intelligence systems are helping bioengineers discover new potential antibiotics, veterinarians interpret animals’ facial expressions, papyrologists read words on ancient scrolls, mathematicians solve baffling problems and climatologists predict sea-ice movements. Some scientists are even probing large language models’ potential as proxies or replacements for human participants in psychology and behavioral research. In one recent example, computer scientists ran ChatGPT through the conditions of the Milgram shock experiment—the famous study on obedience in which people gave what they believed were increasingly painful electric shocks to an unseen person when told to do so by an authority figure—and other well-known psychology studies. The artificial intelligence model responded in a similar way as humans did—75 percent of simulated participants administered shocks of 300 volts and above.But relying on these machine-learning algorithms also carry risks. Some of those risks are commonly acknowledged, such as generative AI’s tendency to spit out occasional “hallucinations” (factual inaccuracies or nonsense). Artificial intelligence tools can also replicate and even amplify human biases about characteristics such as race and gender. And the AI boom, which has given rise to complex, trillion-variable models, requires water- and energy-hungry data centers that likely have high environmental costs.One big risk is less obvious, though potentially very consequential: humans tend to automatically attribute a great deal of authority and trust to machines. This misplaced faith could cause serious problems when AI systems are used for research, according to a paper published in early March in Nature.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.“These tools are being anthropomorphized and framed as humanlike and superhuman. We risk inappropriately extending trust to the information produced by AI,” says the new paper’s co-author Molly Crockett, a cognitive psychologist and neuroscientist at Princeton University. AI models are human-made products, and they “represent the views and positions of the people who developed them,” says Lisa Messeri, a Yale University sociocultural anthropologist who worked with Crockett on the paper. Scientific American spoke with both researchers to learn more about the ways scientists use AI—and the potential effects of trusting this technology too much.[An edited transcript of the interview follows.]Why did you write this paper?LISA MESSERI: [Crockett] and I started seeing and sharing all sorts of large, lofty promises of what AI could offer the scientific pipeline and scientific community. When we really started to think we needed to write something was when we saw claims that large language models could become substitutions for human subjects in research. These claims, given our years of conversation, seemed wrong-footed.MOLLY CROCKETT: I have been using machine learning in my own research for several years, [and] advances in AI are enabling scientists to ask questions we couldn’t ask before. But, as I’ve been doing this research and observing that excitement among colleagues, I have developed a sense of uneasiness that’s been difficult to shake.Beyond using large language models to replace human participants, how are scientists thinking about deploying AI?CROCKETT: Previously we helped write a response to a study in [Proceedings of the National Academy of Sciences USA]that claimed machine learning could be used to predict whether research would [be replicable] just from the words in a paper.... That struck us as technically implausible. But more broadly, we’ve discovered that scientists are talking about using AI tools to make their work more objective and to be more productive.We found that both of those goals are quite risky and open up scientists to producing more while understanding less. The worry is that we’re going to think that these tools are helping us to understand the world better, when in reality they might actually be distorting our view.MESSERI: We categorize the AI uses we observed in our review into four categories: the Surrogate, the Oracle, the Quant and the Arbiter. The Surrogate is what we’ve already discussed—it replaces human subjects. The Oracle is an AI tool that is asked to synthesize the existing corpus of research and produce something, such as a review or new hypotheses. The Quant is AI that is used by scientists to process the intense amount of data out there—maybe produced by those machine surrogates. AI Arbiters are like [the tools described] in the [PNAS] replication study [Crockett] mentioned, tools for evaluating and adducting research. We call these visions for AI because they’re not necessarily being executed today in a successful or clean way, but they’re all being explored and proposed.For each of these uses, you’ve pointed out that even if AI’s hallucinations and other technical problems are solved, risks remain. What are those risks?CROCKETT: The overarching metaphor we use is this idea of monoculture, which comes from agriculture. Monocultures are very efficient. They improve productivity. But they’re vulnerable to being invaded by pests or disease; you’re more likely to lose the whole crop when you have a monoculture versus a diversity of what you’re growing. Scientific monocultures, too, are vulnerable to risks such as errors propagating throughout the whole system. This is especially the case with the foundation models in AI research, where one infrastructure is being used and applied across many domains. If there’s some error in that system, it can have widespread effects.We identify two kinds of scientific monocultures that can arise with widespread AI adoption. The first is the monoculture of knowing. AI tools are only suited to answer certain kinds of questions. Because these tools boost productivity, the overall set of research questions being explored could become tailored to what AI is good at.Then there’s the monoculture of the knower, where AI tools come to replace human thinkers. And because AI tools have a specific standpoint, this eliminates the diversity of different human perspectives from research production. When you have many different kinds of minds working on a scientific problem, you’re more likely to spot false assumptions or missed opportunities.Both monocultures could lead to cognitive illusions.What do you mean by illusions?MESSERI: One example that’s already out there in psychology is the illusion of explanatory depth. Basically, when someone in your community claims they know something, you tend to assume you know that thing as well.In your paper you cite research demonstrating that using a search engine can trick someone into believing they know something—when really they only have online access to that knowledge. And students who use AI assistant tools to respond to test questions end up thinking they understand a topic better than they do.MESSERI: Exactly. Building off that one illusion of explanatory depth, we also identify two others. First, the illusion of exploratory breadth, where someone thinks they’re examining more than they are: There are an infinite number of questions we could ask about science and about the world. We worry that with the expansion of AI, the questions that AI is well suited to answer will be mistaken for the entire field of questions one could ask. Then there’s the risk of an illusion of objectivity. Either there’s an assumption that AI represents all standpoints or there’s an assumption that AI has no standpoint at all. But at the end of the day, AI tools are created by humans coming from a particular perspective.How can scientists avoid falling into these traps? How can we mitigate these risks?MESSERI: There’s the institutional level where universities and publishers dictate research. These institutions are developing partnerships with AI companies. We have to be very circumspect about the motivations behind that.... One mitigation strategy is just to be incredibly forthright about where the funding for AI is coming from and who benefits from the work being done on it.CROCKETT: At the institutional level, funders, journal editors and universities can be mindful of developing a diverse portfolio of research to ensure that they’re not putting all the resources into research that uses a single-AI approach. In the future, it might be necessary to consciously protect resources for the kinds of research that can’t be addressed with AI tools.And what sort of research is that?CROCKETT: Well, as of right now, AI cannot think like a human. Any research about human thought and behavior, and also qualitative research, is not addressable with AI tools.Would you say that in the worst-case scenario, AI poses an existential threat to human scientific knowledge production? Or is that an overstatement?CROCKETT: I don’t think that it’s an overstatement. I think we are at a crossroads around how we decide what knowledge is and how we proceed in the endeavor of knowledge production.Is there anything else you think is important for the public to really understand about what’s happening with AI and scientific research?MESSERI: From the perspective of reading media coverage of AI, it seems as though this is some preordained, inevitable “evolution” of scientific and technical development. But as an anthropologist of science and technology, I would really like to emphasize that science and tech don’t proceed in an inevitable direction. It is always human-driven. These narratives of inevitability are themselves a product of human imagination and come from mistaking the desire by some to be a prophecy for all. Everyone, even nonscientists, can be part of questioning this narrative of inevitability by imagining the different futures that might come true instead.CROCKETT: Being skeptical about AI in science doesn’t require being a hater of AI in science and technology. We love science. I’m excited about AI and its potential for science. But just because an AI tool is being used in science does not mean that it is automatically better science.As scientists, we are trained to deny our humanness. We’re trained that human experience, bias and opinion have no place in the scientific method. The future of autonomous, AI “self-driving” labs is the pinnacle of realizing that sort of training. But increasingly we are seeing evidence that diversity of thought, experience and training in humans that do the science is vital for producing robust, innovative and creative knowledge. We don’t want to lose that. To keep the vitality of scientific knowledge production, we need to keep humans in the loop.

It’s vital to “keep humans in the loop” to avoid humanizing machine-learning models in research

Machine-learning models are quickly becoming common tools in scientific research. These artificial intelligence systems are helping bioengineers discover new potential antibiotics, veterinarians interpret animals’ facial expressions, papyrologists read words on ancient scrolls, mathematicians solve baffling problems and climatologists predict sea-ice movements. Some scientists are even probing large language models’ potential as proxies or replacements for human participants in psychology and behavioral research. In one recent example, computer scientists ran ChatGPT through the conditions of the Milgram shock experiment—the famous study on obedience in which people gave what they believed were increasingly painful electric shocks to an unseen person when told to do so by an authority figure—and other well-known psychology studies. The artificial intelligence model responded in a similar way as humans did—75 percent of simulated participants administered shocks of 300 volts and above.

But relying on these machine-learning algorithms also carry risks. Some of those risks are commonly acknowledged, such as generative AI’s tendency to spit out occasional “hallucinations” (factual inaccuracies or nonsense). Artificial intelligence tools can also replicate and even amplify human biases about characteristics such as race and gender. And the AI boom, which has given rise to complex, trillion-variable models, requires water- and energy-hungry data centers that likely have high environmental costs.

One big risk is less obvious, though potentially very consequential: humans tend to automatically attribute a great deal of authority and trust to machines. This misplaced faith could cause serious problems when AI systems are used for research, according to a paper published in early March in Nature.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


“These tools are being anthropomorphized and framed as humanlike and superhuman. We risk inappropriately extending trust to the information produced by AI,” says the new paper’s co-author Molly Crockett, a cognitive psychologist and neuroscientist at Princeton University. AI models are human-made products, and they “represent the views and positions of the people who developed them,” says Lisa Messeri, a Yale University sociocultural anthropologist who worked with Crockett on the paper. Scientific American spoke with both researchers to learn more about the ways scientists use AI—and the potential effects of trusting this technology too much.

[An edited transcript of the interview follows.]

Why did you write this paper?

LISA MESSERI: [Crockett] and I started seeing and sharing all sorts of large, lofty promises of what AI could offer the scientific pipeline and scientific community. When we really started to think we needed to write something was when we saw claims that large language models could become substitutions for human subjects in research. These claims, given our years of conversation, seemed wrong-footed.

MOLLY CROCKETT: I have been using machine learning in my own research for several years, [and] advances in AI are enabling scientists to ask questions we couldn’t ask before. But, as I’ve been doing this research and observing that excitement among colleagues, I have developed a sense of uneasiness that’s been difficult to shake.

Beyond using large language models to replace human participants, how are scientists thinking about deploying AI?

CROCKETT: Previously we helped write a response to a study in [Proceedings of the National Academy of Sciences USA]that claimed machine learning could be used to predict whether research would [be replicable] just from the words in a paper.... That struck us as technically implausible. But more broadly, we’ve discovered that scientists are talking about using AI tools to make their work more objective and to be more productive.

We found that both of those goals are quite risky and open up scientists to producing more while understanding less. The worry is that we’re going to think that these tools are helping us to understand the world better, when in reality they might actually be distorting our view.

MESSERI: We categorize the AI uses we observed in our review into four categories: the Surrogate, the Oracle, the Quant and the Arbiter. The Surrogate is what we’ve already discussed—it replaces human subjects. The Oracle is an AI tool that is asked to synthesize the existing corpus of research and produce something, such as a review or new hypotheses. The Quant is AI that is used by scientists to process the intense amount of data out there—maybe produced by those machine surrogates. AI Arbiters are like [the tools described] in the [PNAS] replication study [Crockett] mentioned, tools for evaluating and adducting research. We call these visions for AI because they’re not necessarily being executed today in a successful or clean way, but they’re all being explored and proposed.

For each of these uses, you’ve pointed out that even if AI’s hallucinations and other technical problems are solved, risks remain. What are those risks?

CROCKETT: The overarching metaphor we use is this idea of monoculture, which comes from agriculture. Monocultures are very efficient. They improve productivity. But they’re vulnerable to being invaded by pests or disease; you’re more likely to lose the whole crop when you have a monoculture versus a diversity of what you’re growing. Scientific monocultures, too, are vulnerable to risks such as errors propagating throughout the whole system. This is especially the case with the foundation models in AI research, where one infrastructure is being used and applied across many domains. If there’s some error in that system, it can have widespread effects.

We identify two kinds of scientific monocultures that can arise with widespread AI adoption. The first is the monoculture of knowing. AI tools are only suited to answer certain kinds of questions. Because these tools boost productivity, the overall set of research questions being explored could become tailored to what AI is good at.

Then there’s the monoculture of the knower, where AI tools come to replace human thinkers. And because AI tools have a specific standpoint, this eliminates the diversity of different human perspectives from research production. When you have many different kinds of minds working on a scientific problem, you’re more likely to spot false assumptions or missed opportunities.

Both monocultures could lead to cognitive illusions.

What do you mean by illusions?

MESSERI: One example that’s already out there in psychology is the illusion of explanatory depth. Basically, when someone in your community claims they know something, you tend to assume you know that thing as well.

In your paper you cite research demonstrating that using a search engine can trick someone into believing they know something—when really they only have online access to that knowledge. And students who use AI assistant tools to respond to test questions end up thinking they understand a topic better than they do.

MESSERI: Exactly. Building off that one illusion of explanatory depth, we also identify two others. First, the illusion of exploratory breadth, where someone thinks they’re examining more than they are: There are an infinite number of questions we could ask about science and about the world. We worry that with the expansion of AI, the questions that AI is well suited to answer will be mistaken for the entire field of questions one could ask. Then there’s the risk of an illusion of objectivity. Either there’s an assumption that AI represents all standpoints or there’s an assumption that AI has no standpoint at all. But at the end of the day, AI tools are created by humans coming from a particular perspective.

How can scientists avoid falling into these traps? How can we mitigate these risks?

MESSERI: There’s the institutional level where universities and publishers dictate research. These institutions are developing partnerships with AI companies. We have to be very circumspect about the motivations behind that.... One mitigation strategy is just to be incredibly forthright about where the funding for AI is coming from and who benefits from the work being done on it.

CROCKETT: At the institutional level, funders, journal editors and universities can be mindful of developing a diverse portfolio of research to ensure that they’re not putting all the resources into research that uses a single-AI approach. In the future, it might be necessary to consciously protect resources for the kinds of research that can’t be addressed with AI tools.

And what sort of research is that?

CROCKETT: Well, as of right now, AI cannot think like a human. Any research about human thought and behavior, and also qualitative research, is not addressable with AI tools.

Would you say that in the worst-case scenario, AI poses an existential threat to human scientific knowledge production? Or is that an overstatement?

CROCKETT: I don’t think that it’s an overstatement. I think we are at a crossroads around how we decide what knowledge is and how we proceed in the endeavor of knowledge production.

Is there anything else you think is important for the public to really understand about what’s happening with AI and scientific research?

MESSERI: From the perspective of reading media coverage of AI, it seems as though this is some preordained, inevitable “evolution” of scientific and technical development. But as an anthropologist of science and technology, I would really like to emphasize that science and tech don’t proceed in an inevitable direction. It is always human-driven. These narratives of inevitability are themselves a product of human imagination and come from mistaking the desire by some to be a prophecy for all. Everyone, even nonscientists, can be part of questioning this narrative of inevitability by imagining the different futures that might come true instead.

CROCKETT: Being skeptical about AI in science doesn’t require being a hater of AI in science and technology. We love science. I’m excited about AI and its potential for science. But just because an AI tool is being used in science does not mean that it is automatically better science.

As scientists, we are trained to deny our humanness. We’re trained that human experience, bias and opinion have no place in the scientific method. The future of autonomous, AI “self-driving” labs is the pinnacle of realizing that sort of training. But increasingly we are seeing evidence that diversity of thought, experience and training in humans that do the science is vital for producing robust, innovative and creative knowledge. We don’t want to lose that. To keep the vitality of scientific knowledge production, we need to keep humans in the loop.

Read the full story here.
Photos courtesy of

How communities are giving new life to polluted land

Cleaning up contaminated land is a struggle. Meet some of the community leaders who are taking matters into their own hands.

The vision The tarp shade snaps and flutters in the breeze above the harvest volunteers. The CSA is bountiful this spring — crunchy lettuce, sweet strawberries, and even some cherries from the new windbreak. Stores around here sell produce this tasty, organic, and local for a fortune, but our volunteers feed families on it for just an hour of work a week and some dirty fingernails. The model is spreading. Vacant lots and brownfield sites all over the city have started sprouting biodigesters and sunflower fields, compost vessels and prairie plantings, communities of care: the phytoremediating foot soldiers of food sovereignty in recovery. — a drabble by Looking Forward reader Betsy Ruckman The spotlight There are more than 450,000 brownfield sites across the U.S. — previously developed parcels of land that have been left abandoned, with some form of contamination. They may be former industrial facilities, gas stations, mines, landfills, dumping sites. And before they can be reused, they need to be cleaned, or remediated. So, what does that mean exactly? It’s not as if a bunch of volunteers can go out with sponges and buckets of soapy water to rid the land of pollution. Brownfield remediation may involve a number of tactics — like digging up contaminated soil and carting it offsite for safe disposal or treatment; putting some sort of barrier between the contaminated ground and whatever’s going to be built on top of it; injecting chemicals or microbes into the soil that can break down harmful substances; or planting plants that can suck them up. Using plants to treat pollution is known as phytoremediation, and it’s been used successfully to remediate heavy metals, petroleum, fertilizer runoff, and even radioactive elements in the wake of the Chernobyl disaster. Any of these efforts can be costly and time-consuming, which is why brownfields often sit idle for years or even decades (although President Biden’s bipartisan infrastructure law mobilized billions of dollars to address a backlog of Superfund sites, brownfields, and abandoned oil and gas wells). Still, despite high barriers, some communities have taken their own initiative to remediate brownfields and return the land to use. Looking Forward reader Betsy Ruckman, who submitted the drabble above, described “phytoremediating foot soldiers” — envisioning a world in which small-scale, community-led remediation efforts give rise to a patchwork of healthy and communal green spaces. “I was inspired by the Chicago-area Green Era Campus,” Betsy said, which is “turning brownfields into productive organics-recycling hubs, teaching gardens, and productive fields.” The Green Era Campus is just one example of how community leaders are taking remediation efforts into their own hands, investing in returning land to beneficial uses and testing strategies for dealing with some of the toughest soil contaminants. Revitalizing an abandoned lot on Chicago’s South Side Erika Allen has been working in urban agriculture for over two decades. “I was at that time, and still, really focused on juvenile justice diversion,” she said. She began with art therapy as an intervention that could keep young people from going down a path toward incarceration, but quickly realized that the food system offered a more promising opportunity — “because it’s also economic,” she said. “We can all grow food and consume it and sell it and create other products.” In 2002, she founded the Chicago chapter of Growing Power, which has since reorganized as Urban Growers Collective — an organization focused on food security, job training, and community engagement through farming. While working on other growing projects, Allen and other partners began to develop a vision for a multidimensional site that could be a hub for energy development, composting, education, community events, and of course, growing food: the Green Era Campus. In 2015, the team acquired a 9-acre piece of abandoned land in the neighborhood of Auburn Gresham that had formerly been used as an impound lot. They bought it from the city for just $1. “Everybody on the South Side knows the space because if you had your car towed, it was usually towed to this place,” Allen said. Prior to that, it was owned by a manufacturer of agricultural equipment. The site of the Green Era Campus, before remediation efforts began. Courtesy of Green Era Chicago The site had a mixture of contaminants, including petroleum and motor oil from the cars that had been held there and debris from illegal dumping. The crew also discovered a submerged tank that was still filled with linseed oil, dating back to the manufacturing days. “That was a surprise,” Allen said. The remediation process took years. After being denied once, the team was awarded an EPA grant for brownfield remediation in 2017. They contracted the environmental firm Terracon to lead the remediation efforts, which included a variety of strategies. Some of the remediation tactics were tailored to the variety of intended uses for the site. “We built on top of some of the contamination, so a lot of it was treated on-site,” Allen said. For instance, one of the key elements of the campus is a commercial-scale anaerobic digester to process food waste from local restaurants, manufacturers, and residents (which may have inspired the “biodigesters” referenced in Betsy’s drabble). That facility is built on top of concrete, which acts as a barrier. In other places, the team excavated contaminated soil and brought in clean soil to replace it. The price tag was ultimately in the millions. “It was absolutely astronomical,” said Jason Feldman, co-founder of Green Era Sustainability, an investment entity that is one of several partner organizations working on the project. “The cost of the cleanup was more than the value of the land, even if it was clean.” But, Feldman and Allen stressed, they see that investment in the land as a key value that the project is bringing to the community, flipping a narrative of chronic disinvestment. “We were able to figure it out and innovate and educate the community, but also take away the extreme expense of what the community is required to do to be able to address environmental racism that created those issues in the first place,” Allen said. The new-and-improved Green Era Campus. Courtesy of Green Era Chicago Although phytoremediation was the focus of Betsy’s vision, it wouldn’t have been appropriate for the Green Era site — plant roots, while amazing, can’t clear away debris or a submerged tank. But Feldman says he’s interested in that approach for future projects and partnerships. The Green Era team also hopes that the compost produced at their facility might be able to help out other remediation projects in the future. “One of the biggest costs of the whole remediation process at the Green Era Campus was actually bringing in the clean soil,” Feldman said. “Which is a resource that’s being taken from one place and brought to another place. We could use this beautiful, nutrient-rich material that we’ve got to help remediate other sites.” He’d also like to see the compost be used to support urban reforestation efforts, which he views as a form of phytoremediation of city soil, even in places that aren’t designated as brownfields. As far as the campus goes, phase one is now officially complete. The site is clean and the digester is built and operating, creating compost and capturing gas that is already being sent to the grid as energy. “I’m in the process of raising the funds to build the rest of the campus,” Allen said, which will include a vertical farm, a community education center, a plant nursery and produce store, and a stormwater mitigation area. Cleaning up PFAS in the northeast corner of Maine Halfway across the country, another group of community partners is testing the limits of phytoremediation on one of the most pernicious substances in the environment. In 2009, the Mi’kmaq Nation in Maine acquired around 800 acres of land that had been part of the Loring Air Force Base. Due to contamination from fuel, pesticides, on-site landfills, and other hazards, the base was declared a Superfund site in 1990 (four years before it officially closed, and one year before the Mi’Kmaq tribe received federal recognition). Superfund sites differ from brownfields in their level of contamination, and because of the hazard they pose to human health, the federal government is obligated to clean them. Loring Air Force Base remains on the EPA’s National Priorities List, but some efforts to date have included capping the former landfills and removing low-level radioactive waste from nuclear weapons operations. But the tribe discovered there was another contaminant on their new land: per- and polyfluoroalkyl substances, or PFAS. Sometimes known as “forever chemicals,” PFAS are a class of toxic chemicals that have been used in a huge variety of industrial and household items since the 1940s, though in recent years, governments have taken steps to regulate them due to mounting evidence linking the chemicals to health issues. PFAS are often found on military bases, in the residue of firefighting foams. Meanwhile, Chelli Stanley, the founder of a small environmental remediation organization called Upland Grassroots, was studying ways that hemp might be used to clean toxic substances from polluted ground. “It’s a bioaccumulator, it’s very versatile in phytoremediation, in that it can take up a lot of different chemicals,” Stanley said of the plant. “Once it became legalized, I just started reaching out to people to see if we could start testing its abilities on different chemicals.” Stanley reached out to Richard Silliboy, vice-chief of the Mi’Kmaq Nation. “He was very interested in finding solutions to cleaning land — that we could do it ourselves, and we didn’t have to wait on anybody,” Stanley said. “And we could further the science so that it could help to further the ability to clean the land in the future.” In 2019, the tribe began a research project to find out if hemp could help rid the land of PFAS contamination, working with Stanley and Upland Grassroots as well as researchers at the Connecticut Agricultural Experiment Station, and now also at the University of Virginia. Left: Richard Silliboy plants hemp seeds on the land the Mi’Kmaq tribe owns at the site of the former Loring Air Force Base. Right: Chelli Stanley tends the experimental hemp plot. Courtesy of Upland Grassroots Five years in, their experiments have shown that hemp does remove PFAS from the soil. But Sara Nason, one of the lead researchers from the Connecticut Agricultural Experiment Station, said that it stops short of being a total solution. “For most organic chemicals, an important aspect of phytoremediation is that plants and the soil bacteria around them help to break down the contaminants and detoxify them,” Nason said. But PFAS are synthetic chemicals, and as their nickname, ‘forever,’ would suggest, they can’t easily be broken down. “Even if the plants remove PFAS from the soil, we still need other methods to destroy the PFAS in the plants.” That has been one of the greatest challenges to date, Stanley said. “There’s no way to destroy the PFAS at this point, and we don’t want to put it in a landfill or just have a bunch of hemp sitting around that’s full of PFAS.” Currently, all of the hemp from the site is going to labs where scientists are working on a variety of techniques that might help to break the chemicals down. This fall, the group received a four-year grant from the EPA to continue the research. All of the partners involved are taking a long view of this work, with the goals of continuing to clean the land as much as possible, contributing to the scientific understanding of PFAS, and, for the tribe, being able to someday harvest plants from the land without fear of what may lurk inside them. “Our actual phytoremediation results have not been as impressive as we would like them to be,” Nason said, “but in some ways, that has not mattered as much as I would have thought. There are very few ways for communities to take action on PFAS-contaminated soil right now, and doing something that helps in a small way has been very motivating for the people participating.” Although questions remain about how to fully remediate the persistent chemicals, Stanley noted that working with hemp or other bioaccumulating plants is a low-cost, low-tech option available to any land steward dealing with different forms of contamination in soil and water. “Phytoremediation is very accessible. You don’t need a degree, you don’t need specialized training,” Stanley said. “As long as you know how to grow plants, then you would be able to do it” — much like the grassroots vision Betsy shared in her drabble. — Claire Elise Thompson More exposure Read: more about the story of Green Era Campus and the progress of its various projects (Block Club Chicago) Read: more about the PFAS remediation work on the former Loring Air Force Base in Maine (Grist) Read: about how the bipartisan infrastructure law has quadrupled spending on brownfield remediation — and what could happen to future funding under the new administration (The Guardian) Read: a Q&A with a toxicologist who has been cleaning up contaminated sites with fungi — “mycoremediation” (Yale Environment 360) Browse: a citizen’s guide to phytoremediation, from the EPA A parting shot After their successful use at Chernobyl, sunflowers were planted in Japan in the wake of the Fukushima nuclear accident in 2011. In this case, the efforts were not as effective, likely due to the variety of sunflowers planted. But as Reuters reported at the time, the cheerful yellow flowers stood for more than literal phytoremediation, bringing a sense of hope and agency to residents in impacted areas. This photo from 2011 shows a sunflower farm in full bloom in Fukui, Japan, about 300 miles from the disaster. IMAGE CREDITS Vision: Grist Spotlight: Courtesy of Green Era Chicago; Courtesy of Upland Grassroots Parting shot: Buddhika Weerasinghe / Getty Images This story was originally published by Grist with the headline How communities are giving new life to polluted land on Nov 20, 2024.

The World Bank has a factory-farm climate problem

Development banks sent $2.3 billion to industrial animal agriculture last year, according to a new analysis.

Recent data analysis conducted by a human rights advocacy organization found that nearly a dozen international finance institutions directed over $3 billion to animal agriculture in 2023. The majority of those funds — upwards of $2.27 billion — came from development banks and went towards projects that support factory farming, a practice that contributes to greenhouse gas emissions as well as biodiversity loss.  The researchers behind the analysis are calling on the development banks — which include the International Finance Corporation, or IFC, part of the World Bank — to scrutinize the climate and environmental impacts of the projects they fund, especially in light of the World Bank’s climate pledges. The analysis comes from the International Accountability Project, which reviewed disclosure documents from 15 development banks and the Green Climate Fund, established in 2010 at COP16 to support climate action in developing countries. Researchers found that 10 of those development banks, as well as well as the Green Climate Fund, financed projects directly supporting animal agriculture. The data serves as the basis for a new white paper from Stop Financing Factory Farming, or S3F, a coalition of advocacy groups that seeks to block development banks from funding agribusiness, released last month.  The International Accountability Project, which advocates for human and environmental rights, hopes that its findings will pressure international financial institutions like the World Bank to see the contradiction in financing industrial animal agriculture projects while also promising to help reduce harmful greenhouse gas emissions. Agriculture accounts for a significant portion of global greenhouse gas emissions, so much so that research has suggested limiting global warming to 1.5 degrees Celsius (2.7 degrees Fahrenheit) is not possible without changing how we grow food and what we eat. Within the agricultural sector, livestock production is the main source of greenhouse emissions — as ruminants like cows and sheep release methane into the atmosphere whenever they burp.  Factory farms, which aim to produce large amounts of meat and dairy as quickly and cheaply as possible, present a problem for the climate and the environment. They can hold anywhere from hundreds to hundreds of thousands of animals (fewer if the animals are bigger in size, like cattle, and more if they’re smaller, like chickens). These operations result in tremendous amounts of manure, which, depending on how it’s stored, can pollute waterways or release ammonia into the air. They also contribute to global warming: A nonprofit research organization once looked at the 20 meat and dairy companies responsible for the greatest amount of greenhouse gas emissions and found that, put together, their emissions surpassed those of countries like Australia, Germany, and the U.K.  The primary goal of development banks is to provide funding for projects in developing countries that help achieve some social or economic good. In recent years, these banks have included climate among their considerations when selecting initiatives to support. In its Climate Change Action Plan for 2021 to 2025, the World Bank stated its commitment to funding “climate-smart agriculture,” with the goal of nudging the agricultural sector towards lower emissions without sacrificing productivity. The plan says the IFC, a member of the World Bank Group that funds private-sector projects in developing countries, will seek to finance “precision farming and regenerative” agriculture, but also “make livestock production more sustainable.”  But this framework has not precluded development banks from supporting industrial animal agriculture — despite the abundance of information about how factory farming harms people, the planet, and animals themselves. “I think it’s just business as usual,” said Alessandro Ramazzotti, the International Accountability Project researcher who spearheaded the data analysis. In order to determine how much money development banks are sending to industrial animal agriculture projects in the form of loans, investments, and technical and advisory services, Ramazzotti utilized a tool that scrapes bank websites for public disclosures. From there, he and a team of researchers analyzed the information collected, identifying 62 animal agriculture projects and reading the disclosures closely for detail on each one. They found that, of the $3.3 billion spent on animal agriculture, $2.27 billion — or 68 percent — was put towards projects that support industrial animal agriculture, or factory farming.  Only $77 million — or 2 percent — went towards non-industrial operations or small-scale animal agriculture. The remaining project disclosures did not contain sufficient information for the researchers to determine one way or another what sort of initiative it was being funded.  A livestock farmer who implemented a pasture rotation system to raise livestock without deforesting the Amazon under a program of the French Development Agency. RAUL ARBOLEDA / Contributor / Getty Images Ramazzotti noted that the analysis was subjective based on interpreting the language of bank disclosures — which, he pointed out, can largely be considered marketing material for the banks. As a result, sometimes projects that sound small in scale may still feed into industrial animal operations.  He gave the example of the World Bank, which over the years has sought to connect smallholder farmers in Latin America to greater market opportunities. Depending on the exact context of such investments, that can be “quite concerning,” he said. Supporting small-scale cattle ranchers in Brazil could, for example, end up increasing beef supply for the Brazilian-based meat processing giant JBS S.A., which works with suppliers in the region. Such a development would be concerning to environmentalists, as cattle ranching is considered a major driver of deforestation in the Amazon. JBS, along with three smaller slaughterhouses, was sued last year by Brazilian authorities for allegedly purchasing cattle raised illegally on protected lands in the Amazon. JBS declined to comment for this article but has previously said it is “committed to a sustainable beef supply chain.” The IFC, the World Bank Group member that funds private-sector projects in the developing world, told Grist that its goals concern “food security, livelihoods, and climate change.”  “There are 1.3 billion people whose livelihoods are tied to livestock and we also know this sector is responsible for over 30 percent of the global GHG emissions,” a spokesperson for the member bank said, using an abbreviation for greenhouse gas.  Read Next Trump’s second term is already derailing food talks at COP29 Ayurella Horn-Muller The spokesperson added that the bank seeks to fund projects that increase both livestock production and efficiency while reducing greenhouse gas emissions. The IFC also noted that as of July 1, 2025, all its investments will be required to align with the Paris Agreement target of limiting global warming to 1.5 degrees Celsius (2.7 degrees Fahrenheit), and that it asks recipients of livestock financing to adhere to their countries’ commitments under the Paris Agreement.  Lara Fornbaio, a senior legal researcher at Columbia University’s Center for Sustainable Investment, said she was “not surprised at all,” by the S3F report’s findings. She argued that development banks, rather than being profit-motivated, should consider the big picture when choosing the kinds of projects they fund. But she also emphasized that banks need to be rigorous when discerning whether an initiative fits into their stated climate goals.  Even the rubric of “reducing emissions” is likely not rigorous enough to ensure banks are not inadvertently supporting industrial agriculture, Fornbaio said. In some factory farming settings, because growers are so efficient at producing livestock, “the emissions per animal are probably lower than the emissions of a cow [grazing] on a field,” she said. That’s partly because, in industrial agriculture operations, farmers can control every aspect of an animal’s feed — and certain feed choices can help reduce ruminants’ methane emissions. But because large animal agriculture operations grow so many animals, their cumulative emission footprints can be enormous. This big picture lines up with research that says shifting diets away from meat is crucial to curtain global warming. Ramazzotti says his team will continue to monitor bank disclosures for new financing of animal agriculture and hopes to release updated findings on a regular basis going forward. He mentioned that the S3F coalition would consider supporting animal agriculture in the developing world if done on a local, small scale — such as by family farms or pastoral or Indigenous communities. However, he said, the coalition would prefer to see development banks putting money towards vegetable farming and plant-based diets.  Ramazzotti is hopeful about the possibility of pressuring financial institutions to stop supporting factory farming. Recently, the team found that the IFC was considering a loan of up to $60 million to expand a company’s meat-processing operations in Mongolia. “It’s a direct investment in the expansion of factory farming,” he said. “And that’s exactly the [type of] investments we don’t want to see anymore because we believe that the impacts on the local level, but also on the global climate, are very deep.”  The  coalition reached out to the IFC team considering this project with concerns, and the team has since postponed a discussion of the project with their board of directors twice, according to Ramazzotti. Following up, Ramazzotti said, is “not always” proven to be effective, but he’s optimistic that engaging with financial institutions can still lead to change.  Fornbaio agreed. “If I didn’t believe in change, I wouldn’t do this work, probably. … There’s always a way to put pressure,” she said. “I think this type of work is key.” This story was originally published by Grist with the headline The World Bank has a factory-farm climate problem on Nov 20, 2024.

The Invention of Agriculture

Should we be jealous of our hunter-gatherer ancestors?

Subscribe here: Apple Podcasts | Spotify | YouTube | Overcast | Pocket CastsAfter 200,000 years of hunting and gathering, a history-defining decision was made. Starting roughly 12,000 years ago, at least seven different groups of humans independently began to settle down and begin farming. In so doing, they planted the seeds for modern civilization. This is traditionally told as a straightforward story of human progress. After humans made the switch, population growth increased, spurring innovative and creative endeavors that our ancestors couldn’t even imagine.One counterintuitive strain of thought has treated this decision as “the worst mistake in the history of the human race,” as the popular author Jared Diamond once put it. The argument largely rests on research that shows our nomadic forebears were healthier and had more leisure time than those who chose to farm. Diamond, who wrote this article in 1987, when overpopulation concerns were rampant within the American environmental movement, argued that, “forced to choose between limiting population or trying to increase food production, we chose the latter and ended up with starvation, warfare, and tyranny.”Sometimes unorthodox ideas are unorthodox for a reason. On today’s episode of Good on Paper, I’m joined by Andrea Matranga, an economist whose recent paper “The Ant and the Grasshopper: Seasonality and the Invention of Agriculture” argues that the Neolithic revolution happened as a result of climactic changes that necessitated storing food for the winter. Matranga rejects the idea that the past 12,000 years of human development were a mistake, one that underrates the threats of famine and starvation endemic to nomadic life.“There’s a sense in which theories of the Neolithic tend to mirror the political anxieties and the social anxieties of the time in which people came up with them and in which they found favor,” Matranga tells me. “So, you know, obviously in the ’80s—WWF, environmentalism, Earth Day—people are worried about runaway population growth. So obviously in the Neolithic, they must also have had runaway population growth. There’s this interesting mix of the current events bleeding into history.”The following is a transcript of the episode:[Music]Jerusalem Demsas: One of Aesop’s Fables is called “The Ant and the Grasshopper.” As the story goes, a hungry grasshopper comes up to a group of ants in the wintertime and asks them for some food to eat. They are shocked and ask him why he hasn’t stored anything up before the weather got cold. And he replies that he’d eaten well during the summer and made music while the weather was warm. The moral of the story is pretty straightforward: Save while times are good.But for most of human history, for 200,000 years, humanity was much more like the grasshopper than the ants. As hunter-gatherers, we ate well when resources were plentiful but didn’t save for winter, making us susceptible to starvation and death.But then something changed. Around the world, within a relatively short period of time, a bunch of humans independently began farming—and kept farming. How did this happen?[Music]My name’s Jerusalem Demsas. I’m a staff writer here at The Atlantic, and this is Good on Paper. It’s a policy show that questions what we really know about popular narratives.In a paper called “The Ant and the Grasshopper: Seasonality and the Invention of Agriculture,” economist Andrea Matranga formalizes a theory for how humans went from grasshoppers to ants—for how we went from hunting and gathering to settled farming.Climate seasonality increased, meaning winters got harsher and summers got drier. Hunter-gatherers couldn’t keep up with wildlife that fled for warmer climates. Birds can fly south for winters. Humans can’t. So they realized they needed to start storing food during good times. That meant the end of our nomadic lifestyles because people had to remain near those stores.This paper intervenes in the literature in a couple of important ways I explored with Andrea: First, it helps untangle the mystery of how humans became farmers to begin with. But second, it pushes back against the strangely nostalgic idea that our nomadic existence was somehow better than farming—an idea that holds sway among a surprising number of people.Let’s dive in.[Music] Demsas: Andrea, welcome to the show.Andrea Matranga: Thank you so much. Thank you for having me.Demsas: Yeah. So we’re here to talk about a very fun new paper that you’ve recently published at [The Quarterly Journal of Economics], and it’s about the Neolithic Revolution. We’re trying to go all the way back in time. We’ve done some development episodes, but this is further back than I think we’ve ever, ever gone.I want to start with what the Neolithic Revolution was. Can you set the stage for us?Matranga: Yes, absolutely. Neolithic means “new stone,” and it was first detected as a change in the shape of the stone tools that they were using. And then, eventually, they realized that the reason they changed the shape of the tools was also because they changed the subsistence method, meaning that before that—in the Paleolithic, in the Old Stone Age—everybody was a hunter-gatherer, meaning that they were subsisting on foods that grew wild, which they would collect, process, and consume. And then in the Neolithic, or New Stone Age, they started to grow their own food. So the origins of agriculture is what distinguishes the paleo from the Neolithic.Demsas: And around how long ago are we talking?Matranga: It was about 11,500 years ago in the Middle East. And there were seven of these places, and the two latest ones were in sub-Saharan Africa and in eastern North America, where it was about 4,500 years ago.Demsas: You’ve just laid out the span of a few thousand years here where, in a bunch of different places across the world, people are independently inventing farming and agriculture. You said in sub-Saharan Africa but also in the Middle East, north and south China, the Andes, Mexico, North America. How do we know that these developments were independent? And what sorts of evidence do we have from archaeology or otherwise that signal when farming began?Matranga: Yeah, absolutely. For some, it’s very easy. Obviously, if you look at eastern North America versus sub-Saharan Africa, these are two populations which had not had any cultural mixture, so clearly those two have to be independent.When you go from, let’s say, south China and north China or the Middle East and sub-Saharan Africa, then it becomes a little bit murkier. Usually, the arguments that are made are that there’s no other signs of cultural contact, in the sense that the pottery styles are different; the crops that are being grown are different. Usually, you would expect that if they start doing barley and emmer wheat in the Middle East—if you thought that farmers had arrived with this knowledge of it into the Sahel region of Africa, you’d expect them to try to do some of those crops first, and then maybe they find some other crops that work better. And instead, it’s sort of completely disjoint. And that’s usually the way that they think about it.Now, could it be that the idea that somebody was farming some distance away made their way through it? It’s possible, though one of the things is that there’s so many populations today, or in the recent past, that when they were contacted, they had knowledge of plant biology. So they understood perfectly well that if you plant a seed, a plant would grow. But they still hadn’t started farming. They were still hunting and gatherers. And so just knowing that it’s possible to do it doesn’t mean that you have a coherent sort of structure and a strategy for doing it as a population. So for example, I know that if you plant a seed, something grows, but that doesn’t mean I could sustain myself as a farmer.Demsas: (Laughs.) Yes.Matranga: So it still seems that, at least in the sort of making all the parts fit together, these things definitely happened independently in these places.Demsas: But for most of human history, we’re talking about—I mean, 12,000 years ago is obviously a long time ago—but we’re talking about hundreds of thousands of years before then, where we’re a hunter-gathering and are nomads. And so this question of why we make this switch as a species is really, really interesting.And before we get into your research, I’m hoping that we can talk through how the field was thinking about the advent of the Neolithic Revolution. And what were some of the prevailing theories about why agriculture emerged after the last Ice Age?Matranga: Absolutely. So one of the things that happened was that, obviously, we didn’t have as many excavations done, let’s say, in 1900 as we have today. And we didn’t have as good, for example, DNA—we didn’t have DNA at all—but DNA sequencing and other stuff like that. We have much more tools. So obviously, then the theories have also kept pace with the new information that was uncovered.If you look at the earliest theories—Darwin talks about this a little bit but also, let’s say, Braidwood—there are mainly theories about the Middle East because that one was the one where people knew that there had been a Neolithic transition. It was the first one to be excavated. And so most of the explanations are particular to the Middle East. And so one of the arguments that was made was that there might have been a climate desiccation, so it became drier around those years. And when it became drier, people were forced into these oases where there was still water. Therefore, once they were sort of constrained to these small areas, then it was easier to start farming and also necessary because there was just much less land that was fertile enough for hunting and gathering. And so you start taking better care about the land that you already have.And then as you go forward, then one of the things that appeared in the 1960s was the fact that, actually, very often the hunter-gatherers seemed to live a life that seemed enviable in a certain way compared to farmers. And what I mean by that is that they didn’t work very long hours. They seemed that most of the time they only had to gather for a few hours a day. And, obviously, there’s an issue there, which is, What do you consider work? So is walking around hoping you’ll find something, but you don’t actually hunt—is that work? Or is it only while you’re actually chasing the animal? So that was one of the issues.And then another issue is: When you’re a hunter-gatherer, the real problem you have is that it’s not so much about how much you eat during the average periods of times, but it’s what you do when things are very bad. And so for an anthropologist who happens to be there in a regular year, it looks like everything is great. But every 10 years, maybe, there’s a really bad year, and there’s a famine, and everybody’s starving. Now, if you don’t happen to be there in the year in which they’re having a famine, then you don’t understand why anybody would like to switch. That would be one of the caveats I would put to that hunter-gatherer issue.And then we get to the 1980s. There was this very important book, very important also for my research, that was [by] Cohen and Armelagos. There was an edited volume from a conference in which they called hunter-gatherers the original affluent society, in the sense that what they find from many studies from many places around the world is that the farmers are actually shorter. The first farmers are shorter, much shorter, sometimes up to 10 centimeters shorter than the last hunter-gatherers. And basically, the first farmers, they get short, and they stay as short as subsistence farmers are to this day, while the hunter-gatherers—only in the last 50 to 100 years have a lot of people become as tall as the last hunter-gatherers.Demsas: Mmm.Matranga: And this was obviously very surprising to them. Basically, it was a continuation of this theory that perhaps it was better when we were hunter-gatherers. And then the question was: Why did they start farming if hunting and gathering was so great? And so there is this other article by Jared Diamond, and he called agriculture “the worst mistake in the history of the human race.” What he thought was it was runaway population growth.And basically, what happens is that people think they know how the world works, so they start farming. And because you start farming, you become sedentary. And once you’re sedentary, you can have a lot more kids. Because if you’re nomadic, of course, if you have to carry your kids around, you’re going to naturally have to space out the births. And so once we become sedentary, we start having so many kids that, actually, we end up worse off than the way that we started. And what I think is interesting there is that there’s a sense in which theories of the Neolithic tend to mirror the political anxieties and the social anxieties of the time in which people came up with them and in which they found favor.So, you know, obviously in the ’80s—WWF, environmentalism, Earth Day—people are worried about runaway population growth. So obviously, in the Neolithic, they must also have had runaway population growth. There’s this interesting mix of the current events bleeding into history, which you can also see with the Roman Empire. So everybody used to think it was because they debased the currency in the ’70s, because there was inflation in the U.S.Demsas: And now it’s immigration. (Laughs.)Matranga: Exactly. And now it’s immigration.Demsas: It’s so surprising, all of a sudden.Matranga: Exactly. So I sometimes tell students that there’s no such thing in history. There’s current events in period costume.Demsas: (Laughs.)Matranga: I’m exaggerating.Demsas: It’s also a way in which our time period allows us to reflect on similarities with previous times. The Diamond one, I think, is particularly interesting. I was reading that 1987 paper. I looked it up. He was born in, like, the 1930s, so he’s in his 20s when the environmentalist and population-ethics concerns really take off. And it’s really striking. He writes, “Recent discoveries suggest that the adoption of agriculture, supposedly our most decisive step toward a better life, was in many ways a catastrophe from which we have never recovered. With agriculture came the gross social and sexual inequality, the disease and despotism, the curse of our existence.”It’s one of those things—I think it’s useful to think through the ways that it’s possible that you may have made a very early mistake. Like, you maybe have reached an optimal point on a mountain, but you climbed a shorter mountain, and you have to go all the way back down to find a taller mountain.But can you walk us through some of the evidence for seeing farming as a decline in living standards? Where is that coming from? I know you mentioned the height thing, but I remember reading there’s something also about increase in violence and other sorts of problems.Matranga: Yes, absolutely. First of all, one of the things I want to just say to begin with is that it’s hard, for example, for things such as violence, in the sense that the selection pressure on archaeological remains from nomads is very different from the selection pressure for the remains of sedentary populations, simply because it’s a lot easier that remains from nomads might be very shallowly buried or not elaborately buried, while, instead, once you become settled, perhaps you have slightly more ornate tombs, which are easier to find, and other things that tend to preserve the remains.Demsas: So we might be missing a bunch of information about the hunter-gatherer period.Matranga: Yes, especially with things like child mortality. A lot of nomadic groups seem to not have considered kids fully humans, basically, until they were a few years old, just because the child mortality was so horrendously high, just through diseases and other things, that perhaps we have very different selective pressures.But one of the other things that for sure we have is joint diseases. So it looks that the farmers were working more, because they tended to have more arthritis. And the joints on which they have arthritis are the ones that we would expect them to have if they were doing a lot of general farm work, digging, that sort of thing.And also the grinding—the daily grind, right? It’s sort of an idiomatic expression because once you have these seeds—if I give you just, like, Oh, you’re hungry? Here’s a bag of unpopped popcorn.Demsas: (Laughs.) Yeah.Matranga: It’s like, What are you going to do with them? You have to put them on a rock and just grind them for hours and hours every day.Demsas: I had no idea where the “daily grind” came from. I didn’t know that’s where it came from.Matranga: Well, I didn’t until I said it.Demsas: (Laughs.) So maybe you made it up. Okay.Matranga: So maybe we can get one of the producers to check it. But it just came to me as I was saying it. I was like, Oh, I guess that’s where that’s from.But to process that in an efficient way is also incredibly labor-intensive, and so their joint diseases reflect that, as well. And they also have something called porotic hyperostosis, which is, like—you get spongy bone tissue. And that is connected to anemia. So it looks like they were missing iron. And so these are some of the ways in which people have assumed that, basically, from almost everything that you could find, it looks like the farmers were actually eating less, on average, than the hunter-gatherers that came before them.[Music]Demsas: After the break: how the history of agriculture is actually a story about low construction costs.[Break]Demsas: So it’s in this backdrop that your research kind of comes in, right? People are assuming that it has to be a forced choice, because it’s obviously worse to have been a farmer than to be a hunter-gatherer.But you have a paper that, I think, really explores and lays out a different way of thinking about things. Before you get into the meat of it, can you tell me about the genesis for the idea? What got you looking at seasonality as a predictor of the Neolithic Revolution?Matranga: Absolutely. This is a little bit like that scene in Forrest Gump where he starts running, and he says, Well, I thought I’d run until the end of the street, and then I got to the end of the street, so I thought I’d just run into town. And 10 years later, he’s still running.So the origin of this was that I visited my mom, and my mom was teaching Italian, as it was, at the University of Isfahan in Iran. And we went to see this ziggurat in Chogha Zanbil, which is one of those step pyramids. And I was trying to take a picture of this pyramid, and it was very hard to get a good contrast between the pyramid and the ground. And then what happened was I realized, Oh, of course, it’s hard because it’s made out of mud bricks. It’s made out of literally the same stuff that it’s sitting on. They compress it into bricks and dry them and then stack them, and that’s the pyramid.And so then I was thinking, Well, why would agriculture originate from an area with very low construction costs? And so the idea was, Well, the reason why you would need low construction costs is because, once you farm, you’re going to get all of your food in one room at one point of the year after the harvest.Demsas: And sorry—you’re saying that it was already established that farming had begun in places with low construction costs, or you came across this idea yourself?Matranga: No. So the typical idea is: It starts here because it’s the Fertile Crescent, and it’s so fertile. And the problem with this theory is that the Fertile Crescent is very fertile if you compare it to, you know, the deserts north and south of it. But it’s not very fertile compared to any other place in the world. So it’s not very different in terms of the types of soil and the rainfall patterns than any other place, for example, on the north coast of the Mediterranean.So the idea was: That’s why it started. That’s why it’s called the Fertile Crescent. And then I realized, Well, isn’t it weird that it happens to also be the place in the world with the lowest construction costs? Because it’s on an alluvial plain, so everything is clay that you can make mud bricks out of. And it doesn’t rain, so that means that it’s not gonna erode it, so you don’t have to bake the bricks, which takes a lot of energy. You just need to sun dry them, stack them, and that’s your building.Then the idea was, Well, why would it be connected that you start agriculture, or at least it blossoms, in a place that has low construction cost? And I thought, One possibility is that they needed to defend their grain stores. So once you’ve harvested all this food, you’ve put it all in a room. Well, now that’s very attractive to any would-be thieves that would like to come and perhaps kill you and steal it. And that was my undergraduate thesis.Then fast-forward a couple of years, and I’m doing a master’s, which later morphed into my Ph.D. at Universitat Pompeu Fabra in Barcelona. And I take this course with Hans-Joachim Voth, who later became my advisor, and it’s an Economic History course. And I tell him about what my undergraduate thesis was. And he tells me, It’s interesting, this idea of storage, because there’s this literature that says that hunter-gatherers were actually better off than farmers. And so it could be that maybe you can do some model where there’s some shocks from year to year, and having the granary helps you smooth out the consumption. And so the granary is also important for this reason. See what you can do with it. And I wrote a little paper for the course, and that was that.From there, then the question was if this is just proof of concept that one of the advantages, that it could be when you start farming, is that you’re able to smooth your consumption. And so that’s why you accept a lower average standard of living, but you don’t get killed by famines when they happen every 10 years or so.Demsas: It’s like insurance. You, as a human, are like, Okay, I’ll accept less food now, but I know I won’t starve in some forthcoming year.Matranga: Exactly. And at that point, the story was still about variation from year to year, so I’m worried about famines. And a while after that, I came upon this paper by a French anthropologist called Alain Testart, and what this paper was about—it wasn’t really about farming. It was more about hunter-gatherers that become sedentary. And this happens.Usually, we associate hunting and gathering with being nomadic, because you’re chasing the game around, or you’re moving up and down the mountains, depending on the seasons. And he said this isn’t really always the case. There’s many cases around the world of hunter-gatherers who are sedentary and have remained sedentary for centuries and millennia without progressing to agriculture.And a classical example of this are the Native American cultures of the Pacific Northwest. And so there they exploited the salmon run. And so there’s all these millions and millions of salmons that want to reach their breeding grounds in the upland streams. And to do that, they have to pass through these rivers. And the Native Americans, they had these elaborate traps with which they capture the sustainably large, but sustainable—obviously, they wanted to let some through so that they’d reproduce a number of salmon. They’d skin them, and they’d smoke them, and they’d dry them. And that way, they had stores of food that would last them until the next salmon run, which is in the fall.And he said, If you look at these groups, they have very hierarchical societies. They have elaborate material cultures. And so they had almost everything that we would associate with a farming community except the farming itself. Of course, the reason why they didn’t develop farming was because—it’s important, you know. The salmon is just a salmon. It’s gonna lay the eggs where it wants, and then it’s gonna go into the sea and live in the North Pacific for a few years, and then it’s gonna come back. That was an example of nomads which became sedentary and remain sedentary for hundreds and thousands of years without having farming.And he said, And what’s crucial is that there’s food which is abundant and seasonal. And that was, like, my aha moment, because we take this story about sedentary hunter-gatherers, and then we could say, Maybe this is the stepping stone between being a nomadic hunter-gatherer and being a sedentary farmer. Because there’s this chicken-and-egg problem. Because if you’re always moving around, then how can you learn how to farm? And instead, if you don’t know how to farm, then why would you become sedentary? Because all the food is moving away. The game is moving away. You’re exhausting your local area, the plants. Why wouldn’t you just move to some other place where there’s more food?Demsas: And you’d need multiple seasons to figure out how to farm appropriately for your region.Matranga: Exactly. And so the idea was, by taking this Testart paper, I could say they would become sedentary first because they want to store food. And once they are sedentary and they’re storing food, then they’re preadapted for discovering agriculture, because storing food and being sedentary are two things you need to know how to do if you’re going to be a farmer. So at least you figured out that part of it before. And you do this because you’re trying to avoid seasonality, as Testart said. And this is when I switched from, instead of the problem being a famine every 15 years or whatever, then the problem is this periodic, predictable famine, which happens every year, which we call winter.And so in order to avoid all starving in winter, we can just sit in one place, gather all these abundant foods in the places where these exist, store them, and then we can process them and eat them as we go along throughout the year, and then the next year we can do the whole thing again. And it was funny because I found this paper—it was a friend of mine’s birthday, and I had to call her and tell her, I’m sorry, but I can’t come, because I found the paper that sort of unlocks everything for me. And I’m just too excited about it, and I wouldn’t be much company.Demsas: Did she forgive you?Matranga: Yes. I mean, she already knew. It was baked into the pie. You know, we’d known each other a while.And so that’s when he moved from, you know, once-in-a-while famine to predictable scarcity, which is seasonality. And from there, then my next step was, why would it be in—because one of the things that’s been observed is that the Neolithic Revolution happens right after the end of the Ice Age. And so the traditional interpretation by a bunch of people was: The Ice Age ends. Before, it’s just too cold to farm in the Middle East, and so nobody was farming there. And then when the Ice Age ends, then there’s the right climate for farming. And then you can farm.And there’s two issues here, I think. And one of them was that if the climate is really good for farming, then it could also be really good for hunting and gathering. There might also be more wild animals. There might also be more wild plants. So it’s not entirely clear to me that a better climate automatically makes things better for farming. So that would be my first point.And the second point is that if all you needed was a warm climate, then why couldn’t you farm during the Ice Age but, like, a thousand miles south of where you farmed when the Ice Age ended? Because it’s not like it was a snowball Earth. If you went to the equator, you know, it was still warm. And so my idea was: What was missing during the Ice Age were locations that were really good in summer but really bad in winter, because the issue with the equator isn’t that it’s too warm. The problem is that it’s warm the whole year-round—Demsas: Yeah, so you would never start farming.Matranga: —and therefore you don’t need to store. And the important thing is you never become sedentary in order to store, which then leads you to not starting to farm. What happens when the Ice Age ends? Now, there’s places that first it was, let’s say, –20 [degrees] in the winter and –5 in the summer. So there is seasonality, but all of the seasonality is below freezing. So it doesn’t really matter. It’s just a frozen hellscape year-round.Well, now, if you think that moves, you know, sort of parallel, both the summer and the winter become warmer. Now you’re going to have a winter which is like –5, which is really bad. But now in the summer, let’s say it’s plus-15. Sorry—this is Celsius. I should have prefaced that. And so, basically, what happens is that now the summer is quite good, while the winter is abysmal.And the question is: How can we exploit these very good summer conditions without getting stuck here in the winter, or without all dying in the winter? And of course, if you’re a stork, then that’s really not a problem, right? You can fly. You can go to this really warm place in the summer, have your nest there, and then in the winter, you just go back to Africa, and that’s perfect. But if you’re humans, and you’re carrying kids with you, then obviously that’s not going to work.And so you cannot migrate your way out of a Northern Hemisphere winter. So their solution was to store food. And so they say, We can move to these places first. During the summer, we gather all the food, and then we can store it and consume it throughout the long winter. And then the next summer, we do that again. And that was sort of, like, my first idea of why it happens right after the end of the Ice Age.Demsas: Okay, so the theory is, basically: The Ice Age ends. There’s more seasonality, meaning that the difference between summer and winter increases, so you have these kind of highly variable seasons that we’re used to now. Then people are then incentivized to store, so that they can store food for the winter. And as they’re remaining stable, they discover farming in order to supplement their diets.Matranga: Exactly. And so the basic idea is: Once you’re sedentary, then, you know, for sure, like—I mean, what is farming? Farming is you’re expending labor in order to increase the amount of food that the land produces. So farming is really on a spectrum. Because a very simple thing you could do is chase away grazing animals so that they don’t eat the fields that you’re going to need in order to get the seed from it during the harvest season. And so that’s, in a sense, farming because you’re expending labor just chasing away the animals, and perhaps then you fence them. And then the next thing you could do is say, Well, last year, a lot of this area was flooded. So I’m going to dig a drainage ditch. And this way, when it rains, you don’t have standing water. The crops don’t rot. And we’re going to have more food the next harvest season. And then you can start doing all of these little things, which, put together, then amount to farming.But I’ll just go back for a second to the seasonality issue, because what I later found out was that, actually, according to this theory by Serbian physicist called Milanković, it’s actually increases in seasonality which make the Ice Age end. And so what happens is that Earth’s axis is tilted—and famously, this is what causes the seasons—but sometimes it’s more tilted, and sometimes it’s less tilted. And there’s also other variations in Earth’s orbital parameters, and these influence the amount of seasonality that you have in the Northern Hemisphere and in the Southern Hemisphere. And so it’s not really that it was just the end of the Ice Age which caused seasonality to increase, but really there was this big increase in seasonality, which caused the Ice Age to end and also caused the start of agriculture.Demsas: I would expect that there would have been farming that could come in and out of vogue. I’m curious why we don’t see that in your findings.Matranga: I completely think that farming probably happened on some hillside 70,000 years ago and on some other hillside 30,000 years ago and some other place 15,000 years ago. And, you know, what I find really interesting and important about farming isn’t so much the fact that they did it once. It’s the fact that it’s a model which is able of spreading.If it was just something that happened once on one hillside and then stayed there—or perhaps, you know, like the salmon run in the Pacific Northwest—that’s a fantastic accomplishment by the population that does it, but it doesn’t transform the world. Because you cannot take those salmon, bring them to a river in Iowa, and then, you know, just replicate your community in some other place. What’s special about farming is that it does sort of spread, and that it does eventually occupy most of the landmass of the world. And so it’s sort of what I call a franchisable model. It’s not just something that works in one place. You can copy-paste it all over the place.And so I think it probably happened on some hill, but that’s not super interesting. It would be super interesting, of course, from an anthropological aspect, to find that one hillside where it happened 30,000 years ago. But that didn’t change the history of the world, clearly.I think, in order to have that, you have to have a wide area in which there’s a lot of seasonality so that when somebody invents, first, you know, storage and sedentarism and then agriculture, then they’re able to take this packet of seeds, bring it to another place, give it to their kids. Their kids can found a colony. Perhaps they displace the local population. Perhaps they intermarry with it, perhaps not a lot of people. You know, I’m sure all three happened in different places at different times. And then their kids can do it in another place, and so you can colonize other places with this technology, or other people can copy this technology and do it in other places. And in order to have this, I think you need both the seasonality but, also, it needs to be on a wide enough area that it’s instantly appealing to everybody because they think, This is just what we’ve been waiting for, a chance to not all starve every February.Demsas: Hopefully you can unpack why it was such a dominant strategy, right? Because you write in your paper, “Our ancestors traded a risky but abundant lifestyle for a more stable but less prosperous one, driven by risk aversion, particularly among populations near subsistence levels.” And I would imagine that you would expect to see variation based on different populations’ risk tolerance and also desire to kind of smooth their consumption. And also, it seems like there’d be a real free rider problem. Like, nomads could just go around just attacking sedentary populations, taking their food, and moving on. So it’s interesting to me that it ended up being such a dominant strategy to stay put.Matranga: Yeah, so in terms of, obviously, the risk of raids, I think that would go back to my undergraduate thesis of sort of the importance of having some way of defending. So the first places that do this are actually, like, these hillsides—Jarmo, for example, was an early one—that are very steep on all sides. And, you know, the point is that with that, you kind of need a very specific land conformation, where it’s just the right shape of a hill, and there’s water, and there’s fields close to it, and there’s a way to get from the fields to the hill. And, you know, how many hillsides like that can you find? So the convenient thing is: Once you invent fortifications, then you can build a wall, and so build your own quote-unquote hill in the middle of the fertile plain, which is what they do with sort of Mesopotamia.So that’s one aspect to it. The other one is that some people remain nomadic for a very long time, usually because either it’s too cold, the growing season is too short, or otherwise the rainfall is too low, and so they’re not able to farm, and the only way that they can survive in a viable number of people is by constantly moving around. But what they usually do, at that point, is they become pastoralists.One way of seeing this is that it’s not just a matter of risk aversion, because in the end, if your risk aversion, high or low that it is—let’s say that you’re a complete nervous Nellie. You don’t want to take any risk, and you just eat grubs from under a stone, because you never want to leave your immediate area. Well, you’re probably not going to reproduce very fast, which means that either some neighbors that accepted a little bit more risk and have much higher average amount of food have more kids than you, and they can displace you, or even if you somehow intermarry with them, probably they’re not gonna accept your viewpoint on risk aversion. So the risk aversion, in the end, is something which leads you to make some choices, and these choices have some effects on the viability of your group.Demsas: Yeah. I feel like the fertility question is really interesting here because it’s both that once you begin farming, you have to send your kids out to go farm themselves, but it increases the number of children that are born, too, that survive?Matranga: I would say both. When you’re walking, when you’re nomadic, in principle, you cannot have more than one kid per parent, because somebody has to carry them, at least when they are, you know, below 6—because 6-year-olds can walk, but they can’t walk as fast as grown-ups.The second aspect of this is that there’s so many diseases where if you could just stay in a place that’s warm for a couple of weeks, the kid would be fine. But if you’re in the middle of your migration, that’s it.And the other thing is that when you’re constantly breastfeeding and moving around, you’re probably not gonna put on a lot of weight. And it would appear that a lot of hunter-gatherer women would take a few years to even be fertile again. Because they just would not achieve that—I forget if it’s 15 or 18 percent or—whatever the number is of body fat where your body can even conceive.So absolutely, when you become sedentary, you can have more kids. And I think even if you want to remain nomadic, if there’s these farmers which are having way more kids than you survive, then if there’s ever any conflict—maybe now, maybe in two centuries—then very likely, the farmers are going to get their way.Demsas: I’m curious about us returning to what you started this conversation with, which is the question about whether or not it was a good idea for us to move out of the hunter-gatherer stage to the farming stage. Because your paper has something to say, also, about whether we’re over-reading the evidence about humans being worse off nutritionally when they become farmers. So what’s your pushback on this question about, Maybe nutrition was actually improved once you become a farmer?Matranga: Yes, absolutely. It’s interesting because the first concrete evidence of anything that I found in support of my thesis was what I’m about to tell you. And it’s something called “Harris lines.” So Harris was a pathologist. I believe that one of his kids had a pretty severe disease. For some reason, he saw an X-ray of his kid, and he noticed that he had this line in their bones, sort of a transverse line. So, you know, like, not in the direction of the bone—kind of like a tree ring along the growth.And so then he explored this more, and he found out that when there is an episode of growth arrest of a child that is growing normally, then—for example, this could be a disease, or it could be that you’re not eating—and so there’s what’s called a “metabolic insult.” Your metabolism is not producing enough energy to both keep you alive while growing, and so then you have growth arrest.And then when you start eating well again, or the disease passes, then there’s something called catch-up growth. So the body actually grows faster, because it’s trying to get back on the growth curve that it was on originally. And as it’s growing faster, it deposits this different kind of bone, which you can see from X-rays. And so it’s a little bit like a tree ring, but for mammals.And the interesting thing is that from that same Cohen and Armelagos 1984 book, Paleopathology at the Origins of Agriculture, it also looks like the hunter-gatherers had—they were taller, up to 10 centimeters taller, but they also had—way more of these Harris lines, or growth-arrest lines, in their bones, sometimes as many as six per individual, on average, in some populations. And they also appeared to be evenly spaced, just like tree rings. And so to them, this suggested that almost every year, there would be a period of famine.And so it really looks like it was this insurance trade-off that you mentioned before, which is that, you know: The hunter-gatherers, they ate a lot, but for a few months, at least, every year, it looks like they were starving, while the farmers, they ate less, on average, but they always ate. They were able to smooth their consumption from summer to winter, logically.And I think that one of the reasons this had not been proposed before was because as sedentary people with bank accounts and granaries, you know, usually our problems are not about, like, I’m eating a lot this week, but what am I going to eat next week? But if you are a nomad, and you’re not able to store food, then that, I think, would be the dominant concern, and I think that’s why we accepted this trade-off. Like, Sure, we’re just going to be shorter. That’s fine. But, you know, at least we don’t starve for a couple of months every year.Demsas: So you think this is the correct trade-off? You don’t buy the thesis that we made a mistake?Matranga: No, no. I think we did a great trade-off. In fact, I think that part of the problem with, even, development goals—they tend to be phrased in terms of averages. We would like people to make, at least, $5 or $10 a day, on average, throughout the year. And then how can we get them to invest? Or how can we take them to become entrepreneurial and so on? But when you’re this close to starvation, I think that the average, obviously, you think about it, as well. But what you’re really worried about is, What am I going to eat in the worst possible case that could happen to me within the next 30 years?Because the way that they survived as a population through the centuries was by taking the worst case into possibility. If I take a statistic of a country, and I measure their income every year, and for 25 years, it’s quite good, and then they all die in the 26th year, the average income is still very good, but that’s a complete disaster for the population involved.And so if anything, I think that our way of measuring success is, again, predicated on the fact that we do have insurance, and we do have bank accounts, and we do have granaries. And so our worries are more about averages, while if you are a hunter-gatherer, your life is dominated by the worst outcome. And I think it was a correct choice. In fact, it was so correct that we forgot how awful it is to be eating a whole wildebeest that you killed and still be worried about what you’re going to eat next week.Demsas: Well, Andrea, always our final question: What is something that you thought was a good idea at the time but ended up only being good on paper?Matranga: As a personal anecdote, I’d spent a lot of time figuring out a good way to move to the U.S. And I loved my time in the U.S., but then I realized that moving continents is very difficult when you still have family back home. And the things that you like and that you think you’re going to enjoy when you’re 25 and don’t have kids, then once you have a family, you have to move backwards and forwards and all the summer stuff, then it starts to wear on you.So I just realized, after being incredibly internationally minded, I still love traveling and visiting places, but I became much more homeward bound in my aspirations as time went by.Demsas: Yeah. Well, Andrea, thank you so much for coming on the show.Matranga: Absolutely. My absolute pleasure. Anytime.[Music]Demsas: Good on Paper is produced by Jinae West. It was edited by Dave Shaw, fact-checked by Ena Alvarado, and engineered by Erica Huang. Our theme music is composed by Rob Smierciak. Claudine Ebeid is the executive producer of Atlantic audio. Andrea Valdez is our managing editor.And hey, if you like what you’re hearing, please leave us a rating and review on Apple Podcasts.I’m Jerusalem Demsas, and we’ll see you next week.[Music]Matranga: So it was funny, because if you had asked me, What would you say is good on paper? And I was ready to say, Well, for all my office and copier paper needs, I use Dunder Mifflin, the paper supplier. But the setup—Demsas: The setup was too different? You were going to go with Dunder Mifflin? That’s so funny.

French Farmers Mobilize for Protests Over EU-Mercosur Trade Deal

French farmers are mobilizing for widespread protests Monday, targeting the controversial EU-Mercosur trade agreement

PARIS (AP) — French farmers are mobilizing for widespread protests called Monday targeting the EU-Mercosur trade agreement. They argue the deal threatens their livelihoods by allowing a surge of South American agricultural imports produced under less stringent environmental standards.Protests are planned nationwide, including gatherings at prefectures and traffic circles. One group blocked a highway south of Paris on Sunday night with tractors, and scattered actions have been held recently building up to this week's protests.The European Union and the Mercosur trade bloc, composed of Brazil, Argentina, Paraguay, Uruguay and Bolivia, reached an initial agreement in 2019, but negotiations stumbled due to opposition from farmers and some European governments.The new protests come amid fears the agreement could be finalized at the G20 summit in Brazil on Nov. 18-19, or in the coming weeks. EU farm ministers are also meeting in Brussels on Monday.Leading the charge of the new protests are unions like FNSEA and Young Farmers, who oppose provisions such as duty-free imports of beef, poultry and sugar, which they say create unfair competition. Coordination Rurale, a union linked to the far right, has promised an “agricultural revolt,” including food freight blockades beginning Tuesday in Auch and Agen, in southwestern France.Proponents of the EU-Mercosur trade agreement argue that it would significantly boost economic ties between Europe and South America by eliminating tariffs on European exports, notably for machinery, chemicals and cars, thereby enhancing market access and creating lucrative opportunities for European businesses.French Agriculture Minister Annie Genevard has publicly opposed the EU-Mercosur trade agreement, citing risks of deforestation and health concerns linked to hormone-treated meat. In an interview with TF1, she said: “We don’t want this agreement because it’s harmful. It will bring in products, including substances banned in Europe, at the cost of deforestation. It will unfairly compete with our domestic production.”President Emmanuel Macron has also criticized the agreement unless South American producers meet EU standards.Farmers say they are further frustrated by a European Commission audit that flagged hormone use in Brazilian beef exports. Their demonstrations aim to pressure the French government and EU officials to block or renegotiate the agreement.Copyright 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See - Sept. 2024

How to Overcome Solastalgia, the Feeling of Profound Loss of Your Environment

Damage to your environment can bring a profound sense of loss; that feeling, called solastalgia, can also provide inspiration

November 13, 20245 min readHow I Overcame SolastalgiaDamage to your environment can bring a profound sense of loss; that feeling, called solastalgia, can also provide inspirationBy Queen EssangAs I sit in my backyard in Abuja, Nigeria, looking out at the open landscape around me, I can’t help but feel a deep sense of loss. The rolling hills that were once vibrant with a rich carpet of wild ferns, daisies, lupines and goldenrods are now dotted with invasive species that have choked out the native flora. The river that once flowed crystal clear, reflecting the azure sky and teeming with darting fish and dragonflies gliding gracefully by, is now muddied by sediments and pollutants from nearby construction and agriculture.This feeling of loss and dislocation, a combination of nostalgia for what once was and a profound sadness for what has been irretrievably altered, has a name: solastalgia. Coined by philosopher Glenn Albrecht, solastalgia describes the emotional distress caused by environmental change, particularly when it affects the place we call home. Essentially, it is the feeling of being homesick while at home.But despite this feeling, there is hope. Solastalgia has inspired me. It serves as a strong motivator to push for the protection and rejuvenation of our environments. It reminds us of the intrinsic value of nature and the importance of stewardship. When we acknowledge our grief and channel it into positive action, we empower ourselves to fight for the landscapes we love and to safeguard biodiversity, transforming our sorrow into tangible steps for change. Our bonds with nature are resilient and worth nurturing for future generations.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Growing up, I spent countless hours in the woods behind my childhood home surrounded by majestic oaks with their sprawling canopies, towering pines reaching for the heavens and graceful willows swaying gently by the river’s edge. I would often find myself in the embrace of the ancient pines, their earthly scent grounding me as I wandered beneath their branches. The woods were my sanctuary. Each tree had a story, a memory attached to it. I remember the laughter of friends echoing throughout the canopy as we played hide and seek, the sun filtering through the foliage, casting dappled shadows on the forest floor, and the quiet moments spent sitting up against a tree trunk, feeling at one with nature.When I returned home after five years in college, I was struck by how much the ecosystem had changed. As climate change accelerates and development encroaches on familiar spaces, I find myself grappling with an unsettling reality. The vibrant tapestry of my childhood is unraveling. In its place lies a landscape marked by change—change that feels invasive and alien.Today, in my backyard, I find myself thinking about the day years ago that I encountered a friendly female waterbuck while wandering through the lush Stubbs Creeks forests. The forest was alive with playful squirrels, the occasional fox darting through the underbrush. Chirping robins and warblers and buzzing insects created a symphony that felt like home. Now I realize many of those trees had been felled, replaced by sterile housing developments devoid of the life and character.Nestled within this vibrant landscape was Ibeno Lake. I had taken pride in its clear water, where families of ducks and geese often swam gracefully by. The lake was joy: a place for summer swims, lazy afternoons spent floating on rafts, evenings filled with the laughter of friends gathered around bonfires. It was here that I learned the rhythm of nature. Now I watch in dismay as algae blooms choke the water, turning it murky green.The emotional turmoil is not mine alone; it resonates with many people who are witnessing similar transformations in their environments. The deep sense of solastalgia manifests as a grief that is often overlooked—a sorrow not for a person but for a place. It is a longing for a connection that feels increasingly out of reach, as the landscapes we once knew and loved are irrevocably altered.Every time I see a familiar landmark disappear or a beloved habitat shrink, I can't help but reflect on how a once-vivid tapestry of biodiversity is transforming into a homogenized landscape. This transformation induces a precarious tipping of nature’s equilibrium. Climate change is a fundamental cause, but pollution from nearby industrial complexes has contributed significantly to the degradation of the natural environment. Deforestation spurred by the relentless pursuit of urban development continues to erode extensive forestland, and unsustainable resource extraction has stripped the land of its natural resources, leaving scars that are slow to heal.I cannot stand idly by. I began to educate myself about conservation efforts shortly after I returned home, driven by the changes I witnessed in my environment. I have joined local conservation groups, participating in tree-planting initiatives to restore native species and combat the invasion of non–native flora. I have also engaged in cleanup efforts at Ibeno Lake, rallying friends and family to help remove litter and debris from the shorelines, to help restore its natural beauty. Education is vital, too; I strive to raise awareness in my community about the importance of preserving our natural spaces.In my conversations with family and friends, I find that solastalgia is a common experience. We often reminisce about the landscapes of our youth, remembering the places that influenced our lives. There is a somber tone in these discussions, as we realize that our memories are becoming more associated with what we are losing rather than what is left. The world is changing, and as a result, so are we.As I reflect on my journey with solastalgia, I realize that it is not merely a feeling of loss but also a call to reconnect. It urges us to find new ways to engage with our surroundings, to create memories in the face of change and to honor the beauty that still exists, despite the challenges. Although the landscape may shift, our appreciation for it can remain steadfast, reminding us that our bond with nature is resilient and worth nurturing for future generations.In an era where environmental challenges loom large, solastalgia serves as a poignant reminder of the stakes involved. It is an invitation to cherish our homes, to advocate for their protection and to cultivate a deep-rooted sense of responsibility for the world we inhabit. As we confront the realities of a changing climate, may we find solace not only in our memories but also in our collective capacity to create a thriving future for both people and the planet, in a harmonious balance that nurtures the vibrant tapestry of life.This is an opinion article; the views expressed by the author are not necessarily those of Scientific American.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.