Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Iowa Farmers Are Restoring Tiny Prairies for Sustainability Boons

News Feed
Thursday, October 3, 2024

The little tracts of wilderness grow on Maple Edge Farm in southwest Iowa, where the Bakehouse family cultivates 700 acres of corn, soybeans and alfalfa. Set against uniform rows of cropland, the scraps of land look like tiny Edens, colorful and frowzy. Purple bergamot and yellow coneflowers sway alongside big bluestem and other grasses, alive with birdsong and bees.The Bakehouses planted the strips of wild land after floodwaters reduced many fields to moonscapes three years ago, prompting the family to embark on a once-unthinkable path.They took nearly 11 acres of their fields out of crop production, fragments of farmland that ran alongside fields and in gullies. Instead of crops, they sowed native flowering plants and grasses, all species that once filled the prairie.The restored swaths of land are called prairie strips, and they are part of a growing movement to reduce the environmental harms of farming and help draw down greenhouse gas emissions, while giving fauna a much-needed boost and helping to restore the land.As the little wildernesses grew, more and more meadowlarks, dickcissels, pheasants and quail showed up, along with beneficial insects. Underground, root networks formed to quietly perform heroic feats, filtering dangerous nutrient runoff from crops, keeping soil in place and bringing new health to the land.“We’re thinking about our farm as a small piece of the overall good puzzle,” said Jon Bakehouse, on a visit to the family’s fields one sunny morning earlier this summer. “On a larger scale, we’re all in this together.”Subscribe to The Times to read as many articles as you like.

Farmers in the heartland are restoring swaths of the prairie with government help. The aim is to reduce nutrient runoff from cropland, and help birds and bees.

The little tracts of wilderness grow on Maple Edge Farm in southwest Iowa, where the Bakehouse family cultivates 700 acres of corn, soybeans and alfalfa. Set against uniform rows of cropland, the scraps of land look like tiny Edens, colorful and frowzy. Purple bergamot and yellow coneflowers sway alongside big bluestem and other grasses, alive with birdsong and bees.

The Bakehouses planted the strips of wild land after floodwaters reduced many fields to moonscapes three years ago, prompting the family to embark on a once-unthinkable path.

They took nearly 11 acres of their fields out of crop production, fragments of farmland that ran alongside fields and in gullies. Instead of crops, they sowed native flowering plants and grasses, all species that once filled the prairie.

The restored swaths of land are called prairie strips, and they are part of a growing movement to reduce the environmental harms of farming and help draw down greenhouse gas emissions, while giving fauna a much-needed boost and helping to restore the land.

As the little wildernesses grew, more and more meadowlarks, dickcissels, pheasants and quail showed up, along with beneficial insects. Underground, root networks formed to quietly perform heroic feats, filtering dangerous nutrient runoff from crops, keeping soil in place and bringing new health to the land.

“We’re thinking about our farm as a small piece of the overall good puzzle,” said Jon Bakehouse, on a visit to the family’s fields one sunny morning earlier this summer. “On a larger scale, we’re all in this together.”

Subscribe to The Times to read as many articles as you like.

Read the full story here.
Photos courtesy of

Interoception Is Our Sixth Sense, and It May Be Key to Mental Health

Disruptions in interoception may underlie anxiety, eating disorders, and other mental health ailments

By the time Maggie May, an Arkansas resident in her 30s, was admitted to a psychiatric clinic in 2024, she had been struggling for years with atypical anorexia nervosa, an eating disorder that leads to severe food restriction and profound disturbances in body image. (Her name has been changed for privacy.) She had already tried traditional interventions with a psychotherapist and a dietitian, but they had failed to improve her condition. So when May heard about a trial of a new and unconventional therapy, she jumped at the opportunity.The treatment was unusual in that alongside talk therapy, May underwent several sessions in a sensory-deprivation chamber: a dark, soundproof room where she floated in a shallow pool of water heated to match the temperature of her skin and saturated with Epsom salts to make her more buoyant. The goal was to blunt May’s external senses, enabling her to feel from within—focusing on the steady thudding of her heart, the gentle flow of air in and out of her lungs, and other internal bodily signals.The ability to connect with the body’s inner signals is called interoception. Some people are better at it than others, and one’s aptitude for it may change. Life events can also bolster or damage a person’s interoceptive skills. Sahib Khalsa, a psychiatrist and neuroscientist at the University of California, Los Angeles, and his colleagues think a disrupted interoception system might be one of the driving forces behind anorexia nervosa. So they decided to repurpose a decades-old therapy called flotation-REST (for “reduced environmental stimulation therapy”) and launched a trial with it in 2018. They hypothesized that in people with anorexia and some other disorders, an underreliance on internal signals may lead to an overreliance on external ones, such as how one looks in the mirror, that ultimately causes distorted body image, one of the key factors underlying these conditions. “When they’re in the float environment, they experience internal signals more strongly,” Khalsa says. “And having that experience may then confer a different understanding of the brain-body relationship that they have.”On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Studies have implicated problems with this inner sense in a wide variety of conditions, including anxiety disorders, post-traumatic stress disorder and borderline personality disorder. Some researchers and clinicians now think that problems in interoception might contribute to many mental illnesses. Alongside this research, which itself is complicated by challenges in testing design and by a less than clear understanding of interoception, other groups are also developing therapies that aim to target this inner sense and boost psychological well-being.This work is circling in on a central message: the body and mind are inextricably intertwined. “We have always thought about [mental health conditions] as being in the brain or the mind,” says Camilla Nord, a professor of cognitive neuroscience at the University of Cambridge. But clinicians have long noted that people with mental illness frequently report physical symptoms such as abnormalities in heartbeats, breathing and appetite, she adds.The idea that the body can influence the mind dates back centuries. In the 1800s two psychologists on opposite ends of the globe independently proposed a then novel idea: emotions are the result of bodily reactions to a specific event. Called the James-Lange theory after its founders, American psychologist William James and Danish doctor Carl Lange, this view ran counter to the long-dominant belief that emotions were the cause, not a consequence, of corresponding physiological changes.Although this notion has garnered critics, it inspired a slew of studies. The 1980s saw a surge of interest in the role of physiological signals in panic disorders. Researchers discovered that they could bring on panic attacks by asking people to inhale carbon dioxide–enriched air, which can increase breathing rates, or by injecting them with isoproterenol, a drug that increases heart rate.Breathing rate can affect how someone perceives the intensity and unpleasantness of pain.These findings led some psychologists to suggest that physical sensations were the primary trigger of panic attacks. In the early 1990s Anke Ehlers, a psychologist then at the University of Göttingen in Germany, and her team examined dozens of people with panic disorders and reported that these patients were better able to perceive their heartbeats than healthy individuals—and that this greater awareness was linked to more severe symptoms. On top of that, a small, preliminary study by Ehlers of 17 patients revealed that those who were more skilled at this task were more likely to relapse and start having panic attacks again. These observations hinted at a two-way dynamic: not only could physical sensations within the body cause psychological effects, but the ability to perceive and interpret those signals—in other words, one’s interoceptive ability—could have a profound influence on mental health.Over the years a growing body of evidence has indicated that interoception plays an important role in shaping both emotions and psychological health. A large chunk of this work has focused on the heart. With every heartbeat, blood rushes into the arteries and triggers sensors known as baroreceptors, which shoot off messages to the brain conveying information about how strongly and rapidly the heart is beating.In one pivotal 2014 study, Hugo Critchley, a neuropsychiatrist at Brighton and Sussex Medical School in England, and his team reported that this process can affect a person’s sensitivity to fear. By monitoring volunteers’ heartbeats while they viewed fearful or neutral faces, they found that people detected fearful faces more easily and judged them as more intense when their heart was pumping out blood than when it was relaxing and refilling. But participants with higher levels of anxiety often perceived fear even when their hearts relaxed.Researchers have also demonstrated that bodily signals such as breathing patterns and gut rhythms can influence emotional reactions. People are quicker to react to fearful faces while breathing in than while breathing out, and breathing rate can affect how someone perceives the intensity and unpleasantness of pain.In more recent work, some neuroscientists have turned their attention to the gastrointestinal system. In 2021 Nord and her colleagues discovered that people given a dose of an antinausea drug that affects gut rhythms—processes within the stomach that help digestion—were less likely to look away from pictures of feces than they normally would have been. These disgust-related visceral signals, Nord speculates, may be relevant to eating disorders. “It’s possible that some of these signals contribute to feeling aversion to signals of satiety, making satiety very uncomfortable, a feeling you don’t want to feel,” she says.But how, exactly, do disruptions in interoception come about? Many researchers suspect it may have to do with our brain’s predictions going awry. Interoception, like our other senses, feeds information to the brain, which some neuroscientists suggest is a prediction machine: it constantly uses our prior knowledge of the world to make inferences about incoming signals. In the case of interoception, the brain attempts to decode the cause of internal sensations. If its interpretations are incorrect, they may lead to negative psychological effects—for example, if a person erroneously assumes their heart rate is elevated, they may begin to feel anxious in the absence of a threat. And if a person has learned to associate pangs of hunger with disgust, they might severely restrict how much food they consume.Inner signals can be much more ambiguous than the external input from other senses such as sight and hearing. So the brain’s prior information about these internal signals becomes especially important, says André Schulz, a professor of psychology at the University of Luxembourg.To better understand and assess potential mismatches in subjective and objective measures of our bodily signals, researchers have developed a framework that captures the different dimensions in which interoceptive processing occurs. In 2015 Sarah Garfinkel, then a postdoctoral researcher in Critchley’s group at Brighton and Sussex, and her colleagues proposed a model to clearly differentiate three categories of interoceptive processing: interoceptive accuracy (how well someone performs, objectively, on relevant tasks such as heartbeat detection), interoceptive sensibility (a person’s subjective evaluation of their interoceptive abilities), and interoceptive awareness (how well that self-assessment matches their actual abilities).Along with their collaborators, Garfinkel, now at University College London, and Critchley have found that in autistic adults there is a link between anxiety and a poor ability to predict one’s interoceptive skill—in this case, one’s sensitivity to heartbeat. In a study of 40 people (20 of whom had autism), they and their colleagues discovered that individuals with autism performed worse on a heartbeat-detection task and were more likely to overestimate their interoceptive abilities than those without autism. This disconnect was more pronounced in people with higher levels of anxiety, suggesting that errors in the ability to predict bodily signals may contribute to feelings of anxiety, Critchley says.In recent years the list of psychiatric conditions linked to interoceptive dysfunctions has grown. Some, such as panic and anxiety disorders, are associated with heightened attention to one’s internal processes; others, such as borderline personality disorder and schizophrenia, may be tied to a blunting of one’s ability to connect with these signals. In a review of interoception research, published in 2021, Nord and her colleagues examined 33 studies that collectively involved more than 1,200 participants. They found that people with a range of psychiatric disorders, including anxiety disorders and schizophrenia, shared similar alterations in the insula, a key brain region linked with interoception during body-sensing-related tasks.Overall, though, studies show mixed results. “If you look across the literature, [however many] studies have found an association with, say, anxiety, [a roughly] equal amount will have not found a relationship or found it in the other direction,” says Jennifer Murphy, a psychologist at the University of Surrey in England.The varying results could stem from the challenges associated with studying interoception, which can be difficult to both manipulate and measure. Take cardiac interoception. In most early studies in this domain, participants counted their pulses, but this test may measure people’s estimate of their heart rate rather than how well they can feel their heartbeat. This flaw was perhaps most clearly demonstrated in a 1999 study in which people with pacemakers reported their heart rates while experimenters (with the participants’ consent) secretly tuned their pacemakers’ timing up or down. Participants’ self-reported heart rates didn’t follow the shifts in the actual pulses; their beliefs about how their heart rates should be changing had a much stronger influence.To address these limitations, scientists have been devising better study methods. Micah Allen, a neuroscientist at Aarhus University in Denmark, and his team have developed a heart-rate-discrimination task in which people are asked to report whether a series of tones is faster or slower than their current pulse, allowing researchers to quantify how sensitive an individual is to their heartbeats. Allen and his colleagues are now testing breath interoception in a similar way. Using a computer-controlled device, the researchers can make precise changes to the air resistance someone feels when they inhale through a tube. By doing so, they can quantify how well the person can detect changes in their breathing.Using these new techniques, Allen’s team has learned that an individual’s interoceptive chops don’t translate across all domains. In a recent preprint study of 241 people, they found that a person’s ability to perceive their heart rate wasn’t correlated to their performance in a breathing-resistance task.Researchers have also been combining these behavioral tests with measurements of brain activity. One example is the heartbeat-evoked potential, a spike in brain signaling that occurs each time the heart beats. Scientists have found that changes in these signals, which can be detected with noninvasive brain-imaging techniques such as electroencephalography, are linked to accuracy in heartbeat-detection tasks and to the ability to process emotions. Similar brain signals related to organs such as the gut and those of the respiratory system have been linked to the ability to perceive sensations within those organs.These studies indicate that interoception abilities don’t align across a person’s bodily functions, from breathing to heart rate to gut rhythms. It’s therefore difficult to know whether the conflicting findings about the role of interoception disruptions in mental disorders mean there is no meaningful relation to be found or whether the issue is that researchers have simply not been using the right task or studying the most relevant system or level of interoception, Murphy says. “It’s very unlikely that every condition will have the same bit of interoception disrupted.”Untangling how, exactly, interoception is disrupted in people with mental illness remains an active area of investigation. Some experts say answers may come from treatment trials investigating whether interventions that target disturbances in this inner sense might boost mental health. Many such studies are currently underway.“To understand what interoception is, we need to manipulate it,” Allen says. “And to understand its role as a biomarker, as something that is related to mental health, we also need to manipulate it.”Jane Green knows stressful situations can have immediate effects on her body. For Green, who has autism, reading a piece of bad news or dealing with a face-to-face confrontation may set off a chain reaction in her body: a rush of adrenaline followed by a pounding heart, bloating and itchiness, among other physical reactions.Such responses may be linked to an inability to read one’s inner body. In 2019 she took part in a clinical trial in which Critchley, Garfinkel and their colleagues sought to test just that—how resolving a discrepancy between a person’s perceived interoceptive abilities and reality could improve anxiety levels in adults with autism spectrum disorder. The intervention in the study focused on tasks that involved heartbeat detection.After training and testing 121 participants (half of whom were randomly assigned to receive a noninteroception-based control task) across six sessions, the team reported in a 2021 paper that this treatment successfully reduced anxiety in their participants and that this effect persisted for at least three months.Participating in the trial was a “real turning point” in Green’s ability to deal with anxiety, she says. “I recognize now that when I’m stressed, whether I like it or not, my body reacts,” she explains. Although she still experiences physical reactions to emotionally charged situations, they are often less severe than they were prior to the treatment. And her knowledge of what’s happening in her body has made it easier to cope, she adds. Green is chair and founder of SEDSConnective, a charity dedicated to neurodivergent people with connective tissue disorders such as Ehlers-Danlos syndromes. These conditions tend to overlap with anxiety disorders, and Green is now advocating for interoception-based therapies to help affected people.A person’s interoceptive capabilities might be especially malleable during early childhood or adolescence.For May, who participated in the flotation-REST trial, what she learned from being cut off from the external world helped her to get through an inpatient stay at an eating-disorder clinic where she was being forced to eat—and, as a result, gain weight. “You’re working on the things that drove you to come in the first place, but at the same time, your distress with your body is getting worse and worse,” she says. When she was in the flotation chamber, however, May’s awareness of her physical body would slip away, reducing some of the negative feelings she had about herself and quieting the worries that swirled in her mind. “You can’t tell where your body stops and the water begins,” May says. “Because you’re completely buoyant, you also have no sense of the ways that your body distresses you.”Flotation-REST shows promise: in a clinical trial of 68 people hospitalized for anorexia nervosa who were randomly assigned to the therapy or a placebo, Khalsa’s team found that six months after treatment, those who received therapy reported less body dissatisfaction than those who did not. The researchers have also created a version of this therapy for anxiety and depression. In early-stage clinical trials, this intervention appeared to reduce the symptoms of those disorders. Now they are investigating whether this therapy might also benefit people with amphetamine use disorder.Other interoception-based treatments are also under investigation. At Emory University, a group led by clinical psychologist Negar Fani has been examining the effects of combining a mindfulness-based intervention with a wearable device that delivers vibrations corresponding to a person’s breaths. In a group of trauma-exposed individuals, this intervention increased the participants’ confidence in their bodily signals more than the mindfulness-based intervention alone. Even long after these sessions, people report being able to recall the feeling of breath-synced vibrations, Fani says. “It helps to ground them, brings them back into the present moment. They can access their body signals and figure out what they want to do with them.” The team is now conducting a follow-up study to see whether this treatment can improve mental well-being in people who have experienced trauma.In yet another ongoing trial, Nord is collaborating with Garfinkel on a series of studies aimed at understanding in which body systems—and in which of the three dimensions (accuracy, sensibility and awareness)—interoception is altered in people with various mental disorders, among them anxiety and depression. As part of that effort, the researchers are testing interventions, including interoceptive training, mindfulness therapy—to help improve the mind-body connection—and stimulation of the insula with focused ultrasound.Scientists still have plenty of questions to answer about interoception. One major open question is how differences in interoception arise. Some of our interoceptive abilities may begin taking shape during early infanthood. Scientists have discovered that babies as young as three months show differences in the amount of time they spend looking at colored shapes moving either in or out of sync with their heartbeats—a finding that suggests our ability to sense heart rhythms is present at this young age.Interactions with caregivers during one’s first years may play a crucial role in determining how in tune a person becomes with their body. The way a parent responds to an infant’s cues about being hungry, tired or in pain, for instance, may shape how well the child is able to interpret those signals later in life. Although direct evidence for this hypothesis is still lacking, studies have shown that an individual’s early caregiving environment can shape how their body responds to stress.Other factors such as a person’s sex or various environmental conditions, including adversity in early life, may also influence how interoception develops. Some research suggests that adverse experiences, especially chronic, interpersonal trauma early in life, may be key contributors. Clinicians have long observed that traumatic events can lead people to detach or “dissociate” from the body, and some researchers have proposed that this disconnect can disrupt interoceptive processes over time. For a subset of people, these alterations might be linked to an increased likelihood of self-harm and suicide: one 2020 study, for example, found that people with a history of suicide attempts and a diagnosed mental illness, such as anxiety, PTSD or depression, were worse at an interoceptive heartbeat-detection task than those who had the same ailments but had not attempted to take their own life.A person’s interoceptive abilities may change over time. Interoceptive capabilities might be especially malleable during certain life stages: periods such as early childhood, when a person is just learning how to interpret their bodily signals, or adolescence, when puberty is creating a whirlwind of physical changes. It might be one mechanism, among many, that explains why “these times tend to be risk periods for the development of mental illness,” Murphy says.The boundaries of interoception are also only beginning to be understood. In recent years some scientists have become interested in probing the links between the immune system and the brain, which are in constant conversation. An emerging body of work suggests that the brain both keeps tabs on and influences what happens in the immune system, and the immune system can in turn affect the brain. Studies have linked dysfunction in the immune system—namely, inflammation—to mental illnesses such as depression, psychosis and trauma-related disorders. The immune system may affect our mental states over much longer time scales than, say, the heart, which can influence our emotional experiences in real time.Understanding the mysteries of interoception may lead to better therapies for illnesses of the mind—and the body. Some researchers believe that understanding interoception may ultimately be helpful for treating physical symptoms as well. Schulz and his team, for instance, are currently evaluating interoception-based treatments for chronic fatigue syndrome (also known as myalgic encephalomyelitis), a complicated disorder that causes a range of symptoms, including severe tiredness. “Interoception has so much relevance to health in general,” Fani says. “We can’t ignore it anymore.”IF YOU NEED HELPIf you or someone you know is struggling or having thoughts of suicide, help is available. Call or text the 988 Suicide & Crisis Lifeline at 988 or use the online Lifeline Chat at chat.988lifeline.org.

Deep-learning model predicts how fruit flies form, cell by cell

The approach could apply to more complex tissues and organs, helping researchers to identify early signs of disease.

During early development, tissues and organs begin to bloom through the shifting, splitting, and growing of many thousands of cells.A team of MIT engineers has now developed a way to predict, minute by minute, how individual cells will fold, divide, and rearrange during a fruit fly’s earliest stage of growth. The new method may one day be applied to predict the development of more complex tissues, organs, and organisms. It could also help scientists identify cell patterns that correspond to early-onset diseases, such as asthma and cancer.In a study appearing today in the journal Nature Methods, the team presents a new deep-learning model that learns, then predicts, how certain geometric properties of individual cells will change as a fruit fly develops. The model records and tracks properties such as a cell’s position, and whether it is touching a neighboring cell at a given moment.The team applied the model to videos of developing fruit fly embryos, each of which starts as a cluster of about 5,000 cells. They found the model could predict, with 90 percent accuracy, how each of the 5,000 cells would fold, shift, and rearrange, minute by minute, during the first hour of development, as the embryo morphs from a smooth, uniform shape into more defined structures and features.“This very initial phase is known as gastrulation, which takes place over roughly one hour, when individual cells are rearranging on a time scale of minutes,” says study author Ming Guo, associate professor of mechanical engineering at MIT. “By accurately modeling this early period, we can start to uncover how local cell interactions give rise to global tissues and organisms.”The researchers hope to apply the model to predict the cell-by-cell development in other species, such zebrafish and mice. Then, they can begin to identify patterns that are common across species. The team also envisions that the method could be used to discern early patterns of disease, such as in asthma. Lung tissue in people with asthma looks markedly different from healthy lung tissue. How asthma-prone tissue initially develops is an unknown process that the team’s new method could potentially reveal.“Asthmatic tissues show different cell dynamics when imaged live,” says co-author and MIT graduate student Haiqian Yang. “We envision that our model could capture these subtle dynamical differences and provide a more comprehensive representation of tissue behavior, potentially improving diagnostics or drug-screening assays.”The study’s co-authors are Markus Buehler, the McAfee Professor of Engineering in MIT’s Department of Civil and Environmental Engineering; George Roy and Tomer Stern of the University of Michigan; and Anh Nguyen and Dapeng Bi of Northeastern University.Points and foamsScientists typically model how an embryo develops in one of two ways: as a point cloud, where each point represents an individual cell as point that moves over time; or as a “foam,” which represents individual cells as bubbles that shift and slide against each other, similar to the bubbles in shaving foam.Rather than choose between the two approaches, Guo and Yang embraced both.“There’s a debate about whether to model as a point cloud or a foam,” Yang says. “But both of them are essentially different ways of modeling the same underlying graph, which is an elegant way to represent living tissues. By combining these as one graph, we can highlight more structural information, like how cells are connected to each other as they rearrange over time.”At the heart of the new model is a “dual-graph” structure that represents a developing embryo as both moving points and bubbles. Through this dual representation, the researchers hoped to capture more detailed geometric properties of individual cells, such as the location of a cell’s nucleus, whether a cell is touching a neighboring cell, and whether it is folding or dividing at a given moment in time.As a proof of principle, the team trained the new model to “learn” how individual cells change over time during fruit fly gastrulation.“The overall shape of the fruit fly at this stage is roughly an ellipsoid, but there are gigantic dynamics going on at the surface during gastrulation,” Guo says. “It goes from entirely smooth to forming a number of folds at different angles. And we want to predict all of those dynamics, moment to moment, and cell by cell.”Where and whenFor their new study, the researchers applied the new model to high-quality videos of fruit fly gastrulation taken by their collaborators at the University of Michigan. The videos are one-hour recordings of developing fruit flies, taken at single-cell resolution. What’s more, the videos contain labels of individual cells’ edges and nuclei — data that are incredibly detailed and difficult to come by.“These videos are of extremely high quality,” Yang says. “This data is very rare, where you get submicron resolution of the whole 3D volume at a pretty fast frame rate.”The team trained the new model with data from three of four fruit fly embryo videos, such that the model might “learn” how individual cells interact and change as an embryo develops. They then tested the model on an entirely new fruit fly video, and found that it was able to predict with high accuracy how most of the embryo’s 5,000 cells changed from minute to minute.Specifically, the model could predict properties of individual cells, such as whether they will fold, divide, or continue sharing an edge with a neighboring cell, with about 90 percent accuracy.“We end up predicting not only whether these things will happen, but also when,” Guo says. “For instance, will this cell detach from this cell seven minutes from now, or eight? We can tell when that will happen.”The team believes that, in principle, the new model, and the dual-graph approach, should be able to predict the cell-by-cell development of other multiceullar systems, such as more complex species, and even some human tissues and organs. The limiting factor is the availability of high-quality video data.“From the model perspective, I think it’s ready,” Guo says. “The real bottleneck is the data. If we have good quality data of specific tissues, the model could be directly applied to predict the development of many more structures.”This work is supported, in part, by the U.S. National Institutes of Health.

Ignore the Influencers: Simple Showers Are Still Best

By Carole Tanzer Miller HealthDay ReporterSATURDAY, Dec. 13, 2025 (HealthDay News) — Listen to the influencers, skin-care specialists say, and your...

By Carole Tanzer Miller HealthDay ReporterSATURDAY, Dec. 13, 2025 (HealthDay News) — Listen to the influencers, skin-care specialists say, and your daily shower could do more harm than good."Your skin is a barrier," said Dr. Nicole Negbenebor, a dermatologic surgeon at University of Iowa Health Care, told The Associated Press. "So you want to treat it right, and then sometimes there can be too much of a good thing."If you’re double-cleansing, exfoliating, piling on scented body rubs and shower oils and spending a lot of time in the water, you’re probably going overboard, she and other skin-care experts agree.A daily shower with lukewarm water and hypoallergenic cleanser — preferably one that’s fragrance-free, and a slather of lotion or oil afterward are all you need, they say.Here’s a guide from dermatologists to sudsing up without getting carried away:Pay attention to time and temperature. Staying in the shower too long or cranking the temperature up too high can strip away natural oils your skin needs. The upshot: You’ll be dry and irritated.Pick the right soap. Choose one, dermatologists suggest, for sensitive skin and avoid antibacterial soaps, which can cause dryness. (Antibacterial soaps can, however, be beneficial for folks with hidradenitis suppurativa, an autoimmune condition that causes abscesses and boils on the skin, they point out.)Despite the influencers, double-cleansing isn’t necessary. No need, doctors say, to use oil-based cleansers to break down makeup and excess oil and then a water-based cleanser to remove any residue. And, they add, you sure don’t need to do that to your whole body."People overuse soap all the time," Dr. Olga Bunimovich, an assistant professor of dermatology at the University of Pittsburgh, told The AP. "You should not be soaping up all of your skin period." Instead, she advised, use soap to wash skin folds and your privates.Oil up. Once you’re out of the shower but still damp, an oil will lock in moisture that hydrates the skin, Negbenebor said. Just remember: Oil itself is a sealant, not a moisturizer.Don’t go overboard with exfoliating. Using a body scrub or loofah to remove dead cells is good for the skin, but not every day, especially if you have dry skin, acne or eczema. Using products that contain lactic or glycolic acid is a gentler way to exfoliate — but not all the time.While you’re being kind to your skin, think about the environment, too. Nearly 17% of U.S. indoor water use is in the shower, according to the U.S. Environmental Protection Agency. Shorter showers are good for the earth — and a lukewarm one that lasts long enough to clean your body should be sufficient most of the time.The University of Nebraska-Lincoln has more about showering.SOURCE: The Associated Press, July 10, 2025Copyright © 2025 HealthDay. All rights reserved.

New method improves the reliability of statistical estimations

The technique can help scientists in economics, public health, and other fields understand whether to trust the results of their experiments.

Let’s say an environmental scientist is studying whether exposure to air pollution is associated with lower birth weights in a particular county.They might train a machine-learning model to estimate the magnitude of this association, since machine-learning methods are especially good at learning complex relationships.Standard machine-learning methods excel at making predictions and sometimes provide uncertainties, like confidence intervals, for these predictions. However, they generally don’t provide estimates or confidence intervals when determining whether two variables are related. Other methods have been developed specifically to address this association problem and provide confidence intervals. But, in spatial settings, MIT researchers found these confidence intervals can be completely off the mark.When variables like air pollution levels or precipitation change across different locations, common methods for generating confidence intervals may claim a high level of confidence when, in fact, the estimation completely failed to capture the actual value. These faulty confidence intervals can mislead the user into trusting a model that failed.After identifying this shortfall, the researchers developed a new method designed to generate valid confidence intervals for problems involving data that vary across space. In simulations and experiments with real data, their method was the only technique that consistently generated accurate confidence intervals.This work could help researchers in fields like environmental science, economics, and epidemiology better understand when to trust the results of certain experiments.“There are so many problems where people are interested in understanding phenomena over space, like weather or forest management. We’ve shown that, for this broad class of problems, there are more appropriate methods that can get us better performance, a better understanding of what is going on, and results that are more trustworthy,” says Tamara Broderick, an associate professor in MIT’s Department of Electrical Engineering and Computer Science (EECS), a member of the Laboratory for Information and Decision Systems (LIDS) and the Institute for Data, Systems, and Society, an affiliate of the Computer Science and Artificial Intelligence Laboratory (CSAIL), and senior author of this study.Broderick is joined on the paper by co-lead authors David R. Burt, a postdoc, and Renato Berlinghieri, an EECS graduate student; and Stephen Bates an assistant professor in EECS and member of LIDS. The research was recently presented at the Conference on Neural Information Processing Systems.Invalid assumptionsSpatial association involves studying how a variable and a certain outcome are related over a geographic area. For instance, one might want to study how tree cover in the United States relates to elevation.To solve this type of problem, a scientist could gather observational data from many locations and use it to estimate the association at a different location where they do not have data.The MIT researchers realized that, in this case, existing methods often generate confidence intervals that are completely wrong. A model might say it is 95 percent confident its estimation captures the true relationship between tree cover and elevation, when it didn’t capture that relationship at all.After exploring this problem, the researchers determined that the assumptions these confidence interval methods rely on don’t hold up when data vary spatially.Assumptions are like rules that must be followed to ensure results of a statistical analysis are valid. Common methods for generating confidence intervals operate under various assumptions.First, they assume that the source data, which is the observational data one gathered to train the model, is independent and identically distributed. This assumption implies that the chance of including one location in the data has no bearing on whether another is included. But, for example, U.S. Environmental Protection Agency (EPA) air sensors are placed with other air sensor locations in mind.Second, existing methods often assume that the model is perfectly correct, but this assumption is never true in practice. Finally, they assume the source data are similar to the target data where one wants to estimate.But in spatial settings, the source data can be fundamentally different from the target data because the target data are in a different location than where the source data were gathered.For instance, a scientist might use data from EPA pollution monitors to train a machine-learning model that can predict health outcomes in a rural area where there are no monitors. But the EPA pollution monitors are likely placed in urban areas, where there is more traffic and heavy industry, so the air quality data will be much different than the air quality data in the rural area.In this case, estimates of association using the urban data suffer from bias because the target data are systematically different from the source data.A smooth solutionThe new method for generating confidence intervals explicitly accounts for this potential bias.Instead of assuming the source and target data are similar, the researchers assume the data vary smoothly over space.For instance, with fine particulate air pollution, one wouldn’t expect the pollution level on one city block to be starkly different than the pollution level on the next city block. Instead, pollution levels would smoothly taper off as one moves away from a pollution source.“For these types of problems, this spatial smoothness assumption is more appropriate. It is a better match for what is actually going on in the data,” Broderick says.When they compared their method to other common techniques, they found it was the only one that could consistently produce reliable confidence intervals for spatial analyses. In addition, their method remains reliable even when the observational data are distorted by random errors.In the future, the researchers want to apply this analysis to different types of variables and explore other applications where it could provide more reliable results.This research was funded, in part, by an MIT Social and Ethical Responsibilities of Computing (SERC) seed grant, the Office of Naval Research, Generali, Microsoft, and the National Science Foundation (NSF).

Gas Stoves Are Poisoning Americans by Releasing Toxic Fumes Associated With Asthma and Lung Cancer

In the United States, gas stoves are the main source of indoor nitrogen dioxide—a toxic gas tied to many health problems—according to a new study

Gas Stoves Are Poisoning Americans by Releasing Toxic Fumes Associated With Asthma and Lung Cancer In the United States, gas stoves are the main source of indoor nitrogen dioxide—a toxic gas tied to many health problems—according to a new study Sarah Kuta - Daily Correspondent December 11, 2025 9:13 a.m. Gas stoves are responsible for more than half of some Americans’ total exposure to toxic nitrogen dioxide, a new study suggests. Pexels A hidden danger may be lurking in your kitchen. Many Americans are breathing in nitrogen dioxide—a harmful pollutant that’s been linked with asthma and lung cancer—from fumes emitted by their gas stoves. A new study, published this month in the journal PNAS Nexus, suggests that gas stoves are the main source of indoor nitrogen dioxide pollution in the United States, responsible for more than half of some Americans’ total exposure to the gas. “We’ve spent billions of dollars cleaning up our air outdoors and nothing to clean up our air indoors,” study co-author Robert Jackson, an environmental scientist at Stanford University, tells SFGATE’s Anna FitzGerald Guth. “As our air outdoors gets cleaner and cleaner, a higher proportion of the pollution we breathe comes from indoor sources.” Scientists and public health experts have long known that nitrogen dioxide is bad for human health. The reddish-brown gas can irritate airways and worsen or even contribute to the development of respiratory diseases like asthma. Children and older individuals are particularly susceptible to its effects. Nitrogen dioxide is a byproduct of burning fuel, so most emissions come from vehicles, power plants and off-road equipment. However, indoors, the primary culprit is the gas stove, the household appliance that burns natural gas or propane to produce controlled flames under individual burners. It’s relatively easy to keep tabs on outdoor nitrogen dioxide concentrations and estimate their corresponding exposure risks, thanks to satellites and ground-level stations located across the country. By contrast, however, indoor sources are “neither systematically monitored nor estimated,” the researchers write in the paper. Did you know? Bans on gas Berkeley, California, became the first city to prohibit gas hookups in most new buildings in 2019, although the ordinance was halted in 2024 after the California Restaurant Association sued. Still, 130 local governments have now implemented zero-emission building ordinances, according to the Building Decarbonization Coalition. For the study, Jackson and his colleagues performed a ZIP-code-level estimate of how much total nitrogen dioxide communities are exposed to. Information came from two databases tracking outdoor nitrogen dioxide concentrations and a building energy use database, which helped the team construct characteristics of 133 million residential dwellings across the country, along with their home appliances. Among individuals who use gas stoves, the appliances are responsible for roughly a quarter of their overall nitrogen dioxide exposure on average, the team found. For those who cook more frequently or for longer durations, gas stoves can be responsible for as much as 57 percent of their total exposure. “Our research shows that if you use a gas stove, you’re often breathing as much nitrogen dioxide pollution indoors from your stove as you are from all outdoor sources combined,” says Jackson in a Stanford statement. Individuals who use gas stoves are exposed to roughly 25 percent more total residential nitrogen dioxide over the long term than those who use electric stoves, which do not emit the gas. Total exposure tends to be highest in big cities, where people often have small living spaces and outdoor levels are also high. Switching from a gas to an electric stove would help roughly 22 million Americans dip below the maximum nitrogen dioxide exposure levels recommended by the World Health Organization, the analyses suggest. The authors recommend replacing gas stoves with electric models whenever possible. “You would never willingly stand over the tailpipe of your car, breathing in pollution,” Jackson tells Women’s Health’s Korin Miller. “Why breathe the same toxins every day in your kitchen?” Dylan Plummer, acting deputy director for building electrification for the Sierra Club, a nonprofit environmental organization, agrees. Plummer, who was not involved with the research, tells Inside Climate News’ Phil McKenna that “years from now, we will look back at the common practice of burning fossil fuels in our homes with horror.” If swapping stoves is not possible, experts have some other tips for reducing nitrogen dioxide exposure. “One thing people could do is to minimize the time the stoves are on,” Jamie Alan, a toxicologist at Michigan State University who was not involved with the research, tells Women’s Health. “Another suggestion would be to increase ventilation,” such as by turning on the range hood and opening a window. Other suggestions by the New York Times’ Rachel Wharton include using a portable induction countertop unit or electric kitchen gadgets like tea kettles, toaster ovens and slow cookers. Get the latest stories in your inbox every weekday.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.