Connect with us

TOP SCEINCE

Large wildfires create weather that favors more fire

Published

on

Large wildfires create weather that favors more fire


A new UC Riverside study shows soot from large wildfires in California traps sunlight, making days warmer and drier than they ought to be.

Many studies look at the effect of climate change on wildfires. However, this study sought to understand the reverse — whether large fires are also changing the climate.

“I wanted to learn how the weather is affected by aerosols emitted by wildfires as they’re burning,” said lead study author and UCR doctoral candidate James Gomez.

To find his answers, Gomez analyzed peak fire days and emissions from every fire season over the past 20 years. Of these fire days, he examined a subset that occurred when temperatures were lower, and humidity was higher. “I looked at abnormally cool or wet days during fire season, both with and without fires. This mostly takes out the fire weather effects,” Gomez said.

Published in the journal Atmospheric Chemistry and Physics, the study found that large fires did have an effect. They made it hotter and drier than usual on the days the fires burned. The extra heat and aridity may then make conditions favorable for more fire.

“It appears these fires are creating their own fire weather,” Gomez said.

The most intense fires occurred in Northern California, where fire-fueling vegetation is denser than elsewhere in the state. On average, temperatures were about 1 degree Celsius warmer per day during the fires.

There are likely two reasons for this. One — soot traps heat, and two — the extra heat reduces humidity in the atmosphere, making it more difficult for clouds to form.

“Fires emit smoke with black carbon, or soot. Since it is very dark, the soot absorbs sunlight more readily than bright or reflective things,” Gomez said.

There are two types of aerosols: reflective and absorptive. Sulfate aerosols, which are byproducts of fossil fuel burning, are reflective and can cool the environment. These particles reflect the sun’s energy back into space, keeping it out of the atmosphere.

Recent UCR research points to an unfortunate byproduct of improving air quality by reducing sulfate aerosols. Since these particles have a cooling effect, removing them makes climate change more severe and leads to an increase in wildfires, especially in northern hemisphere forests.

Sulfate aerosols can also help make clouds brighter, more reflective, and more effective at cooling the planet.

The researchers note that the only way to prevent additional wildfires when cleaning up reflective sulfate air pollution is to simultaneously reduce emissions of greenhouse gases like carbon dioxide and methane.

Absorptive aerosols have the opposite effect. They trap light and heat in the atmosphere, which can raise temperatures. Black carbon, the most common aerosol emission from wildfires, is an absorbing aerosol. They not only directly make temperatures hotter, but indirectly as well by discouraging cloud formation and precipitation.

“What I found is that the black carbon emitted from these California wildfires is not increasing the number of clouds,” Gomez said. “It’s hydrophobic.” Fewer clouds mean less precipitation, which is problematic for drought-prone states.

While some studies have shown an association between fires and brighter, more numerous clouds, this one did not.

Notably, the study found that days with fewer fire emissions had a more muted effect on the weather. “If the aerosols are coming out in smaller amounts and more slowly, the heating effect is not as pronounced,” Gomez said.

Gomez is hopeful that mitigating CO2 emissions, alongside better land management practices, can help reduce the number of large wildfires.

“There is a buildup of vegetation here in California. We need to allow more frequent small fires to reduce the amount of fuel available to burn,” Gomez said. “With more forest management and more prescribed burns, we could have fewer giant fires. That is in our control.”



Source link

Continue Reading
Click to comment

Leave a Reply

TOP SCEINCE

Waste Styrofoam can now be converted into polymers for electronics

Published

on

By

Large wildfires create weather that favors more fire


University of Delaware and Argonne National Laboratory have come up with a chemical reaction that can convert Styrofoam into a high-value conducting polymer known as PEDOT:PSS. In a new paper published in JACS Au, the study demonstrates how upgraded plastic waste can be successfully incorporated into functional electronic devices, including silicon-based hybrid solar cells and organic electrochemical transistors.

The research group of corresponding author Laure Kayser, assistant professor in the Department of Materials Science and Engineering in UD’s College of Engineering with a joint appointment in the Department of Chemistry and Biochemistry in the College of Arts and Sciences, regularly works with PEDOT:PSS, a polymer that has both electronic and ionic conductivity, and was interested in finding ways to synthesize this material from plastic waste.

After connecting with Argonne chemist David Kaphan during an event hosted by UD’s research office, the research teams at UD and Argonne began evaluating the hypothesis that PEDOT:PSS could be made by sulfonating polystyrene, a synthetic plastic found in many types of disposable containers and packing materials.

Sulfonation is a common chemical reaction where a hydrogen atom is replaced by sulfonic acid; the process is used to create a variety of products such as dyes, drugs and ion exchange resins. These reactions can either be “hard” (with higher conversion efficiency but that require caustic reagents) or “soft” (a less efficient method but one that uses milder materials).

In this paper, the researchers wanted to find something in the middle: “A reagent that is efficient enough to get really high degrees of functionalization but that doesn’t mess up your polymer chain,” Kayser explained.

The researchers first turned to a method described in a previous study for sulfonating small molecules, one that showed promising results in terms of efficiency and yield, using 1,3-Disulfonic acid imidazolium chloride ([Dsim]Cl). But adding functional groups onto a polymer is more challenging than for a small molecule, the researchers explained, because not only are unwanted byproducts harder to separate, any small errors in the polymer chain can change its overall properties.

To address this challenge, the researchers embarked on many months of trial and error to find the optimal conditions that minimized side reactions, said Kelsey Koutsoukos, a materials science doctoral candidate and second author of this paper.

“We screened different organic solvents, different molar ratios of the sulfonating agent, and evaluated different temperatures and times to see which conditions were the best for achieving high degrees of sulfonation,” he said.

The researchers were able to find reaction conditions that resulted in high polymer sulfonation, minimal defects and high efficiency, all while using a mild sulfonating agent. And because the researchers were able to use polystyrene, specifically waste Styrofoam, as a starting material, their method also represents an efficient way to convert plastic waste into PEDOT:PSS.

Once the researchers had PEDOT:PSS in hand, they were able to compare how their waste-derived polymer performed compared to commercially available PEDOT:PSS.

“In this paper, we looked at two devices — an organic electronic transistor and a solar cell,” said Chun-Yuan Lo, a chemistry doctoral candidate and the paper’s first author. “The performance of both types of conductive polymers was comparable, and shows that our method is a very eco-friendly approach for converting polystyrene waste into high-value electronic materials.”

Specific analyses conducted at UD included X-ray photoelectron spectroscopy (XPS) at the surface analysis facility, film thickness analysis at the UD Nanofabrication Facility, and solar cell evaluation at the Institute of Energy Conversion. Argonne’s advanced spectroscopy equipment, such as carbon NMR, was used for detailed polymer characterization. Additional support was provided by materials science and engineering professor Robert Opila for solar cell analysis and by David C. Martin, the Karl W. and Renate Böer Chaired Professor of Materials Science and Engineering, for the electronic device performance analyses.

One unexpected finding related to the chemistry, the researchers added, is the ability to use stoichiometric ratios during the reaction.

“Typically, for sulfonation of polystyrene, you have to use an excess of really harsh reagents. Here, being able to use a stoichiometric ratio means that we can minimize the amount of waste being generated,” Koutsoukos said.

This finding is something the Kayser group will be looking into further as a way to “fine-tune” the degree of sulfonation. So far, they’ve found that by varying the ratio of starting materials, they can change the degree of sulfonation on the polymer. Along with studying how this degree of sulfonation impacts the electrical properties of PEDOT:PSS, the team is interested in seeing how this fine-tuning capability can be used for other applications, such as fuel cells or water filtration devices, where the degree of sulfonation greatly impacts a material’s properties.

“For the electronic devices community, the key takeaway is that you can make electronic materials from trash, and they perform just as well as what you would purchase commercially,” Kayser said. “For the more traditional polymer scientists, the fact that you can very efficiently and precisely control the degree of sulfonation is going to be of interest to a lot of different communities and applications.”

The researchers also see great potential for how this research can contribute to ongoing global sustainability efforts by providing a new way to convert waste products into value-added materials.

“Many scientists and researchers are working hard on upcycling and recycling efforts, either by chemical or mechanical means, and our study provides another example of how we can address this challenge,” Lo said.

The complete list of co-authors includes Chun-Yuan Lo, Kelsey Koutsoukos, Dan My Nguyen, Yuhang Wu, David Angel Trujillo, Tulaja Shrestha, Ethan Mackey, Vidhika Damani, Robert Opila, David Martin, and Laure Kayser from the University of Delaware and Tabitha Miller, Uddhav Kanbur, and David Kaphan from Argonne National Laboratory.



Source link

Continue Reading

TOP SCEINCE

New snake discovery rewrites history, points to North America’s role in snake evolution

Published

on

By

Large wildfires create weather that favors more fire


A new species of fossil snake unearthed in Wyoming is rewriting our understanding of snake evolution. The discovery, based on four remarkably well-preserved specimens found curled together in a burrow, reveals a new species named Hibernophis breithaupti. This snake lived in North America 34 million years ago and sheds light on the origin and diversification of boas and pythons.

Hibernophis breithaupti has unique anatomical features, in part because the specimens are articulated — meaning they were found all in one piece with the bones still arranged in the proper order — which is unusual for fossil snakes. Researchers believe it may be an early member of Booidea, a group that includes modern boas and pythons. Modern boas are widespread in the Americas, but their early evolution is not well understood.These new and very complete fossils add important new information, in particular, on the evolution of small, burrowing boas known as rubber boas.

Traditionally, there has been much debate on the evolution of small burrowing boas. Hibernophis breithaupti shows that northern and more central parts of North America might have been a key hub for their development. The discovery of these snakes curled together also hints at the oldest potential evidence for a behavior familiar to us today — hibernation in groups.

“Modern garter snakes are famous for gathering by the thousands to hibernate together in dens and burrows,” says Michael Caldwell, a U of A paleontologist who co-led the research along with his former graduate student Jasmine Croghan, and collaborators from Australia and Brazil. “They do this to conserve heat through the effect created by the ball of hibernating animals. It’s fascinating to see possible evidence of such social behavior or hibernation dating back 34 million years.”



Source link

Continue Reading

TOP SCEINCE

Good timing: Study unravels how our brains track time

Published

on

By

Large wildfires create weather that favors more fire


Ever hear the old adage that time flies when you’re having fun? A new study by a team of UNLV researchers suggests that there’s a lot of truth to the trope.

Many people think of their brains as being intrinsically synced to the human-made clocks on their electronic devices, counting time in very specific, minute-by-minute increments. But the study, published this month in the latest issue of the peer-reviewed Cell Press journal Current Biology, showed that our brains don’t work that way.

By analyzing changes in brain activity patterns, the research team found that we perceive the passage of time based on the number of experiences we have — not some kind of internal clock. What’s more, increasing speed or output during an activity appears to affect how our brains perceive time.

“We tell time in our own experience by things we do, things that happen to us,” said James Hyman, a UNLV associate professor of psychology and the study’s senior author. “When we’re still and we’re bored, time goes very slowly because we’re not doing anything or nothing is happening. On the contrary, when a lot of events happen, each one of those activities is advancing our brains forward. And if this is how our brains objectively tell time, then the more that we do and the more that happens to us, the faster time goes.”

Methodology and Findings

The findings are based on analysis of activity in the anterior cingulate cortex (ACC), a portion of the brain important for monitoring activity and tracking experiences. To do this, rodents were tasked with using their noses to respond to a prompt 200 times.

Scientists already knew that brain patterns are similar, but slightly different, each time you do a repetitive motion, so they set out to answer: Is it possible to detect whether these slight differences in brain pattern changes correspond with doing the first versus 200th motion in series? And does the amount of time it takes to complete a series of motions impact brain wave activity?

By comparing pattern changes throughout the course of the task, researchers observed that there are indeed detectable changes in brain activity that occur as one moves from the beginning to middle to end of carrying out a task. And regardless of how slowly or quickly the animals moved, the brain patterns followed the same path. The patterns were consistent when researchers applied a machine learning-based mathematical model to predict the flow of brain activity, bolstering evidence that it’s experiences — not time, or a prescribed number of minutes, as you would measure it on a clock — that produce changes in our neurons’ activity patterns.

Hyman drove home the crux of the findings by sharing an anecdote of two factory workers tasked with making 100 widgets during their shift, with one worker completing the task in 30 minutes and the other in 90 minutes.

“The length of time it took to complete the task didn’t impact the brain patterns. The brain is not a clock; it acts like a counter,” Hyman explained. “Our brains register a vibe, a feeling about time. …And what that means for our workers making widgets is that you can tell the difference between making widget No. 85 and widget No. 60, but not necessarily between No. 85 and No. 88.”

But exactly “how” does the brain count? Researchers discovered that as the brain progresses through a task involving a series of motions, various small groups of firing cells begin to collaborate — essentially passing off the task to a different group of neurons every few repetitions, similar to runners passing the baton in a relay race.

“So, the cells are working together and over time randomly align to get the job done: one cell will take a few tasks and then another takes a few tasks,” Hyman said. “The cells are tracking motions and, thus, chunks of activities and time over the course of the task.”

And the study’s findings about our brains’ perception of time applies to activities-based actions other than physical motions too.

“This is the part of the brain we use for tracking something like a conversation through dinner,” Hyman said. “Think of the flow of conversation and you can recall things earlier and later in the dinner. But to pick apart one sentence from the next in your memory, it’s impossible. But you know you talked about one topic at the start, another topic during dessert, and another at the end.”

By observing the rodents who worked quickly, scientists also concluded that keeping up a good pace helps influence time perception: “The more we do, the faster time moves. They say that time flies when you’re having fun. As opposed to having fun, maybe it should be ‘time flies when you’re doing a lot’.”

Takeaways

While there’s already a wealth of information on brain processes over very short time scales of less than a second, Hyman said that the UNLV study is groundbreaking in its examination of brain patterns and perception of time over a span of just a few minutes to hours — “which is how we live much of our life: one hour at a time. ”

“This is among the first studies looking at behavioral time scales in this particular part of the brain called the ACC, which we know is so important for our behavior and our emotions,” Hyman said.

The ACC is implicated in most psychiatric and neurodegenerative disorders, and is a concentration area for mood disorders, PTSD, addiction, and anxiety. ACC function is also central to various dementias including Alzheimer’s disease, which is characterized by distortions in time. The ACC has long been linked to helping humans with sequencing events or tasks such as following recipes, and the research team speculates that their findings about time perception might fall within this realm.

While the findings are a breakthrough, more research is needed. Still, Hyman said, the preliminary findings posit some potentially helpful tidbits about time perception and its likely connection to memory processes for everyday citizens’ daily lives. For example, researchers speculate that it could lend insights for navigating things like school assignments or even breakups.

“If we want to remember something, we may want to slow down by studying in short bouts and take time before engaging in the next activity. Give yourself quiet times to not move,” Hyman said. “Conversely, if you want to move on from something quickly, get involved in an activity right away.”

Hyman said there’s also a huge relationship between the ACC, emotion, and cognition. Thinking of the brain as a physical entity that one can take ownership over might help us control our subjective experiences.

“When things move faster, we tend to think it’s more fun — or sometimes overwhelming. But we don’t need to think of it as being a purely psychological experience, as fun or overwhelming; rather, if you view it as a physical process, it can be helpful,” he said. “If it’s overwhelming, slow down or if you’re bored, add activities. People already do this, but it’s empowering to know it’s a way to work your own mental health, since our brains are working like this already.”



Source link

Continue Reading

Trending