Connect with us

TOP SCEINCE

Climate change likely drove the extinction of North America’s largest animals

Published

on

Climate change likely drove the extinction of North America’s largest animals

A new study published in Nature Communications suggests that the extinction of North America’s largest mammals was not driven by overhunting by rapidly expanding human populations following their entrance into the Americas. Instead, the findings, based on a new statistical modelling approach, suggest that populations of large mammals fluctuated in response to climate change, with drastic decreases of temperatures around 13,000 years ago initiating the decline and extinction of these massive creatures. Still, humans may have been involved in more complex and indirect ways than simple models of overhunting suggest.

Before around 10,000 years ago, North America was home to many large and exotic creatures, such as mammoths, gigantic ground-dwelling sloths, larger-than-life beavers, and huge armadillo-like creatures known as glyptodons. But by around 10,000 years ago, most of North America’s animals weighing over 44 kg, also known as megafauna, had disappeared. Researchers from the Max Planck Extreme Events Research Group in Jena, Germany, wanted to find out what led to these extinctions. The topic has been intensely debated for decades, with most researchers arguing that human overhunting, climate change, or some combination of the two was responsible. With a new statistical approach, the researchers found strong evidence that climate change was the main driver of extinction.Overhunting vs. climate change

Since the 1960’s, it has been hypothesized that, as human populations grew and expanded across the continents, the arrival of specialized “big-game” hunters in the Americas some 14,000 year ago rapidly drove many giant mammals to extinction. The large animals did not possess the appropriate anti-predator behaviors to deal with a novel, highly social, tool-wielding predator, which made them particularly easy to hunt. According to proponents of this “overkill hypothesis,” humans took full advantage of the easy-to-hunt prey, devastating the animal populations and carelessly driving the giant creatures to extinction.

Not everyone agrees with this idea, however. Many scientists have argued that there is too little archaeological evidence to support the idea that megafauna hunting was persistent or widespread enough to cause extinctions. Instead, significant climatic and ecological changes may have been to blame.

Around the time of the extinctions (between 15,000 and 12,000 years ago), there were two major climatic changes. The first was a period of abrupt warming that began around 14,700 years ago, and the second was a cold snap around 12,900 years ago during which the Northern Hemisphere returned to near-glacial conditions. One or both of these important temperature swings, and their ecological ramifications, have been implicated in the megafauna extinctions.


“A common approach has been to try to determine the timing of megafauna extinctions and to see how they align with human arrival in the Americas or some climatic event,” says Mathew Stewart, co-lead author of the study. “However, extinction is a process — meaning that it unfolds over some span of time — and so to understand what caused the demise of North America’s megafauna, it’s crucial that we understand how their populations fluctuated in the lead up to extinction. Without those long-term patterns, all we can see are rough coincidences.”

‘Dates as data’

To test these conflicting hypotheses, the authors used a new statistical approach developed by W. Christopher Carleton, the study’s other co-lead author, and published last year in the Journal of Quaternary Science. Estimating population sizes of prehistoric hunter-gatherer groups and long-extinct animals cannot be done by counting heads or hooves. Instead, archaeologists and palaeontologists use the radiocarbon record as a proxy for past population sizes. The rationale being that the more animals and humans present in a landscape, the more datable carbon is left behind after they are gone, which is then reflected in the archaeological and fossil records. Unlike established approaches, the new method better accounts for uncertainty in fossil dates.

The major problem with the previous approach is that it blends the uncertainty associated with radiocarbon dates with the process scientists are trying to identify.

“As a result, you can end up seeing trends in the data that don’t really exist, making this method rather unsuitable for capturing changes in past population levels. Using simulation studies where we know what the real patterns in the data are, we have been able to show that the new method does not have the same problems. As a result, our method is able to do a much better job capturing through-time changes in population levels using the radiocarbon record,” explains Carleton.

North American megafauna extinctions

The authors applied this new approach to the question of the Late Quaternary North American megafauna extinctions. In contrast to previous studies, the new findings show that megafauna populations fluctuated in response to climate change.

“Megafauna populations appear to have been increasing as North American began to warm around 14,700 years ago,” states Stewart. “But we then see a shift in this trend around 12,900 years ago as North America began to drastically cool, and shortly after this we begin to see the extinctions of megafauna occur.”

And while these findings suggest that the return to near glacial conditions around 12,900 years ago was the proximate cause for the extinctions, the story is likely to be more complicated than this.

“”We must consider the ecological changes associated with these climate changes at both a continental and regional scale if we want to have a proper understanding of what drove these extinctions,” explains group leader Huw Groucutt, senior author of the study. “Humans also aren’t completely off the hook, as it remains possible that they played a more nuanced role in the megafauna extinctions than simple overkill models suggest.”

Many researchers have argued that it is an impossible coincidence that megafauna extinctions around the world often happened around the time of human arrival. However, it is important to scientifically demonstrate that there was a relationship, and even if there was, the causes may have been much more indirect (such as through habitat modification) than a killing frenzy as humans arrived in a region.

The authors end their article with a call to arms, urging researchers to develop bigger, more reliable records and robust methods for interpreting them. Only then will we develop a comprehensive understanding of the Late Quaternary megafauna extinction event.

Source link

Continue Reading
2 Comments

2 Comments

  1. Pingback: Did woolly mammoths overlap with first humans in what is now New England? Researchers trace the age of a Mount Holly mammoth rib fragment from Mount Holly, Vt.

  2. Pingback: 100-Million-Year-Old Dinosaur Bones Discovered in Meghalaya: Researchers

Leave a Reply

TOP SCEINCE

Early dark energy could resolve cosmology’s two biggest puzzles

Published

on

By

Climate change likely drove the extinction of North America’s largest animals


A new study by MIT physicists proposes that a mysterious force known as early dark energy could solve two of the biggest puzzles in cosmology and fill in some major gaps in our understanding of how the early universe evolved.

One puzzle in question is the “Hubble tension,” which refers to a mismatch in measurements of how fast the universe is expanding. The other involves observations of numerous early, bright galaxies that existed at a time when the early universe should have been much less populated.

Now, the MIT team has found that both puzzles could be resolved if the early universe had one extra, fleeting ingredient: early dark energy. Dark energy is an unknown form of energy that physicists suspect is driving the expansion of the universe today. Early dark energy is a similar, hypothetical phenomenon that may have made only a brief appearance, influencing the expansion of the universe in its first moments before disappearing entirely.

Some physicists have suspected that early dark energy could be the key to solving the Hubble tension, as the mysterious force could accelerate the early expansion of the universe by an amount that would resolve the measurement mismatch.

The MIT researchers have now found that early dark energy could also explain the baffling number of bright galaxies that astronomers have observed in the early universe. In their new study, reported in the Monthly Notices of the Royal Astronomical Society, the team modeled the formation of galaxies in the universe’s first few hundred million years. When they incorporated a dark energy component only in that earliest sliver of time, they found the number of galaxies that arose from the primordial environment bloomed to fit astronomers’ observations.

You have these two looming open-ended puzzles,” says study co-author Rohan Naidu, a postdoc in MIT’s Kavli Institute for Astrophysics and Space Research. “We find that in fact, early dark energy is a very elegant and sparse solution to two of the most pressing problems in cosmology.”

The study’s co-authors include lead author and Kavli postdoc Xuejian (Jacob) Shen, and MIT professor of physics Mark Vogelsberger, along with Michael Boylan-Kolchin at the University of Texas at Austin, and Sandro Tacchella at the University of Cambridge.

Big city lights

Based on standard cosmological and galaxy formation models, the universe should have taken its time spinning up the first galaxies. It would have taken billions of years for primordial gas to coalesce into galaxies as large and bright as the Milky Way.

But in 2023, NASA’s James Webb Space Telescope (JWST) made a startling observation. With an ability to peer farther back in time than any observatory to date, the telescope uncovered a surprising number of bright galaxies as large as the modern Milky Way within the first 500 million years, when the universe was just 3 percent of its current age.

“The bright galaxies that JWST saw would be like seeing a clustering of lights around big cities, whereas theory predicts something like the light around more rural settings like Yellowstone National Park,” Shen says. “And we don’t expect that clustering of light so early on.”

For physicists, the observations imply that there is either something fundamentally wrong with the physics underlying the models or a missing ingredient in the early universe that scientists have not accounted for. The MIT team explored the possibility of the latter, and whether the missing ingredient might be early dark energy.

Physicists have proposed that early dark energy is a sort of antigravitational force that is turned on only at very early times. This force would counteract gravity’s inward pull and accelerate the early expansion of the universe, in a way that would resolve the mismatch in measurements. Early dark energy, therefore, is considered the most likely solution to the Hubble tension.

Galaxy skeleton

The MIT team explored whether early dark energy could also be the key to explaining the unexpected population of large, bright galaxies detected by JWST. In their new study, the physicists considered how early dark energy might affect the early structure of the universe that gave rise to the first galaxies. They focused on the formation of dark matter halos — regions of space where gravity happens to be stronger, and where matter begins to accumulate.

“We believe that dark matter halos are the invisible skeleton of the universe,” Shen explains. “Dark matter structures form first, and then galaxies form within these structures. So, we expect the number of bright galaxies should be proportional to the number of big dark matter halos.”

The team developed an empirical framework for early galaxy formation, which predicts the number, luminosity, and size of galaxies that should form in the early universe, given some measures of “cosmological parameters.” Cosmological parameters are the basic ingredients, or mathematical terms, that describe the evolution of the universe.

Physicists have determined that there are at least six main cosmological parameters, one of which is the Hubble constant — a term that describes the universe’s rate of expansion. Other parameters describe density fluctuations in the primordial soup, immediately after the Big Bang, from which dark matter halos eventually form.

The MIT team reasoned that if early dark energy affects the universe’s early expansion rate, in a way that resolves the Hubble tension, then it could affect the balance of the other cosmological parameters, in a way that might increase the number of bright galaxies that appear at early times. To test their theory, they incorporated a model of early dark energy (the same one that happens to resolve the Hubble tension) into an empirical galaxy formation framework to see how the earliest dark matter structures evolve and give rise to the first galaxies.

“What we show is, the skeletal structure of the early universe is altered in a subtle way where the amplitude of fluctuations goes up, and you get bigger halos, and brighter galaxies that are in place at earlier times, more so than in our more vanilla models,” Naidu says. “It means things were more abundant, and more clustered in the early universe.”

“A priori, I would not have expected the abundance of JWST’s early bright galaxies to have anything to do with early dark energy, but their observation that EDE pushes cosmological parameters in a direction that boosts the early-galaxy abundance is interesting,” says Marc Kamionkowski, professor of theoretical physics at Johns Hopkins University, who was not involved with the study. “I think more work will need to be done to establish a link between early galaxies and EDE, but regardless of how things turn out, it’s a clever — and hopefully ultimately fruitful — thing to try.”

We demonstrated the potential of early dark energy as a unified solution to the two major issues faced by cosmology. This might be an evidence for its existence if the observational findings of JWST get further consolidated,” Vogelsberger concludes. “In the future, we can incorporate this into large cosmological simulations to see what detailed predictions we get.”

This research was supported, in part, by NASA and the National Science Foundation.



Source link

Continue Reading

TOP SCEINCE

Plant-derived secondary organic aerosols can act as mediators of plant-plant interactions

Published

on

By

Climate change likely drove the extinction of North America’s largest animals


A new study published in Science reveals that plant-derived secondary organic aerosols (SOAs) can act as mediators of plant-plant interactions. This research was conducted through the cooperation of chemical ecologists, plant ecophysiologists and atmospheric physicists at the University of Eastern Finland.

It is well known that plants release volatile organic compounds (VOCs) into the atmosphere when damaged by herbivores. These VOCs play a crucial role in plant-plant interactions, whereby undamaged plants may detect warning signals from their damaged neighbours and prepare their defences. “Reactive plant VOCs undergo oxidative chemical reactions, resulting in the formation of secondary organic aerosols (SOAs). We wondered whether the ecological functions mediated by VOCs persist after they are oxidated to form SOAs,” said Dr. Hao Yu, formerly a PhD student at UEF, but now at the University of Bern.

The study showed that Scots pine seedlings, when damaged by large pine weevils, release VOCs that activate defences in nearby plants of the same species. Interestingly, the biological activity persisted after VOCs were oxidized to form SOAs. The results indicated that the elemental composition and quantity of SOAs likely determines their biological functions.

“A key novelty of the study is the finding that plants adopt subtly different defence strategies when receiving signals as VOCs or as SOAs, yet they exhibit similar degrees of resistance to herbivore feeding,” said Professor James Blande, head of the Environmental Ecology Research Group. This observation opens up the possibility that plants have sophisticated sensing systems that enable them to tailor their defences to information derived from different types of chemical cue.

“Considering the formation rate of SOAs from their precursor VOCs, their longer lifetime compared to VOCs, and the atmospheric air mass transport, we expect that the ecologically effective distance for interactions mediated by SOAs is longer than that for plant interactions mediated by VOCs,” said Professor Annele Virtanen, head of the Aerosol Physics Research Group. This could be interpreted as plants being able to detect cues representing close versus distant threats from herbivores.

The study is expected to open up a whole new complex research area to environmental ecologists and their collaborators, which could lead to new insights on the chemical cues structuring interactions between plants.



Source link

Continue Reading

TOP SCEINCE

Folded or cut, this lithium-sulfur battery keeps going

Published

on

By

Climate change likely drove the extinction of North America’s largest animals


Most rechargeable batteries that power portable devices, such as toys, handheld vacuums and e-bikes, use lithium-ion technology. But these batteries can have short lifetimes and may catch fire when damaged. To address stability and safety issues, researchers reporting in ACS Energy Letters have designed a lithium-sulfur (Li-S) battery that features an improved iron sulfide cathode. One prototype remains highly stable over 300 charge-discharge cycles, and another provides power even after being folded or cut.

Sulfur has been suggested as a material for lithium-ion batteries because of its low cost and potential to hold more energy than lithium-metal oxides and other materials used in traditional ion-based versions. To make Li-S batteries stable at high temperatures, researchers have previously proposed using a carbonate-based electrolyte to separate the two electrodes (an iron sulfide cathode and a lithium metal-containing anode). However, as the sulfide in the cathode dissolves into the electrolyte, it forms an impenetrable precipitate, causing the cell to quickly lose capacity. Liping Wang and colleagues wondered if they could add a layer between the cathode and electrolyte to reduce this corrosion without reducing functionality and rechargeability.

The team coated iron sulfide cathodes in different polymers and found in initial electrochemical performance tests that polyacrylic acid (PAA) performed best, retaining the electrode’s discharge capacity after 300 charge-discharge cycles. Next, the researchers incorporated a PAA-coated iron sulfide cathode into a prototype battery design, which also included a carbonate-based electrolyte, a lithium metal foil as an ion source, and a graphite-based anode. They produced and then tested both pouch cell and coin cell battery prototypes.

After more than 100 charge-discharge cycles, Wang and colleagues observed no substantial capacity decay in the pouch cell. Additional experiments showed that the pouch cell still worked after being folded and cut in half. The coin cell retained 72% of its capacity after 300 charge-discharge cycles. They next applied the polymer coating to cathodes made from other metals, creating lithium-molybdenum and lithium-vanadium batteries. These cells also had stable capacity over 300 charge-discharge cycles. Overall, the results indicate that coated cathodes could produce not only safer Li-S batteries with long lifespans, but also efficient batteries with other metal sulfides, according to Wang’s team.

The authors acknowledge funding from the National Natural Science Foundation of China; the Natural Science Foundation of Sichuan, China; and the Beijing National Laboratory for Condensed Matter Physics.



Source link

Continue Reading

Trending