Connect with us

TOP SCEINCE

Climate change likely drove the extinction of North America’s largest animals

Published

on

Climate change likely drove the extinction of North America’s largest animals

A new study published in Nature Communications suggests that the extinction of North America’s largest mammals was not driven by overhunting by rapidly expanding human populations following their entrance into the Americas. Instead, the findings, based on a new statistical modelling approach, suggest that populations of large mammals fluctuated in response to climate change, with drastic decreases of temperatures around 13,000 years ago initiating the decline and extinction of these massive creatures. Still, humans may have been involved in more complex and indirect ways than simple models of overhunting suggest.

Before around 10,000 years ago, North America was home to many large and exotic creatures, such as mammoths, gigantic ground-dwelling sloths, larger-than-life beavers, and huge armadillo-like creatures known as glyptodons. But by around 10,000 years ago, most of North America’s animals weighing over 44 kg, also known as megafauna, had disappeared. Researchers from the Max Planck Extreme Events Research Group in Jena, Germany, wanted to find out what led to these extinctions. The topic has been intensely debated for decades, with most researchers arguing that human overhunting, climate change, or some combination of the two was responsible. With a new statistical approach, the researchers found strong evidence that climate change was the main driver of extinction.Overhunting vs. climate change

Since the 1960’s, it has been hypothesized that, as human populations grew and expanded across the continents, the arrival of specialized “big-game” hunters in the Americas some 14,000 year ago rapidly drove many giant mammals to extinction. The large animals did not possess the appropriate anti-predator behaviors to deal with a novel, highly social, tool-wielding predator, which made them particularly easy to hunt. According to proponents of this “overkill hypothesis,” humans took full advantage of the easy-to-hunt prey, devastating the animal populations and carelessly driving the giant creatures to extinction.

Not everyone agrees with this idea, however. Many scientists have argued that there is too little archaeological evidence to support the idea that megafauna hunting was persistent or widespread enough to cause extinctions. Instead, significant climatic and ecological changes may have been to blame.

Around the time of the extinctions (between 15,000 and 12,000 years ago), there were two major climatic changes. The first was a period of abrupt warming that began around 14,700 years ago, and the second was a cold snap around 12,900 years ago during which the Northern Hemisphere returned to near-glacial conditions. One or both of these important temperature swings, and their ecological ramifications, have been implicated in the megafauna extinctions.


“A common approach has been to try to determine the timing of megafauna extinctions and to see how they align with human arrival in the Americas or some climatic event,” says Mathew Stewart, co-lead author of the study. “However, extinction is a process — meaning that it unfolds over some span of time — and so to understand what caused the demise of North America’s megafauna, it’s crucial that we understand how their populations fluctuated in the lead up to extinction. Without those long-term patterns, all we can see are rough coincidences.”

‘Dates as data’

To test these conflicting hypotheses, the authors used a new statistical approach developed by W. Christopher Carleton, the study’s other co-lead author, and published last year in the Journal of Quaternary Science. Estimating population sizes of prehistoric hunter-gatherer groups and long-extinct animals cannot be done by counting heads or hooves. Instead, archaeologists and palaeontologists use the radiocarbon record as a proxy for past population sizes. The rationale being that the more animals and humans present in a landscape, the more datable carbon is left behind after they are gone, which is then reflected in the archaeological and fossil records. Unlike established approaches, the new method better accounts for uncertainty in fossil dates.

The major problem with the previous approach is that it blends the uncertainty associated with radiocarbon dates with the process scientists are trying to identify.

“As a result, you can end up seeing trends in the data that don’t really exist, making this method rather unsuitable for capturing changes in past population levels. Using simulation studies where we know what the real patterns in the data are, we have been able to show that the new method does not have the same problems. As a result, our method is able to do a much better job capturing through-time changes in population levels using the radiocarbon record,” explains Carleton.

North American megafauna extinctions

The authors applied this new approach to the question of the Late Quaternary North American megafauna extinctions. In contrast to previous studies, the new findings show that megafauna populations fluctuated in response to climate change.

“Megafauna populations appear to have been increasing as North American began to warm around 14,700 years ago,” states Stewart. “But we then see a shift in this trend around 12,900 years ago as North America began to drastically cool, and shortly after this we begin to see the extinctions of megafauna occur.”

And while these findings suggest that the return to near glacial conditions around 12,900 years ago was the proximate cause for the extinctions, the story is likely to be more complicated than this.

“”We must consider the ecological changes associated with these climate changes at both a continental and regional scale if we want to have a proper understanding of what drove these extinctions,” explains group leader Huw Groucutt, senior author of the study. “Humans also aren’t completely off the hook, as it remains possible that they played a more nuanced role in the megafauna extinctions than simple overkill models suggest.”

Many researchers have argued that it is an impossible coincidence that megafauna extinctions around the world often happened around the time of human arrival. However, it is important to scientifically demonstrate that there was a relationship, and even if there was, the causes may have been much more indirect (such as through habitat modification) than a killing frenzy as humans arrived in a region.

The authors end their article with a call to arms, urging researchers to develop bigger, more reliable records and robust methods for interpreting them. Only then will we develop a comprehensive understanding of the Late Quaternary megafauna extinction event.

Source link

Continue Reading
2 Comments

2 Comments

  1. Pingback: Did woolly mammoths overlap with first humans in what is now New England? Researchers trace the age of a Mount Holly mammoth rib fragment from Mount Holly, Vt.

  2. Pingback: 100-Million-Year-Old Dinosaur Bones Discovered in Meghalaya: Researchers

Leave a Reply

TOP SCEINCE

Charge your laptop in a minute or your EV in 10? Supercapacitors can help

Published

on

By

Climate change likely drove the extinction of North America’s largest animals


Imagine if your dead laptop or phone could charge in a minute or if an electric car could be fully powered in 10 minutes.

While not possible yet, new research by a team of CU Boulder scientists could potentially lead to such advances.

Published today in the Proceedings of the National Academy of Sciences, researchers in Ankur Gupta’s lab discovered how tiny charged particles, called ions, move within a complex network of minuscule pores. The breakthrough could lead to the development of more efficient energy storage devices, such as supercapacitors, said Gupta, an assistant professor of chemical and biological engineering.

“Given the critical role of energy in the future of the planet, I felt inspired to apply my chemical engineering knowledge to advancing energy storage devices,” Gupta said. “It felt like the topic was somewhat underexplored and as such, the perfect opportunity.”

Gupta explained that several chemical engineering techniques are used to study flow in porous materials such as oil reservoirs and water filtration, but they have not been fully utilized in some energy storage systems.

The discovery is significant not only for storing energy in vehicles and electronic devices but also for power grids, where fluctuating energy demand requires efficient storage to avoid waste during periods of low demand and to ensure rapid supply during high demand.

Supercapacitors, energy storage devices that rely on ion accumulation in their pores, have rapid charging times and longer life spans compared to batteries.

“The primary appeal of supercapacitors lies in their speed,” Gupta said. “So how can we make their charging and release of energy faster? By the more efficient movement of ions.”

Their findings modify Kirchhoff’s law, which has governed current flow in electrical circuits since 1845 and is a staple in high school students’ science classes. Unlike electrons, ions move due to both electric fields and diffusion, and the researchers determined that their movements at pore intersections are different from what was described in Kirchhoff’s law.

Prior to the study, ion movements were only described in the literature in one straight pore. Through this research, ion movement in a complex network of thousands of interconnected pores can be simulated and predicted in a few minutes.

“That’s the leap of the work,” Gupta said. “We found the missing link.”



Source link

Continue Reading

TOP SCEINCE

AI headphones let wearer listen to a single person in a crowd, by looking at them just once

Published

on

By

Climate change likely drove the extinction of North America’s largest animals


Noise-canceling headphones have gotten very good at creating an auditory blank slate. But allowing certain sounds from a wearer’s environment through the erasure still challenges researchers. The latest edition of Apple’s AirPods Pro, for instance, automatically adjusts sound levels for wearers — sensing when they’re in conversation, for instance — but the user has little control over whom to listen to or when this happens.

A University of Washington team has developed an artificial intelligence system that lets a user wearing headphones look at a person speaking for three to five seconds to “enroll” them. The system, called “Target Speech Hearing,” then cancels all other sounds in the environment and plays just the enrolled speaker’s voice in real time even as the listener moves around in noisy places and no longer faces the speaker.

The team presented its findings May 14 in Honolulu at the ACM CHI Conference on Human Factors in Computing Systems. The code for the proof-of-concept device is available for others to build on. The system is not commercially available.

“We tend to think of AI now as web-based chatbots that answer questions,” said senior author Shyam Gollakota, a UW professor in the Paul G. Allen School of Computer Science & Engineering. “But in this project, we develop AI to modify the auditory perception of anyone wearing headphones, given their preferences. With our devices you can now hear a single speaker clearly even if you are in a noisy environment with lots of other people talking.”

To use the system, a person wearing off-the-shelf headphones fitted with microphones taps a button while directing their head at someone talking. The sound waves from that speaker’s voice then should reach the microphones on both sides of the headset simultaneously; there’s a 16-degree margin of error. The headphones send that signal to an on-board embedded computer, where the team’s machine learning software learns the desired speaker’s vocal patterns. The system latches onto that speaker’s voice and continues to play it back to the listener, even as the pair moves around. The system’s ability to focus on the enrolled voice improves as the speaker keeps talking, giving the system more training data.

The team tested its system on 21 subjects, who rated the clarity of the enrolled speaker’s voice nearly twice as high as the unfiltered audio on average.

This work builds on the team’s previous “semantic hearing” research, which allowed users to select specific sound classes — such as birds or voices — that they wanted to hear and canceled other sounds in the environment.

Currently the TSH system can enroll only one speaker at a time, and it’s only able to enroll a speaker when there is not another loud voice coming from the same direction as the target speaker’s voice. If a user isn’t happy with the sound quality, they can run another enrollment on the speaker to improve the clarity.

The team is working to expand the system to earbuds and hearing aids in the future.

Additional co-authors on the paper were Bandhav Veluri, Malek Itani and Tuochao Chen, UW doctoral students in the Allen School, and Takuya Yoshioka, director of research at AssemblyAI. This research was funded by a Moore Inventor Fellow award, a Thomas J. Cabel Endowed Professorship and a UW CoMotion Innovation Gap Fund.



Source link

Continue Reading

TOP SCEINCE

Theory and experiment combine to shine a new light on proton spin

Published

on

By

Climate change likely drove the extinction of North America’s largest animals


Nuclear physicists have long been working to reveal how the proton gets its spin. Now, a new method that combines experimental data with state-of-the-art calculations has revealed a more detailed picture of spin contributions from the very glue that holds protons together. It also paves the way toward imaging the proton’s 3D structure.

The work was led by Joseph Karpie, a postdoctoral associate in the Center for Theoretical and Computational Physics (Theory Center) at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility.

He said that this decades-old mystery began with measurements of the sources of the proton’s spin in 1987. Physicists originally thought that the proton’s building blocks, its quarks, would be the main source of the proton’s spin. But that’s not what they found. It turned out that the proton’s quarks only provide about 30% of the proton’s total measured spin. The rest comes from two other sources that have so far proven more difficult to measure.

One is the mysterious but powerful strong force. The strong force is one of the four fundamental forces in the universe. It’s what “glues” quarks together to make up other subatomic particles, such as protons or neutrons. Manifestations of this strong force are called gluons, which are thought to contribute to the proton’s spin. The last bit of spin is thought to come from the movements of the proton’s quarks and gluons.

“This paper is sort of a bringing together of two groups in the Theory Center who have been working toward trying to understand the same bit of physics, which is how do the gluons that are inside of it contribute to how much the proton is spinning around,” he said.

He said this study was inspired by a puzzling result that came from initial experimental measurements of the gluons’ spin. The measurements were made at the Relativistic Heavy Ion Collider, a DOE Office of Science user facility based at Brookhaven National Laboratory in New York. The data at first seemed to indicate that the gluons may be contributing to the proton’s spin. They showed a positive result.

But as the data analysis was improved, a further possibility appeared.

“When they improved their analysis, they started to get two sets of results that seemed quite different, one was positive and the other was negative,” Karpie explained.

While the earlier positive result indicated that the gluons’ spins are aligned with that of the proton, the improved analysis allowed for the possibility that the gluons’ spins have an overall negative contribution. In that case, more of the proton spin would come from the movement of the quarks and gluons, or from the spin of the quarks themselves.

This puzzling result was published by the Jefferson Lab Angular Momentum (JAM) collaboration.

Meanwhile, the HadStruc collaboration had been addressing the same measurements in a different way. They were using supercomputers to calculate the underlying theory that describes the interactions among quarks and gluons in the proton, Quantum Chromodynamics (QCD).

To equip supercomputers to make this intense calculation, theorists somewhat simplify some aspects of the theory. This somewhat simplified version for computers is called lattice QCD.

Karpie led the work to bring together the data from both groups. He started with the combined data from experiments taken in facilities around the world. He then added the results from the lattice QCD calculation into his analysis.

“This is putting everything together that we know about quark and gluon spin and how gluons contribute to the spin of the proton in one dimension,” said David Richards, a Jefferson Lab senior staff scientist who worked on the study.

“When we did, we saw that the negative things didn’t go away, but they changed dramatically. That meant that there’s something funny going on with those,” Karpie said.

Karpie is lead author on the study that was recently published in Physical Review D. He said the main takeaway is that combining the data from both approaches provided a more informed result.

“We’re combining both of our datasets together and getting a better result out than either of us could get independently. It’s really showing that we learn a lot more by combining lattice QCD and experiment together in one problem analysis,” said Karpie. “This is the first step, and we hope to keep doing this with more and more observables as well as we make more lattice data.”

The next step is to further improve the datasets. As more powerful experiments provide more detailed information on the proton, these data begin painting a picture that goes beyond one dimension. And as theorists learn how to improve their calculations on ever-more powerful supercomputers, their solutions also become more precise and inclusive.

The goal is to eventually produce a three-dimensional understanding of the proton’s structure.

“So, we learn our tools do work on the simpler one-dimension scenario. By testing our methods now, we hopefully will know what we need to do when we want to move up to do 3D structure,” Richards said. “This work will contribute to this 3D image of what a proton should look like. So it’s all about building our way up to the heart of the problem by doing this easier stuff now.”



Source link

Continue Reading

Trending

Copyright © 2017 Zox News Theme. Theme by MVP Themes, powered by WordPress.