Connect with us

TOP SCEINCE

Ghostlike dusty galaxy reappears in James Webb Space Telescope image

Published

on

Ghostlike dusty galaxy reappears in James Webb Space Telescope image


It first appeared as a glowing blob from ground-based telescopes and then vanished completely in images from the Hubble Space Telescope. Now, the ghostly object has reappeared as a faint, yet distinct galaxy in an image from the James Webb Space Telescope (JWST).

Astronomers with the COSMOS-Web collaboration have identified the object AzTECC71 as a dusty star-forming galaxy. Or, in other words, a galaxy that’s busy forming many new stars but is shrouded in a dusty veil that’s hard to see through — from nearly 1 billion years after the Big Bang. These galaxies were once thought to be extremely rare in the early universe, but this discovery, plus more than a dozen additional candidates in the first half of COSMOS-Web data that have yet to be described in the scientific literature, suggests they might be three to 10 times as common as expected.

“This thing is a real monster,” said Jed McKinney, a postdoctoral researcher at The University of Texas at Austin. “Even though it looks like a little blob, it’s actually forming hundreds of new stars every year. And the fact that even something that extreme is barely visible in the most sensitive imaging from our newest telescope is so exciting to me. It’s potentially telling us there’s a whole population of galaxies that have been hiding from us.”

If that conclusion is confirmed, it suggests the early universe was much dustier than previously thought.

The team published its findings in The Astrophysical Journal.

The COSMOS-Web project — the largest initial JWST research initiative, co-led by Caitlin Casey, an associate professor at UT Austin — aims to map up to 1 million galaxies from a part of the sky the size of three full moons. The goal in part is to study the earliest structures of the universe. The team of more than 50 researchers was awarded 250 hours of observing time in JWST’s first year and received a first batch of data in December 2022, with more coming in through January 2024.

A dusty star-forming galaxy is hard to see in optical light because much of the light from its stars is absorbed by a veil of dust and then re-emitted at redder (or longer) wavelengths. Before JWST, astronomers sometimes referred to them as “Hubble-dark galaxies,” in reference to the previously most-sensitive space telescope.

“Until now, the only way we’ve been able to see galaxies in the early universe is from an optical perspective with Hubble,” McKinney said. “That means our understanding of the history of galaxy evolution is biased because we’re only seeing the unobscured, less dusty galaxies.”

This galaxy, AzTECC71, was first detected as an indistinct blob of dust emission by a camera on the James Clerk Maxwell Telescope in Hawaii that sees in wavelengths between far infrared and microwave. The COSMOS-Web team next spotted the object in data collected by another team using the ALMA telescope in Chile, which has higher spatial resolution and can see in infrared. That allowed them to narrow down the location of the source. When they looked in the JWST data in the infrared at a wavelength of 4.44 microns, they found a faint galaxy in exactly the same place. In shorter wavelengths of light, below 2.7 microns, it was invisible.

Now, the team is working to uncover more of these JWST-faint galaxies.

“With JWST, we can study for the first time the optical and infrared properties of this heavily dust-obscured, hidden population of galaxies,” McKinney said, “because it’s so sensitive that not only can it stare back into the farthest reaches of the universe, but it can also pierce the thickest of dusty veils.”

The team estimates that the galaxy is being viewed at a redshift of about 6, which translates to about 900 million years after the Big Bang.

Study authors from UT Austin are McKinney, Casey, Olivia Cooper (a National Science Foundation graduate research fellow), Arianna Long (a NASA Hubble fellow), Hollis Akins and Maximilien Franco.

Support was provided by NASA through a grant from the Space Telescope Science Institute.



Source link

Continue Reading
Click to comment

Leave a Reply

TOP SCEINCE

AI headphones let wearer listen to a single person in a crowd, by looking at them just once

Published

on

By

AI headphones let wearer listen to a single person in a crowd, by looking at them just once


Noise-canceling headphones have gotten very good at creating an auditory blank slate. But allowing certain sounds from a wearer’s environment through the erasure still challenges researchers. The latest edition of Apple’s AirPods Pro, for instance, automatically adjusts sound levels for wearers — sensing when they’re in conversation, for instance — but the user has little control over whom to listen to or when this happens.

A University of Washington team has developed an artificial intelligence system that lets a user wearing headphones look at a person speaking for three to five seconds to “enroll” them. The system, called “Target Speech Hearing,” then cancels all other sounds in the environment and plays just the enrolled speaker’s voice in real time even as the listener moves around in noisy places and no longer faces the speaker.

The team presented its findings May 14 in Honolulu at the ACM CHI Conference on Human Factors in Computing Systems. The code for the proof-of-concept device is available for others to build on. The system is not commercially available.

“We tend to think of AI now as web-based chatbots that answer questions,” said senior author Shyam Gollakota, a UW professor in the Paul G. Allen School of Computer Science & Engineering. “But in this project, we develop AI to modify the auditory perception of anyone wearing headphones, given their preferences. With our devices you can now hear a single speaker clearly even if you are in a noisy environment with lots of other people talking.”

To use the system, a person wearing off-the-shelf headphones fitted with microphones taps a button while directing their head at someone talking. The sound waves from that speaker’s voice then should reach the microphones on both sides of the headset simultaneously; there’s a 16-degree margin of error. The headphones send that signal to an on-board embedded computer, where the team’s machine learning software learns the desired speaker’s vocal patterns. The system latches onto that speaker’s voice and continues to play it back to the listener, even as the pair moves around. The system’s ability to focus on the enrolled voice improves as the speaker keeps talking, giving the system more training data.

The team tested its system on 21 subjects, who rated the clarity of the enrolled speaker’s voice nearly twice as high as the unfiltered audio on average.

This work builds on the team’s previous “semantic hearing” research, which allowed users to select specific sound classes — such as birds or voices — that they wanted to hear and canceled other sounds in the environment.

Currently the TSH system can enroll only one speaker at a time, and it’s only able to enroll a speaker when there is not another loud voice coming from the same direction as the target speaker’s voice. If a user isn’t happy with the sound quality, they can run another enrollment on the speaker to improve the clarity.

The team is working to expand the system to earbuds and hearing aids in the future.

Additional co-authors on the paper were Bandhav Veluri, Malek Itani and Tuochao Chen, UW doctoral students in the Allen School, and Takuya Yoshioka, director of research at AssemblyAI. This research was funded by a Moore Inventor Fellow award, a Thomas J. Cabel Endowed Professorship and a UW CoMotion Innovation Gap Fund.



Source link

Continue Reading

TOP SCEINCE

Birth of universe’s earliest galaxies observed for first time

Published

on

By

Birth of universe’s earliest galaxies observed for first time


Using the James Webb Space Telescope, University of Copenhagen researchers have become the first to see the formation of three of the earliest galaxies in the universe, more than 13 billion years ago. The sensational discovery contributes important knowledge about the universe and is now published in the journal Science.

For the first time in the history of astronomy, researchers at the Niels Bohr Institute have witnessed the birth of three of the universe’s absolute earliest galaxies, somewhere between 13.3 and 13.4 billion years ago.

The discovery was made using the James Webb Space Telescope, which brought these first ‘live observations’ of formative galaxies down to us here on Earth.

Through the telescope, researchers were able to see signals from large amounts of gas that accumulate and accrete onto a mini-galaxy in the process of being built. While this is how galaxies are formed according to theories and computer simulations, it had never actually been witnessed.

“You could say that these are the first ‘direct’ images of galaxy formation that we’ve ever seen. Whereas the James Webb has previously shown us early galaxies at later stages of evolution, here we witness their very birth, and thus, the construction of the first star systems in the universe,” says Assistant Professor Kasper Elm Heintz from the Niels Bohr Institute, who led the new study.

Galaxies born shortly after the Big Bang

The researchers estimate the birth of the three galaxies to have occurred roughly 400-600 million years after the Big Bang, the explosion that began it all. While that sounds like a long time, it corresponds to galaxies forming during the first three to four percent of the universe’s 13.8-billion-year overall lifetime.

Shortly after the Big Bang, the universe was an enormous opaque gas of hydrogen atoms — unlike today, where the night sky is speckled with a blanket of well-defined stars.

“During the few hundred million years after the Big Bang, the first stars formed, before stars and gas began to coalesce into galaxies. This is the process that we see the beginning of in our observations,” explains Associate Professor Darach Watson.

The birth of galaxies took place at a time in the history of the universe known as the Epoch of Reionization, when the energy and light of some of the first galaxies broke through the mists of hydrogen gas.

It is precisely these large amounts of hydrogen gas that the researchers captured using the James Webb Space Telescope’s infrared vision. This is the most distant measurement of the cold, neutral hydrogen gas, which is the building block of the stars and galaxies, discovered by scientific researchers to date.

Adds to the understanding of our origins

The study was conducted by Kasper Elm Heintz, in close collaboration with, among others, research colleagues Darach Watson, Gabriel Brammer and PhD student Simone Vejlgaard from the Cosmic Dawn Center at the University of Copenhagen’s Niels Bohr Institute — a center whose stated goal is to investigate and understand the dawn of the universe. This latest result brings them much closer to doing just that.

The research team has already applied for more observation time with the James Webb Space Telescope, with hopes of expanding upon their new result and learning more about the earliest epoch in the formation of galaxies.

“For now, this is about mapping our new observations of galaxies being formed in even greater detail than before. At the same time, we are constantly trying to push the limit of how far out into the universe we can see. So, perhaps we’ll reach even further,” says Simone Vejlgaard.

According to the researcher, the new knowledge contributes to answering one of humanity’s most basic questions.

“One of the most fundamental questions that we humans have always asked is: ‘Where do we come from?’. Here, we piece together a bit more of the answer by shedding light on the moment that some of the universe’s first structures were created. It is a process that we’ll investigate further, until hopefully, we are able to fit even more pieces of the puzzle together,” concludes Associate Professor Gabriel Brammer.

The study was conducted by researchers Kasper E. Heintz, Darach Watson, Gabriel Brammer, Simone Vejlgaard, Anne Hutter, Victoria B. Strait, Jorryt Matthee, Pascal A. Oesch, Pall Jakobsson, Nial R. Tanvir, Peter Laursen, Rohan P. Naidu, Charlotte A. Mason, Meghana Killi, Intae Jung, Tiger Yu-Yang Hsiao, Abdurro’uf, Dan Coe, Pablo Arrabal Haro, Steven L. Finkelstein, & Sune Toft.

The Danish portion of the research is funded by the Danish National Research Foundation and the Carlsberg Foundation.

HOW THEY DID IT

Researchers were able to measure the formation of the universe’s first galaxies by using sophisticated models of how light from these galaxies was absorbed by the neutral gas located in and around them. This transition is known as the Lyman-alpha transition.

By measuring the light, the researchers were able to distinguish gas from the newly formed galaxies from other gas. These measurements were only possible thanks to the James Webb Space Telescope’s incredibly sensitive infrared spectrograph capabilities.

ABOUT THE EARLY UNIVERSE

The universe began its “life” about 13.8 billion years ago in an enormous explosion — the Big Bang. The event gave rise to an abundance of subatomic particles such as quarks and electrons. These particles aggregated to form protons and neutrons, which later coalesced into atomic nuclei. Roughly 380,000 years after the Big Bang, electrons began to orbit atomic nuclei, and the simplest atoms of the universe gradually formed.

The first stars were formed after a few hundred million years. And within the hearts of these stars, the larger and more complex atoms that we have around us were formed.

Later, stars coalesced into galaxies. The oldest galaxies known to us were formed about 3-400 million years after the Big Bang. Our own solar system came into being about 4.6 billion years ago — more than 9 billion years after the Big Bang.



Source link

Continue Reading

TOP SCEINCE

Scientists map networks regulating gene function in the human brain

Published

on

By

Scientists map networks regulating gene function in the human brain


A consortium of researchers has produced the largest and most advanced multidimensional maps of gene regulation networks in the brains of people with and without mental disorders. These maps detail the many regulatory elements that coordinate the brain’s biological pathways and cellular functions. The research, supported the National Institutes of Health (NIH), used postmortem brain tissue from over 2,500 donors to map gene regulation networks across different stages of brain development and multiple brain-related disorders.

“These groundbreaking findings advance our understanding of where, how, and when genetic risk contributes to mental disorders such as schizophrenia, post-traumatic stress disorder, and depression,” said Joshua A. Gordon, M.D., Ph.D., director of NIH’s National Institute of Mental Health (NIMH). “Moreover, the critical resources, shared freely,willhelp researchers pinpoint genetic variants that are likely to play a causal role in mental illnesses and identify potential molecular targets for new therapeutics.”

The research is published across 15 papers in Science, Science Advances, and Scientific Reports. The papers report findings along several key themes:

  • Population-level analyses that link genetic variants, regulatory elements, and different molecular forms of expressed genes to regulatory networks at the cellular level, in both the developing brain and adult brain
  • Single-cell-level maps of the prefrontal cortex from individuals diagnosed with mental disorders and neurodevelopmental disorders
  • Experimental analyses validating the function of regulatory elements and genetic variants associated with quantitative trait loci (segments of DNA that are linked with observable traits)

The analyses expand on previous findings, exploring multiple cortical and subcortical regions of the human brain. These brain areas play key roles in a range of essential processes, including decision-making, memory, learning, emotion, reward processing, and motor control.

Approximately 2% of the human genome is composed of genes that code for proteins. The remaining 98% includes DNA segments that help regulate the activity of those genes. To better understand how brain structure and function contribute to mental disorders, researchers in the NIMH-funded PsychENCODE Consortium are using standardized methods and data analysis approaches to build a comprehensive picture of these regulatory elements in the human brain.

In addition to these discoveries, the papers also highlight new methods and tools to help researchers analyze and explore the wealth of data produced by this effort. These resources include a web-based platform offering interactive visualization data from diverse brain cell types in individuals with and without mental disorders, known as PsychSCREEN. Together, these methods and tools provide a comprehensive, integrated data resource for the broader research community.

The papers focus on the second phase of findings from the PsychENCODE Consortium. This effort aims to advance our understanding of how gene regulation impacts brain function and dysfunction.

“These PsychENCODE Consortium findings shed new light on how gene risk maps onto brain function across developmental stages, brain regions, and disorders,” said Jonathan Pevsner, Ph.D., chief of the NIMH Genomics Research Branch. “The work lays a strong foundation for ongoing efforts to characterize regulatory pathways across disorders, elucidate the role of epigenetic mechanisms, and increase the ancestral diversity represented in studies.”

The PsychENCODE papers published in Science and Science Advances are presented as a collection on the Science website.



Source link

Continue Reading

Trending

Copyright © 2017 Zox News Theme. Theme by MVP Themes, powered by WordPress.