Connect with us

TOP SCEINCE

First X-rays from Uranus discovered

Published

on

First X-rays from Uranus discovered

Astronomers have detected X-rays from Uranus for the first time, using NASA’s Chandra X-ray Observatory. This result may help scientists learn more about this enigmatic ice giant planet in our solar system.

Uranus is the seventh planet from the Sun and has two sets of rings around its equator. The planet, which has four times the diameter of Earth, rotates on its side, making it different from all other planets in the solar system. Since Voyager 2 was the only spacecraft to ever fly by Uranus, astronomers currently rely on telescopes much closer to Earth, like Chandra and the Hubble Space Telescope, to learn about this distant and cold planet that is made up almost entirely of hydrogen and helium.

In the new study, researchers used Chandra observations taken in Uranus in 2002 and then again in 2017. They saw a clear detection of X-rays from the first observation, just analyzed recently, and a possible flare of X-rays in those obtained fifteen years later. The main graphic shows a Chandra X-ray image of Uranus from 2002 (in pink) superimposed on an optical image from the Keck-I Telescope obtained in a separate study in 2004. The latter shows the planet at approximately the same orientation as it was during the 2002 Chandra observations.

What could cause Uranus to emit X-rays? The answer: mainly the Sun. Astronomers have observed that both Jupiter and Saturn scatter X-ray light given off by the Sun, similar to how Earth’s atmosphere scatters the Sun’s light. While the authors of the new Uranus study initially expected that most of the X-rays detected would also be from scattering, there are tantalizing hints that at least one other source of X-rays is present. If further observations confirm this, it could have intriguing implications for understanding Uranus.

One possibility is that the rings of Uranus are producing X-rays themselves, which is the case for Saturn’s rings. Uranus is surrounded by charged particles such as electrons and protons in its nearby space environment. If these energetic particles collide with the rings, they could cause the rings to glow in X-rays. Another possibility is that at least some of the X-rays come from auroras on Uranus, a phenomenon that has previously been observed on this planet at other wavelengths.

On Earth, we can see colorful light shows in the sky called auroras, which happen when high-energy particles interact with the atmosphere. X-rays are emitted in Earth’s auroras, produced by energetic electrons after they travel down the planet’s magnetic field lines to its poles and are slowed down by the atmosphere. Jupiter has auroras, too. The X-rays from auroras on Jupiter come from two sources: electrons traveling down magnetic field lines, as on Earth, and positively charged atoms and molecules raining down at Jupiter’s polar regions. However, scientists are less certain about what causes auroras on Uranus. Chandra’s observations may help figure out this mystery.

Uranus is an especially interesting target for X-ray observations because of the unusual orientations of its spin axis and its magnetic field. While the rotation and magnetic field axes of the other planets of the solar system are almost perpendicular to the plane of their orbit, the rotation axis of Uranus is nearly parallel to its path around the Sun. Furthermore, while Uranus is tilted on its side, its magnetic field is tilted by a different amount, and offset from the planet’s center. This may cause its auroras to be unusually complex and variable. Determining the sources of the X-rays from Uranus could help astronomers better understand how more exotic objects in space, such as growing black holes and neutron stars, emit X-rays.

A paper describing these results appears in the most recent issue of the Journal of Geophysical Research. The authors are William Dunn (University College London, United Kingdom), Jan-Uwe Ness (University of Marseille, France), Laurent Lamy (Paris Observatory, France), Grant Tremblay (Center for Astrophysics | Harvard & Smithsonian), Graziella Branduardi-Raymont (University College London), Bradford Snios (CfA), Ralph Kraft (CfA), Z. Yao (Chinese Academy of Sciences, Beijing), Affelia Wibisono (University College London).

NASA’s Marshall Space Flight Center manages the Chandra program. The Smithsonian Astrophysical Observatory’s Chandra X-ray Center controls science from Cambridge Massachusetts and flight operations from Burlington, Massachusetts.

Source link

Continue Reading
Click to comment

Leave a Reply

TOP SCEINCE

AI headphones let wearer listen to a single person in a crowd, by looking at them just once

Published

on

By

First X-rays from Uranus discovered


Noise-canceling headphones have gotten very good at creating an auditory blank slate. But allowing certain sounds from a wearer’s environment through the erasure still challenges researchers. The latest edition of Apple’s AirPods Pro, for instance, automatically adjusts sound levels for wearers — sensing when they’re in conversation, for instance — but the user has little control over whom to listen to or when this happens.

A University of Washington team has developed an artificial intelligence system that lets a user wearing headphones look at a person speaking for three to five seconds to “enroll” them. The system, called “Target Speech Hearing,” then cancels all other sounds in the environment and plays just the enrolled speaker’s voice in real time even as the listener moves around in noisy places and no longer faces the speaker.

The team presented its findings May 14 in Honolulu at the ACM CHI Conference on Human Factors in Computing Systems. The code for the proof-of-concept device is available for others to build on. The system is not commercially available.

“We tend to think of AI now as web-based chatbots that answer questions,” said senior author Shyam Gollakota, a UW professor in the Paul G. Allen School of Computer Science & Engineering. “But in this project, we develop AI to modify the auditory perception of anyone wearing headphones, given their preferences. With our devices you can now hear a single speaker clearly even if you are in a noisy environment with lots of other people talking.”

To use the system, a person wearing off-the-shelf headphones fitted with microphones taps a button while directing their head at someone talking. The sound waves from that speaker’s voice then should reach the microphones on both sides of the headset simultaneously; there’s a 16-degree margin of error. The headphones send that signal to an on-board embedded computer, where the team’s machine learning software learns the desired speaker’s vocal patterns. The system latches onto that speaker’s voice and continues to play it back to the listener, even as the pair moves around. The system’s ability to focus on the enrolled voice improves as the speaker keeps talking, giving the system more training data.

The team tested its system on 21 subjects, who rated the clarity of the enrolled speaker’s voice nearly twice as high as the unfiltered audio on average.

This work builds on the team’s previous “semantic hearing” research, which allowed users to select specific sound classes — such as birds or voices — that they wanted to hear and canceled other sounds in the environment.

Currently the TSH system can enroll only one speaker at a time, and it’s only able to enroll a speaker when there is not another loud voice coming from the same direction as the target speaker’s voice. If a user isn’t happy with the sound quality, they can run another enrollment on the speaker to improve the clarity.

The team is working to expand the system to earbuds and hearing aids in the future.

Additional co-authors on the paper were Bandhav Veluri, Malek Itani and Tuochao Chen, UW doctoral students in the Allen School, and Takuya Yoshioka, director of research at AssemblyAI. This research was funded by a Moore Inventor Fellow award, a Thomas J. Cabel Endowed Professorship and a UW CoMotion Innovation Gap Fund.



Source link

Continue Reading

TOP SCEINCE

Birth of universe’s earliest galaxies observed for first time

Published

on

By

First X-rays from Uranus discovered


Using the James Webb Space Telescope, University of Copenhagen researchers have become the first to see the formation of three of the earliest galaxies in the universe, more than 13 billion years ago. The sensational discovery contributes important knowledge about the universe and is now published in the journal Science.

For the first time in the history of astronomy, researchers at the Niels Bohr Institute have witnessed the birth of three of the universe’s absolute earliest galaxies, somewhere between 13.3 and 13.4 billion years ago.

The discovery was made using the James Webb Space Telescope, which brought these first ‘live observations’ of formative galaxies down to us here on Earth.

Through the telescope, researchers were able to see signals from large amounts of gas that accumulate and accrete onto a mini-galaxy in the process of being built. While this is how galaxies are formed according to theories and computer simulations, it had never actually been witnessed.

“You could say that these are the first ‘direct’ images of galaxy formation that we’ve ever seen. Whereas the James Webb has previously shown us early galaxies at later stages of evolution, here we witness their very birth, and thus, the construction of the first star systems in the universe,” says Assistant Professor Kasper Elm Heintz from the Niels Bohr Institute, who led the new study.

Galaxies born shortly after the Big Bang

The researchers estimate the birth of the three galaxies to have occurred roughly 400-600 million years after the Big Bang, the explosion that began it all. While that sounds like a long time, it corresponds to galaxies forming during the first three to four percent of the universe’s 13.8-billion-year overall lifetime.

Shortly after the Big Bang, the universe was an enormous opaque gas of hydrogen atoms — unlike today, where the night sky is speckled with a blanket of well-defined stars.

“During the few hundred million years after the Big Bang, the first stars formed, before stars and gas began to coalesce into galaxies. This is the process that we see the beginning of in our observations,” explains Associate Professor Darach Watson.

The birth of galaxies took place at a time in the history of the universe known as the Epoch of Reionization, when the energy and light of some of the first galaxies broke through the mists of hydrogen gas.

It is precisely these large amounts of hydrogen gas that the researchers captured using the James Webb Space Telescope’s infrared vision. This is the most distant measurement of the cold, neutral hydrogen gas, which is the building block of the stars and galaxies, discovered by scientific researchers to date.

Adds to the understanding of our origins

The study was conducted by Kasper Elm Heintz, in close collaboration with, among others, research colleagues Darach Watson, Gabriel Brammer and PhD student Simone Vejlgaard from the Cosmic Dawn Center at the University of Copenhagen’s Niels Bohr Institute — a center whose stated goal is to investigate and understand the dawn of the universe. This latest result brings them much closer to doing just that.

The research team has already applied for more observation time with the James Webb Space Telescope, with hopes of expanding upon their new result and learning more about the earliest epoch in the formation of galaxies.

“For now, this is about mapping our new observations of galaxies being formed in even greater detail than before. At the same time, we are constantly trying to push the limit of how far out into the universe we can see. So, perhaps we’ll reach even further,” says Simone Vejlgaard.

According to the researcher, the new knowledge contributes to answering one of humanity’s most basic questions.

“One of the most fundamental questions that we humans have always asked is: ‘Where do we come from?’. Here, we piece together a bit more of the answer by shedding light on the moment that some of the universe’s first structures were created. It is a process that we’ll investigate further, until hopefully, we are able to fit even more pieces of the puzzle together,” concludes Associate Professor Gabriel Brammer.

The study was conducted by researchers Kasper E. Heintz, Darach Watson, Gabriel Brammer, Simone Vejlgaard, Anne Hutter, Victoria B. Strait, Jorryt Matthee, Pascal A. Oesch, Pall Jakobsson, Nial R. Tanvir, Peter Laursen, Rohan P. Naidu, Charlotte A. Mason, Meghana Killi, Intae Jung, Tiger Yu-Yang Hsiao, Abdurro’uf, Dan Coe, Pablo Arrabal Haro, Steven L. Finkelstein, & Sune Toft.

The Danish portion of the research is funded by the Danish National Research Foundation and the Carlsberg Foundation.

HOW THEY DID IT

Researchers were able to measure the formation of the universe’s first galaxies by using sophisticated models of how light from these galaxies was absorbed by the neutral gas located in and around them. This transition is known as the Lyman-alpha transition.

By measuring the light, the researchers were able to distinguish gas from the newly formed galaxies from other gas. These measurements were only possible thanks to the James Webb Space Telescope’s incredibly sensitive infrared spectrograph capabilities.

ABOUT THE EARLY UNIVERSE

The universe began its “life” about 13.8 billion years ago in an enormous explosion — the Big Bang. The event gave rise to an abundance of subatomic particles such as quarks and electrons. These particles aggregated to form protons and neutrons, which later coalesced into atomic nuclei. Roughly 380,000 years after the Big Bang, electrons began to orbit atomic nuclei, and the simplest atoms of the universe gradually formed.

The first stars were formed after a few hundred million years. And within the hearts of these stars, the larger and more complex atoms that we have around us were formed.

Later, stars coalesced into galaxies. The oldest galaxies known to us were formed about 3-400 million years after the Big Bang. Our own solar system came into being about 4.6 billion years ago — more than 9 billion years after the Big Bang.



Source link

Continue Reading

TOP SCEINCE

Scientists map networks regulating gene function in the human brain

Published

on

By

First X-rays from Uranus discovered


A consortium of researchers has produced the largest and most advanced multidimensional maps of gene regulation networks in the brains of people with and without mental disorders. These maps detail the many regulatory elements that coordinate the brain’s biological pathways and cellular functions. The research, supported the National Institutes of Health (NIH), used postmortem brain tissue from over 2,500 donors to map gene regulation networks across different stages of brain development and multiple brain-related disorders.

“These groundbreaking findings advance our understanding of where, how, and when genetic risk contributes to mental disorders such as schizophrenia, post-traumatic stress disorder, and depression,” said Joshua A. Gordon, M.D., Ph.D., director of NIH’s National Institute of Mental Health (NIMH). “Moreover, the critical resources, shared freely,willhelp researchers pinpoint genetic variants that are likely to play a causal role in mental illnesses and identify potential molecular targets for new therapeutics.”

The research is published across 15 papers in Science, Science Advances, and Scientific Reports. The papers report findings along several key themes:

  • Population-level analyses that link genetic variants, regulatory elements, and different molecular forms of expressed genes to regulatory networks at the cellular level, in both the developing brain and adult brain
  • Single-cell-level maps of the prefrontal cortex from individuals diagnosed with mental disorders and neurodevelopmental disorders
  • Experimental analyses validating the function of regulatory elements and genetic variants associated with quantitative trait loci (segments of DNA that are linked with observable traits)

The analyses expand on previous findings, exploring multiple cortical and subcortical regions of the human brain. These brain areas play key roles in a range of essential processes, including decision-making, memory, learning, emotion, reward processing, and motor control.

Approximately 2% of the human genome is composed of genes that code for proteins. The remaining 98% includes DNA segments that help regulate the activity of those genes. To better understand how brain structure and function contribute to mental disorders, researchers in the NIMH-funded PsychENCODE Consortium are using standardized methods and data analysis approaches to build a comprehensive picture of these regulatory elements in the human brain.

In addition to these discoveries, the papers also highlight new methods and tools to help researchers analyze and explore the wealth of data produced by this effort. These resources include a web-based platform offering interactive visualization data from diverse brain cell types in individuals with and without mental disorders, known as PsychSCREEN. Together, these methods and tools provide a comprehensive, integrated data resource for the broader research community.

The papers focus on the second phase of findings from the PsychENCODE Consortium. This effort aims to advance our understanding of how gene regulation impacts brain function and dysfunction.

“These PsychENCODE Consortium findings shed new light on how gene risk maps onto brain function across developmental stages, brain regions, and disorders,” said Jonathan Pevsner, Ph.D., chief of the NIMH Genomics Research Branch. “The work lays a strong foundation for ongoing efforts to characterize regulatory pathways across disorders, elucidate the role of epigenetic mechanisms, and increase the ancestral diversity represented in studies.”

The PsychENCODE papers published in Science and Science Advances are presented as a collection on the Science website.



Source link

Continue Reading

Trending

Copyright © 2017 Zox News Theme. Theme by MVP Themes, powered by WordPress.