Connect with us

TOP SCEINCE

Neuroscientists roll out first comprehensive atlas of brain cells: BRAIN initiative consortium takes census of motor cortex cells in mice, marmoset and humans

Published

on

Neuroscientists roll out first comprehensive atlas of brain cells: BRAIN initiative consortium takes census of motor cortex cells in mice, marmoset and humans

When you clicked to read this story, a band of cells across the top of your brain sent signals down your spine and out to your hand to tell the muscles in your index finger to press down with just the right amount of pressure to activate your mouse or track pad.

A slew of new studies now shows that the area of the brain responsible for initiating this action — the primary motor cortex, which controls movement — has as many as 116 different types of cells that work together to make this happen.

The 17 studies, appearing online Oct. 6 in the journal Nature, are the result of five years of work by a huge consortium of researchers supported by the National Institutes of Health’s Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative to identify the myriad of different cell types in one portion of the brain. It is the first step in a long-term project to generate an atlas of the entire brain to help understand how the neural networks in our head control our body and mind and how they are disrupted in cases of mental and physical problems.

“If you think of the brain as an extremely complex machine, how could we understand it without first breaking it down and knowing the parts?” asked cellular neuroscientist Helen Bateup, a University of California, Berkeley, associate professor of molecular and cell biology and co-author of the flagship paper that synthesizes the results of the other papers. “The first page of any manual of how the brain works should read: Here are all the cellular components, this is how many of them there are, here is where they are located and who they connect to.”

Individual researchers have previously identified dozens of cell types based on their shape, size, electrical properties and which genes are expressed in them. The new studies identify about five times more cell types, though many are subtypes of well-known cell types. For example, cells that release specific neurotransmitters, like gamma-aminobutyric acid (GABA) or glutamate, each have more than a dozen subtypes distinguishable from one another by their gene expression and electrical firing patterns.

While the current papers address only the motor cortex, the BRAIN Initiative Cell Census Network (BICCN) — created in 2017 — endeavors to map all the different cell types throughout the brain, which consists of more than 160 billion individual cells, both neurons and support cells called glia. The BRAIN Initiative was launched in 2013 by then-President Barack Obama.


“Once we have all those parts defined, we can then go up a level and start to understand how those parts work together, how they form a functional circuit, how that ultimately gives rise to perceptions and behavior and much more complex things,” Bateup said.

Together with former UC Berkeley professor John Ngai, Bateup and UC Berkeley colleague Dirk Hockemeyer have already used CRISPR-Cas9 to create mice in which a specific cell type is labeled with a fluorescent marker, allowing them to track the connections these cells make throughout the brain. For the flagship journal paper, the Berkeley team created two strains of “knock-in” reporter mice that provided novel tools for illuminating the connections of the newly identified cell types, she said.

“One of our many limitations in developing effective therapies for human brain disorders is that we just don’t know enough about which cells and connections are being affected by a particular disease and therefore can’t pinpoint with precision what and where we need to target,” said Ngai, who led UC Berkeley’s Brain Initiative efforts before being tapped last year to direct the entire national initiative. “Detailed information about the types of cells that make up the brain and their properties will ultimately enable the development of new therapies for neurologic and neuropsychiatric diseases.”

Ngai is one of 13 corresponding authors of the flagship paper, which has more than 250 co-authors in all.

Bateup, Hockemeyer and Ngai collaborated on an earlier study to profile all the active genes in single dopamine-producing cells in the mouse’s midbrain, which has structures similar to human brains. This same profiling technique, which involves identifying all the specific messenger RNA molecules and their levels in each cell, was employed by other BICCN researchers to profile cells in the motor cortex. This type of analysis, using a technique called single-cell RNA sequencing, or scRNA-seq, is referred to as transcriptomics.


The scRNA-seq technique was one of nearly a dozen separate experimental methods used by the BICCN team to characterize the different cell types in three different mammals: mice, marmosets and humans. Four of these involved different ways of identifying gene expression levels and determining the genome’s chromatin architecture and DNA methylation status, which is called the epigenome. Other techniques included classical electrophysiological patch clamp recordings to distinguish cells by how they fire action potentials, categorizing cells by shape, determining their connectivity, and looking at where the cells are spatially located within the brain. Several of these used machine learning or artificial intelligence to distinguish cell types.

“This was the most comprehensive description of these cell types, and with high resolution and different methodologies,” Hockemeyer said. “The conclusion of the paper is that there’s remarkable overlap and consistency in determining cell types with these different methods.”

A team of statisticians combined data from all these experimental methods to determine how best to classify or cluster cells into different types and, presumably, different functions based on the observed differences in expression and epigenetic profiles among these cells. While there are many statistical algorithms for analyzing such data and identifying clusters, the challenge was to determine which clusters were truly different from one another — truly different cell types — said Sandrine Dudoit, a UC Berkeley professor and chair of the Department of Statistics. She and biostatistician Elizabeth Purdom, UC Berkeley associate professor of statistics, were key members of the statistical team and co-authors of the flagship paper.

“The idea is not to create yet another new clustering method, but to find ways of leveraging the strengths of different methods and combining methods and to assess the stability of the results, the reproducibility of the clusters you get,” Dudoit said. “That’s really a key message about all these studies that look for novel cell types or novel categories of cells: No matter what algorithm you try, you’ll get clusters, so it is key to really have confidence in your results.”

Bateup noted that the number of individual cell types identified in the new study depended on the technique used and ranged from dozens to 116. One finding, for example, was that humans have about twice as many different types of inhibitory neurons as excitatory neurons in this region of the brain, while mice have five times as many.

“Before, we had something like 10 or 20 different cell types that had been defined, but we had no idea if the cells we were defining by their patterns of gene expression were the same ones as those defined based on their electrophysiological properties, or the same as the neuron types defined by their morphology,” Bateup said.

“The big advance by the BICCN is that we combined many different ways of defining a cell type and integrated them to come up with a consensus taxonomy that’s not just based on gene expression or on physiology or morphology, but takes all of those properties into account,” Hockemeyer said. “So, now we can say this particular cell type expresses these genes, has this morphology, has these physiological properties, and is located in this particular region of the cortex. So, you have a much deeper, granular understanding of what that cell type is and its basic properties.”

Dudoit cautioned that future studies could show that the number of cell types identified in the motor cortex is an overestimate, but the current studies are a good start in assembling a cell atlas of the whole brain.

“Even among biologists, there are vastly different opinions as to how much resolution you should have for these systems, whether there is this very, very fine clustering structure or whether you really have higher level cell types that are more stable,” she said. “Nevertheless, these results show the power of collaboration and pulling together efforts across different groups. We’re starting with a biological question, but a biologist alone could not have solved that problem. To address a big challenging problem like that, you want a team of experts in a bunch of different disciplines that are able to communicate well and work well with each other.”

Other members of the UC Berkeley team included postdoctoral scientists Rebecca Chance and David Stafford, graduate student Daniel Kramer, research technician Shona Allen of the Department of Molecular and Cell Biology, doctoral student Hector Roux de Bézieux of the School of Public Health and postdoctoral fellow Koen Van den Berge of the Department of Statistics. Bateup is a member of the Helen Wills Neuroscience Institute, Hockemeyer is a member of the Innovative Genomics Institute, and both are investigators funded by the Chan Zuckerberg Biohub.

Source link

Continue Reading
Click to comment

Leave a Reply

TOP SCEINCE

Mechanism found to determine which memories last

Published

on

By

Neuroscientists roll out first comprehensive atlas of brain cells: BRAIN initiative consortium takes census of motor cortex cells in mice, marmoset and humans


Neuroscientists have established in recent decades the idea that some of each day’s experiences are converted by the brain into permanent memories during sleep the same night. Now, a new study proposes a mechanism that determines which memories are tagged as important enough to linger in the brain until sleep makes them permanent.

Led by researchers from NYU Grossman School of Medicine, the study revolves around brain cells called neurons that “fire” — or bring about swings in the balance of their positive and negative charges — to transmit electrical signals that encode memories. Large groups of neurons in a brain region called the hippocampus fire together in rhythmic cycles, creating sequences of signals within milliseconds of each other that can encode complex information.

Called “sharp wave-ripples,” these “shouts” to the rest of the brain represent the near-simultaneous firing of 15 percent of hippocampal neurons, and are named for the shape they take when their activity is captured by electrodes and recorded on a graph.

While past studies had linked ripples with memory formation during sleep, the new study, published online in the journal Science on March 28, found that daytime events followed immediately by five to 20 sharp wave-ripples are replayed more during sleep and so consolidated into permanent memories. Events followed by very few or no sharp wave-ripples failed to form lasting memories.

“Our study finds that sharp wave-ripples are the physiological mechanism used by the brain to ‘decide’ what to keep and what to discard,” said senior study author György Buzsáki, MD, PhD, the Biggs Professor of Neuroscience in the Department of Neuroscience and Physiology at NYU Langone Health.

Walk and Pause

The new study is based on a known pattern: mammals including humans experience the world for a few moments, then pause, then experience a little more, then pause again. After we pay attention to something, say the study authors, brain computation often switches into an “idle” re-assessment mode. Such momentary pauses occur throughout the day, but the longest idling periods occur during sleep.

Buzsaki and colleagues had previously established that no sharp wave-ripples occur as we actively explore sensory information or move, but only during the idle pauses before or after. The current study found that sharp wave-ripples represent the natural tagging mechanism during such pauses after waking experiences, with the tagged neuronal patterns reactivated during post-task sleep.

Importantly, sharp wave-ripples are known to be made up the firing of hippocampal “place cells” in a specific order that encodes every room we enter, and each arm of a maze entered by a mouse. For memories that are remembered, those same cells fire at high speed, as we sleep, “playing back the recorded event thousands times per night.” The process strengthens the connections between the cells involved.

For the current study, successive maze runs by study mice were tracked via electrodes by populations of hippocampal cells that constantly changed over time despite recording very similar experiences. This revealed for the first time the maze runs during which ripples occurred during waking pauses, and then were replayed during sleep.

Sharp wave-ripples were typically recorded when a mouse paused to enjoy a sugary treat after each maze run. The consumption of the reward, say the authors, prepared the brain to switch from an exploratory to an idle pattern so that sharp wave-ripples could occur.

Using dual-sided silicon probes, the research team was able to record up to 500 neurons simultaneously in the hippocampus of animals during maze runs. This in turn created a challenge because data becomes exceedingly complex the more neurons are independently recorded. To gain an intuitive understanding of the data, visualize neuronal activity, and form hypotheses, the team successfully reduced the number of dimensions in the data, in some ways like converting a three-dimensional image into a flat one, and without losing the data’s integrity.

“We worked to take the external world out of the equation, and looked at the mechanisms by which the mammalian brain innately and subconsciously tags some memories to become permanent,” said first author Wannan (Winnie) Yang, PhD, a graduate student in Buzsáki’s lab. “Why such a system evolved is still a mystery, but future research may reveal devices or therapies that can adjust sharp wave-ripples to improve memory, or even lessen recall of traumatic events.”

Along with Drs. Buzsáki and Yang, study authors from the Neuroscience Institute at NYU Langone Health were Roman Huszár and Thomas Hainmueller. Kirill Kiselev of the Center for Neural Science at New York University was also an author, as was Chen Sun of Mila, the Quebec Artificial Intelligence Institute, in Montréal. The work was supported by National Institute of Health grants R01MH122391 and U19NS107616.



Source link

Continue Reading

TOP SCEINCE

Long-period oscillations control the Sun’s differential rotation

Published

on

By

Neuroscientists roll out first comprehensive atlas of brain cells: BRAIN initiative consortium takes census of motor cortex cells in mice, marmoset and humans


The Sun’s differential rotation pattern has puzzled scientists for decades: while the poles rotate with a period of approximately 34 days, mid-latitudes rotate faster and the equatorial region requires only approximately 24 days for a full rotation. In addition, in past years advances in helioseismology, i.e. probing the solar interior with the help of solar acoustic waves, have established that this rotational profile is nearly constant throughout the entire convection zone. This layer of the Sun stretches from a depth of approximately 200,000 kilometers to the visible solar surface and is home to violent upheavals of hot plasma which play a crucial role in driving solar magnetism and activity.

While theoreticaThe interior of the Sun does not rotate at the same rate at all latitudes. The physical origin of this differential rotation is not fully understood. A team of scientists at the Max Planck Institute for Solar System Research (MPS) in Germany has made a ground-breaking discovery. As the team reports today in the journal Science Advances, the long-period solar oscillations discovered by MPS scientists in 2021 play a crucial role in controlling the Sun’s rotational pattern. The long-period oscillations are analogous to the baroclinically unstable waves in Earth’s atmosphere that shape the weather. In the Sun, these oscillations carry heat from the slightly hotter poles to the slightly cooler equator. To obtain their new results, the scientists interpreted observations from NASA’s Solar Dynamics Observatory using cutting-edge numerical simulations of the solar interior. They found that the difference in temperature between the poles and the equator is about seven degrees.

l models have long postulated a slight temperature difference between solar poles and equator to maintain the Sun’s rotational pattern, it has proven notoriously difficult to measure. After all, observations have to “look through” the background of the Sun’s deep interior which measures up to million degrees in temperature. However, as the researchers from MPS show, it is now possible to determine the temperature difference from the observations of the long-period oscillations of the Sun.

In their analysis of observational data obtained by the Helioseismic and Magnetic Imager (HMI) onboard NASA’s Solar Dynamics Observatory from 2017 to 2021, the scientists turned to global solar oscillations with long periods that can be discerned as swirling motions at the solar surface. Scientists from MPS reported their discovery of these inertial oscillations three years ago. Among these observed modes, the high-latitude modes with velocities of up to 70 km per hour, proved to be especially influential.

To study the nonlinear nature of these high-latitude oscillations, a set of three-dimensional numerical simulations was conducted. In their simulations, the high-latitude oscillations carry heat from the solar poles to the equator, which limits the temperature difference between the Sun’s poles and the equator to less than seven degrees. “This very small temperature difference between the poles and the equator controls the angular momentum balance in the Sun and thus is an important feedback mechanism for the Sun’s global dynamics” says MPS Director Prof. Dr. Laurent Gizon.

In their simulations, the researchers for the first time described the crucial processes in a fully three-dimensional model. Former endeavors had been limited to two-dimensional approaches that assumed the symmetry about the Sun’s rotation axis. “Matching the nonlinear simulations to the observations allowed us to understand the physics of the long-period oscillations and their role in controlling the Sun’s differential rotation,” says MPS postdoc and the lead author of the study, Dr. Yuto Bekki.

The solar high-latitude oscillations are driven by a temperature gradient in a similar way to extratropical cyclones on the Earth. The physics is similar, though the details are different: “In the Sun, the solar pole is about seven degrees hotter than equator and this is enough to drive flows of about 70 kilometers per hour over a large fraction of the Sun. The process is somewhat similar to the driving of cyclones,” says MPS scientist Dr. Robert Cameron.

Probing the physics of the Sun’s deep interior is difficult. This study is important as it shows that the long-period oscillations of the Sun are not only useful probes of the solar interior, but that they play an active role in the way the Sun works. Future work, which will be carried out in the context of the ERC Synergy Grant WHOLESUN and the DFG Collaborative Research Center 1456 Mathematics of Experiments, will be aimed at better understanding the role of these oscillations and their diagnostic potential.



Source link

Continue Reading

TOP SCEINCE

Artificial reef designed by MIT engineers could protect marine life, reduce storm damage

Published

on

By

Neuroscientists roll out first comprehensive atlas of brain cells: BRAIN initiative consortium takes census of motor cortex cells in mice, marmoset and humans


The beautiful, gnarled, nooked-and-crannied reefs that surround tropical islands serve as a marine refuge and natural buffer against stormy seas. But as the effects of climate change bleach and break down coral reefs around the world, and extreme weather events become more common, coastal communities are left increasingly vulnerable to frequent flooding and erosion.

An MIT team is now hoping to fortify coastlines with “architected” reefs — sustainable, offshore structures engineered to mimic the wave-buffering effects of natural reefs while also providing pockets for fish and other marine life.

The team’s reef design centers on a cylindrical structure surrounded by four rudder-like slats. The engineers found that when this structure stands up against a wave, it efficiently breaks the wave into turbulent jets that ultimately dissipate most of the wave’s total energy. The team has calculated that the new design could reduce as much wave energy as existing artificial reefs, using 10 times less material.

The researchers plan to fabricate each cylindrical structure from sustainable cement, which they would mold in a pattern of “voxels” that could be automatically assembled, and would provide pockets for fish to explore and other marine life to settle in. The cylinders could be connected to form a long, semipermeable wall, which the engineers could erect along a coastline, about half a mile from shore. Based on the team’s initial experiments with lab-scale prototypes, the architected reef could reduce the energy of incoming waves by more than 95 percent.

“This would be like a long wave-breaker,” says Michael Triantafyllou, the Henry L. and Grace Doherty Professor in Ocean Science and Engineering in the Department of Mechanical Engineering. “If waves are 6 meters high coming toward this reef structure, they would be ultimately less than a meter high on the other side. So, this kills the impact of the waves, which could prevent erosion and flooding.”

Details of the architected reef design are reported today in a study appearing in the open-access journal PNAS Nexus. Triantafyllou’s MIT co-authors are Edvard Ronglan SM ’23; graduate students Alfonso Parra Rubio, Jose del Auila Ferrandis, and Erik Strand; research scientists Patricia Maria Stathatou and Carolina Bastidas; and Professor Neil Gershenfeld, director of the Center for Bits and Atoms; along with Alexis Oliveira Da Silva at the Polytechnic Institute of Paris, Dixia Fan of Westlake University, and Jeffrey Gair Jr. of Scinetics, Inc.

Leveraging turbulence

Some regions have already erected artificial reefs to protect their coastlines from encroaching storms. These structures are typically sunken ships, retired oil and gas platforms, and even assembled configurations of concrete, metal, tires, and stones. However, there’s variability in the types of artificial reefs that are currently in place, and no standard for engineering such structures. What’s more, the designs that are deployed tend to have a low wave dissipation per unit volume of material used. That is, it takes a huge amount of material to break enough wave energy to adequately protect coastal communities.

The MIT team instead looked for ways to engineer an artificial reef that would efficiently dissipate wave energy with less material, while also providing a refuge for fish living along any vulnerable coast.

“Remember, natural coral reefs are only found in tropical waters,” says Triantafyllou, who is director of the MIT Sea Grant. “We cannot have these reefs, for instance, in Massachusetts. But architected reefs don’t depend on temperature, so they can be placed in any water, to protect more coastal areas.”

The new effort is the result of a collaboration between researchers in MIT Sea Grant, who developed the reef structure’s hydrodynamic design, and researchers at the Center for Bits and Atoms (CBA), who worked to make the structure modular and easy to fabricate on location. The team’s architected reef design grew out of two seemingly unrelated problems. CBA researchers were developing ultralight cellular structures for the aerospace industry, while Sea Grant researchers were assessing the performance of blowout preventers in offshore oil structures — cylindrical valves that are used to seal off oil and gas wells and prevent them from leaking.

The team’s tests showed that the structure’s cylindrical arrangement generated a high amount of drag. In other words, the structure appeared to be especially efficient in dissipating high-force flows of oil and gas. They wondered: Could the same arrangement dissipate another type of flow, in ocean waves?

The researchers began to play with the general structure in simulations of water flow, tweaking its dimensions and adding certain elements to see whether and how waves changed as they crashed against each simulated design. This iterative process ultimately landed on an optimized geometry: a vertical cylinder flanked by four long slats, each attached to the cylinder in a way that leaves space for water to flow through the resulting structure. They found this setup essentially breaks up any incoming wave energy, causing parts of the wave-induced flow to spiral to the sides rather than crashing ahead.

“We’re leveraging this turbulence and these powerful jets to ultimately dissipate wave energy,” Ferrandis says.

Standing up to storms

Once the researchers identified an optimal wave-dissipating structure, they fabricated a laboratory-scale version of an architected reef made from a series of the cylindrical structures, which they 3D-printed from plastic. Each test cylinder measured about 1 foot wide and 4 feet tall. They assembled a number of cylinders, each spaced about a foot apart, to form a fence-like structure, which they then lowered into a wave tank at MIT. They then generated waves of various heights and measured them before and after passing through the architected reef.

“We saw the waves reduce substantially, as the reef destroyed their energy,” Triantafyllou says.

The team has also looked into making the structures more porous, and friendly to fish. They found that, rather than making each structure from a solid slab of plastic, they could use a more affordable and sustainable type of cement.

“We’ve worked with biologists to test the cement we intend to use, and it’s benign to fish, and ready to go,” he adds.

They identified an ideal pattern of “voxels,” or microstructures, that cement could be molded into, in order to fabricate the reefs while creating pockets in which fish could live. This voxel geometry resembles individual egg cartons, stacked end to end, and appears to not affect the structure’s overall wave-dissipating power.

“These voxels still maintain a big drag while allowing fish to move inside,” Ferrandis says.

The team is currently fabricating cement voxel structures and assembling them into a lab-scale architected reef, which they will test under various wave conditions. They envision that the voxel design could be modular, and scalable to any desired size, and easy to transport and install in various offshore locations. “Now we’re simulating actual sea patterns, and testing how these models will perform when we eventually have to deploy them,” says Anjali Sinha, a graduate student at MIT who recently joined the group.

Going forward, the team hopes to work with beach towns in Massachusetts to test the structures on a pilot scale.

“These test structures would not be small,” Triantafyllou emphasizes. “They would be about a mile long, and about 5 meters tall, and would cost something like 6 million dollars per mile. So it’s not cheap. But it could prevent billions of dollars in storm damage. And with climate change, protecting the coasts will become a big issue.”

This work was funded, in part, by the U.S. Defense Advanced Research Projects Agency.



Source link

Continue Reading

Trending

Copyright © 2017 Zox News Theme. Theme by MVP Themes, powered by WordPress.