Connect with us

TOP SCEINCE

A technique for more effective multipurpose robots

Published

on

A technique for more effective multipurpose robots


Let’s say you want to train a robot so it understands how to use tools and can then quickly learn to make repairs around your house with a hammer, wrench, and screwdriver. To do that, you would need an enormous amount of data demonstrating tool use.

Existing robotic datasets vary widely in modality — some include color images while others are composed of tactile imprints, for instance. Data could also be collected in different domains, like simulation or human demos. And each dataset may capture a unique task and environment.

It is difficult to efficiently incorporate data from so many sources in one machine-learning model, so many methods use just one type of data to train a robot. But robots trained this way, with a relatively small amount of task-specific data, are often unable to perform new tasks in unfamiliar environments.

In an effort to train better multipurpose robots, MIT researchers developed a technique to combine multiple sources of data across domains, modalities, and tasks using a type of generative AI known as diffusion models.

They train a separate diffusion model to learn a strategy, or policy, for completing one task using one specific dataset. Then they combine the policies learned by the diffusion models into a general policy that enables a robot to perform multiple tasks in various settings.

In simulations and real-world experiments, this training approach enabled a robot to perform multiple tool-use tasks and adapt to new tasks it did not see during training. The method, known as Policy Composition (PoCo), led to a 20 percent improvement in task performance when compared to baseline techniques.

“Addressing heterogeneity in robotic datasets is like a chicken-egg problem. If we want to use a lot of data to train general robot policies, then we first need deployable robots to get all this data. I think that leveraging all the heterogeneous data available, similar to what researchers have done with ChatGPT, is an important step for the robotics field,” says Lirui Wang, an electrical engineering and computer science (EECS) graduate student and lead author of a paper on PoCo.

Wang’s coauthors include Jialiang Zhao, a mechanical engineering graduate student; Yilun Du, an EECS graduate student; Edward Adelson, the John and Dorothy Wilson Professor of Vision Science in the Department of Brain and Cognitive Sciences and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL); and senior author Russ Tedrake, the Toyota Professor of EECS, Aeronautics and Astronautics, and Mechanical Engineering, and a member of CSAIL. The research will be presented at the Robotics: Science and Systems Conference.

Combining disparate datasets

A robotic policy is a machine-learning model that takes inputs and uses them to perform an action. One way to think about a policy is as a strategy. In the case of a robotic arm, that strategy might be a trajectory, or a series of poses that move the arm so it picks up a hammer and uses it to pound a nail.

Datasets used to learn robotic policies are typically small and focused on one particular task and environment, like packing items into boxes in a warehouse.

“Every single robotic warehouse is generating terabytes of data, but it only belongs to that specific robot installation working on those packages. It is not ideal if you want to use all of these data to train a general machine,” Wang says.

The MIT researchers developed a technique that can take a series of smaller datasets, like those gathered from many robotic warehouses, learn separate policies from each one, and combine the policies in a way that enables a robot to generalize to many tasks.

They represent each policy using a type of generative AI model known as a diffusion model. Diffusion models, often used for image generation, learn to create new data samples that resemble samples in a training dataset by iteratively refining their output.

But rather than teaching a diffusion model to generate images, the researchers teach it to generate a trajectory for a robot. They do this by adding noise to the trajectories in a training dataset. The diffusion model gradually removes the noise and refines its output into a trajectory.

This technique, known as Diffusion Policy, was previously introduced by researchers at MIT, Columbia University, and the Toyota Research Institute. PoCo builds off this Diffusion Policy work.

The team trains each diffusion model with a different type of dataset, such as one with human video demonstrations and another gleaned from teleoperation of a robotic arm.

Then the researchers perform a weighted combination of the individual policies learned by all the diffusion models, iteratively refining the output so the combined policy satisfies the objectives of each individual policy.

Greater than the sum of its parts

“One of the benefits of this approach is that we can combine policies to get the best of both worlds. For instance, a policy trained on real-world data might be able to achieve more dexterity, while a policy trained on simulation might be able to achieve more generalization,” Wang says.

Because the policies are trained separately, one could mix and match diffusion policies to achieve better results for a certain task. A user could also add data in a new modality or domain by training an additional Diffusion Policy with that dataset, rather than starting the entire process from scratch.

The researchers tested PoCo in simulation and on real robotic arms that performed a variety of tools tasks, such as using a hammer to pound a nail and flipping an object with a spatula. PoCo led to a 20 percent improvement in task performance compared to baseline methods.

“The striking thing was that when we finished tuning and visualized it, we can clearly see that the composed trajectory looks much better than either one of them individually,” Wang says.

In the future, the researchers want to apply this technique to long-horizon tasks where a robot would pick up one tool, use it, then switch to another tool. They also want to incorporate larger robotics datasets to improve performance.

“We will need all three kinds of data to succeed for robotics: internet data, simulation data, and real robot data. How to combine them effectively will be the million-dollar question. PoCo is a solid step on the right track,” says Jim Fan, senior research scientist at NVIDIA and leader of the AI Agents Initiative, who was not involved with this work.

This research is funded, in part, by Amazon, the Singapore Defense Science and Technology Agency, the U.S. National Science Foundation, and the Toyota Research Institute.



Source link

Continue Reading
Click to comment

Leave a Reply

TOP SCEINCE

New drug shows promise in clearing HIV from brain

Published

on

By

A technique for more effective multipurpose robots


An experimental drug originally developed to treat cancer may help clear HIV from infected cells in the brain, according to a new Tulane University study.

For the first time, researchers at Tulane National Primate Research Center found that a cancer drug significantly reduced levels of SIV, the nonhuman primate equivalent of HIV, in the brain by targeting and depleting certain immune cells that harbor the virus.

Published in the journal Brain, this discovery marks a significant step toward eliminating HIV from hard-to-reach reservoirs where the virus evades otherwise effective treatment.

“This research is an important step in tackling brain-related issues caused by HIV, which still affect people even when they are on effective HIV medication,” said lead study author Woong-Ki Kim, PhD, associate director for research at Tulane National Primate Research Center. “By specifically targeting the infected cells in the brain, we may be able to clear the virus from these hidden areas, which has been a major challenge in HIV treatment.”

Antiretroviral therapy (ART) is an essential component of successful HIV treatment, maintaining the virus at undetectable levels in the blood and transforming HIV from a terminal illness into a manageable condition. However, ART does not completely eradicate HIV, necessitating lifelong treatment. The virus persists in “viral reservoirs” in the brain, liver, and lymph nodes, where it remains out of reach of ART.

The brain has been a particularly challenging area for treatment due to the blood-brain barrier — a protective membrane that shields it from harmful substances but also blocks treatments, allowing the virus to persist. In addition, cells in the brain known as macrophages are extremely long-lived, making them difficult to eradicate once they become infected.

Infection of macrophages is thought to contribute to neurocognitive dysfunction, experienced by nearly half of those living with HIV. Eradicating the virus from the brain is critical for comprehensive HIV treatment and could significantly improve the quality of life for those with HIV-related neurocognitive problems.

Researchers focused on macrophages, a type of white blood cell that harbors HIV in the brain. By using a small molecule inhibitor to block a receptor that increases in HIV-infected macrophages, the team successfully reduced the viral load in the brain. This approach essentially cleared the virus from brain tissue, providing a potential new treatment avenue for HIV.

The small molecule inhibitor used, BLZ945, has previously been studied for therapeutic use in amyotrophic lateral sclerosis (ALS) and brain cancer, but never before in the context of clearing HIV from the brain.

The study, which took place at the Tulane National Primate Research Center, utilized three groups to model human HIV infection and treatment: an untreated control group, and two groups treated with either a low or high dose of the small molecule inhibitor for 30 days. The high-dose treatment lead to a notable reduction in cells expressing HIV receptor sites, as well as a 95-99% decrease in viral DNA loads in the brain .

In addition to reducing viral loads, the treatment did not significantly impact microglia, the brain’s resident immune cells, which are essential for maintaining a healthy neuroimmune environment. It also did not show signs of liver toxicity at the doses tested.

The next step for the research team is to test this therapy in conjunction with ART to assess its efficacy in a combined treatment approach. This could pave the way for more comprehensive strategies to eradicate HIV from the body entirely.

This research was funded by the National Institutes of Health, including grants from the National Institute of Mental Health and the National Institute of Neurological Disorders and Stroke, and was supported with resources from the Tulane National Primate Research Center base grant of the National Institutes of Health, P51 OD011104.



Source link

Continue Reading

TOP SCEINCE

Chemical analyses find hidden elements from renaissance astronomer Tycho Brahe’s alchemy laboratory

Published

on

By

A technique for more effective multipurpose robots


In the Middle Ages, alchemists were notoriously secretive and didn’t share their knowledge with others. Danish Tycho Brahe was no exception. Consequently, we don’t know precisely what he did in the alchemical laboratory located beneath his combined residence and observatory, Uraniborg, on the now Swedish island of Ven.

Only a few of his alchemical recipes have survived, and today, there are very few remnants of his laboratory. Uraniborg was demolished after his death in 1601, and the building materials were scattered for reuse.

However, during an excavation in 1988-1990, some pottery and glass shards were found in Uraniborg’s old garden. These shards were believed to originate from the basement’s alchemical laboratory. Five of these shards — four glass and one ceramic — have now undergone chemical analyses to determine which elements the original glass and ceramic containers came into contact with.

The chemical analyses were conducted by Professor Emeritus and expert in archaeometry, Kaare Lund Rasmussen from the Department of Physics, Chemistry, and Pharmacy, University of Southern Denmark. Senior researcher and museum curator Poul Grinder-Hansen from the National Museum of Denmark oversaw the insertion of the analyses into historical context.

Enriched levels of trace elements were found on four of them, while one glass shard showed no specific enrichments. The study has been published in the journal Heritage Science.

“Most intriguing are the elements found in higher concentrations than expected — indicating enrichment and providing insight into the substances used in Tycho Brahe’s alchemical laboratory,” said Kaare Lund Rasmussen.

The enriched elements are nickel, copper, zinc, tin, antimony, tungsten, gold, mercury, and lead, and they have been found on either the inside or outside of the shards.

Most of them are not surprising for an alchemist’s laboratory. Gold and mercury were — at least among the upper echelons of society — commonly known and used against a wide range of diseases.

“But tungsten is very mysterious. Tungsten had not even been described at that time, so what should we infer from its presence on a shard from Tycho Brahe’s alchemy workshop?,” said Kaare Lund Rasmussen.

Tungsten was first described and produced in pure form more than 180 years later by the Swedish chemist Carl Wilhelm Scheele. Tungsten occurs naturally in certain minerals, and perhaps the element found its way to Tycho Brahe’s laboratory through one of these minerals. In the laboratory, the mineral might have undergone some processing that separated the tungsten, without Tycho Brahe ever realizing it.

However, there is also another possibility that Professor Kaare Lund Rasmussen emphasizes has no evidence whatsoever — but which could be plausible.

Already in the first half of the 1500s, the German mineralogist Georgius Agricola described something strange in tin ore from Saxony, which caused problems when he tried to smelt tin. Agricola called this strange substance in the tin ore “Wolfram” (German for Wolf’s froth, later renamed to tungsten in English).

“Maybe Tycho Brahe had heard about this and thus knew of tungsten’s existence. But this is not something we know or can say based on the analyses I have done. It is merely a possible theoretical explanation for why we find tungsten in the samples,” said Kaare Lund Rasmussen.

Tycho Brahe belonged to the branch of alchemists who, inspired by the German physician Paracelsus, tried to develop medicine for various diseases of the time: plague, syphilis, leprosy, fever, stomach aches, etc. But he distanced himself from the branch that tried to create gold from less valuable minerals and metals.

In line with the other medical alchemists of the time, he kept his recipes close to his chest and shared them only with a few selected individuals, such as his patron, Emperor Rudolph II, who allegedly received Tycho Brahe’s prescriptions for plague medicine.

We know that Tycho Brahe’s plague medicine was complicated to produce. It contained theriac, which was one of the standard remedies for almost everything at the time and could have up to 60 ingredients, including snake flesh and opium. It also contained copper or iron vitriol (sulphates), various oils, and herbs.

After various filtrations and distillations, the first of Brahe’s three recipes against plague was obtained. This could be made even more potent by adding tinctures of, for example, coral, sapphires, hyacinths, or potable gold.

“It may seem strange that Tycho Brahe was involved in both astronomy and alchemy, but when one understands his worldview, it makes sense. He believed that there were obvious connections between the heavenly bodies, earthly substances, and the body’s organs. Thus, the Sun, gold, and the heart were connected, and the same applied to the Moon, silver, and the brain; Jupiter, tin, and the liver; Venus, copper, and the kidneys; Saturn, lead, and the spleen; Mars, iron, and the gallbladder; and Mercury, mercury, and the lungs. Minerals and gemstones could also be linked to this system, so emeralds, for example, belonged to Mercury,” explained Poul Grinder-Hansen.

Kaare Lund Rasmussen has previously analyzed hair and bones from Tycho Brahe and found, among other elements, gold. This could indicate that Tycho Brahe himself had taken medicine that contained potable gold.



Source link

Continue Reading

TOP SCEINCE

Nitrogen emissions have a net cooling effect: But researchers warn against a climate solution

Published

on

By

A technique for more effective multipurpose robots


An international team of researchers has found that nitrogen emissions from fertilisers and fossil fuels have a net cooling effect on the climate. But they warn increasing atmospheric nitrogen has further damaging effects on the environment, calling for an urgent reduction in greenhouse gas emissions to halt global warming.

Published today in Nature, the paper found that reactive nitrogen released in the environment through human activities cools the climate by minus 0.34 watts per square metre. While global warming would have advanced further without the input of human-generated nitrogen, the amount would not offset the level of greenhouse gasses heating the atmosphere.

The paper was led by the Max Planck Institute in Germany and included authors from the University of Sydney. It comes one day after new data from the European Union’s Copernicus Climate Change Service indicated that Sunday, 21 July was the hottest day recorded in recent history.

The net cooling effect occurs in four ways:

  • Short-lived nitrogen oxides produced by the combustion of fossil fuels pollute the atmosphere by forming fine suspended particles which shield sunlight, in turn cooling the climate;

  • ammonia (a nitrogen and hydrogen-based compound) released into the atmosphere from the application of manure and artificial fertilisers has a similar effect;

  • nitrogen applied to crops allows plants to grow more abundantly, absorbing more CO2 from the atmosphere, enabling a cooling effect;

  • nitrogen oxides also play a role in the breakdown of atmospheric methane, a potent greenhouse gas.

The researchers warned that increasing atmospheric nitrogen was not a solution for combatting climate change.

“Nitrogen fertilisers pollute water and nitrogen oxides from fossil fuels pollute the air. Therefore, increasing rates of nitrogen in the atmosphere to combat climate change is not an acceptable compromise, nor is it a solution,” said Professor Federico Maggi from the University of Sydney’s School of Civil Engineering.

Sönke Zaehle from the Max Planck Institute said: “This may sound like good news, but you have to bear in mind that nitrogen emissions have many harmful effects, for example on health, biodiversity and the ozone layer. The current findings, therefore, are no reason to gloss over the harmful effects, let alone see additional nitrogen input as a means of combatting global warming.”

Elemental nitrogen, which makes up around 78 percent of the air, is climate-neutral, but other reactive nitrogen compounds can have direct or indirect effects on the global climate — sometimes warming and at other times cooling. Nitrous oxide (N2O) is an almost 300 times more potent greenhouse gas than CO2. Other forms of nitrogen stimulate the formation of ozone in the troposphere, which is a potent greenhouse gas and enhances global warming.

Professor Maggi said the research was important as it helped the team gain an understanding of the net-effect of the distribution of nitrogen emissions from agriculture.

“This work is an extraordinary example of how complex interactions at planetary scales cannot be captured with simplistic assessment tools. It shows the importance of developing mathematical models that can show the emergence of nonlinear — or unproportional — effects across soil, land, and atmosphere,” he said.

“Even if it appears counter-intuitive, reactive nitrogen introduced in the environment, mostly as agricultural fertilisers, can reduce total warming. However, this is minor compared with the reduction in greenhouse gas emissions required to keep the planet within safe and just operational boundaries.

“New generation computational tools are helping drive new learnings in climate change science, but understanding is not enough — we must act with great urgency to reduce greenhouse gas emissions.”

Gaining a holistic understanding of the impacts of nitrogen

The scientists determined the overall impact of nitrogen from human sources by first analysing the quantities of the various nitrogen compounds that end up in soil, water and air.

They then fed this data into models that depict the global nitrogen cycle and the effects on the carbon cycle, for example the stimulation of plant growth and ultimately the CO2 and methane content of the atmosphere. From the results of these simulations, they used another atmospheric chemistry model to calculate the effect of man-made nitrogen emissions on radiative forcing, that is the radiant energy that hits one square metre of the Earth’s surface per unit of time.



Source link

Continue Reading

Trending