Neanderthals May Have Been Early Risers, New Study Finds

When the ancestors of modern Eurasians migrated from Africa and interbred with the archaic humans of Eurasia, namely Neanderthals and Denisovans, the DNA of the archaic ancestors became anatomically integrated into the genomes of modern humans. homo sapiens. This process could accelerate adaptation to Eurasian environmental factors, such as reduced UV radiation and increased seasonal variation. In a new study, scientists from Vanderbilt University, the University of Pennsylvania, and the University of California, San Francisco have discovered lineage-specific genetic differences in circadian genes and their regulatory elements between humans and Neanderthals. They found that the introgressed genetic variants were enriched with effects on circadian regulation and consistently increased morningness tendencies in Europeans. The results expand our understanding of how the genomes of humans and our closest relatives responded to environments with different light-dark cycles.

Velasquez Alsuley other. They found that genetic material from Neanderthal ancestors may contribute to the tendency of some people today to be early risers, the type of people who wake up early and go to bed more easily. Image credit: Holger Neumann / Neanderthal Museum.

All anatomically modern humans trace their origins to the African continent about 300,000 years ago, where environmental factors shaped many of their biological characteristics.

They arrived in Eurasia 70,000 years ago, but other humans, Neanderthals and Denisovans, lived there for more than 400,000 years.

These archaic humans diverged from anatomically modern humans about 700,000 years ago, and as a result, humans and archaic hominid ancestors evolved under different environmental conditions.

“Although there was considerable variation in the latitudinal range of each group, Eurasian hominids primarily lived at consistently high latitudes and were therefore exposed to larger amplitude seasonal fluctuations in photoperiod,” said the University of California. said the San Francisco school. Dr. John Capra and his colleagues.

“Given the influence of environmental cues on circadian biology, we hypothesized that these separate evolutionary histories produced differences in circadian traits adapted to different environments.”

Although previous studies have shown that many of the archaic ancestors of modern humans are not beneficial and have been removed by natural selection, some archaic hominin variants that remain in human populations has shown evidence of adaptation.

For example, archaic genetic variation is thought to be associated with differences in hemoglobin levels, immune resistance to new pathogens, levels of skin pigmentation, and fat composition among Tibetans at high altitudes.

Changes in patterns and levels of light exposure have biological and behavioral effects that lead to evolutionary adaptations.

Scientists have extensively studied the evolution of circadian adaptations in insects, plants, and fish, but humans have been less well studied.

The Eurasian environment where Neanderthals and Denisovans lived for hundreds of thousands of years is located at higher latitudes and has more variable daylight hours than where modern humans evolved before leaving Africa.

Dr. Capra and his co-authors therefore investigated whether there was genetic evidence for differences in circadian clocks between Neanderthals and modern humans.

Using a combination of literature searches and expert knowledge, they defined a set of 246 circadian genes.

They found hundreds of genetic variations unique to each strain that can affect genes involved in the circadian clock.

Using artificial intelligence techniques, they identified 28 circadian genes that contain mutations that could alter splicing in archaic humans and that may be differentially regulated between modern and archaic humans. identified 16 circadian genes.

This indicates that there may be functional differences between the circadian clocks of ancient and modern humans.

Eurasian modern humans and Neanderthal ancestors interbred, so some humans may have acquired circadian variation from Neanderthals.

To test this, researchers investigated whether introgressed genetic variants were associated with the body’s preferences for wakefulness and sleep in a large cohort of hundreds of thousands of people at UK Biobank. did.

They found a number of introgressed mutants that affected sleep preferences, and most surprisingly, they found that these mutants consistently increased morningness, or the tendency to rise early.

This suggests a directional influence on this trait and is consistent with adaptations to high latitudes observed in other animals.

Increased morning time in humans is associated with a shortened circadian clock period. This may be beneficial at high latitudes, as it has been shown that sleep and wakefulness can be coordinated more quickly with external timing cues.

Shortening of the circadian period is required to synchronize the long summer light period at high latitudes in Drosophila, and selection for a shorter circadian period results in a latitudinal shift with increasing latitude in natural Drosophila populations. There is a latitudinal gradient in which the period decreases.

Therefore, the bias toward morningness in introgressed mutants may indicate selection for shortened circadian periods in populations living at high latitudes.

The tendency to be a morning person may have been evolutionarily beneficial to our ancestors who lived in the high latitudes of Europe, and would have been a Neanderthal genetic trait worth preserving.

“By combining ancient DNA, extensive genetic studies in modern humans, and artificial intelligence, we discovered substantial genetic differences in the circadian systems of Neanderthals and modern humans,” Dr. Capra said. .

“And by analyzing fragments of Neanderthal DNA that remain in the genomes of modern humans, we discovered surprising trends, many of which influence the regulation of circadian genes in modern humans. These effects are primarily in the consistent direction of increasing Neanderthal tendencies.” Morning people. ”

“This change is consistent with the effects of living at high latitudes on animals’ circadian clocks, and changes in seasonal light patterns may allow them to adjust their circadian clocks more quickly. ”

“Our next steps include applying these analyzes to more diverse modern human populations and investigating the effects of the Neanderthal variants we identified on circadian clocks in model systems. and applying similar analyzes to other potentially adaptive traits.”

of the team paper It was published in the magazine Genome biology and evolution.

_____

Kayla Velasquez-Arsley other. 2023. Archaic genetic introgression shaped human circadian characteristics. Genome biology and evolution 15 (12): evad203; doi: 10.1093/gbe/evad203

Source: www.sci.news

Webb’s fresh perspective on supernovae, laser connections between space stations, and the Europa Clipper mission

New high-definition images from NASA’s James Webb Space Telescope’s NIRCam (Near Infrared Camera) reveal intricate details of the supernova remnant Cassiopeia A (Cas A), which is struck by a gas outlet by a star before exploding. It shows an expanding shell of matter. Credits: NASA, ESA, CSA, STScI, Danny Milisavljevic (Purdue University), Ilse De Looze (UGent), Tea Temim (Princeton University)

NASAWebb Space Telescope observes newly exploded star…

The team prepares to install the moon rocket hardware…

And we completed NASA’s first bidirectional end-to-end laser relay system…

Some of the stories we want to share with you – this week at NASA!

Watch the web’s new high-definition exploded stars

NASA’s James Webb Space Telescope recently captured this new image of supernova remnant Cassiopeia A. This image, taken with Webb’s near-infrared camera, shows the star’s explosion at a resolution previously unattainable at these wavelengths, giving astronomers a hint at the dynamic processes occurring. . It’s inside a supernova remnant.

NASA’s Artemis II mission is making final preparations for its SLS rocket at Kennedy Space Center. The Orion stage adapter, a critical component that connects Orion to his SLS, recently underwent critical installation work on its diaphragm at Marshall Space Flight Center. This adapter plays an important role in preventing hydrogen gas buildup and ensuring safety during launch.Credit: NASA/Sam Lott

Team prepares to assemble moon rocket and spacecraft connectors

A team at NASA’s Marshall Space Flight Center recently flipped the Orion stage adapter over and prepared the adapter for diaphragm installation.

The stage adapter connects the Orion spacecraft to the Space Launch System rocket’s intermediate cryogenic propulsion stage (ICPS). The diaphragm helps prevent highly flammable hydrogen gas, which could leak from the rocket’s propellant tanks, from accumulating beneath Orion and its crew before and during launch.

NASA’s ILLUMA-T payload communicates with the LCRD via laser signals.Credit: NASA/Dave Ryan

Space station laser communication terminal achieves first link

NASA’s LCRD and the new space station technology experiment ILLUMA-T successfully exchanged data for the first time, establishing the first laser link between ILLUMA-T and an on-orbit laser relay system. LCRD and his ILLUMA-T teamed up to complete NASA’s first bidirectional end-to-end laser relay system.

Laser communications uses infrared light rather than traditional radio waves to send and receive signals, allowing spacecraft to pack more data into each transmission.

The “Message in a Bottle” campaign offers anyone the opportunity to stencil their name onto a microchip inscribed with U.S. Poet Laureate Ada Limón’s “Mystery Praise: A Poem to Europe.” The chip will be mounted on NASA’s Europa Clipper spacecraft, bound for Jupiter and its moon Europa. Credit: NASA

Add your name to join the European Clipper Mission

The deadline to participate in NASA’s European Clipper mission’s Message in a Bottle campaign is 11:59 p.m. EST, December 31, 2023. You can join the mission and carve your name on his Clipper spacecraft as it travels and explores 1.8 billion miles of Europe. Jupitericy moon, Europa.

For more information, visit go.nasa.gov/MessageInABottle.

What’s happening this week at @NASA!

Source: scitechdaily.com

Nighttime exposure to high levels of light linked with higher risk of anxiety and depression

A large-scale study involving 87,000 participants found that while excessive night-time light exposure increases the risk of mental illness, increasing daytime light can reduce these risks. This groundbreaking study highlights the importance of balancing light exposure for mental health and suggests simple lifestyle adjustments for better health.

Exposure to artificial light at night increases the risk of developing mental illnesses such as anxiety, bipolar disorder, and post-traumatic stress disorder.PTSD), with a tendency toward self-harm.

The world’s largest study of the effects of light exposure on mental health, involving nearly 87,000 people, found that increased exposure to light at night increases the risk of mental health conditions such as anxiety, bipolar disorder and PTSD. Not only that, but it has also been shown that the possibility of self-injury increases. harm. Importantly, the study also found that enhancing exposure to natural light during the day may serve as a non-drug approach to reducing the risk of psychosis.

Day and night light exposure: a balancing act

People exposed to high amounts of light at night had a 30 percent increased risk of depression, while those exposed to high amounts of light during the day had a 20 percent decreased risk of depression. A similar pattern of results was found for self-harm, psychosis, bipolar disorder, generalized anxiety disorder, and PTSD. These findings demonstrate that the simple practice of avoiding light at night and seeking brighter light during the day may be an effective non-pharmacological means of alleviating serious mental health problems. is showing.

The study, led by Associate Professor Sean Kane from the Monash School of Psychological Sciences and the Turner Institute for Brain and Mental Health in Melbourne, Australia, was published today in the journal Nature Mental Health.

“Our findings have potentially significant societal impact,” said Associate Professor Kane.

“If people understand that light exposure patterns have a huge impact on their mental health, they can take some simple steps to optimize their health. Let in bright light during the day. , it is important to get darkness at night.”

The study’s 86,772 participants, all from UK Biobank, were tested for light exposure, sleep, physical activity and mental health. Associate Professor Kane said the effects of night-time light exposure were independent of demographics, physical activity, season and employment.

“And our findings were consistent when considering shift work, sleep, urban versus rural living, and cardiometabolic health,” he said.

Challenging human biology with modern lighting

Modern, industrialized humanity has literally turned our biological systems upside down. According to Associate Professor Cain, our brains have evolved to function best in bright light during the day and little to no light at night.

“Humans today are challenging this biology, spending about 90% of their days under indoor electric lights that are too dim during the day and too bright at night compared to the natural light-dark cycle. It confuses our bodies and makes us feel sick,” he said.

Reference: “Day and night light exposure is associated with mental illness: an objective light study of over 85,000 people” Angus C. Burns, Daniel P. Windred, Martin K. Rutter, Patrick Olivier, Celine Vetter, Richa Saxena, Jacqueline M Lane, Andrew JK Phillips, Sean W. Kane, October 9, 2023; natural mental health.
DOI: 10.1038/s44220-023-00135-8

Source: scitechdaily.com

Recent study challenges previous beliefs about forest resilience

Recent research has shown that trees in humid regions are more vulnerable to drought, challenging previous beliefs about tree resilience. The study, which included analysis of more than 6.6 million tree rings, revealed that trees in arid regions are surprisingly drought tolerant. This finding highlights the widespread effects of climate change on forests and suggests that genetic diversity in drier regions may be important for adapting to changing conditions. There is. Credit: SciTechDaily.com

Scientists have flipped the script and revealed that trees in humid regions are more sensitive to drought.

This holiday season brings some surprising news about Christmas trees. Scientists have found that globally, trees that grow in wetter regions are more sensitive to drought. This means that if your tree was grown in a humid climate, it has likely been damaged over generations.

Debate over drought tolerance of trees

Scientists have long debated whether arid environments make trees more or less tolerant of drought. It seems intuitive that trees living at the biological margin are most vulnerable to climate change. Because even the slightest bit of extra stress can send a tree over the brink. On the other hand, these populations may be better able to withstand drought because they are adapted to harsher environments.

The trees of this lush temperate forest in the Cascade Mountains of Washington state may be less drought tolerant than trees in drier regions of the South.Credit: Joan Dudney

Insights from new research

According to a new study published in the journal science Increased water availability could “kill” trees by reducing their ability to adapt to drought, according to researchers from the University of California, Santa Barbara and the University of California, Davis. “And that’s really important to understand when we think about the global vulnerability of forest carbon stocks and forest health.” said Joan Dudney, an assistant professor and ecologist. “You don’t want to be a ‘spoiled’ tree when faced with a major drought.”

Dudney and his co-authors predicted that trees growing in the driest regions would be more sensitive to drought because they were already living on the edge of their limits. Furthermore, climate change models predict that these regions will dry out more rapidly than wetter regions. This change in climate can expose trees to conditions beyond their ability to adapt.

Methodology: Tree ring analysis

To measure drought sensitivity, the authors analyzed 6.6 million tree ring samples from 122 trees. seed World wide. They measured whether a tree was growing faster or slower than average based on the width of its growth rings. They correlated these trends with historical climate data such as precipitation and temperature.

The team then compared different regions’ responses to drought. “As you move to the drier edge of a species’ range, trees become less and less sensitive to drought,” said lead author Robert Heilmeyer, an environmental economist with the Environmental Research Program and the Bren School. he said. “Those trees are actually very resilient.”

Dudney, Heilmeyer, and their co-author Frances Moore were partially inspired by UCSB professor Tamma Carleton’s research on the effects of climate change on humanity. “This paper highlights the value of interdisciplinary scientific research,” added Moore, an associate professor at the University of California, Davis. “We applied economics methods originally developed to study how people and businesses adapt to changing climate, and applied them to ecological contexts to study the sensitivity of forests to drought. could be applied to.”

“A heat wave is likely to kill more people in a cool place like Seattle than in a hot city like Phoenix,” Heilmeyer said. It’s already quite hot in the Southwest, with a scorching heatwave occurring. But cities in the region are adapted to extreme climates, he points out. We now know that forests exhibit similar trends.

Impact on warm regions

Unfortunately, temperate regions are expected to become disproportionately drier in the coming decades. “Significant parts of the species’ ranges will be faced with entirely new climates, a phenomenon that these species do not find anywhere else in their ranges today,” Heilmeyer explained. The authors found that in 2100, 11% of the average species’ range will be drier than the driest part of its historical range. For some species, this increases to 50% or more.

“Broadly speaking, our study highlights that very few forests will be immune to the effects of climate change,” Dudney said. “Even wet forests are under more threat than we realize.”

But there’s also the other side of the coin. This species stores drought-tolerant resources in drier parts of its range and has the potential to strengthen forests in wetter regions. Previous research UCSB researchers have revealed that many species have the ability to adapt to environmental changes. But these researchers also discovered that trees move slowly from one generation to the next. This means that human intervention, such as assisted migration, may be required to take advantage of this genetic diversity.

Christmas tree and the fate of the forest

Whether the Christmas tree lives in a dry or humid region, its growth may decrease in the future. But understanding how trees respond to climate change can help secure the future of Tannenbaum and its wild trees.

Reference: “Drought sensitivity of mesic forests increases vulnerability to climate change” by Robert Heilmeyer, Joan Dudney, and Frances C. Moore, December 7, 2023. science.
DOI: 10.1126/science.adi1071

Source: scitechdaily.com

Astrophysical mysteries unraveled by new dark matter theory

Researchers have advanced our understanding of dark matter through simulations that support the self-interacting dark matter (SIDM) theory. This theory has the potential to resolve the discrepancy in dark matter density observed in different galaxies, poses a challenge to traditional cold dark matter (CDM) models, and highlights the dynamic nature of dark matter. Credit: SciTechDaily.com

Dark matter may be more active than previously thought, reports a study from the University of California, Riverside.

Dark matter, which is thought to make up 85% of the matter in the universe, does not emit light and its properties are still poorly understood. Normal matter absorbs, reflects, and emits light, but dark matter cannot be seen directly, making it difficult to detect. A theory called “self-interacting dark matter” (SIDM) claims that dark matter particles self-interact with each other due to dark forces, causing them to collide strongly with each other near the centers of galaxies.

Among the works published in of Astrophysics Journal LetterA research team led by Haibo Yu, a professor of physics and astronomy at the University of California, Riverside, reports that SIDM can simultaneously explain two extreme astrophysical puzzles.

Understanding dark matter halos and gravitational lenses

“The first is a halo of dense dark matter in a giant elliptical galaxy,” Yu said. “The halo is detected by observations of strong gravitational lenses, and its density is so high that it is extremely unlikely under the prevailing cold dark matter theory. Second, the density of dark matter halos in superdiffuse galaxies is extremely low. is extremely low and difficult to explain using cold dark matter theory.”

A dark matter halo is an invisible halo of matter that permeates and surrounds a galaxy or galaxy cluster. Gravitational lensing occurs when light traveling across space from a distant galaxy is bent around a massive object. The cold dark matter (CDM) paradigm/theory assumes that dark matter particles do not collide. As the name suggests, superdiffuse galaxies have extremely low luminosity and a dispersed distribution of stars and gas.

Hai-Bo Yu is a theoretical physicist at the University of California, Riverside, with expertise in the particle properties of dark matter.Credit: Samantha Tiu

Yu was also joined in the study by Ethan Nadler, a postdoctoral fellow at the Carnegie Observatory and the University of Southern California, and Danen Yang, a postdoctoral fellow at UCR.

To show that SIDM can explain two puzzles in astrophysics, the research team presents a theory of cosmic structure formation with strong dark matter self-interactions at relevant mass scales for strong lenticular halos and superdiffuse galaxies. We conducted our first high-resolution simulation.

“These self-interactions cause heat transfer within the halo and diversify the halo density in the central region of the galaxy,” Nadler said. “In other words, some halos have higher center densities and others have lower center densities compared to their CDM counterparts, the details of which depend on the evolutionary history of the Universe and the environment of the individual halo.”

Challenges to the CDM paradigm and future research

According to the research team, these two puzzles pose a formidable challenge to the standard CDM paradigm.

“CDM takes on the challenge of explaining these mysteries,” Yang said. “SIDM is probably a good candidate for reconciling two opposing extremes. There are no other explanations in the literature. We now know that dark matter may be more complex and active than we expected. There is an interesting possibility that there is.”

The study also demonstrates the ability to investigate dark matter through astrophysical observations using computer simulation tools of cosmic structure formation.

“We hope that our study will encourage further research in this promising research area,” Yu said. “This is a particularly timely development given the expected influx of data in the near future from observatories such as the James Webb Space Telescope and the upcoming Rubin Observatory.”

Since around 2009, the work of Yu and his collaborators has popularized SIDM in the particle physics and astrophysics communities.

References: Ethan O. Nadler, Danen Yang, and Haibo Yu, “Self-interacting dark matter solutions for the extreme diversity of low-mass halo properties,” November 30, 2023. Astrophysics Journal Letter.
DOI: 10.3847/2041-8213/ad0e09

This research was supported by the John Templeton Foundation and the U.S. Department of Energy.

Source: scitechdaily.com

Chia Genome Sequenced by Researchers, New Study Finds

Cheer (salvia hispanica) It is one of the most popular nutrient-dense foods and pseudocereals of the Lamiaceae family Lamiaceae. Chia seeds are rich in protein, polyunsaturated fatty acids, dietary fiber, and antioxidants. A team of scientists at Oregon State University has sequenced the chia genome, providing a blueprint for future research to exploit the nutritional and human health benefits of the chia plant.



chia seeds. Image credit: Valeria Lu.

Chia is an annual herbaceous plant in the Lamiaceae family, which also includes popular culinary herbs.

It is grown in southern Mexico and Central America for its nutrient-rich seeds containing protein, polyunsaturated fatty acids, dietary fiber, antioxidants, and minerals.

Compared to dietary fiber sources such as soy, wheat, and corn, chia seeds contain approximately 54g of dietary fiber per 100g, of which 93% is insoluble fiber.

Similarly, 60% of the total fatty acids are composed of polyunsaturated fatty acids, and proteins constitute 18–24% of the seed mass.

Additionally, the health-beneficial effects of chia seeds on improving muscle lipid content, cardiovascular health, total cholesterol ratio, triglyceride content, and anti-carcinogenic properties have been demonstrated in humans and animals.

The high fiber content in chia seeds also helps to reduce hypoglycemic effects and stabilize blood sugar levels in people with type 2 diabetes.

Professor Pankaj Jaiwal from Oregon State University said, “Our study opens up the possibility for scientists to study chia seeds with a view to improving human health, while also expanding knowledge of chia’s full range of nutritional benefits.” We will continue to deepen our understanding.”

“Long-term food and nutritional security currently requires diversifying human diets through breeding and genetic improvement of nutrient-rich so-called minor crops like chia,” said Dr. Sushma Naisani of Oregon State University. We have reached a stage where this is necessary.”

In the study, the authors assembled a haploid chia genome with an estimated genome size of 356 Mb.

They identified genes and genetic markers in chia that could help agricultural researchers breed plants to amplify plant traits valuable to human health.

They discovered 29 genes involved in the biosynthesis of polyunsaturated fatty acids and 93 genes that aid chia seeds’ gel-forming properties.

They also found 2,707 genes highly expressed in the seeds that are likely to produce small biologically active peptides (biopeptides) derived from proteins.

When seed proteins are digested in the intestinal tract, these small biopeptides are released and absorbed into the body, with potential properties that may help alleviate human health conditions such as type 2 diabetes and high blood pressure. Masu.

“This is the first report in silico “Annotation of plant genomes for protein-derived small biopeptides associated with improved human health,” the researchers said.

of findings It was published in the magazine Frontiers of plant science.

_____

parul gupta other. 2023. Reference genome of the nutrient-rich orphan crop chia (salvia hispanica) and implications for future breeding. front.plant science 14; doi: 10.3389/fps.2023.1272966

Source: www.sci.news

New technology uses magnetism to control light sources

Researchers have developed a new method to create transparent magnetic materials using laser heating. This breakthrough is crucial for the integration of magneto-optic materials and optical circuits, a major challenge in this field. This is expected to lead to advances in miniature magneto-optical isolators, miniature lasers, high-resolution displays, and miniature optical devices. Credit: SciTechDaily.com

A new laser heating technique by a Japanese research team enables the integration of transparent magnetic materials into optical circuits, paving the way for advanced optical communication devices.

In a major advance in optical technology, researchers at Tohoku University and Toyohashi University of Technology have developed a new method to create transparent magnetic materials using laser heating. This breakthrough, recently published in the journal Optical Materials, presents a new approach to integrating magneto-optic materials and optical devices, a long-standing challenge in the field.

“The key to this result is that we used a special laser heating technology to create a transparent magnetic material called cerium-substituted yttrium iron garnet (Ce:YIG),” said Taichi Goto, associate professor at Tohoku University’s Institute of Electrical Communication. he points out. (RIEC) and study co-author. “This method addresses the critical challenge of integrating magneto-optic materials into optical circuits without causing damage, an issue that has hindered progress in miniaturizing optical communication devices.”

Laser heating setup for preparing transparent magnetic materials.Credit: Taichi Goto et al.

Magneto-optical isolators in optical communications

Magneto-optical isolators are essential for achieving stable optical communications. These act like traffic lights at traffic lights, allowing movement in one direction but not the other. Integrating these isolators into silicon-based photonic circuits is difficult because they typically require high-temperature processes.

As a result of this challenge, Goto and his colleagues turned to laser annealing, a technique that selectively heats specific areas of a material with a laser. This allows precise control that affects only the target area without affecting the surrounding areas.

Previous work has exploited this to selectively heat bismuth-substituted yttrium iron garnet (Bi:YIG) films deposited on dielectric mirrors. This allows Bi:YIG to be crystallized without affecting the dielectric mirror.

However, problems arise when working with Ce:YIG, whose magnetic and optical properties make it an ideal material for optical devices, as exposure to air causes undesirable chemical reactions.

To get around this, the researchers designed a new device that uses a laser to heat the material in a vacuum, meaning without air. This made it possible to precisely heat small areas (approximately 60 micrometers) without changing the surrounding material.

Impact on optical technology

“Transparent magnetic materials created using this method are expected to greatly facilitate the development of compact magneto-optical isolators that are essential for stable optical communications,” Goto added. “It also opens the door to creating powerful miniature lasers, high-resolution displays, and miniature optical devices.”

Reference: “Vacuum laser annealing of magneto-optical cerium-substituted yttrium-iron-garnet films” Hibiki Miyashita, Yuki Yoshihara, Kanta Mori, Takumi Oguchi, Pan Boy Lim, Mitsuteru Inoue, Kazushi Ishiyama, Taichi Goto, 2023. November 14th, optical materials.
DOI: 10.1016/j.optmat.2023.114530

Source: scitechdaily.com

Hydrogen Cyanide Detected in Enceladus’ Plume by Planetary Researchers

Using data from NASA’s Cassini mission, planetary scientists have detected several compounds critical to the habitability of Saturn’s icy moon Enceladus, including hydrogen cyanide, acetylene, propylene, and ethane. . These compounds may support living microbial communities or drive complex organic syntheses leading to the origin of life.

Diagram of Enceladus’ plume activity.Image credit: Peter other., doi: 10.1038/s41550-023-02160-0.

“Our study provides further evidence that Enceladus hosts some of the most important molecules for both producing the building blocks of life and sustaining life through metabolic reactions,” said Harvard University Ph.D. said Jonah Peter, a student in the program.

“Not only does Enceladus appear to meet the basic requirements for habitability, but we are also wondering how complex biomolecules are formed there and what kinds of chemical pathways are involved. I got an idea about it.”

“The discovery of hydrogen cyanide was particularly exciting because it is the starting point for most theories about the origin of life.”

As we know, life requires building blocks such as amino acids, and hydrogen cyanide is one of the most important and versatile molecules required for the formation of amino acids.

Peter and his colleagues refer to hydrogen cyanide as the Swiss Army knife of amino acid precursors because its molecules stack up in different ways.

“The more we tested alternative models and tried to poke holes in the results, the stronger the evidence became,” Peter said.

“Ultimately, it became clear that there was no way to match the plume composition without including hydrogen cyanide.”

Saturn’s moon Enceladus with plumes. Image credit: NASA / JPL-Caltech / SSI / Kevin M. Gill.

In 2017, planetary scientists discovered evidence of chemistry on Enceladus that could help sustain life in the ocean, if it exists.

The combination of carbon dioxide, methane, and hydrogen in the plume suggested methanogenesis, a metabolic process that produces methane.

This process is widespread on Earth and may have been important for the origin of life on Earth.

Peter and his co-authors found evidence for additional energetic chemical sources that are far more powerful and diverse than methane production.

They discovered a series of oxidized organic compounds, showing scientists that Enceladus’ underground ocean potentially has many chemical pathways to support life. That’s because oxidation promotes the release of chemical energy.

“If methane production is like a small clock battery in terms of energy, then our findings suggest that Enceladus’ ocean could provide large amounts of energy for any life that might exist. This suggests that we may be able to provide something similar to car batteries,” said Dr. Kevin Hand, a researcher at NASA’s Jet Propulsion Laboratory.

Unlike previous studies that used laboratory experiments and geochemical modeling to recreate the conditions Cassini found on Enceladus, the authors relied on detailed statistical analysis.

They examined data collected by Cassini’s ion and neutral mass spectrometers, which study the gas, ions, and ice grains around Saturn.

By quantifying the amount of information contained in the data, the authors were able to uncover subtle differences in how well different compounds explain the Cassini signal.

“There are a lot of potential puzzle pieces that can be put together when trying to reconcile observed data,” Peter said.

“We used mathematics and statistical modeling to identify the combination of puzzle pieces that best matched the plume’s composition and made the most of the data without over-interpreting the limited data set.”

of findings It was published in the magazine natural astronomy.

_____

JS Peter other. Detection of HCN and diverse redox chemistries in Enceladus plumes. Nat Astron, published online on December 14, 2023. doi: 10.1038/s41550-023-02160-0

Source: www.sci.news

Tracing the Sea Ice Highway: The Arrival of North America’s First Immigrants

New findings suggest that early humans arrived in North America earlier than 13,000 years ago, likely taking advantage of the “sea ice highway” along the Pacific coast. This theory is supported by paleoclimate data, challenges traditional migration theories, and emphasizes the adaptability of early humans. Credit: SciTechDaily.com

A new study suggests that some early Americans may have traveled down the coast from Beringia 24,000 years ago on winter sea ice.

One of the hottest debates in archeology is when and how humans first arrived in North America. Archaeologists have traditionally argued that people walked through temporary ice-free passages between ice sheets an estimated 13,000 years ago.

New evidence casts doubt on traditional theory

But a growing number of archaeological and genetic discoveries, such as human footprints in New Mexico dating back some 23,000 years, suggest that humans were on the continent much earlier. These early Americans likely migrated from Beringia along the Pacific coastline. Beringia is a land bridge between Asia and North America that appeared during the last ice age maximum when ice sheets trapped large amounts of water and caused sea levels to drop.

Now, in a study presented at the American Geophysical Union Annual Meeting (AGU23) in San Francisco on Friday, December 15th, paleoclimate reconstructions of the Pacific Northwest show that sea ice has grown even further south than humans. This suggests that it may have been a means of transportation.

Coastal migration theory

The idea that early Americans may have traveled along the Pacific coast is not new. People may have been south of the giant ice sheet that once covered much of the continent by at least 16,000 years ago. Given that ice-free corridors would not open for thousands of years before these early arrivals, scientists proposed that people instead migrated along a “kelp highway.” Along this path, early Americans slowly made their way down to North America by ship. Abundant supplies found in coastal waters.

Archaeologists have discovered evidence of coastal settlements in western Canada dating back 14,000 years. But in 2020, researchers noted that freshwater from melting glaciers at the time may have created strong currents, making it difficult for people to travel along the coast.

Sea ice in Nunavut, Canada. Credit: Grid-Arendel CC-BY-NC-SA

An icy highway crossing a dangerous sea

To get a more complete picture of ocean conditions during key periods of human migration, Summer Pretorius and colleagues at the U.S. Geological Survey examined climate proxies in marine sediments along the coast. Most of the data came from small fossilized plankton. Its abundance and chemistry help scientists reconstruct ocean temperatures, salinity, and sea ice cover.

Praetorius’ presentation is part of a session at AGU23 on the climate history and geology of Beringia and the North Pacific during the Pleistocene. This year, his week-long conference brought together 24,000 of his experts from all areas of earth and space sciences in San Francisco and 3,000 online participants.

Using climate models, Praetorius’ team found that at the height of the Last Glacial Maximum, about 20,000 years ago, ocean currents were more than twice as strong as they are today due to glacial winds and falling sea levels. Pretorius said it would have been very difficult to travel by boat in these conditions, although it was not impossible to row.

However, records show that much of the region had winter sea ice until about 15,000 years ago. As a cold-adapted people, “they may have been using the sea ice as a foothold instead of having to row against this terrible glacial current,” Pretorius said.

Sea ice as a migration path

People in the Arctic now travel along the sea ice on dog sleds and snowmobiles. Pretorius said early Americans may also have used the “sea ice highway” to travel and hunt marine mammals, slowly making their way into North America in the process. Climate data suggest that conditions along the coastal route may have been favorable for migration between 24,500 and 22,000 years ago and between 16,400 and 14,800 years ago, possibly due to the presence of winter sea ice.

Integration of old and new theories

It’s difficult to prove that people used sea ice for travel, given that most ruins are underwater, but the idea is that without land bridges or easy ocean travel, humans It provides a new framework for understanding how it arrived in North America.

And the Sea Ice Highway is not mutually exclusive with other human movements beyond it, Pretorius said. The researchers’ model shows that by 14,000 years ago, the Alaska Current had calmed down, making it easier for people to travel by boat along the coast.

“There’s nothing wrong with it,” she said. “We are always amazed by the ingenuity of ancient humans.”

Source: scitechdaily.com

Getting the Gateway to the Moon Ready

NASA’s Artemis II mission is making final preparations for its SLS rocket at Kennedy Space Center. The Orion stage adapter, a critical component that connects Orion to his SLS, recently underwent critical installation work on its diaphragm at Marshall Space Flight Center. This adapter plays an important role in preventing hydrogen gas buildup and ensuring safety during launch.Credit: NASA/Sam Lott

NASAThe Artemis II mission is making final preparations. SLS rocket. The Orion stage adapter, essential for connecting Orion to SLS and ensuring launch safety, has reached a key milestone. SLS is essential to NASA’s lunar exploration goals.

Elements of the super-heavy lift SLS (Space Launch System) rocket for NASA’s Artemis II mission are undergoing final preparations before being shipped to NASA Kennedy Space Center in Florida for stacking and pre-launch activities in 2024. It is being said.

orion stage adapter

A team at NASA’s Marshall Space Flight Center in Huntsville, Alabama, recently installed the Orion Stage Adapter, a ring structure that connects NASA’s Orion spacecraft to the SLS rocket’s Interim Cryogenic Propulsion Stage (ICPS), in preparation for the installation of the diaphragm. rotated. The Nov. 30 installation is one of the final steps for the adapter before it is ready to be shipped to Kennedy on NASA’s Super Guppy cargo plane.

Diaphragm safety and functionality

“The diaphragm is a composite dome-shaped structure that isolates the volume above the ICPS from the volume below Orion,” said Brent, director of Orion Stage Adapter in Marshall’s SLS Program Spacecraft/Payload Integration and Evolution Office. Gaddes said. . “This acts as a barrier between the two, allowing highly flammable hydrogen gas that could leak from the rocket’s propellant tanks to accumulate beneath the Orion spacecraft and its crew before and during launch. It prevents

Engineers at NASA’s Marshall Space Flight Center in Huntsville, Alabama, recently rotated, or “flipped” the smallest key element to attach critical components to NASA’s SLS (Space Launch System) rocket on Nov. 30. I let it happen. 5 feet tall, 1,800 rockets. -Pond’s Orion stage adapter connects NASA’s Orion spacecraft to his SLS rocket’s interim cryogenic propulsion stage and is manufactured entirely in Marshall. The recently installed diaphragm will act as a barrier to prevent gases generated during Artemis II’s launch from entering the spacecraft.Credit: NASA’s Marshall Space Flight Center

The role of adapters in SLS Rocket

At 5 feet tall and weighing 1,800 pounds, the adapter is the smallest key element of the SLS rocket, which will generate more than 8.8 million pounds of thrust to launch the four Artemis astronauts into the constellation of Orion around the moon. . This adapter is completely manufactured by Marshall’s engineering team.

SLS: Pillar of deep space exploration

NASA is working to land the first woman and first person of color on the moon under Artemis. SLS is part of NASA’s deep space exploration backbone, along with Orion and Gateway in lunar orbit and the Commercial Manned Landing System. SLS is the only rocket capable of sending Orion, astronauts, and supplies to the moon in a single launch.

Source: scitechdaily.com

Revealing an Innovative Approach to Cooling

Schematic diagram showing cooling of nanopores by charge-selective ion transport. Credit: 2023 Tsutsui et al., Peltier Cooling for Thermal Management of Nanofluidic Devices, Devices, ed.

Groundbreaking work by Japanese researchers demonstrates nanopore-mediated cooling, revolutionizing temperature control in microfluidic systems and deepening our understanding of cellular ion channels.

Have you ever wondered how water boils in an electric kettle? Most people may think that electricity just heats a metal coil inside the kettle and transfers that heat to the water. . But electricity can do so much more. When electricity causes ions in a solution to flow, heat is generated. If all ions and surrounding molecules are free to move, this heating effect will be uniform throughout the solution. Now, Japanese researchers have investigated what happens if this flow is blocked in one direction.

Cooling with nanopore technology

In a recently published study, deviceA team led by researchers at Osaka University’s SANKEN (National Institute of Scientific and Industrial Research) has shown that cooling can be achieved by using nanopores (very small holes in membranes) as gateways that allow only certain ions to pass through. Through.

In general, when electricity is used to drive ions in a solution, positively charged ions and negatively charged ions are attracted in opposite directions. Therefore, the thermal energy carried by the ions travels in both directions.

Understanding ion flow and temperature control

If the path of the ions is blocked by a membrane that can only pass through the nanopores, it becomes possible to control the flow. For example, if the pore surface is negatively charged, negative ions can interact without passing through, and only positive ions will flow with energy.

“At high ion concentrations, we measured an increase in temperature as the power increased,” explains study lead author Mayu Tsutsui. “However, at low concentrations, the available negative ions interact with the negatively charged nanopore walls. Therefore, only positively charged ions passed through the nanopore and a decrease in temperature was observed. ”

Applications in microfluidics and cell biology

The demonstrated ionic cooling could potentially be used to cool microfluidic systems, setups used to move, mix, or interrogate very small volumes of liquids. Such systems are important across many fields, from microelectronics to nanomedicine.

Additionally, this discovery could help further our understanding of ion channels, which play a key role in the delicate balance mechanisms of cells. Such insights could be key to understanding function and disease and designing treatments.

Broader implications and future prospects

“We are excited about the breadth of the potential impact of our findings,” says Yuji Kawai, lead author of the study. “There is considerable scope to tune nanopore materials to tune cooling. Additionally, arrays of nanopores can be created to amplify the effect.”

The list of areas that could be enhanced by this discovery is indeed considerable, extending to the use of temperature gradients to generate electrical potentials. This has potential applications in temperature sensing and blue power generation.

References: “Peltier Cooling for Thermal Management in Nanofluidic Devices” by Mayu Tsutsui, Kazumichi Yokota, Wei Lung Su, Dennis Garoli, Hirofumi Oguji, and Yuji Kawai, December 5, 2023. device.
DOI: 10.1016/j.device.2023.100188

Source: scitechdaily.com

Next Phase of Human Clinical Trials for Revolutionary Sepsis Treatment Commences

Scientists have developed a promising treatment for sepsis, and clinical trials using sodium ascorbate, a vitamin C preparation, have shown effective results. The treatment has progressed into extensive clinical trials across Australia and demonstrated significant improvements in sepsis patients, including improved kidney function and reduced dependence on other drugs. This breakthrough, the result of decades of research, brings hope to a disease that is the leading cause of death in intensive care units around the world.

Flory Institute researchers, in collaboration with hospital intensivists, have demonstrated that sodium ascorbate, a pH-balanced formulation of vitamin C, is effective in treating sepsis.

Researchers at the Florey Institute have demonstrated that the formulation they have developed reduces deadly sepsis, and the next phase of clinical trials is set to begin across Australia next month.

Promising results from early clinical trial conducted at Melbourne’s Austin Hospital published in journal Critical carehave shown that sodium ascorbate, a pH-balanced formulation of vitamin C, is effective in treating sepsis.

Lead researcher Associate Professor Yugish Lankadeva said sepsis is notoriously difficult to treat and is often fatal.

LR Florey Professor Clive May, Austin Health Intensivist Professor Rinaldo Bellomo and Florey Associate Professor Yugish Rankadeva discovered that sodium ascorbate can be used to treat sepsis.Credit: Flory

Challenges in sepsis treatment

“Sepsis accounts for 35 to 50 percent of all hospital deaths. It is when the immune system is unable to fight the underlying infection, causing a life-threatening drop in blood pressure, multiple organ failure, and death. ,” said Associate Professor Lankadeva. In our clinical trial at Austin Hospital, sodium ascorbate was administered into patients’ bloodstreams, resulting in promising improvements in multiple organs. ”

Associate Professor Lankadeva, Florey’s research director for Systems Neuroscience, said of the next steps: $4.9 million government-funded research project Delivered in intensive care units in Adelaide, Melbourne, Perth, Brisbane, Alice Springs and Sydney.

“We will recruit 300 adult sepsis patients who will receive either our formulation or a placebo in addition to their usual hospital care. These results will provide additional data to determine the efficacy of the formulation. It will help in collection,” said Associate Professor Lankadeva.

Flory scientists have created a special formulation of sodium ascorbate to treat sepsis.Credit: Flory

Insights into previous trials

Professor Rinaldo Bellomo, director of intensive care research at Austin Hospital, said the first part of the trial at his department involved 30 adult sepsis patients between October 2020 and November 2022.

While in intensive care in the hospital, half of the patients were randomly assigned to receive sodium ascorbate, and the other half received a placebo.

This study found that patients with sepsis treated with sodium ascorbate:

  • Signs that more urine is produced and kidney function has improved
  • Less need for noradrenaline, a drug used clinically to restore blood pressure
  • He showed signs of improved function in multiple organs.

“Sepsis is the number one cause of death in intensive care units in Australia and around the world,” Professor Bellomo said. “In many cases, the disease progresses so rapidly that by the time patients reach us, they are already seriously ill. It will be a huge change.”

Decades of research bear fruit

Professor Clive May, Florey Senior Research Fellow on the project, has been researching how sepsis causes organ failure, particularly damage to the brain and kidneys, for more than 20 years.

“By showing decreased oxygen levels in the tissues of sepsis, we found that sodium ascorbate was a possible treatment.

“We have seen dramatic results in preclinical studies, where extremely high doses of sodium ascorbate caused complete recovery within just three hours with no side effects. It’s heartening to see that it’s paying off and bringing treatments into the hands of patients,” said Professor Clive May.

Surviving sepsis: The patient’s perspective

Longtime Flory staffer Brett Purcell serves as the consumer representative for the MEGASCORES research program, providing a valuable perspective from sepsis survivors.

“In 2011 I was taken to the hospital by ambulance with high fever and delirium. I was suffering from the early stages of sepsis. My condition gradually worsened and I was transferred to a larger hospital after 12 days. By that time My heart was severely infected and I was in septic shock. Six months ago I had a successful aortic valve replacement. Unfortunately the valve was infected.

“The surgical team repaired the damage in a six-hour operation, but my condition deteriorated to critical condition. I was told it would be an hour. It was the good decision-making of the surgical team and ICU intensivist that saved me. I was put on life support with an ECMO machine and dialysis, and my symptoms rapidly worsened. Improved.

“After almost eight weeks in the hospital, I’m home. I’m really lucky to be alive and hope this new research using sodium ascorbate is less invasive, faster, and extremely effective in fighting sepsis.” We hope to provide hospitals with a new and effective life-saving tool.”

Reference: “Ultra-dose sodium ascorbate: pilot, single-dose, physiological effects, double-blind, randomized, controlled trial” Fumitaka Yanase, Sofia Spano, Akinori Maeda, Anis Chaba, Thummaporn Naorungroj, Connie Pei Chen Ow , Yugeesh R. Rankadeva, Clive N. May, Ashenafi H. Betley, Darius JR Lane, Glenn M. Eastwood, Mark P. Plummer, Rinaldo Bellomo, October 12, 2023. Critical care.
DOI: 10.1186/s13054-023-04644-x

Source: scitechdaily.com

Volcanoes: Not as calm as they seem – Explosive secrets unveiled

Researchers are analyzing the Chomadul volcano to comprehend the sudden eruption of a long-dormant volcano. The chemical and mineral composition of magma has been the focus of research to gain insight into volcanic reactivation and prediction of eruptions. This has underscored the potential hazards of inactive volcanoes. The dormant period of a volcano could potentially be interrupted by a rapid and hazardous eruption. The study by Hungarian researchers has helped uncover warning signs before the eruption of long-dormant volcanoes. They focused on the Chomadul volcano in the Carpathian-Pannonian region. The team used mineralogical and chemical composition data to understand magma evolution and infer the structure of the volcanic subsurface magma chamber. The study revealed that the volcanic activity during the last active period was mainly explosive. The eruption of Chomadul volcano was analyzed in terms of its eruptive history. The researchers were able to determine the causes and processes that control the eruption style of volcanic activity through a detailed study of rock-forming minerals. The key mineral, amphibole, played a crucial role in the study. It suggested that water-rich recharge magmas played an important role in triggering explosive eruptions. The research also highlighted the importance of quantitative volcanic petrology studies in understanding pre-eruption signals and enhancing eruption prediction capabilities. The study of Chomadul volcano has attracted international attention and is significant in identifying potential dangers associated with long-dormant volcanoes.

Source: scitechdaily.com

Breathing Easy: The World Can Now Relax.

Washington University in St. Louis published a new study on December 17, 2023, examining the health risks of PM2.5 and global reduction efforts. The study found that global PM2.5 exposure has decreased since 2011, mainly due to China’s efforts. The researchers emphasize the health benefits of exposure reduction and emphasize the need for continued monitoring and mitigation efforts, especially in densely populated areas.

The study, conducted by researchers at Washington University, quantified changes in air pollution from 1998 to 2019 and concluded that further mitigation efforts are still needed.

PM2.5, which is 2.5 micrometers or smaller, poses a significant global environmental health risk. It can lead to respiratory diseases such as asthma and bronchitis, cardiovascular diseases such as heart attacks and high blood pressure, and permanent developmental problems in children. Exposure to PM2.5 is also associated with an increased risk of premature death.

To address these negative effects, several countries, including China, have reduced their exposure to PM2.5. Yet, the study raises questions about the effectiveness of these efforts and which regions are making the most progress in driving PM2.5 reductions.

The research, led by Randall Martin, examined PM2.5 data from 1998 to 2019 and found that China’s strict air quality controls were the biggest cause of the global reversal in PM2.5 exposure. This reduced exposure resulted in 1.1 million fewer premature deaths in China between 2011 and 2019, as well as improved health outcomes more generally.

Overall, the study underscores the need for continued reductions in PM2.5 exposure globally and emphasizes the importance of sustained monitoring, especially in poorly monitored but highly populated regions such as South Asia and the Middle East. The success in PM2.5 reduction demonstrates the benefits of mitigation efforts and provides motivation for further progress.

Source: scitechdaily.com

Unraveling Subtle Mysteries with “Donut” Rays

Researchers at the University of Boulder have advanced the field of ptychography by innovating a new imaging method using donut-shaped light beams. This technique enables detailed imaging of small regularly patterned structures such as semiconductors, overcoming previous limitations of conventional microscopy. This advance promises significant improvements in nanoelectronics and biological imaging. (Artist’s concept) Credit: SciTechDaily.com

In a new study, researchers at the University of Boulder used a donut-shaped beam of light to take detailed images of objects too small to be seen with traditional microscopes.

Advances in Nanoelectronic Imaging

This new technology could help scientists improve the inner workings of a variety of ‘nanoelectronics’, including miniature ones. The semiconductor inside a computer chip. This discovery was featured in a special issue on December 1st. Optics and Photonics News called Optics in 2023.

Ptychography: A Lens into the Microscopic World

This research is the latest advance in the field of ptychography, a challenging yet powerful technique for seeing very small things. Unlike traditional microscopes, ptychography tools do not directly observe small objects. Instead, it shines a laser at a target and measures how the light is scattered. This is a bit like making shadow puppets on a wall when viewed through a microscope.

A scattering pattern produced by donut-shaped rays of light reflecting off an object with a regularly repeating structure. Credit: Wang et al., 2023, optica

Overcoming Ptychography Challenges

So far, the approach has worked surprisingly well, with one major exception, said Margaret Mahne, the study’s lead author and distinguished professor of physics.

“Until recently, we had been completely unsuccessful with highly periodic samples or objects with regularly repeating patterns,” says the UW-Boulder and National Institute of Standards and Technology (NIST) collaboration. Margaret, a fellow at JILA, said, “That’s a problem because this has a lot of nanoelectronics in it.”

She pointed out that many important technologies, such as some semiconductors, are made up of atoms such as silicon and carbon bonded in regular patterns, like small grids or meshes. So far, it has proven difficult for scientists to observe these structures up close using ptychography.

Donut-shaped beams of light scatter from incredibly small structures. Credit: Wang et al., 2023, optica

A Breakthrough in Donut-Shaped Light

But in a new study, Murunet and colleagues have come up with a solution. Instead of using a traditional laser in a microscope, they generated a donut-shaped beam of extreme ultraviolet light.

The researchers’ new approach can collect precise images of small, delicate structures that are around 10 to 100 nanometers in size, or many times smaller than a millionth of an inch. In the future, researchers expect to be able to zoom in and observe even smaller structures. The donut beam, or angular momentum beam of light, also does not damage small electronic equipment during the process, as existing imaging tools such as electron microscopes do.

“In the future, this method could be used to inspect polymers used in semiconductor manufacturing and printing for defects without damaging the structure during the process,” Mahne said. Stated.

Bin Wang and Nathan Brooks, who received their PhDs from JILA in 2023, are the lead authors of this new study.

Pushing the Limits of Microscopy

Mahne said this research pushes the fundamental limits of microscopy. Because of the physics of light, lens-based imaging tools can only see the world to a resolution of about 200 nanometers, which is not precise enough to capture many viruses. For example, those that infect humans. Although scientists can freeze viruses to death and view them with powerful cryo-electron microscopes, they still cannot capture the activity of these pathogens in real time.

Ptychography, developed in the mid-2000s, could help researchers break through that limit.

How ptychography works
To understand how, go back to shadow puppets. Imagine that a scientist wants to collect stylized images of very small structures, perhaps the letters that spell “CU.” To do this, they first shine a laser beam on the text and scan the text multiple times. When light hits “C” and “U” (in this case the dolls), the light rays break and scatter, creating a complex pattern (shadow). Scientists record those patterns using sensitive detectors and analyze them using a series of mathematical formulas. Given enough time, they will perfectly recreate the shape of the doll from the shadow it casts, Mahne explained.

Evolution to Finer Details

Stated. Bin Wang and Nathan Brooks, who received their PhDs from JILA in 2023, are the lead authors of this new study. Other co-authors of the new study include physics professor and JILA fellow Henry Kaptein, current and former JILA graduate students Peter Johnsen, Nicholas Jenkins, Yuka Esashi, Iona Binney, Includes Michael Tanksalvara.

Reference: “High-fidelity ptychographic imaging of highly periodic structures enabled by vortex harmonic beams” Michael Tanksalvala, Henry C. Kapteyn, Bin Wang, Peter Johnsen, Yuka Esashi, Iona Binnie, Margaret M. Murnane, Nicholas W. Jenkins, and Nathan J. Brooks, September 19, 2023, optica.
DOI: doi:10.1364/OPTICA.498619

Source: scitechdaily.com

NASA’s Exciting Test Phase for Dream Chaser

NASA and Sierra Space are making progress toward the company’s Dream Chaser spacecraft’s maiden flight to the International Space Station. The unmanned cargo spaceplane is scheduled to begin demonstration missions to orbital complexes in 2024 as part of NASA’s commercial resupply services. Credit: Sierra Space

NASA and Sierra Space are testing the Dream Chaser spacecraft at the Neil Armstrong Test Facility, with a focus on environmental simulation for future ISS missions. After testing, the spacecraft will head to Kennedy Space Center for launch in 2024.

NASA and Sierra Space are preparing for the maiden flight of their Dream Chaser spacecraft. international space station. Dream Chaser and its accompanying cargo module “Shooting Star” NASA’s Neil Armstrong Test Facility It will fly in Sandusky, Ohio, for environmental testing, scheduled to begin in mid-December ahead of its first flight scheduled for early 2024.

State-of-the-art testing equipment

The Neil Armstrong Test Facility, part of NASA’s Glenn Research Center in Cleveland, has multiple test facilities including the Space Environment Complex and the Space Propulsion Facility, both of which will be home to Dream Chaser. The complex includes a mechanical vibration facility that exposes test articles to the harsh conditions of launch.

During Armstrong’s stay, the Dream Chaser winged spacecraft will be stacked atop the Shooting Star cargo module on a vibration table, experiencing vibrations similar to those experienced during liftoff or atmospheric re-entry.

NASA and Sierra Space are making progress toward the company’s Dream Chaser spacecraft’s maiden flight to the International Space Station. The unmanned cargo spaceplane is scheduled to begin demonstration missions to orbital complexes in 2024 as part of NASA’s commercial resupply services. Credit: Sierra Space/Shay Saldana

Rigorous space simulation

After vibration testing, Dream Chaser will be moved to the propulsion facility for thermal vacuum testing. Dream Chaser is placed in a vacuum and exposed to low ambient pressure, low background temperature, and simulated dynamic solar heating, simulating the environment the spacecraft will encounter during its mission. This facility is the only facility capable of testing full-scale upper stage rockets and rocket engines under simulated space conditions and conducting high-temperature fires.

After testing at Armstrong, Dream Chaser will be transported to NASA’s Kennedy Space Center in Florida for further launch preparations, and is currently scheduled to launch in the first half of 2024.

Source: scitechdaily.com

A cluster of stripped helium stars found in the Magellanic Cloud by astronomers

Removing the hydrogen-rich layer from a main-sequence star exposes the helium-rich core. Such stripped helium stars are known at high and low masses, but not at intermediate masses, despite theoretical predictions that they should be common. In a new study, astronomers at the University of Toronto and elsewhere used ultraviolet photometry to identify candidates for stripped helium stars in two nearby dwarf galaxies, the Large and Small Magellanic Clouds. We observed 25 such candidate stars using optical spectroscopy. Most of these systems have been shown to be binary systems, with the companion star likely stripping the helium star of its outer hydrogen-rich layer.

An artist’s impression of a large-scale binary system. Image credit: ESO / M. Kornmesser / SE de Mink.

The hydrogen-rich outer layers of massive stars can be removed by interactions with binary companions.

Theoretical models predict that this separation would produce a population of hot helium stars with masses between two and eight times the mass of the Sun, but only one such system has been identified to date.

“This was a very large and noticeable hole. If these stars turn out to be rare, it could affect supernovae, gravitational waves, light from distant galaxies, and our theories for all these different phenomena. The whole framework is wrong,” said Dr Maria Draut, an astronomer at the university. of Toronto.

“This discovery shows that these stars actually exist.”

“In the future, we will be able to perform even more detailed physics on these stars.”

“For example, predictions of how many neutron star mergers we will see depend on the properties of these stars, such as how much material is ejected by stellar winds.”

“In the past, people have estimated it, but now for the first time they will be able to measure it.”

Dr. Drout and her colleagues designed a new study to look at the ultraviolet part of the spectrum, where very hot stars emit most of their light.

Astronomers used data from the Swift Ultraviolet/Optical Telescope to collect the brightness of millions of stars in the Large and Small Magellanic Clouds, the two closest galaxies to Earth.

They developed the first wide-field UV catalog of the Magellanic Clouds and used UV photometry to detect systems with unusual UV emissions indicating the possible presence of stripped stars.

They acquired optical spectroscopy with the Magellan Telescope at the Las Campanas Observatory from 2018 to 2022 and conducted pilot studies on 25 objects.

These stripped stars had high temperatures (60,000 to 100,000 K), high surface gravity, and hydrogen-depleted surfaces. Sixteen stars also showed binary motion.

Drout and his co-authors propose that these stars will eventually explode as hydrogen-depleted supernovae.

These objects, like the gravitational wave-emitting objects detected from Earth by the LIGO experiment, are also thought to be necessary for the formation of neutron star mergers.

In fact, researchers believe that some of the objects in the current sample are neutron stars or stripped stars with black hole companions.

These objects are on the verge of becoming double neutron stars or neutron star and black hole systems that may eventually merge.

“Many stars are part of a cosmic dance with partners, orbiting each other in binary star systems,” says Dr. Bethany Ludwig. He is a student at the University of Toronto.

“They are not solitary giants, but part of a dynamic duo, interacting and influencing each other throughout their lives.”

“Our research sheds light on these fascinating relationships, revealing a universe far more interconnected and active than previously imagined.”

“Just as humans are social beings, stars, especially massive stars, are rarely lonely.”

of result appear in the diary science.

_____

MR Drought other. 2023. Observed population of intermediate-mass helium stars separated by binaries. Science 382 (6676): 1287-1291; doi: 10.1126/science.ade4970

Source: www.sci.news

Researchers Utilize ‘Mobile Observers’ to Uncover Previously Uncharted Air Pollutants

A groundbreaking study conducted by the University of Utah and EDF used Google Street View vehicles to closely monitor air quality in the Salt Lake Valley. This study revealed highly localized pollution hotspots, highlighted issues of environmental justice, and represents a major advance in understanding and addressing the uneven impacts of urban air pollution.

In the Salt Lake Valley, vehicles equipped with advanced air quality measurement tools similar to Google Street View vehicles drove through neighborhoods and collected highly detailed air quality data. This comprehensive sampling revealed clear variations in pollution levels within different regions. Additionally, new atmospheric modeling techniques have been developed to accurately identify these sources of pollution emissions.

In 2019, a team of atmospheric scientists at the University of Utah, in collaboration with the Environmental Defense Fund and other partners, introduced an innovative approach to air quality monitoring in the Salt Lake Valley. They equipped two Google Street View cars with air quality measurement tools, creating mobile air pollution detectors capable of identifying hyper-local pollution hotspots.

Over the next few months, John Lin, a professor of atmospheric science at the university, developed a breakthrough modeling technique. The method combined wind pattern modeling and statistical analysis to trace pollutants to their exact source. This technology provided a level of detail in pollution tracking that exceeded the more extensive and less accurate methods of traditional air quality monitoring, which typically assessed air quality across urban areas.

According to a study led by the United States and the Environmental Defense Fund (EFD) recently published in the journal atmospheric environment, the results are out. “With mobile vehicles, you can literally send them anywhere you can drive and find out more about pollution, including off-road sources that traditional monitoring has missed,” said Lin, who is also deputy director of the Wilkes Climate Science Center. “We can put up a map,” he said. policy. “I think the idea of ​​patrolling lifeguards is pretty viable in many cities.”

Researchers equipped vehicles with air quality instruments and asked drivers to explore their neighborhoods street by street, taking air samples once every second, from May 2019 to March 2020. This created a huge dataset of air pollutant concentrations in the Salt Lake Valley. It is the highest-resolution map showing pollution hotspots at a detailed scale, with data capturing fluctuations within 200 meters, or about the width of two football fields.

The air quality pattern was as expected, with higher pollution around traffic and industrial areas. Neighborhoods with lower average incomes and higher proportions of black residents had more pollutants, confirming well-known issues of environmental justice. This pattern traces its legacy to his century-old redlining policy, in which Homeowner’s Loan Corp. created maps outlining “dangerous” areas in red ink.

“Air quality is not a new problem. It’s been around for decades, and it was probably worse back then,” Lin said. “The Interstate 15 corridor runs along red-light districts. And sadly, there is quite a bit of research supporting the fact that the red-light districts of 80 years ago are still important. These areas still struggle with air quality issues. These areas tend to be underinvested, so the legacy of racism remains.”

Research-grade equipment in Google Street View vehicles measures the ambient air that is pumped in from the surrounding area and detects major emissions. The researchers tested Lin’s new atmospheric modeling approach using two case studies of well-known pollution sources. The model was then applied to analyze previously unknown areas of PM elevation.2.5

The authors hope to use atmospheric models for projects such as Air Tracker, a web-based tool developed in partnership with EDF and Carnegie Mellon University that helps users find possible sources of air pollution in their neighborhood.

This research was funded by the Environmental Defense Fund. Other authors of this article are also cited, and the study utilized the resources of the National Center for High Performance Computing.

Source: scitechdaily.com

Proxima B’s Explosive Cryovolcano and Habitable Subsurface Ocean

Astronomers at NASA and the University of Washington estimated the total internal heating rate and depth to potential subsurface oceans for 17 potentially cold ocean planets. These planets are low-mass exoplanets with surface temperatures and densities consistent with icy surfaces and large amounts of water. content. Like the icy moons of our solar system, these planets could be astrobiologically important worlds with habitable environments beneath their icy surfaces.

This artist’s impression shows Proxima b orbiting Proxima Centauri, the closest star to our solar system at just 4.23 light years. The double star Alpha Centauri AB also appears in the image between the exoplanet and Proxima itself. Image credit: M. Kornmesser / ESO.

Ocean planets have been proposed as a class of low-density terrestrial exoplanets with significant liquid water layers.

They exist in different climatic states, including ice-free, partially ice-covered, and completely frozen surfaces.

“Our analysis suggests that the surfaces of these 17 alien worlds may be covered in ice, but they are affected by internal heating due to the decay of radioactive elements and tidal forces from their host stars,” said NASA Goddard researcher Dr. Lynne Quick. “We predict that it will receive enough water to maintain its internal ocean.” Space flight center.

“Due to the amount of internal heating the planets experience, all the planets in our study may also exhibit polar volcanic eruptions in the form of geyser-like plumes.”

Dr. Quick and his colleagues examined the status of 17 confirmed exoplanets. These planets are roughly Earth-sized but less dense, suggesting they may have significant amounts of ice and water instead of dense rock.

Although the exact composition of these planets remains unknown, all initial estimates of surface temperatures from previous studies indicate that they are much cooler than Earth, and their surfaces may be covered with ice. This suggests that there is a possibility that

The authors improved their estimates of each exoplanet’s surface temperature by recalculating them using the known surface brightness and other properties of Europa and Enceladus as models.

They also estimated the total internal heating of these exoplanets by using the shape of each exoplanet’s orbit to capture the heat generated from the tides and adding it to the heat expected from radioactive activity. did.

Because oceans cool and freeze at the surface while being heated from within, estimates of surface temperature and total heating provide information about the thickness of each exoplanet’s ice layer.

Finally, they compared these numbers to Europa’s and used Europa’s estimated level of geyser activity as a conservative baseline for estimating the exoplanet’s geyser activity.

They predict surface temperatures will be up to 33 degrees Celsius (60 degrees Fahrenheit) cooler than previous estimates.

Artist’s impression of the planetary system LHS 1140. Image credit: Sci.News.

Estimated ice shell thicknesses ranged from approximately 58 m (190 ft) for Proxima b and 1.6 km (1 mi) for LHS 1140b to 38.6 km (24 mi) for LHS 1140b. MOA-2007-BLG-192Lbcompared to an estimated European average of 29 km (18 mi).

The estimated geyser activity was only 8 kg/s for Kepler 441b, 2,000 kg/s for Europa, 290,000 kg/s for LHS 1140b, and 6 million kg/s for Proxima b.

“Our models predict that oceans could be found relatively close to the surfaces of Proxima b and LHS 1140b, and that geyser activity rates could exceed those of Europa by hundreds to thousands of times. “The telescope has the best chance of detecting geological activity on these planets because of their presence,” said Dr. Quick.

“This activity can be seen when an exoplanet passes in front of its star. Certain colors of the star’s light can be dimmed or blocked by water vapor from geysers. there is.”

“If water vapor is detected sporadically and the amount of water vapor detected changes over time, it would suggest the presence of a cryovolcanic eruption.”

“Water may contain other elements and compounds, which could reveal whether it can support life.”

“Elements and compounds absorb light of certain characteristic colors, so analysis of starlight allows scientists to determine the composition of geysers and assess the potential habitability of exoplanets. Become.”

a paper Regarding the survey results, astrophysical journal.

_____

Lynne C. Quick other. 2023. Prospects for polar volcanic activity on cold ocean planets. APJ 956, 29; doi: 10.3847/1538-4357/ace9b6

Source: www.sci.news

The Threat of Cool Star’s Strong Winds to Exoplanets

Artist’s illustration of a stellar planetary system. You can clearly see the stellar wind orbiting the star and its effect on the planet’s atmosphere.Credit: AIP/ K. Riebe/ J. Fohlmeister, editor

A groundbreaking study reveals that cold stars with strong magnetic fields generate powerful stellar winds, providing important information for assessing the habitability of exoplanetary systems.

A study led by scientists at the Potsdam Leibniz Institute for Astrophysics (AIP) uses cutting-edge numerical simulations to systematically characterize the properties of stellar winds in a sample of cold stars for the first time. Ta. They found that stars with stronger magnetic fields generate stronger winds. These winds create unfavorable conditions for the survival of planetary atmospheres, thus affecting the habitability of these systems.

Cool star classification

The Sun is one of the most abundant stars in the universe, known as “cool stars.” These stars are divided into four categories (F-type, G-type, K-type, and M-type) that differ in size, temperature, and brightness. The Sun is a fairly average star and belongs to category G. Stars that are brighter and larger than the Sun belong to category F, while K stars are slightly smaller and cooler than the Sun. The smallest and faintest star is the M star, also known as a “red dwarf” because of the color in which it emits most of its light.

Solar wind and its effects

Satellite observations have revealed that, apart from light, the sun continuously emits a stream of particles known as the solar wind. These winds travel through interplanetary space and interact with the planets of our solar system, including Earth. The beautiful displays of the Northern Lights near the North and South Poles are actually produced by this interaction. But these winds can also be harmful, as they can erode Earth’s stable atmosphere. Mars.

We know a lot about the solar wind, thanks in part to missions like Solar Orbiter, but the same isn’t true for other cool stars. The problem is that we can’t see these stellar winds directly, so we’re limited to studying their effects on the thin gas that fills the cavities between stars in galaxies. However, this approach has some limitations and can only be applied to a small number of stars. This has encouraged the use of computer simulations and models to predict various properties of stellar winds without the need for astronomer observations.

Pioneering research on the properties of stellar winds

In this regard, in collaboration with Cecilia Garaffo of the Harvard University Center for Astrophysics, doctoral student Judy Chevely of AIP’s Stellar Physics and Exoplanet Division, and scientist Julián D. Alvarado Gomez Dr. Katja Poppenhager, head of the department, assisted. The Smithsonian Institution conducted the first systematic study of the expected stellar wind properties for F, G, K, and M stars.

To this end, they performed numerical simulations using one of the most sophisticated models currently available, driven by the observed large-scale magnetic field distributions of 21 well-observed stars. I used it. The simulations were performed at the AIP and Leibniz-Rechenzentrum (LRZ) supercomputing facilities.

The research team investigated how star properties such as gravity, magnetic field strength, and rotation period affect the properties of the wind in terms of velocity and density. The results include a comprehensive characterization of stellar wind properties across spectral types and, in particular, challenge previous assumptions about stellar wind speeds when estimating associated mass loss rates from observations. This indicates that it needs to be reconsidered.

In addition, the simulations can predict the expected size of the Alfvén surface, the boundary between the stellar corona and the stellar wind. This information is the basis for determining whether planetary systems are affected by strong magnetic star-planet interactions. This interaction can occur when a planet’s orbit enters or is completely embedded in the Alfvén surface of its host star.

Impact on planetary systems

Their findings show that stars with magnetic fields larger than the Sun have faster winds. In some cases, stellar wind speeds can be up to five times faster than the average solar wind speed (typically 450 km/s). The study revealed how strong these stars’ winds are in their so-called “habitable zone,” defined as the orbital distance at which a rocky exoplanet can maintain liquid water on its surface and provide an Earth-like atmospheric pressure. It was evaluated as being strong. They found milder conditions around F- and G-type stars, comparable to those experienced by Earth around the G-type Sun, and increasingly harsh wind environments around K- and M-type stars. discovered. Such intense stellar winds have a strong impact on any atmosphere a planet might have.

Broader implications for exoplanet research

This phenomenon is well documented in heliophysics between rocky planets and the Sun, but not in exoplanetary systems. This requires estimates of stellar winds to assess processes similar to those seen between the solar wind and planetary atmospheres. This study is important from the perspective of habitability, as no information on stellar winds has been known for main-sequence stars F to M until now.

Although the study presented in this paper was performed on 21 stars, the results are general enough to apply to other cool main sequence stars. This study paves the way for future studies of stellar wind observations and their effects on planetary atmosphere erosion.

References: Judy J Chebly, Julián D Alvarado-Gómez, Katja Poppenhäger, and Cecilia Garraffo, “Quantifying the wind properties of cool main-sequence stars,” July 19, 2023. Royal Astronomical Society Monthly Notices.
DOI: 10.1093/mnras/stad2100

Source: scitechdaily.com

Bloodstains at crime scene reveal forensic evidence from tail

Recent research published in fluid physics Scientists at Boston University and the University of Utah have introduced a new aspect of bloodstain analysis. This study focused on the “tail” of the bloodstain, which could provide additional information about the blood droplet’s size, velocity, and impact angle. These discoveries represent a major advance in forensic science, with implications for crime scene reconstruction and verification of eyewitness testimony.

New research in forensic science has revealed that the “tail” of a bloodstain provides important information about the origin of the blood droplet, enhancing crime scene analysis and evidence interpretation.

Forensic science has taken the public imagination by storm, as evidenced by the abundance of “true crime” media over the past decade or so. Evidence such as blood left at a crime scene can now reveal key information for investigating and understanding the circumstances of a crime, and scientific methods can help interpret that information. , now almost everyone knows.

in fluid physicsA group of scientists from Boston University and the University of Utah have demonstrated in AIP Publishing how bloodstains can yield even more valuable details than those typically collected by detectives, forensic scientists, and crime scene investigators. The researchers studied how these “tails” form by examining protrusions that deviate from the boundaries of oval bloodstains.

“These protrusions are typically only used to figure out the direction in which the droplet has moved, and are otherwise ignored,” says author James Byrd.

Within a few milliseconds, tiny droplets of blood impact the solid surface, forming the shape of a stain. Of particular interest is a protrusion that occurs on the right side and deviates from the boundaries of the oval stain.Credit: James C. Byrd

In fact, previous studies have mainly focused on large blood droplets that fall vertically onto flat or inclined surfaces, where gravity can distort the shape of the tail and make it difficult to see. In contrast, the new study involved a series of high-speed experiments in which droplets of human blood, less than a millimeter in diameter, were bombarded with horizontal surfaces at different angles.

“We showed that the precise flow that determines the length of the tail is different from the flow that is responsible for the size and shape of the oval part of the stain,” Bird said. “In other words, the tail length contains additional, independent information that helps analysts reconstruct where the blood drop actually came from.”

Indeed, the tail length may reflect information about the size, impact velocity, and impact angle of the blood droplet that formed the stain. Measuring multiple blood stains within a stain pattern allows the trajectory of the droplet to be traced back to its presumed origin.

Although their analysis only used horizontal planes to examine impact velocity dynamics, Byrd and colleagues hope this will spark further research focusing on the tail length of bloodstain patterns. Masu. They believe that incorporating tail length into standard bloodstain analysis will provide more robust evidence information.

“Knowing the origin of bloodstains at a crime scene can help detectives determine whether the victim was standing or sitting, and corroborate or challenge eyewitness testimony,” Byrd said. said.

Reference: “Bloodstain Tail: Asymmetry helps reconstruct oblique shocks” by Garam Lee, Daniel Attinger, Kenneth F. Martin, Samira Shiri, and James C. Byrd, November 2023 21st of the month fluid physics.
DOI: 10.1063/5.0170124

Source: scitechdaily.com

Effectiveness of coronavirus vaccines diminishes with passage of time, study finds

A study by the UK Health and Safety Executive that analyzed more than 10 million coronavirus patients found that vaccination significantly reduced the risk of death, with the most significant benefit seen within six months of vaccination. Became. The results of this study support the success of vaccination programs and the need for booster vaccinations. Journal of the Royal Society of Medicine. Credit: SciTechDaily.com

According to a study by the UK Health and Safety Executive, COVID-19 (new coronavirus infection) Vaccination significantly reduces the risk of death, especially within 6 months after vaccination, highlighting the importance of booster vaccination.

The risk of dying from COVID-19 is significantly reduced after vaccination, but this protection wears off after six months, providing evidence to continue giving booster shots, a new study has found.

Researchers from the UK Health and Safety Agency (UKHSA) analyzed more than 10 million coronavirus infections in adults between May 2020 and February 2022. The result is Journal of the Royal Society of Medicine (JRSM).

Vaccination and mortality reduction

Cross-referencing vaccination status with case fatality risk (CFR), the proportion of cases that result in death, revealed a clear association between vaccination and lower mortality rates. Of note, this study highlights a critical period (within 6 months of last vaccination) in which CFR was consistently lowest across all age groups. After this time, the protective effect started to decrease and the CFR increased.

Noteworthy findings in the elderly

The study highlights that the COVID-19 vaccination program has been successful in reducing mortality rates.

Among adults aged 50 years and older, CFR was 10 times higher among those who had not been vaccinated (6.3%) compared with those who had received the vaccine within 6 months of testing positive (0.6%). The study also found a sharp decline in CFR in early 2021, coinciding with the initial vaccine rollout.

Florence Halford, from UKHSA’s Covid-19 Vaccines and Epidemiology Unit, said: ‘The risk of dying from Covid-19 is reduced after vaccination, and those vaccinated up to six months before the sample collection date. This was the lowest of all age groups.” This provides some evidence for the continuation of booster doses in the elderly group. ”

Reference: “Temporal changes in the risk of adult mortality from COVID-19 after vaccination in the UK from May 2020 to February 2022: a national surveillance study” Florence Halford, Kathryn Yates, Tom Clare , by Jamie Lopez Bernal, Megan Karl, and Hester Allen, December 13, 2023, Journal of the Royal Society of Medicine.
DOI: 10.1177/01410768231216332

Source: scitechdaily.com

Archaeologists in Mongolia uncover ancient wooden saddle dating back 1,600 years

New archaeological discoveries in Mongolia show that, despite a fragmentary archaeological record, horse cultures in the eastern Eurasian steppes early adopted framed saddles and stirrups, at least by the turn of the 5th century AD. It shows. His 1,600-year-old saddle, discovered at Urd Ulan Unito, is one of the earliest known examples of a wooden-framed saddle, indicating that it was locally produced and a link to earlier saddle traditions. Both show evidence of a connection. The recent discovery of Khufu Nur suggests that stirrups were also used in the Mongolian steppes at the same time as they first appeared elsewhere in East Asia.

Birch composite frame saddle (top left) from Urud-Ulan-Unit, Mongolia and artist’s restoration. Image credit: P. Lopez Calle.

Horseback riding appears to have been little attempted as a regular mode of transportation until the late 2nd millennium BC or early 1st millennium BC, although some archaeological data suggests that horses were used in Eastern Europe by the early 2nd millennium BC. This suggests that it could have been ridden. Grassland.

Early iconography, written sources, and archaeological finds indicate that in regions of western Eurasia, these first horsemen used simple blankets or soft pads with their legs suspended and separating rider and horse. Basically, I often rode naked.

The Greek writer and soldier Xenophon, writing in the 4th century BC, outlined best practices for cavalry riding, including riding naked, holding the horse only by the upper thighs, letting the lower legs dangle, and holding the mane. I explained the Greek tradition. More security.

Despite their near-ubiquitous use among modern horsemen, neither stirrups nor true saddles appear to have been used by early equestrians.

The earliest direct evidence of mounted horses in the equid family is from mounted cavalrymen in Mesopotamia and the Levant who interbred with donkeys in the third millennium BC.

By the middle of the first millennium BC, at the same time as cavalry was emerging across Eurasia, soft-padded saddles made of leather and stuffed with fur, textiles, and other materials and secured to the horse with a girth strap were being adopted in the Eurasian interior. I did.

These early saddles were sometimes reinforced with wooden or horn supports, and sometimes secured to the horse’s chest or hindquarters with chest straps or clappers.

Throughout Eurasia, by the beginning of the first century AD, simple saddles were adopted for greater safety.

In western Eurasia, Roman military saddles incorporated four large “horns” and grips to increase stability for mounted soldiers. It may also contain hard internal components, but this is debated.

Early semi-structured saddles probably provided greater comfort and safety for rider and horse, and allowed mounted and armored soldiers to handle blunt weapons and swords more directly.

These innovations in saddle stability allowed riders to withstand collisions and ride more heavily armed, allowing heavy cavalry to replace chariots on the battlefield throughout Eurasia by the end of the first millennium BC. It was helpful.

In East Asia, parallel developments were underway towards structured saddles.

Excavations of the tombs of the Xiongnu (c. 200 BC – c. 100 AD), the first steppe empire of Mongolia, have shown that padded saddles are usually supplied with a croupier and/or chest strap to secure the saddle in place. revealed that it had a hard pommel/cantle. Components were also commonly used.

By the 6th century, in East and Central Asia, primitive saddles had been replaced by sophisticated composite frame saddles combined with two metal ribs.

“Ultimately, the technologies that emerged from Mongolia had a domino effect that shaped today’s American horse culture, particularly the tradition of harnesses and stirrups,” said William Taylor, an archaeologist at the University of Colorado Boulder. said.

“But these insights come at a time when Mongolia’s horse culture is beginning to disappear,” added Dr. Jamsranjab Bayarsaikhan, an archaeologist at the Max Planck Institute for the Science of Human History.

“Horses not only influenced the history of the region, but also left a deep mark on the art and worldview of the nomadic Mongolians.”

“However, the age of technology is slowly erasing the culture and use of horses. In the plains of Mongolia, horse-riding pastoralists are increasingly being replaced by motorcyclists.”

In April 2015, Dr. Bayarsaikhan and his colleagues at the National Museum of Mongolia received a report from the police that the Urd Ulan Unit cave burial site in Myangad Sum, Khovd province, had been destroyed by looters.

Police seized some organic material that was well preserved in the cave’s dry environment.

An intact wooden saddle was also recovered from Urd Ulan Unito Cave.

The saddle was made of about six birch pieces held together with wooden nails.

The black trim has red paint marks and contains two leather straps that may have once supported the stirrups.

Archaeologists have not been able to definitively trace where those materials came from. However, birch trees commonly grow in the Altai Mountains of Mongolia, suggesting that local people were not trading saddles, but were making them themselves.

“Since the early days of horseback riding, humans have used pads, the precursor to saddles, to keep horses comfortable after riding,” Dr. Taylor said.

“The combination of a sturdier wooden saddle and stirrup opens up new ranges of what people can do with their horses.”

“One of the things they created was heavy cavalry and fierce fighting on horseback. Think of the jousting of medieval Europe.”

“In the centuries after the Mongol saddle was made, this type of tool quickly spread throughout western Asia and into the early Islamic world.”

“There cavalry was the key to conquering and trading with the Mediterranean region and much of North Africa.”

“But where it all started is less clear. Archaeologists usually think that the birthplace of the first frame saddles and stirrups is modern-day China, and some finds date back to the 5th century AD. It dates back to the 6th century or earlier.

“But our research complicates that picture. It is possible that Mongolia may have been the first to adopt these new technologies, or may actually be the place where the innovations first took place.” This is not the only information that suggests this.”

“Mongolia’s place in its history may have been underestimated for a long time because of the region’s geography.”

“The country’s mountainous regions have some of the lowest population densities on earth, making it difficult to encounter and analyze important archaeological finds.”

“Mongolia is one of the few countries that has preserved horse culture from ancient times to modern times,” said Dr. Bayarsaikhan.

“However, scientific understanding of the origins of this culture is still incomplete.”

team’s findings Published in this month’s magazine ancient.

_____

Jamsranjaf Bayarsaikhan other. Origins of saddles and horse riding techniques in East Asia: Discoveries from Altai, Mongolia. ancient, published online on December 12, 2023. doi: 10.15184/aqy.2023.172

Source: www.sci.news

The surprising evolutionary advantage of aging: Why do we age?

Researchers used computer models to investigate the evolutionary role of aging. They challenge the notion that aging has no positive effects and suggest that aging may promote evolution in a changing environment, thereby benefiting subsequent generations. I am. Their findings indicate that aging may be an advantageous trait selected by natural evolution. Credit: SciTechDaily.com

The mysteries of aging have fascinated people for thousands of years. Because aging is usually associated with a gradual decline in most bodily functions, many people are willing to do anything to stop or reverse this process. Aging is a natural part of life, but biologists understand surprisingly little about the evolutionary emergence of this process. It’s not clear whether aging is inevitable. That’s because there are some organisms that never seem to age at all, and there is also a phenomenon known as negative aging or rejuvenation. In some turtles, vital functions improve with age.

Studying the evolutionary role of aging

Researchers at the Institute for Evolution, led by scholar Airs Zatmary, have sought to debunk previously proposed but unproven theories of aging. This theory suggests that, under the right circumstances, evolution can encourage the proliferation of genes that control aging.

To test their hypothesis, the researchers used a computer model they developed. This model is an algorithm that allows scientists to simulate long-term processes in populations of organisms and genes in a controlled environment. Essentially, such models allow you to run evolutionary scenarios and get results in hours instead of millions of years. Modern evolutionary research is unthinkable without computer modeling.

Exploring the purpose of aging

The basic research question was simple. The question was, “Is there any meaning to aging?” Does it serve some evolutionary function or is it truly a bitter and deadly byproduct of life? “If there is selection for aging, then aging may have an evolutionary function. Our study aimed to reveal this selection,” he says Eörs Szathmáry. “According to the classical explanation, aging occurs in a population even without selection. It is because individuals die sooner or later without aging (as a result of disease or accidents), This creates an opportunity for genes to accumulate that have a negative effect on older individuals (thus causing aging), meaning that aging is only a side effect of evolution. It means that there is no adaptive function.”

Challenging common sense

During the last century, several evolutionary theories have been formulated to explain inevitable aging without active functions using different biological mechanisms. Although some scientists accepted this assumption as fact, the discovery of organisms that do not age led more and more researchers to question the inevitability of aging and to suggest that perhaps aging has benefits as well. I suggested that it might be.

“The evolutionary biology community has accepted that classical non-adaptive theories of aging cannot explain all aging patterns in nature, meaning that the explanation of aging has once again become an open question. “I mean,” Zatomary said. “Alternative adaptation theories provide a solution to this problem by suggesting positive effects of aging. For example, aging and death may be more advantageous for individuals in a changing environment. This is because doing so reduces competition that prevents the survival and reproduction of more fit offspring with a better genetic makeup.

However, this scenario is only true if the individual is surrounded primarily by relatives. Otherwise, during sexual reproduction, non-senescent individuals would “steal” better (better adapted to environmental changes) genes from aging population members, thus erasing significant senescence.

Aging as a catalyst for evolution

Hungarian biologists ran a model and found that aging can actually accelerate evolution. This is an advantage in a changing world. Faster adaptation allows us to find suitable traits faster, which supports the survival and spread of offspring genes. This means that aging can become a highly advantageous trait and be favored by natural selection.

Reference: András Siraj, Tamash Charan, Mauro Santos, Airs Zatmary, “Directional selection combined with kin selection favors the establishment of senescence”, October 23, 2023. BMC biology.
DOI: 10.1186/s12915-023-01716-w

Funding: National Agency for Research, Development and Innovation (Hungary), Bolyai János Research Fellowship of the Hungarian Academy of Sciences, New National Excellence Program of the Ministry of Culture and Innovation, Ministry of Science and Innovation, Autonomous Region of Catalonia 2021 Special Guest Scientist Volkswagen Foundation, Hungary Fellowship Program of the Academy of Sciences (Initiative “Leben?

Source: scitechdaily.com

The Emergence of the Anthropocene Era

New research supports the concept of the Anthropocene, a proposed geological era characterized by the significant impact humans have had on Earth. This study used fossil pollen data to analyze changes in North American vegetation since the end of the Pleistocene. Their findings show that recent changes in vegetation are comparable to those observed during the last epoch transition, suggesting significant changes in ecosystem function that justify the classification of a new epoch. There is.

The researchers determined that human activity shaped the environment as much as the retreat of glaciers at the end of the Ice Age.

Scientists have long debated the Anthropocene, a proposed unit of geological time that corresponds to the most recent epoch in history. It is characterized by the enormous impact humans have on the earth.

Are we living in the Anthropocene? If so, when did it start?

In a research paper published this month, Proceedings of the National Academy of Sciences, Dr. Trisha Spanbauer of the University of Toledo and Dr. M. Alison Stegner of Stanford University lend credence to the argument for its existence. The pair analyzed open-source data to track changes in vegetation across North America since the end of the Pleistocene and concluded that humans have impacted the landscape as much as the retreat of glaciers at the end of the Ice Age. Ta.

research method

“As a paleolimnologist, I’m very interested in what the past can tell us about the future,” said Spannbauer, an assistant professor in the Department of Environmental Sciences. “Biological changes have been used to delimit eras in the past, so this analysis suggests that what we see today is what we would have seen during the transition between the Pleistocene and the Holocene. It provides valuable context for understanding whether the scale is essentially the same over time.”

Spanbauer and Stegner used the Neotoma Paleoecology Database, a community-curated repository of multi-species paleoecological data. They specifically looked at fossil pollen data from 386 sediment core records taken from lakes across North America.

Sediment cores are samples taken from the bottom of a lake that preserve the sedimentary order. Spannbauer and Stegner looked at samples taken during the late Pleistocene, about 14,000 years ago.

Analysis of ecological change

They analyzed the data according to seven indicators: taxonomic richness, or pollen diversity; seed; first occurrence data, last occurrence data, and short-term gains and losses in taxa. It measures how often species appear and disappear in the fossil record. A sudden change in the community, referring to the species identified in the sample. They organized data points within a 250-year period at both continental and regional scales, incorporated age model uncertainties, and produced conservative estimates to account for differences in sample sizes.

Their results show that vegetation changes over the past several hundred years are comparable to those associated with the last epochal transition, including increases in first and last emergence and abrupt community shifts.

“The power of a database like this is that you can ask questions about macroscale ecological change,” says Spannbauer. “While scientists have documented the effects of human activities on single species and on biodiversity generally, our study places these observations in a broader context. “We show corroborating changes in ecosystem function.”

Reference: “North American pollen record provides evidence of large-scale ecological change in the Anthropocene,” by M. Alison Stegner and Trisha L. Spanbauer, October 16, 2023. Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2306815120

Source: scitechdaily.com

Astronomers make breakthrough discovery in planet formation, conflicting with theoretical predictions

Recent observations of the young star DG Taurus reveal a smooth protoplanetary disk in which no planets have yet formed, suggesting that it is on the brink of this process. The findings show unexpected dust grain growth patterns and provide new insights into the early stages of planet formation. Credit: SciTechDaily.com

Astronomers have become very good at finding signs of planet formation around stars. However, to fully understand planet formation, it is important to examine cases where this process has not yet begun.

Looking for something and not finding it can sometimes be even more difficult than finding it, but new detailed observations of the young star DG Taurus reveal that the planet is a smooth protoplanet with no signs of planet formation. It was shown that it has a system disk. This lack of detected planet formation may indicate that DG Taurus is on the eve of planet formation.

Image of radio radiation intensity from a disk near DG Taurus observed with ALMA. Rings have not yet formed within the disk, suggesting that planets are about to form.Credit: ALMA (ESO/National Astronomical Observatory/NRAO), S. Obashi et al.

Protoplanetary disk and planet growth

Planets form around protostars, which are young stars that are still forming, in disks of gas and dust known as protoplanetary disks. Planets grow so slowly that it is impossible to observe their evolution in situ. Therefore, astronomers observe many protostars at slightly different stages of planet formation to build theoretical understanding.

This time, an international research team led by Satoshi Ohashi of the National Astronomical Observatory of Japan (NAOJ) has developed the Atacama Large Millimeter/Submillimeter Array (alma telescope) will conduct high-resolution observations of the protoplanetary disk surrounding the relatively young protostar DG Taurus, located 410 light-years away in the direction of Taurus. The researchers found that DG Taurus has a smooth protoplanetary disk and no rings that would indicate planet formation. This led the research team to believe that the DG Taurus system could begin forming planets in the future.

Unexpected discoveries and future research

The researchers found that during this pre-planetary stage, dust particles are within 40 astronomical units (about twice the size of Earth’s orbit). Uranus The radius of the central protostar is still small, but beyond this radius the dust particles begin to grow, which is the first step in planet formation. This goes against the theoretical expectation that planet formation begins inside the disk.

These results provide surprising new information about dust distribution and other conditions at the beginning of planet formation. Studying more examples in the future will further deepen our understanding of planet formation.

Reference: “Dust concentration and particle growth in the smooth disk of a DG tau protostar revealed by ALMA triple-band frequency observations” Satoshi Ohashi, Munetake Momose, Akiraka Kataoka, Aya Higuchi E, Takashi Tsukagoshi, Takahiro Ueda, Claudio Codella, Linda Podio, Tomoyuki Hanawa, Nami Sakai, Hiroshi Kobayashi, Satoshi Okuzumi, Hidekazu Tanaka, August 28, 2023, of astrophysical journal.
DOI: 10.3847/1538-4357/ace9b9

This research was funded by the Japan Society for the Promotion of Science, the German Foundation, and the European Union.

Source: scitechdaily.com

The ultrasound patch developed by MIT accurately detects bladder fullness

MIT researchers have developed a wearable ultrasound patch that can non-invasively image internal organs, primarily focusing on bladder health. The device eliminates the need for an ultrasound operator or gel and could transform the monitoring of various organ functions and disease detection.

The wearable device is specifically designed to monitor the health of the bladder and kidneys and could be instrumental for early diagnosis of cancers deep within the body.

Designed in the form of a patch, the ultrasound monitor can capture images of organs inside the body without requiring an ultrasound operator or gel application. The patch can accurately image the bladder and determine its fullness, allowing patients with bladder or kidney problems to efficiently monitor the functionality of these organs.

Additionally, the wearable patch has the potential for use in monitoring other organs in the body by adjusting the ultrasound array’s position and signal frequency. This capability could enable the early detection of deep-seated cancers like ovarian cancer.

The researchers behind this groundbreaking technology are based at the Massachusetts Institute of Technology (MIT), and the study has been published in Nature Electronics. Their aim is to develop a series of devices that improve information sharing between clinicians and patients and ultimately shape the future of medical device design.

In an initial study, the wearable ultrasound patch was able to obtain bladder images comparable to traditional ultrasound probes. To advance the clinical application of this technology, the research team is working on a portable device that can be used to view the images.

The MIT team also has aspirations to develop an ultrasound device capable of imaging other deep-seated organs in the body, such as the pancreas, liver, and ovaries. This will involve designing new piezoelectric materials and conducting further research and clinical trials.

Funding for this research was provided by various organizations, including the National Science Foundation, 3M Non-Tenured Faculty Award, Texas Instruments Corporation, and the MIT Media Lab Consortium, among others.

Source: scitechdaily.com

Utilizing DNA from Polar Bear Snow Tracks to Support Conservation efforts

Researchers have developed a breakthrough method to protect polar bears by analyzing DNA from footprints in the snow. This non-invasive technique can also be applied to other snow-dwelling animals such as lynx and snow leopards, providing a safer and more efficient way to collect data essential to wildlife conservation.

Scientists have discovered a way to capture DNA Observations from snow tracks – a promising non-invasive way to monitor elusive animals like polar bears.

The polar bear is a symbol of the Arctic, an elusive and vulnerable animal. Close monitoring of polar bear populations is critical to polar bear conservation, but polar bears are so difficult to find that critical data about population size and connectivity between those populations is lacking. I am. Scientists have now developed a helpful new tool: DNA analysis using skin cells shed from bear tracks in the snow.

Dr Melanie Lancaster of the World Wildlife Fund’s Global Arctic Program said: ‘Finding polar bears in the Arctic, let alone counting them and understanding how they are coping with climate change, is particularly difficult. “And it’s expensive and time-consuming.” , senior author of the study Frontiers of conservation science.

Innovative forensic techniques in preservation

The scientists were inspired by forensic techniques that can be applied to trace amounts of degraded DNA samples. These techniques eliminate the need to physically capture bears, which can be stressful and dangerous for both bears and humans, and is a concern for some local indigenous communities. Instead, scientists can look at the source of accidentally released DNA: environmental DNA.

A polar bear in Utchagvik, Alaska.Credit: Elizabeth Kruger, World Wildlife Fund

“Many Inuit have expressed concerns about invasive research methods,” said the article’s author, Elizabeth Krueger of the World Wildlife Fund. “People are concerned about the welfare of individual polar bears and the health and safety of those who may later harvest the bears. This is one reason we are so excited about new methods like this. The person collecting the samples does not need to see or even be seen by the polar bear.”

Environmental DNA: a non-invasive tool

A common form of environmental DNA is deposited when animals defecate. However, the quality of DNA is not always sufficient for the individual-level analysis required for preservation. Furthermore, in the case of territorial animals like her other two, seed Scientists tested lynx and snow leopards, and the collection of faeces can affect the animals’ behavior. So the researchers focused on the skin cells in snowy footprints.

“Trucks typically contain fresh cells and the DNA is intact due to the cold ‘storage’ temperatures. “The DNA that passes through the intestine is further degraded, making it more difficult to study,” said lead author Dr Michaela Helström from MIX Research Sweden AB.

Real-world tracking and sampling

The researchers collected snow from individual footprints made by polar bears in Alaska and Eurasian lynx in Sweden in the wild and in captivity. They also collected snow from footprints made by captive snow leopards. Additional substances such as hair, saliva, and mucus were also sampled to ensure that the traces yielded accurate genotypes.

Twenty-four wild polar bear tracks and 44 wild lynx tracks were sampled. The researchers melted and filtered the snow to collect environmental DNA and analyzed the microsatellites. Although the concentration of DNA recovered from footprints collected in the wild was very low, we were able to recover nuclear DNA from 87.5% of wild polar bear footprints and 59.1% of wild lynx footprints. We were able to genotype 13 of the wild polar bear samples and identify 12 different individuals.

They were able to genotype 11% of the lynx footprints, but this percentage increased significantly when scientists examined only footprints sampled by trained personnel. They were able to recover nuclear DNA from 76% of the samples collected by trained personnel and genotype 24% of the samples.

A step-by-step approach

This technology has great potential to inform conservation of these animals, better understand animal populations and behavior, and manage conflicts with humans through accurate animal identification. Although non-invasive sampling has a low success rate, it is easy to collect and can greatly expand sample size.

“We hope this method will be adopted by the polar bear research community as a new way to collect information about polar bears, with the participation of hunters, volunteers, and indigenous communities,” Lancaster said. “We also hope that this method can be extended to other animals that live in snowy environments. We have started by showing that this method works for lynx and snow leopards as well. I did.”

Reference: “Capturing environmental DNA from snow tracks of polar bears, lynx, and snow leopards for individual identification” Michaela Hellström, Elisabeth Kruger, Johan Neslund, Mia Bister, Anna Edlund, Patrick Hernvall, Viktor・Birgerson, Rafael Augusto, Melanie L. Lancaster, October 11, 2023. Frontiers of conservation science.
DOI: 10.3389/fcosc.2023.1250996

Source: scitechdaily.com

Uncovering the Hidden Physics of Temperature and Radiation

by

A groundbreaking study investigated the complex relationship between Earth’s surface temperature and emitted longwave radiation, revealing deviations from the expected quaternary pattern. This research improves our understanding of climate sensitivity and the factors that influence it, such as greenhouse gases and atmospheric dynamics. Credit: SciTechDaily.com

Climate science research has revealed new insights into the relationship between surface temperature and emitted longwave radiation, challenging traditional models and improving our understanding of Earth’s climate sensitivity.

Want to know what causes Earth’s climate sensitivity? Recent research shows Advances in atmospheric science. We investigate a complex relationship that transforms the relationship between surface temperature and outgoing longwave radiation (OLR) from fourth-order to sublinear. Led by Dr. Jie Sun florida state university this study elucidates the hidden mechanisms that shape Earth’s climate and provides new insights into why the relationship between temperature and OLR deviates from the fourth-order pattern described by the Stefan-Boltzmann law. Masu.

Stefan-Boltzmann law and climate dynamics

What is the Stefan-Boltzmann law? Atmospheric greenhouse gases create a contrast between surface heat release and OLR, which is related to the fourth power of surface temperature.

Professor Hu Xiaoming of Sun Yat-sen University, corresponding author of the study, explained: This allows the relationship between surface temperature and OLR to follow a quartic pattern, since the radiation-emitting layer is lowered. ”

Diagram showing two main processes: sublinear surface temperature and outgoing longwave radiation (OLR). Left: Increased meridional surface temperature gradient due to the greenhouse effect of water vapor. Right: Poleward energy transport reroutes part of the OLR from warmer to colder regions. Credit: Ming Cai and Xiaoming Hu

Factors affecting surface temperature and OLR

This study reveals how various factors influence surface temperature and OLR. The water vapor greenhouse effect acts as a magnifying glass, amplifying temperature differences across the Earth’s surface without changing the latitudinal variation of the OLR. This suppresses the nonlinearity between OLR and surface temperature.

Polar energy transport, on the other hand, acts as an equalizer to harmonize temperature differences across different regions of the Earth. One of the by-products of this global heat redistribution is the rerouting of OLR from warmer to colder regions, which acts to reduce the differences in OLR between different regions. This further suppresses nonlinearities.

“Understanding these complex climate interactions is like deciphering a puzzle. Each piece brings us closer to deciphering the complexity of Earth’s climate,” said Ming Kai, a professor at Florida State University. Masu.”

By uncovering these relationships, scientists are learning more about Earth’s climate and how its complex components regulate overall climate sensitivity, i.e., not just the rate of energy output, but also where the output occurs to make significant progress in understanding.

Reference: “Sublinear relationship between planetary outward longwave radiation and surface temperature in a gray atmosphere radiative-convective transport climate model” Jie Sun, Michael Secor, Ming Cai, Xiaoming Hu, November 25, 2023. Advances in atmospheric science.
DOI: 10.1007/s00376-023-2386-1

Source: scitechdaily.com

Making this simple dietary change may impact your blood pressure

New research shows that cutting back on salt can significantly lower your blood pressure, whether you have hypertension or are on medication. The study, which included 213 participants from diverse backgrounds, found that a low-salt diet lowered systolic blood pressure by an average of 7 mmHg. These results apply to a wide range of individuals and suggest that salt restriction is as effective as common hypertension medications in controlling blood pressure.

Research has shown that a low-salt diet significantly lowers blood pressure and is beneficial for people with and without high blood pressure, and even for people taking blood pressure medications.

  • Reducing sodium intake significantly lowered blood pressure in most people, even those who were already taking blood pressure medications.
  • The findings suggest that reducing sodium intake may have health benefits for a wide range of people.

Half of Americans have high blood pressure. If the systolic reading (the upper number, the pressure at which blood is pumped out of the heart) is consistently above 130 mm Hg, or the diastolic reading (the lower number, the pressure when the heart is filling with blood) Blood pressure is considered high if the pressure between heartbeats) exceeds 80 mm Hg. mmHg or higher.

Role of sodium in hypertension

Sodium is essential for the human body, but too much sodium can cause high blood pressure. However, blood pressure sensitivity to sodium varies from person to person. This makes it difficult to determine what counts as a healthy amount of sodium in someone’s diet. Also, most studies on low-salt diets exclude people who take blood pressure-lowering medications. Therefore, it is unclear how much salt reduction affects people taking these drugs.

Research on dietary sodium and blood pressure

An NIH-funded research team led by Dr. Deepak Gupta of Vanderbilt University Medical Center studied the effects of dietary sodium on blood pressure in 213 people (65% female, 64% black) between the ages of 50 and 75. Both normotensive and hypertensive participants were enrolled from April 2021 to February 2023 in Chicago, Illinois, and Birmingham, Alabama. Some were taking medication to control high blood pressure.

Participants were randomly assigned to either a high-sodium diet or a low-sodium diet for one week. Those on a high-sodium diet added 2,200 mg of sodium per day to their regular diet. Those on a low-salt diet were provided with a week’s worth of low-sodium meals, snacks, and drinks. This diet provided an average of 500 mg of sodium per day.

The researchers measured the participants’ blood pressure a week later. The participant was then switched to another diet for one week and her blood pressure was measured again. Blood pressure was the average value she measured over a 24-hour period during normal daily activities. The results were: Japan Automobile Manufacturers Association November 11, 2023.

Important discoveries and implications

Almost 75% of participants had lower systolic blood pressure on the low-sodium diet than on the high-sodium diet, with an average decrease of 7 mmHg. Compared to a regular diet, the low-sodium diet lowered systolic blood pressure in 72% of participants, with an average drop of 6 mmHg. The effect of dietary sodium did not depend on whether a person had high blood pressure in the first place. It was also unaffected by whether people were taking medication for high blood pressure.

This reduction in blood pressure can have significant health benefits. This finding supports reducing sodium in the diet to lower blood pressure. The effects of a low-salt diet were similar to those of common first-line drugs for hypertension. The results also suggest that reducing salt intake may help a wide range of people, including those already taking blood pressure-lowering drugs.

“Just as any physical activity is better than none for most people, reducing salt from your current normal diet is likely to be better than none,” says Gupta. To tell.

For more information about this study, see New study reveals universal blood pressure-lowering strategy.

Reference: “Effects of dietary sodium on blood pressure: a cross-over study”, Deepak K. Gupta, Cora E. Lewis, Krista A. Varady, Yan Ru Su, Meena S. Madhur, Daniel T. Lackland, Jared P. Reis , Thomas J. Wang, Donald M. Lloyd-Jones, Norina B. Allen, November 11, 2023, Japan Automobile Manufacturers Association.
DOI: 10.1001/jama.2023.23651

Funding: NIH’s National Heart, Lung, and Blood Institute (NHLBI), National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), National Cancer Institute (NCI), and National Center for the Advancement of Translational Sciences (NCATS). American Heart Association.

Source: scitechdaily.com

Chemists at MIT create vibrant organic molecules through synthesis

Researchers at MIT have made a groundbreaking development in the stability of acene, a molecule with potential for use in semiconductors and light-emitting diodes. This advancement has opened up possibilities for acene to emit light in a range of colors, leading to its potential use in solar cells and energy-efficient screens. Known as organic light-emitting diodes and promising for use in solar cells, acenes consist of chains of fused carbon-containing rings with unique optoelectronic properties.

However, the stability of acene has been challenging, as the length of the molecule determines the color of light it emits, and longer acenes tend to be less stable and therefore not widely used in light-emitting applications. Researchers at MIT have devised a new approach to address this issue, making the molecules more stable in order to synthesize acenes of various lengths and build molecules that emit red, orange, yellow, green, or blue light. This innovative approach allowed them to create acenes with positive charges that possess increased stability and unique electronic properties, making them suitable for a wide range of applications.

The new, stable acenes, doped with boron and nitrogen, can now emit light in different colors depending on their length and the type of chemical group attached to the carbodicarbene. This is a significant development, as traditional acene molecules tend to emit only blue light, while the ability to emit red light is vital for many applications, including biological processes such as imaging. The new acenes also exhibit stability in both air and water, a noteworthy feature that opens up possibilities for medical applications.

Furthermore, researchers are exploring the potential of acenes in various derivative forms and incorporating them into technologies such as solar cells and light-emitting diodes for use in screens. By combining creative research with non-traditional paradigms, the research holds promising implications for the development of air- and photostable luminescent materials and compact energy harvesting devices. This innovative work was supported by the Arnold and Mabel Beckman Foundation and the National Science Foundation’s Major Research Instrumentation Program.

Source: scitechdaily.com

Transforming Cardboard Waste into Sustainable Foam: The Packaging Revolutionized

This cardboard-based foam reinforced with gelatin has the potential to make packaging materials more sustainable.Credit: Gou Jingsheng

Eco-friendly cushion foam made from recycled cardboard provides a stronger, more insulating alternative to traditional packaging materials, providing a sustainable solution for the shipping industry.

The holiday season is in full swing and gifts of all shapes and sizes are being shipped all over the world. However, all packaging generates large amounts of waste, including cardboard boxes and plastic-based foam cushioning such as his Styrofoam™. Rather than throw those boxes away, researchers ACS Sustainable Chemistry and Engineering We developed cushion foam from cardboard waste. Their upcycled material was stronger and more insulating than traditional plastic foam-based cushions.

Turn common household waste into eco-friendly materials

Out of all the types of trash that accumulate in your home, paper waste is one of the most common. Especially as internet shopping has exploded in popularity, everything from newspapers and junk mail to cardboard envelopes and boxes can end up piling up. Researchers are interested in turning these containers and paper scraps into something else useful: durable, lightweight mail.

Today, molded cushioning materials such as Styrofoam are typically used to securely fit electronics and toys inside boxes. Lightweight cellulose aerogels are a possible sustainable alternative, but current methods of producing aerogels from waste paper require several chemical pretreatment steps. So Jinsheng Gou and colleagues wanted to find an easier way to create a waste paper-based foam material that could withstand even the toughest deliveries.

Innovative cardboard-based foam for added protection

To create the foam, the team crushed cardboard scraps in a blender to create a pulp, which they mixed with either gelatin or polyvinyl acetate (PVA) adhesive. The mixture was poured into molds, refrigerated, and then freeze-dried to form cushioning foam. Both paper-based foams acted as excellent insulators and strong energy absorbers, even better than some plastic foams.

The team then created a durable version of the wastepaper foam by combining pulp, gelatin, PVA adhesive, and a silica-based liquid that hardens when force is applied. This version’s cardboard-based foam withstood hammer impact without shattering. The results suggest that the foam could be used for deliveries that require force, such as airdrops without a parachute.

The researchers say their work provides a simple and efficient way to upcycle cardboard to create more environmentally friendly packaging materials.

Reference: “Biodegradable waste paper-based foam with ultra-high energy absorption, good insulation and good cushioning properties” Bin Zhang, Wenxuan Tao, Ziming Ren, Shiqi Yue, Jinsheng Gou, November 28, 2023 Day, ACS Sustainable Chemistry and Engineering.
DOI: 10.1021/acssuschemeng.3c06230

The authors acknowledge funding from the Beijing Key Research Institute of Wood Science and Engineering.

Source: scitechdaily.com

The unexpected connection between diet, diabetes, and mental well-being

New research reveals important links between nutrition, diabetes, and mental health. Poor dietary choices can put you at risk for developing type 2 diabetes and mental health problems such as depression and anxiety. Conversely, a diet rich in essential nutrients and low in processed foods can reduce these risks. The findings highlight the importance of informed dietary choices in the management and prevention of diabetes, anxiety, and depression and have implications for public health policy and medical practice.

A new literature review by researchers at the College of Public Health provides new insights into the relationship between nutrition and mental health.

According to the Centers for Disease Control and Prevention, people with diabetes (diabetes mellitus) are two to three times more likely to experience depression than people without diabetes. Current treatments include therapy, medication, or both.

However, understanding of the multifaceted relationship between nutrition, mental health, and DM is relatively new in scientific discussion. Mason researchers sought to learn about the relationship between nutrition, diabetes, and mental health.

The impact of nutrition on diabetes and mental health

Two literature reviews by Associate Professor Raedeh Basiri show that malnutrition plays a dual role in contributing to both the risk of developing type 2 diabetes and mental health effects such as anxiety and depression. I am. According to the results of this study, mental illnesses such as depression and anxiety increase his risk of developing type 2 diabetes, and diabetes is also associated with an increased risk of developing depression and anxiety. Nutritional interventions can help with both of these health issues.

“Our findings highlight that dietary choices play a vital role in reducing the risks associated with both diabetes and mental health. These findings The implications of these findings extend beyond the scientific community, as they are expected to inform public health policies, medical practices, and dietary recommendations that can positively impact people. ” said Basili, lead author of the paper.

Strengthen dietary choices for health and prevention

“This research ultimately aims to enable individuals to make informed health-promoting dietary choices, which will help prevent and manage diabetes, anxiety, and depression. It serves as a proactive strategy,” Basili said.

More specifically, the research team’s findings provide a comprehensive view of the relationship between dietary patterns, health impacts, and the important role of eating behavior in the context of type 2 diabetes and mental health. Masu.

The research team found that eating foods rich in fresh fruits and vegetables, whole grains, lean proteins, and low-fat dairy products may be associated with lower risk of type 2 diabetes and mental health disorders such as depression and anxiety. found that it was associated with lower risk. Conversely, a diet high in processed foods has been found to have negative effects, making you more likely to develop type 2 diabetes, depression, and anxiety.

The importance of a nutritious diet

Additionally, the researchers found that although people consume energy-dense foods, they lack essential nutrients such as omega-3 fatty acids, vitamin D, vitamin E, vitamin B6, vitamin B12, folic acid, selenium, chromium, and magnesium. I found the meals to be nutritious. It is associated with worsening of unfavorable symptoms in both mental health and the development of type 2 diabetes. This relationship highlights the importance of nutrient-dense food choices for overall health and well-being.

“Current scientific evidence highlights the potential benefits of adopting a balanced diet in reducing symptoms of anxiety and depression while enhancing glycemic control in people with diabetes.” said Basili.

References: “Exploring the interrelationships of diabetes, nutrition, anxiety, and depression: Implications for treatment and prevention strategies,” by Raedeh Basiri, Blessing Seidu, and Mark Rudich, September 29, 2023. nutrients.
DOI: 10.3390/nu15194226

“Key Nutrients for Optimal Glycemic Control and Mental Health in People with Diabetes: A Review of the Evidence,” by Raedeh Basiri, Blessing Seidu, and Lawrence J. Cheskin, September 9, 2023. nutrients.
DOI: 10.3390/nu15183929

Source: scitechdaily.com

The ISS Crew Stay Busy While Waiting for SpaceX’s Dragon to Navigate through Weather Conditions


This night view of southern Europe looks from Milan, Italy, northwest to southeast (bottom right), across the Adriatic Sea to Split, Croatia. At the time this photo was taken, the International Space Station was orbiting 423 miles above eastern France.
Credit: NASAExpedition 70 crews continue to pack up the U.S. cargo spacecraft for departure early next week. The seven residents living on the ship are international space station (ISS) has also explored virtual reality while providing various scientific and life support hardware services.
NASA and space x The undocking of the SpaceX Dragon cargo replenishment spacecraft from the International Space Station will be postponed to Sunday, Dec. 17, due to inclement weather as a cold front moves through the spray belt off the coast of Florida.The joint team will continue to assess weather conditions to determine the best opportunity for Dragon to autonomously leave the space station and determine the next available opportunity by 5:05 p.m. EST Monday, December 18th.The vibrant city lights of Tokyo were captured from the International Space Station, orbiting 421 miles above the sky.
Credit: NASAWeather permitting for Monday’s undock, coverage of Dragon’s departure will begin at 4:45 p.m. on the NASA streaming service. web or NASA app. The coverage will also be broadcast live on NASA Television. YouTubeand the agency’s Website. After re-entering the atmosphere, the spacecraft will fly off the coast of Florida, but the event will not be broadcast on NASA TV.
NASA astronaut Jasmine Moghbeli and NASA’s Satoshi Furukawa JAXA The Japan Aerospace Exploration Agency (Japan Aerospace Exploration Agency) has resumed transferring cargo freezers packed with science from the station’s Express Rack to Dragon. The two activated and configured the scientific freezer within the Dragon, securing biological samples stored for recovery and analysis on Earth.
Prior to this, Moghbeli replaced hardware in the Solution Crystallization Observation Facility, a research instrument that studies crystal morphology and growth. She also shook up mixing tubes containing seed samples for astrobotany research. Furukawa reconnected the power and communications units within the combustion research hardware in Kibo’s laboratory module.
Palm Jumeirah, an artificial island shaped like a palm tree, is a highlight of the city of Dubai in the United Arab Emirates in this nighttime photo taken from the International Space Station, which orbits 454 miles above the Persian Gulf. Masu.
Credit: NASAESA Commander Andreas Mogensen (european space agency) His day began with an experiment aimed at strengthening computer programming skills Promote STEM careers for students across the globe. Mogensen then donned virtual reality goggles and watched a 360-degree film to understand the stabilizing effects of the nervous system. VR mental care experiment.
NASA flight engineer Loral O’Hara spent the day performing laboratory maintenance throughout the orbiting outpost. She replaced orbital plumbing components, deployed a portable her fan inside the Tranquility module, and replaced a broken wireless antenna inside the Unity module.
The space station’s three astronauts remained focused on scientific activities and maintaining the orbital system. After breakfast, flight engineers Oleg Kononenko and Nikolai Chubb scanned the stomach again with an ultrasound machine. Roscosmos Research on spatial digestion. Mr. Kononenko relocated the eggs into incubators for biological experiments, and Mr. Chubb transferred the dismantled life support equipment from the Zarya module to Unity. Flight engineer Konstantin Borisov spent the morning working on the orbital plumbing and ended the day by photographing and inspecting the windows of the Zvezda service module.

Source: scitechdaily.com

The real cause of the degradation of Earth’s most magnificent creature

New study shows that humans, not climate, caused decline of megafauna 50,000 years ago

New research from Aarhus University confirms that it was humans, not climate, that caused the dramatic decline in large mammal populations over the past 50,000 years. Scientists have long debated whether humans or climate were to blame, but new DNA analysis of 139 extant large mammal species shows that climate cannot explain the decline.

About 100,000 years ago, the first modern humans migrated from Africa, settling in every type of terrain and hunting large animals using clever techniques and weapons. Unfortunately, this led to the extinction of many large mammals during the era of human colonization, and new research reveals that the surviving large mammals also experienced a dramatic decline.

According to Jens Christian Svenning, professor and director of the Danish National Research Foundation’s Center for New Biosphere Ecodynamics at Aarhus University, the populations of nearly all 139 species of large mammals declined about 50,000 years ago. DNA analysis shows that the decline is related to human dispersal rather than climate change.

This study used DNA analysis to map the long-term history of 139 large mammal species that have survived without extinction for the past 50,000 years, and scientists were able to estimate the population size of each species over time. The results are conclusive that human dispersal is the most likely cause of the decline in large mammal populations.

The study also showed that woolly mammoths are a poor example for climate-based models of extinction, as the vast majority of megafauna species that went extinct lived in temperate and tropical regions, not mammoth grasslands. Despite ongoing debate, the evidence strongly points to human activity rather than climate change as the main cause of the dramatic decline in large mammal populations.

Source: scitechdaily.com

NASA revives scientific endeavors in light of gyro challenge

Hubble drifts over Earth after being released by the crew of the Space Shuttle Atlantis on May 19, 2009. Service Mission 4 (SM4), the fifth visit by astronauts to the Hubble Space Telescope, was an undisputed success, with the crew performing all planned tasks during the five spacewalks. . Credit: NASA

Following the gyroscope issue, NASA successfully resumed scientific activities in hubble space telescopethe system works optimally.

NASA returned the agency’s Hubble Space Telescope to scientific operations on December 8th. The telescope temporarily suspended scientific observations on November 23 due to a problem with one of its gyros. The spacecraft is in good health and operating again using all three of her gyros.

NASA has decided to return the agency’s Hubble Space Telescope to science operations after a series of tests to determine the performance of the gyro that caused the spacecraft to suspend scientific operations.

After analyzing the data, the research team determined that scientific activities could resume under the control of the three gyros. Based on the performance observed during testing, the team decided to operate the gyro in a higher precision mode during scientific observations. Hubble’s instruments and the observatory itself remain stable and healthy.

Hubble’s two primary cameras, Wide Field Camera 3 and Advanced Survey Camera, resumed scientific observations on December 8th. The team plans to restore operation of the Cosmic Origins Spectrograph and Space Telescope Imaging Spectrometer later this month.

Hubble orbits more than 300 miles above Earth as seen from the Space Shuttle. Credit: NASA

About the Hubble Space Telescope

Launched in 1990, the Hubble Space Telescope is a wonder of modern astronomy, orbiting Earth and capturing unprecedented views of the universe. Unlike ground-based telescopes, Hubble operates above the distortions of Earth’s atmosphere, providing clear images of distant galaxies, nebulae, and other celestial phenomena.

Its discoveries have revolutionized our understanding of the universe, from understanding the universe’s accelerating expansion to capturing the most detailed view of the solar system’s planets. Hubble’s longevity and adaptability have made it one of the most important instruments in the history of astronomy, and it continues to push the frontiers of our cosmic knowledge.

Source: scitechdaily.com

Caltech Researchers Introduce Novel Error-Correction Technique for Quantum Computers

Researchers at the California Institute of Technology have developed a quantum erasure device to correct “erasure” errors in quantum computing systems. The technique allows fluorescent error detection and correction by manipulating alkaline earth neutral atoms with laser light “tweezers.” This innovation leads to a 10-fold increase in the entanglement rate of Rydberg neutral atomic systems, and is an important step forward in making quantum computers more reliable and scalable.

For the first time, researchers have successfully demonstrated the identification and removal of “erasure” errors.

Future quantum computers are expected to revolutionize problem-solving in a variety of fields, including creating sustainable materials, developing new drugs, and solving complex problems in fundamental physics. However, these pioneering quantum systems are more error-prone than the classical computers we use today. Wouldn’t it be great if researchers could whip out a special quantum eraser and remove mistakes?

Report in magazine Nature, A group of researchers led by the California Institute of Technology has demonstrated for the first time a type of quantum erasure device. Physicists have shown that mistakes can be pinpointed and corrected. quantum computing A system known as an “erasure” error.

“Typically, it’s very difficult to detect errors in quantum computers, because just the act of looking for errors creates more errors,” said Manuel Endres, co-lead author of the new study and co-author of the study. says Adam Shaw, a graduate student in the room. Professor of Physics at California Institute of Technology. “However, we found that with careful control, certain errors can be precisely identified and erased without significant impact. This is where the name erasure comes from.”

How quantum computing works

Quantum computers are based on the physical laws that govern the subatomic realm, such as entanglement, a phenomenon in which particles mimic each other while remaining connected without direct contact. In the new study, researchers focused on a type of quantum computing platform that uses arrays of neutral atoms, or atoms that carry no electric charge. Specifically, they manipulated individual alkaline earth neutral atoms trapped inside “tweezers” made with laser light. The atoms are excited to a high-energy state, or “Rydberg” state, and neighboring atoms begin to interact.

Errors are typically difficult to spot in quantum devices, but researchers have shown that if carefully controlled, some errors can cause atoms to emit light. The researchers used this ability to perform quantum simulations using atomic arrays and laser beams, as shown in this artist’s concept. Experiments show that quantum simulations can be run more efficiently by discarding erroneous atoms that are glowing.Credit: Caltech/Lance Hayashida

“The atoms in our quantum systems interact with each other and generate entanglements,” said the study’s other co-lead author, a former postdoctoral fellow at the California Institute of Technology and now at a French quantum computing company. Pascal Scholl, who works at PASQAL, explains.

Entanglement is what allows quantum computers to outperform classical computers. “But nature doesn’t like to stay in this entangled state,” Scholl explains. “Eventually an error will occur and the entire quantum state will be destroyed. You can think of these entangled states like a basket full of apples, where the atoms are the apples. Over time , some apples will start to rot. If you don’t remove these apples from the basket and replace them with fresh apples, all the apples will quickly rot. It’s not clear how to completely prevent these errors from occurring. Therefore, the only viable option at this time is to detect and remediate them.”

Innovation in error detection and correction

The new error-trapping system is designed so that atoms with errors fluoresce, or glow, when hit by a laser. “We have images of glowing atoms that show us where the errors are, so we can either exclude them from the final statistics or actively correct them by applying additional laser pulses.” says Scholl.

Implementation theory of erasure detection in neutral atom The system was first developed by Jeff Thompson, a professor of electrical and computer engineering. princeton university, and his colleagues.The team recently reported a demonstration of the technique in the journal Nature.

The Caltech team says that by removing and identifying errors in the Rydberg atomic system, the overall rate of entanglement, and therefore fidelity, can be improved. In the new study, the researchers report that only one out of every 1,000 pairs of atoms failed to entangle. This is a 10-fold improvement over what was previously achieved and the highest entanglement rate ever observed for this type of system.

Ultimately, these results bode well for quantum computing platforms that use Rydberg neutral atomic arrays. “Neutral atoms are the most scalable type of quantum computer, but until now they have not had the high degree of entanglement fidelity,” Shaw says.

References: “Elimination Transformations in High-Fidelity Rydberg Quantum Simulators” Pascal Scholl, Adam L. Shaw, Richard Bing-Shiun Tsai, Ran Finkelstein, Joonhee Choi, Manuel Endres, October 11, 2023. Nature.
DOI: 10.1038/s41586-023-06516-4

The research was funded by the National Science Foundation (NSF) through the Institute for Quantum Information and Materials (IQIM), based at the California Institute of Technology. Defense Advanced Research Projects Agency. NSF Career Award. Air Force Office of Scientific Research. NSF Quantum Leap Challenge Laboratory. Department of Energy’s Quantum Systems Accelerator. Fellowships in Taiwan and California Institute of Technology. and a Troesch Postdoctoral Fellowship. Other Caltech-related authors include graduate student Richard Bing-Shiun Tsai; Ran Finkelstein, Troesch Postdoctoral Research Fellow in Physics. Former postdoc Joonhee Choi is now a professor at Stanford University.

Source: scitechdaily.com

An Intriguing Puzzle: Deja Vu

Déjà vu, the feeling of reliving an experience, is a subject that intrigues many people. Recent scientific research suggests that this phenomenon may be caused by spatial similarities between the new scene and the unrecalled memory. Various studies, including those using virtual reality, aim to learn more about the causes of déjà vu. Credit: SciTechDaily.com

What is déjà vu? Psychologists are investigating this eerie feeling that you may have already experienced before.

Have you ever felt that strange feeling? I went through the exact same situation before, even if it’s impossible? Sometimes it even seems like you are reliving something that has already happened. This phenomenon, known as déjà vuIt baffled philosophers, but neurologistand Writer for for a very long time.

Since the late 1800s, Many theories began to emerge About the cause of “déjà vu”, which means “already seen” in French. People thought maybe it was due to mental dysfunction, or maybe some kind of brain problem. Or maybe it was a temporary glitch in the normal workings of human memory. However, this topic has only recently reached the realm of science.

Transition from paranormal to science

At the beginning of this century, a scientist named Alan Brown All reviews written by researchers about Déjà Vu Until that point. Much of what he found had a paranormal flavour, relating to past lives, psychic powers, and other supernatural things. But he also found studies that surveyed ordinary people about their déjà vu experiences. From all these papers, Brown was able to glean some fundamental discoveries about the phenomenon of déjà vu.

For example, Brown determined that approximately two-thirds of people experience deja vu at some point in their lives. He determined that the most common trigger for déjà vu was a scene or location, and the second most common trigger was a conversation. He also reported hints across a century or so of medical literature about a possible link between déjà vu and certain types of seizure activity in the brain.

Brown’s book review brought the topic of déjà vu into the realm of more mainstream science. This is because these are the scientific journals that scientists who study cognition tend to read; in the book Intended for scientists. His research inspired scientists to design experiments to investigate déjà vu.

The layout of your new place may be very similar to places you’ve visited before, but you may not consciously remember it. Credit: SciTechDaily.com

Testing déjà vu in a psychology lab

Inspired by Brown’s work, my own research team began an experiment aimed at testing hypotheses about the mechanism of déjà vu.we investigated a nearly century-old hypothesis It suggests that déjà vu can occur when there is a spatial similarity between the current scene and a scene that cannot be recalled in memory. Psychologists called this the Gestalt affinity hypothesis.

For example, suppose you are on your way to visit a sick friend and pass a nursing station in a hospital ward. You had never been to this hospital before, but you had a certain feeling. The root cause of this feeling of déjà vu may be that the layout of the scene, including the placement of furniture and certain objects in the space, is the same layout as another scene experienced in the past.

Perhaps the way the nurse’s station is arranged – the way it is connected to the furniture, the items on the counter and the corners of the hallway – may be the same as the way a set of welcome tables are arranged in relation to the signs and furniture in a hospital corridor. not. Admission to a school event I attended a year ago. According to the Gestalt familiarity hypothesis, only a strong sense of familiarity may remain in a current situation if no previous situation with a similar layout to the current situation comes to mind.

To investigate this idea in the lab, my team used virtual reality to place people in a scene. This allows people to manipulate the environment in which they find themselves. Some scenes share the same spatial layout, while others are distinct. As I expected, There was a high possibility of déjà vu occurring. When people are in a scene that contains the same spatial arrangement of elements as a previous scene that they have seen but do not remember.

This study suggests that one factor that causes déjà vu may be the spatial similarity of a new scene to a scene in memory that is not consciously recalled at that moment. However, spatial similarity is not the only cause of deja vu. Many factors can contribute to making a scene or situation feel familiar. Further research is underway to investigate additional factors that may be involved in this mysterious phenomenon.

Written by Ann Cleary, Professor of Cognitive Psychology, Colorado State University.

This article was first published conversation.

Source: scitechdaily.com

Harvard team makes significant strides in error correction technology

Quantum computing has advanced significantly with a new platform from Harvard University that is capable of dynamic reconfiguration and can demonstrate low error rates in two-qubit entangled gates. This breakthrough, highlighted in a recent Nature paper, represents a major advance in overcoming the challenges of quantum error correction and places Harvard’s technology alongside other leading quantum computing methods. Masu. This research, in collaboration with MIT and others, represents an important step toward scalable, error-correcting quantum computing. Credit: SciTechDaily.com

A method developed by a team at Harvard University to reduce errors addresses a critical hurdle in scaling up technology.

Quantum computing technology has the potential to achieve unprecedented speed and efficiency, vastly exceeding the capabilities of even the most advanced supercomputers currently available. However, this innovative technology has not been widely scaled or commercialized, primarily due to inherent limitations in error correction. Quantum computers, unlike classical computers, cannot correct errors by copying encoded data over and over again. Scientists had to find another way.

Now, a new paper Nature depicting Harvard University quantum computing A potential platform to solve a long-standing problem known as quantum error correction.

The Harvard team is led by quantum optics expert Mikhail Lukin, Joshua and Beth Friedman Professor of Physics and co-director of the Harvard Quantum Initiative. The research reported in Nature was a collaboration between Harvard University. Massachusetts Institute of Technology, Boston-based QuEra Computing. George Busmer Leverett Professor of Physics and Marcus Greiner’s group also participated.

Unique Harvard Platform

The Harvard University platform, an effort over the past several years, is built on an array of very cold rubidium atoms captured by a laser.Each atom They act as bits (called “qubits” in the quantum world) that can perform extremely fast calculations.

The team’s main innovation is configuring a “neutral atomic array” so that the layout can be dynamically changed by moving and connecting atoms during calculations. This is called “entanglement” in physics terms. 2 Operations that entangle pairs of atoms called qubit logic gates are units of computing power.

Running complex algorithms on a quantum computer requires many gates. However, these gating operations are known to be error-prone, and the accumulation of errors renders the algorithm useless.

In a new paper, the team reports near-perfect performance of the two-qubit entanglement gate with extremely low error rates. For the first time, they demonstrated the ability to entangle atoms with an error rate of less than 0.5 percent. In terms of operational quality, this puts the performance of the company’s technology on par with other major types of quantum computing platforms, such as superconducting qubits and trapped ion qubits.

Benefits and future prospects

However, Harvard’s approach has significant advantages over these competitors due to its large system size, efficient qubit control, and the ability to dynamically reconfigure the atomic layout.

“We demonstrate that the physical errors of this platform are low enough that we can actually imagine large-scale error correction devices based on neutral atoms,” said lead author and Harvard University Griffin School of Arts and Sciences. student Simon Evered said. group. “Currently, our error rates are low enough that if we group atoms into logical qubits (information is stored non-locally between the constituent atoms), we can Errors can be even lower than individual atoms.”

The Harvard team’s progress was tracked by former Harvard graduate student and current princeton university, former Harvard University postdoctoral fellow Manuel Endres, now at the California Institute of Technology. Taken together, these advances lay the foundation for quantum error correction algorithms and large-scale quantum computing. All of this means that quantum computing on neutral atomic arrays is reaching its full potential.

“These contributions open the door to very special opportunities in scalable quantum computing, and truly exciting times ahead for the field as a whole,” Lukin said.

Reference: “High-fidelity parallel entanglement gates on neutral atom quantum computers” Simon J. Evered, Dolev Bluvstein, Marcin Kalinowski, Sepehr Ebadi, Tom Manovitz, Hengyun Zhou, Sophie H. Li, Alexandra A. Geim, Tout T Wang, Nishad Maskara, Harry Levine, Julia Semeghini, Markus Greiner, Vladan Vretić, Mikhail D. Lukin, October 11, 2023. Nature.
DOI: 10.1038/s41586-023-06481-y

This research was supported by the U.S. Department of Energy’s Quantum Systems Accelerator Center. Ultracold Atom Center. National Science Foundation. Army Research Office Interdisciplinary University Research Initiative.And thatDARPAOptimization with a noisy intermediate-scale quantum device program.

Source: scitechdaily.com

The Impact of Plasma Instability on Our Understanding of the Universe

Scientists have discovered a new instability in plasma, revolutionizing our understanding of cosmic rays. This groundbreaking discovery reveals that cosmic rays generate electromagnetic waves within plasma and influence their paths. This collective behavior of cosmic rays, similar to waves formed by water molecules, challenges previous theories and holds promise for insights into intragalactic cosmic ray transport and its role in galaxy evolution. Credit: SciTechDaily.com

Scientists at the Potsdam Leibniz Institute for Astrophysics (AIP) have discovered a new substance. plasma This instability is expected to revolutionize our understanding of the origin of cosmic rays and their dynamic impact on galaxies.

At the beginning of the last century, Victor Hess discovered a new phenomenon called cosmic rays, for which he was later awarded the Nobel Prize. He conducted high-altitude balloon flights and discovered that the Earth’s atmosphere was not ionized by ground radiation. Instead, he confirmed that the origin of ionization was extraterrestrial. Later, it was discovered that cosmic “rays” are composed of charged particles that travel from space at speeds close to the speed of light. radiation. However, the name “cosmic rays” outlasted these discoveries.

Recent advances in cosmic ray research

In the new study, AIP scientist and lead author of the study, Dr. Mohammad Shalaby, and his collaborators performed numerical simulations to trace the trajectories of many cosmic ray particles, showing that these particles We studied how the plasma interacts with the surrounding plasma, which is made up of electrons and electrons. proton.

Simulation of cosmic rays flowing in the opposite direction to the background plasma and causing plasma instability. The distribution of background particles in response to streaming cosmic rays is shown in phase space spanned by the particle’s position (horizontal axis) and velocity (vertical axis). Color visualizes number density, and holes in phase space represent the highly dynamic nature of instabilities that break up ordered motion into random motion. Credit: Shalaby/AIP

When researchers studied cosmic rays flying from one side of the simulation to the other, they discovered a new phenomenon that excites electromagnetic waves in the background plasma. These waves exert a force on the cosmic rays, causing them to change their meandering paths.

Understanding cosmic rays as a collective phenomenon

Most importantly, this new phenomenon is best understood if we think of cosmic rays as supporting collective electromagnetic waves rather than acting as individual particles. When these waves interact with the background fundamental waves, they are strongly amplified and a transfer of energy occurs.

“This insight allows us to think of cosmic rays in this context as behaving more like radiation than individual particles, as Victor Hess originally believed,” said AIP Cosmology and High Energy Astrophysics. says Professor Christoph Pfrommer, head of the section. .

Momentum distribution of protons (dashed lines) and electrons (solid lines). The appearance of a high-energy electron tail in a slowly moving shock is shown. This is the result of interactions with electromagnetic waves caused by newly discovered plasma instabilities (red) that are absent from faster shocks (black). This shows the importance of understanding the physics of the acceleration process, since only high-energy electrons produce observable radio radiation. Credit: Shalaby/AIP

A good analogy for this behavior is that individual water molecules come together to form waves that break on the shore. “This progress was only made possible by taking into account smaller scales, which had been overlooked until now and called into question the use of effective fluid dynamics theory when studying plasma processes,” explains Dr. Mohammad Shalaby. To do.

Meaning and application

This newly discovered plasma instability has many applications, including the first study of how electrons from thermal interstellar plasma are accelerated to high energies in supernova remnants. It also includes an explanation.

“This newly discovered plasma instability represents a major advance in our understanding of acceleration processes and finally explains why these supernova remnants glow in radio waves and gamma rays.” Mohammad Shalaby reports.

Moreover, this breakthrough opens the door to a deeper understanding of the fundamental processes of cosmic ray transport in galaxies. This represents the biggest mystery in understanding the processes that form galaxies during the evolution of the universe.

References:

“Deciphering the physical basis of mesoscale instability” by Mohammad Shalaby, Timon Thomas, Christoph Pfrommer, Reuven Lemmerz, and Virginia Bresci, December 12, 2023, Plasma Physics Journal.
DOI: 10.1017/S0022377823001289

“Mechanism of efficient electron acceleration in parallel non-relativistic shocks” by Mohammad Shalaby, Reuven Lemmerz, Timon Thomas, and Christoph Pfromer, May 4, 2022, Astrophysics > High-energy astrophysical phenomena.
arXiv:2202.05288

“New Cosmic Ray Instabilities” by Mohammad Shalaby, Timon Thomas, and Christoph Pfrommer, February 24, 2021, of astrophysical journal.
DOI: 10.3847/1538-4357/abd02d

Source: scitechdaily.com