Physicists Start Construction of Groundbreaking Graviton Detector

Igor Pikovsky, a physicist at Stevens Institute of Technology, along with his team, is pioneering an innovative experiment aimed at capturing individual gravitons—particles previously believed to be nearly undetectable. This groundbreaking work signals a new era in quantum gravity research.



Expected detection of single graviton signatures from gravitational waves in future experiments. Image credit: I. Pikovski.

Modern physics faces a significant challenge. The two foundational pillars—quantum theory and Einstein’s general theory of relativity—appear contradictory at a glance.

While quantum theory depicts nature through discrete quantum particles and interactions, general relativity interprets gravity as the smooth curvature of space and time.

A true unification demands that gravity be quantum in nature, mediated by particles called gravitons.

For a long time, detecting even a single graviton was deemed nearly impossible.

Consequently, the problem of quantum gravity has mostly remained a theoretical concept, with no experimental framework for a unified theory in view.

In 2024, Dr. Pikovsky and his collaborators from Stevens Institute of Technology, Stockholm University, Okinawa Institute of Science and Technology, and Nordita demonstrated that *detecting gravitons* is indeed feasible.

“For ages, the idea of detecting gravitons seemed hopeless, which is why it wasn’t considered an experimental question,” Pikovsky stated.

“Our findings indicate that this conclusion is outdated, especially with today’s advanced quantum technologies.”

The breakthrough stems from a fresh perspective that combines two pivotal experimental innovations.

The first is the detection of gravitational waves—ripples in spacetime generated by collisions between black holes and neutron stars.

The second innovation is the advancement in quantum engineering. Over the last decade, physicists have mastered the cooling, control, and measurement of larger systems in true quantum states, leading to extraordinary quantum phenomena beyond the atomic scale.

In a landmark experiment in 2022, a team led by Yale University professor Jack Harris showcased the control and measurement of individual vibrational quanta of superfluid helium exceeding 1 nanogram in weight.

Dr. Pikovsky and his co-authors realized that by merging these two advancements, it becomes possible to absorb and detect a single graviton. A passing gravitational wave could, theoretically, transfer exactly one quantum of energy (or one graviton) into a sufficiently large quantum system.

The resulting energy shift may be minimal but manageable. The primary hurdle lies in the fact that gravitons seldom interact with matter.

Nevertheless, in quantum systems scaled to the kilogram level, it is feasible to absorb a single graviton in the presence of strong gravitational waves generated by black hole or neutron star mergers.

Thanks to this recent revelation, Dr. Pikovsky and Professor Harris are collaborating to construct the world’s first experiment specifically designed to detect individual gravitons.

With backing from the WM Keck Foundation, they are engineering centimeter-scale superfluid helium resonators, moving closer to the conditions needed to absorb single gravitons from astrophysical gravitational waves.

“We already possess essential tools; we can detect single quanta in macroscopic quantum systems; it’s merely a matter of scaling up,” Professor Harris elaborated.

The objective of this experiment is to immerse a gram-scale cylindrical resonator within a superfluid helium container, cool the setup to the quantum ground state, and utilize laser-based measurements to detect individual phonons (the vibrational quanta transformed from gravitons).

This detector builds upon an existing laboratory system while advancing into uncharted territory—scaling masses to the gram level while maintaining exceptional quantum sensitivity.

Successfully demonstrating this platform sets the stage for the next iteration, which will be optimized for the sensitivity required to achieve direct detection of gravitons, thus opening new experimental avenues in quantum gravity.

“Quantum physics began with controlled experiments involving light and matter,” Pikovsky noted.

“Our current aim is to bring gravity into this experimental domain and investigate gravitons much like physicists studied photons over a century ago.”

Source: www.sci.news

Physicists Question Long-Standing Beliefs on Dark Matter’s True Nature

New insights challenge the long-held belief that dark matter was “cold” in the immediate aftermath of the Big Bang. A groundbreaking study from the University of Minnesota Twin Cities and the University of Paris-Saclay reveals that dark matter particles might have been extraordinarily hot and traveling at near-light speeds in the primordial universe, before cooling down during the formative epochs of galaxies and large-scale structures.



Hypothetical dark matter particles. Image credit: University of Adelaide.

For decades, physicists have categorized dark matter based on the velocity of its constituent particles. Cold dark matter is slow enough to clump under gravitational forces, contributing to the formation of galaxies and galaxy clusters.

This categorization is a cornerstone of the standard cosmological model, explaining the universe’s intricate web-like structure.

However, the recent findings indicate that dark matter may have emerged from the hot plasma of the early universe in an ultrarelativistic state—essentially moving at ultra-high speeds—before cooling adequately during the formation of cosmic structures.

This refined perspective broadens the potential behaviors of dark matter particles and expands the pool of candidate particles physicists can investigate through experiments and astronomical observations.

The study concentrates on a critical phase in the early universe known as reheating, which followed an explosive inflationary expansion.

During the reheating phase, the energy fueling the universe’s expansion transformed into a dense hot mixture of particles and radiation.

This discovery suggests that under certain conditions, dark matter produced during this period could exist at speeds approaching that of light while still aligning with the vast universe we observe today.

If validated, these findings could significantly impact ongoing dark matter detection initiatives, including particle colliders, underground detectors, and astrophysical studies.

Moreover, they pose new theoretical challenges regarding the fundamental nature of dark matter and its role in the universe’s evolution.

“Dark matter remains one of the biggest mysteries in physics,” explains Stephen Henrik, a graduate student at the University of Minnesota.

“Historically, one consistent assumption has been that dark matter must be cold at its inception in the primordial universe.”

“Our findings reveal a different narrative. In fact, dark matter may start off as red-hot, but has ample time to cool before galaxies commence formation.”

“The simplest dark matter candidate, low-mass neutrinos, was deemed incompatible decades ago because they could annihilate galaxy-sized structures instead of facilitating them,” states Keith Olive, a professor at the University of Minnesota.

“Neutrinos serve as a prime example of hot dark matter, whose structural formation relies on cold dark matter.”

“If a similar candidate arose during the hot Big Bang, it’s remarkable that it could cool sufficiently to behave as cold dark matter.”

“This new discovery allows us to explore a period in the universe’s history that is very close to the Big Bang,” adds Professor Yann Mambrini, a physicist at the University of Paris-Saclay.

The team’s research has been published in the journal Physical Review Letters.

_____

Stephen E. Henrik et al. 2025. Ultra-relativistic freezeout: Bridge from WIMP to FIMP. Physics Review Letters 135, 221002; doi: 10.1103/zk9k-nbpj

Source: www.sci.news

2025 Breakthrough: Physicists Discover Dark Photons, Transforming Our Understanding of Physics

Dark photons in quantum physics

Dark Photons: A New Explanation for the Double-Slit Experiment

Russell Kightley/Science Photo Library

This year, a fundamental aspect of quantum theory faced scrutiny when researchers introduced a groundbreaking interpretation of an experiment exploring the nature of light.

Central to this research was the historic double-slit experiment, first conducted by physicist Thomas Young in 1801, which confirmed the wave-like behavior of light. Conventionally, particles and waves are considered distinct; however, in the quantum realm, they coexist, showcasing wave-particle duality.

For years, light stood as the quintessential example of this duality. Experimentation demonstrated that light can exhibit particle-like behavior as photons and wave-like characteristics, culminating in interference patterns reminiscent of Young’s findings. However, earlier in 2023, Celso Villas Boas and his team at Brazil’s Federal University of São Carlos proposed a novel interpretation of the double-slit experiment, exclusively utilizing photons and negating the wave aspect of optical duality.

After New Scientist covered their study, the team received significant interest from peers, with citations soaring. Villas-Boas shared, “I’ve received numerous invitations to present, including events in Japan, Spain, and Brazil,” emphasizing the widespread intrigue.

In the traditional double-slit experiment, an opaque barrier containing two narrow slits is positioned between a screen and a light source. Light travels through the slits to create a pattern of alternating bright and dark vertical stripes, known as classical interference, usually attributed to colliding light waves.

The researchers shifted away from this conventional explanation, examining the so-called dark state of photons—a unique quantum state that prevents interaction with other particles, hence not illuminating the screen. This perspective eliminates the necessity for light waves to clarify the observed dark stripes.

This reevaluation challenges a deeply ingrained view of light within quantum physics. Many educators expressed concern, with some remarking, “Your findings challenge the foundational concepts I’ve taught for years.” However, while some colleagues embraced the new perspective, others remained skeptically intrigued, following New Scientist‘s initial report.

Villas-Boas has been actively exploring implications surrounding the dark state of photons. His investigations revealed that thermal radiation, such as sunlight, can reside in a dark state, concealing a substantial portion of its energy due to a lack of interaction with other objects. Experimental validation could involve placing atoms in cavities where their interactions with light are meticulously examined, according to Villas-Boas.

His team’s reinterpretation of interference phenomena facilitates comprehension of previously perplexing occurrences, such as non-overlapping wave interactions. Moving beyond the wave model to incorporate distinct bright and dark photon states opens avenues for innovative applications. Villas-Boas envisions potential developments such as light-controlled switches and devices that selectively permit specific light types to pass.

In his view, all these explorations connect back to the essential principles of quantum physics, highlighting that engaging with quantum objects necessitates understanding their interactions with measurement devices—encompassing darkness itself. “This concept is intrinsic to quantum mechanics,” Villas-Boas asserts.

Topics:

Source: www.newscientist.com

Physicists Reject the Existence of Sterile Neutrinos

Researchers within the MicroBooNE (Micro Booster Neutrino) collaboration have determined, with 95% probability, that a single sterile neutrino does not exist.

Utilizing data from the MicroBooNE detector, physicists announce one of the preliminary searches for sterile neutrinos with two accelerator neutrino beams. Image credit: Gemini AI.

Neutrinos are tiny subatomic particles that seldom interact with matter, allowing them to traverse the Earth without being impeded.

Current particle physics theory recognizes three types of neutrinos: electron, muon, and tau neutrinos.

These neutrinos can transform from one type to another, a phenomenon known as oscillation.

Previous experiments had revealed neutrinos that seemed to oscillate in ways not consistent with the standard model.

To clarify this anomaly, scientists suggested a fourth type: sterile neutrinos, which interact only through gravity, complicating their detection.

“The existence of three distinct flavors of neutrinos is a fundamental aspect of the Standard Model of particle physics,” explained Dr. Andrew Mastbaum, a physicist from Rutgers University and a member of the MicroBooNE leadership team.

“Because of quantum mechanical interference, neutrinos of one flavor can eventually be detected as a different flavor, a phenomenon known as neutrino oscillation.”

“Numerous unusual findings that challenge the three-flavor model have led us to postulate the existence of an additional neutrino state, referred to as a ‘sterile’ neutrino, which does not directly interact with matter.”

In the experiment conducted by MicroBooNE, physicists investigated neutrinos from two distinct beams and analyzed their oscillations.

After a decade of data gathering and scrutiny, they uncovered no evidence of sterile neutrinos, effectively rejecting one of the leading theories for the peculiarities observed in neutrino behavior.

“This result signifies a pivotal moment,” remarked Dr. Mastbaum.

“It will ignite innovative ideas in neutrino research, helping us to better comprehend the underlying phenomena.”

“While we can rule out major possibilities, this alone does not unravel the entire mystery.”

“The Standard Model does not encompass everything, such as dark matter, dark energy, and gravity, prompting scientists to seek clues that extend beyond the model,” he observed.

“By dismissing one potential explanation, we can concentrate on alternative hypotheses that may yield significant advancements in our understanding of the universe.”

The findings will also provide valuable insights for forthcoming experiments, like the Deep Underground Neutrino Experiment (DUNE).

“Through meticulous modeling and a strategic analytical approach, the MicroBooNE team has extracted an extraordinary amount of information from this detector,” stated Dr. Mastbaum.

“In next-generation projects like DUNE, we are already utilizing these techniques to explore even more fundamental questions about the essence of matter and the nature of the universe.”

of the team results published in the journal Nature.

_____

Collaboration with MicroBooNE. 2025. Search for photosterile neutrinos using two neutrino beams with MicroBooNE. Nature 648, 64-69; doi: 10.1038/s41586-025-09757-7

Source: www.sci.news

Physicists Discover Universal Law Governing How Objects Fracture

Sure! Here’s a rewritten version of the content while preserving the HTML tags:

How many pieces can a dropped vase break into?

Imaginechina Limited / Alamy

The physics behind a dropped plate, a crumbled sugar cube, and a shattered glass shows striking similarities regarding how many pieces result from each object breaking.

For decades, researchers have recognized a universal behavior related to fragmentation, where objects break apart upon falling or colliding. If one counts the fragments of varying sizes and plots their distribution, a consistent shape emerges regardless of the object that is broken. Emmanuel Villemaux from the University of Aix-Marseille in France has formulated equations to illustrate these shapes, thereby establishing universal laws of fragmentation.

Instead of concentrating on the appearance of cracks leading to an object’s breakup, Villermaux employed a broader approach. He considered all potential fragment configurations that could result in shattering. Some configurations produce precise outcomes, such as a vase breaking into four equal parts; however, he focused on capturing the most probable set that represents chaotic breakage, namely the one with the highest entropy. This mirrors methods used to derive laws concerning large aggregates of particles in the 19th century, he notes. Villermaux also applied the principles of physics that govern changes in fragment density during shattering, knowledge previously uncovered by him and his colleagues.

By integrating these two elements, they succeeded in deriving a straightforward equation that predicts the size distribution of fragments in a broken object. To verify its accuracy, Villermaux compared it against a number of earlier experiments involving glass rods, dry spaghetti, plates, ceramic tubes, and even fragments of plastic submerged in water and waves crashing during stormy weather. Overall, the fragmentation patterns observed in each of these experiments conformed to his novel law and reflected the universal distribution shapes previously noted by researchers.

He also experimented by dropping objects from varying heights to crush sugar cubes. “This was a summer endeavor with my daughters. I had done it a long time ago when they were young, and later revisited the data to further illustrate my concept,” Villermaux explains. He observes that this equation fails to hold when randomness is absent, or the fragmentation process is overly uniform, as occurs when a liquid stream divides into uniform droplets based on the deterministic rules of fluid dynamics, or in instances when fragments engage with each other during fragmentation.

Mr. Ferenc and his colleagues at the University of Debrecen in Hungary argue that the graphical pattern highlighted in Villermaux’s analysis is so fundamentally universal that it may derive from a more extensive principle. Simultaneously, they express surprise at how broadly applicable it is, as well as its adaptability to accommodate specific variations, such as in plastics where cracking can be “healed.”

Fragmentation is not merely a captivating challenge in physics; a deeper understanding could significantly impact energy expenditures in mining operations or guide preparations for increasing rockfalls in mountainous areas as global temperatures tend to rise, Kuhn remarks.

Looking ahead, it may prove beneficial to explore not only the sizes of the fragments but also their shape distributions, suggests Kuhn. Additionally, identifying the smallest conceivable size of a fragment remains an unresolved issue, according to Villermaux.

Topic:

Source: www.newscientist.com

Physicists Suggest a Cosmic ‘Knot’ Could Have Influenced the Early Universe Briefly

Knots are prevalent in various fields of mathematics and physics today. A collaborative team of Japanese and German physicists proposes the existence of a “knot-dominated epoch” in the universe’s early days, suggesting that knots were essential building blocks during this time. This intriguing hypothesis can be investigated through gravitational wave observations. Additionally, they theorize that the conclusion of this period will involve the collapse of the knot due to quantum tunneling, leading to an Asymmetry between matter and antimatter in space.



Model proposed by Eto et al.. It suggests a brief, knot-dominated epoch when these intertwined energy fields outweighed everything else, a scenario that can be investigated through gravitational wave signals. Image credit: Muneto Nitta / Hiroshima University.

Mathematically, knots are defined as closed curves embedded in three-dimensional space and can be found not just in tying neckties but across numerous scientific disciplines today, as noted by Lord Kelvin.

Although his theory postulated that atoms are knots of etheric vortices was ultimately refuted, it sparked advancements in knot theory and its application in multiple areas of physics.

“Our study tackles one of the core mysteries of physics: why the universe is predominantly composed of matter rather than antimatter,” remarked Professor Munehito Nitta, a physicist at Hiroshima University and Keio University.

“This question is crucial as it relates directly to the existence of stars, galaxies, and ourselves.”

“The Big Bang was expected to produce equal amounts of matter and antimatter, with the intent that each particle would annihilate its counterpart, leaving only radiation.”

“Yet, the universe is overwhelmingly composed of matter, with only trace amounts of antimatter.”

“Calculations indicate that to achieve the matter we see today, only one extra particle of matter is needed for every billion matter-antimatter pairs.”

“Despite its remarkable achievements, the Standard Model of particle physics fails to resolve its inconsistencies.”

“That prediction is significantly off.”

“Unraveling the origin of the slight excess of matter, a phenomenon known as baryogenesis, remains one of the greatest unresolved enigmas in physics.”

By merging the measured baryon number minus lepton number (BL) symmetry with the Peksey-Quinn (PQ) symmetry, Professor Nitta and his associates demonstrated that the knot could have spontaneously formed in the early universe, resulting in the observed surplus.

These two well-studied extensions to the standard model address some of its most confounding gaps.

PQ symmetry offers a solution to the strong CP problem, which explains the absence of the small electric dipole moments that theories predict for neutrons, simultaneously introducing axions, a leading candidate for dark matter.

BL symmetry, conversely, elucidates why neutrinos, elusive particles that can seamlessly pass through entire planets, possess mass.

Maintaining the PQ symmetry globally, rather than merely measuring it, safeguards the delicate axion physics that addresses the strong CP problem.

In physics, “measuring” a symmetry implies allowing it to operate freely at any locale and moment in time.

However, this regional freedom requires nature to introduce new mechanisms for force transmission to clarify the equations.

By acknowledging BL symmetry, the researchers not only validated the existence of heavy right-handed neutrinos (crucial for averting anomalies in the theory and central to the primary burr formation model) but also incorporated superconducting behavior, likely providing the magnetic foundation for some of the universe’s earliest knots.

As the universe cooled following the Big Bang, its symmetry may have fractured through a series of phase transitions, leaving behind string-like defects called cosmic strings, which some cosmologists theorize may still persist.

Even though thinner than a proton, a cosmic string can stretch across a mountain.

As the universe expanded, these writhing filaments would twist and intertwine, preserving traces of the primal conditions that once existed.

The breakdown of BL symmetry formed a flux tube string, while PQ symmetry resulted in a flux-free superfluid vortex.

This contrast renders them compatible.

The BL flux tube grants the Chern-Simons coupling of the PQ superfluid vortex a point of attachment.

This coupling subsequently channels the PQ superfluid vortex into the BL flux tube, counteracting the tension that might otherwise disrupt the loop.

The outcome is a metastable, topologically locked structure known as a knot soliton.

“No prior studies had simultaneously considered these two symmetries,” notes Professor Nitta.

“In a way, our good fortune lay in this. By integrating them, we uncovered a stable knot.”

While radiation diminishes energy as waves traverse through space and time, knots exhibit properties akin to matter and dissipate energy far more gradually.

They subsequently surpassed all other forms, heralding an era of knot domination, where their energy density eclipsed that of radiation in the universe.

However, this dominance was short-lived. Ultimately, the knot succumbed to quantum tunneling, an elusive process where particles slip through energy barriers as though they were nonexistent.

This decay yielded heavy dextral neutrinos, a consequence of the inherent BL symmetry within its framework.

These colossal, elusive particles eventually transformed into lighter and more stable variations that favored matter over antimatter, shaping the universe we recognize today.

“Essentially, this decay releases a cascade of particles, including right-handed neutrinos, scalar particles, and gauge particles,” explained Dr. Masaru Hamada, a physicist at the German Electron Synchrotron Institute and Keio University.

“Among them, right-handed neutrinos are particularly noteworthy since their decay can inherently generate a discrepancy between matter and antimatter.”

“These massive neutrinos decompose into lighter particles, such as electrons and photons, sparking a secondary cascade that reheats the universe.”

“In this manner, they can be regarded as the ancestors of all matter in the universe today, including our own bodies, while knots might be considered our forebears.”

Once the researchers delved into the mathematics underlying the model—analyzing how efficiently the knot produced right-handed neutrinos, the mass of those neutrinos, and the degree of heat generated post-collapse—the observed matter-antimatter imbalance naturally emerged from their equations.

Rearranging the equations, with an estimated mass of 1012 gigaelectronvolts (GeV) for heavy dextral neutrinos, and assuming that most energy retained by the knot was utilized to generate these particles, the model yielded a natural reheating temperature of 100 GeV.

This temperature fortuitously coincides with the final opportunity for the universe to produce matter.

Should the universe cool beyond this point, the electroweak reactions that convert neutrino discrepancies into matter would cease permanently.

Reheating to 100 GeV may have also reshaped the cosmic gravitational wave spectrum, shifting it toward higher frequencies.

Forthcoming observatories such as Europe’s Laser Interferometer Space Antenna (LISA), the United States’ Cosmic Explorer, and Japan’s Decihertz Interferometer Gravitational-Wave Observatory (DECIGO) may someday detect these subtle tonal variations.

Dr. Minoru Eto, a physicist at Yamagata University, Keio University, and Hiroshima University, remarked, “The cosmic string is a variant of topological soliton, an entity defined by a quantity that remains unchanged regardless of how much it is twisted or stretched.”

“This characteristic not only guarantees stability but also indicates that our results are not confined to the specifics of the model.”

“While this work is still theoretical, we believe it represents a significant advancement towards future development, as the foundational topology remains constant.”

Although Lord Kelvin initially proposed that knots were fundamental components of matter, the researchers assert that their findings present the first realistic particle physics model in which knots could significantly contribute to the origin of matter.

“The next step involves refining our theoretical models and simulations to more accurately forecast the formation and collapse of these knots, connecting their signatures with observable signals,” said Professor Nitta.

“In particular, upcoming gravitational wave experiments like LISA, Cosmic Explorer, and DECIGO will enable the testing of whether the universe indeed experienced a knot-dominated era.”

The team’s work appears in the journal Physical Review Letters.

_____

Minoru Eto et al. 2025. Tying the Knot in Particle Physics. Physics. Pastor Rhett 135, 091603; doi: 10.1103/s3vd-brsn

Source: www.sci.news

Physicists Explore the Moments When Nature’s Strongest Forces Diminish

STAR detector of the relativistic heavy ion collider

Brookhaven National Laboratory

We are making strides toward comprehending when the powerful nuclear force weakens its influence on the most basic components of matter, causing quarks and gluons within particles to suddenly morph into a hot soup of particles.

There exist unique combinations of temperature and pressure where all three phases of water (liquid, ice, and vapor) coexist simultaneously. For years, scientists have sought similar “critical points” in matter impacted by the potent nuclear force that binds quarks and gluons into protons and neutrons.

In a particle collider, when ions collide, the strong force is disrupted, resulting in a state where quarks and gluons form a soup-like “quark-gluon plasma.” However, it remains uncertain if there is a tipping point preceding this transition. Shinto Researchers at the Lawrence Berkeley National Laboratory in California are getting closer to unraveling this mystery.

They assessed the number and distribution of particles produced after the collision of two high-energy gold ions at the Relativistic Heavy Ion Collider at Brookhaven National Laboratory in New York. Dong mentioned they were essentially attempting to formulate a phase diagram for quarks and gluons, depicting what types of matter are generated by strong forces under varied conditions. Although the new experiment did not definitively locate the critical point on this diagram, it significantly narrowed the possible area for its existence.

The phase diagram indicates a region where the material gradually “melts” into plasma, akin to butter softening on a countertop, but a critical point would correspond to a more sudden transition, similar to a chunk of ice unexpectedly forming in liquid water. Agnieszka Sorensen from the Rare Isotope Beam Facility in Michigan, which was not part of the study, stated that this new experiment not only guides researchers in pinpointing this critical point but also uncovers which particle properties might best indicate its presence.

Claudia Ratti from the University of Houston in Texas emphasized that many researchers eagerly anticipated the new analysis due to its precision, which surpasses that of previous measurements, particularly in parts of phase diagrams difficult to theoretically compute. She noted that several predictions regarding the critical point’s location have recently converged, and the challenge for experimenters will now be to analyze data at even lower collision energies that align with these predictions.

Dong remarked that the clear detection of the tipping point would mark a generational milestone. This is significant as the only fundamental force suspected of possessing a critical point is the strong force, which has played a crucial role in the universe’s formation. It governs the characteristics of the hot, dense matter created shortly after the Big Bang and continues to influence the structure of neutron stars. Dong concluded that collider experiments like this one could deepen our understanding of these exotic celestial objects once the strong force phase diagram is finalized.

topic:

Source: www.newscientist.com

Physicists Unveil the Concept of Neutrino Lasers

Researchers from MIT and the University of Texas at Arlington suggest that supercooling radioactive atoms may enable the creation of laser-like neutrino beams. They illustrate this by calculating the potential for a neutrino laser using one million rubidium-83 atoms. Generally, the half-life of a radioactive atom like this is approximately 82 days, indicating that half of the atoms will decay and emit an equal number of neutrinos within that timeframe. Their findings indicate that cooling rubidium-83 to a stable quantum state could allow for radioactive decay to occur in only a few minutes.



BJP Jones & Ja Formaggio devises the concept of a laser that emits neutrinos. Image credit: Gemini AI.

“In this neutrino laser scenario, neutrinos would be released at a significantly accelerated rate, similar to how lasers emit photons rapidly.”

“This offers a groundbreaking method to enhance radioactive decay and neutrino output. To my knowledge, this has never been attempted before,” remarked MIT Professor Joseph Formaggio.

A few years ago, Professor Formaggio and Dr. Jones were each considering unique opportunities in this field. They pondered: could we amplify the natural process of neutrino generation through quantum consistency?

Their preliminary research highlighted several fundamental challenges to achieving this goal.

Years later, during discussions regarding the properties of ultra-cold tritium, they asked: could enhancing qualitatively the quantum state of radioactive atoms like tritium lead to improved neutrino production?

The duo speculated that transitioning radioactive atoms into Bose-Einstein condensates might promote neutrino generation. However, during quantum mechanical calculations, they initially concluded that such effects might not be feasible.

“It was a misleading assumption; merely creating a Bose-Einstein condensate does not speed up radioactive decay or neutrino production,” explained Professor Formaggio.

Years later, Dr. Jones revisited the concept, incorporating the phenomenon of Superradiance. This principle from quantum optics occurs when groups of luminescent atoms are synchronously stimulated.

It is anticipated that in this coherent state, the atoms will emit a burst of superradiant or more radioactive photons than they would if they were not synchronized.

Physicists suggest that analogous superradiant effects may be achievable with radioactive Bose-Einstein condensates, potentially leading to similar bursts of neutrinos.

They turned to the equations governing quantum mechanics to analyze how light-emitting atoms transition from a coherent state to a superradiant state.

Using the same equations, they explored the behavior of radioactive atoms in a coherent Bose-Einstein condensed state.

“Our findings indicate that by producing photons more rapidly and applying that principle to neutrinos, we can significantly increase their emission rate,” noted Professor Formaggio.

“When all the components align, the superradiation of the radioactive condensate facilitates this accelerated, laser-like neutrino emission.”

To theoretically validate their idea, the researchers calculated the neutrino generation from a cloud of 1 million supercooled rubidium-83 atoms.

The results showed that in the coherent Bose-Einstein condensate state, atoms can reduce radioactivity at an accelerated rate, releasing a laser-like stream of neutrinos within minutes.

Having demonstrated that neutrino lasers are theoretically feasible, they plan to experiment with a compact tabletop setup.

“This should involve obtaining the radioactive material, evaporating, laser-trapping, cooling, and converting it into a Bose-Einstein condensate,” said Jones.

“Subsequently, we must instigate this superradiance.”

The pair recognizes that such experiments will require extensive precautions and precise manipulation.

“If we can demonstrate this in the lab, it opens up possibilities for future applications. Could this serve as a neutrino detector? Or perhaps as a new form of communication?”

Their paper has been published today in the journal Physical Review Letters.

____

BJP Jones & Ja Formaggio. 2025. Super radioactive neutrino lasers from radioactive condensate. Phys. Pastor Rett 135, 111801; doi:10.1103/l3c1-yg2l

Source: www.sci.news

Physicists Uncover Unusual Quantum Echoes in Niobium Superconductors

Researchers from Ames National Laboratory and Iowa State University have unveiled the emergence of Higgs echoes in niobium superconductors. These findings shed light on quantum behavior that could influence the development of next-generation quantum sensing and computing technologies.

Using Higgs Echo Spectroscopy, Huang et al reveal unconventional echo formation due to non-uniform expansion and soft quasiparticle bands, dynamically evolving under THZ drive. Image credit: Ames National Laboratory.

Superconductors are materials known for conducting electricity without resistance.

These superconducting materials exhibit collective oscillations referred to as the Higgs mode.

The Higgs mode represents a quantum phenomenon that occurs when the electronic potential fluctuates similarly to a Higgs boson.

Such modes manifest when the material experiences a superconducting phase transition.

Monitoring these vibrations has posed challenges for scientists for many years.

Additionally, they interact complexly with quasiparticles, which are electron-like excitations arising from superconducting dynamics.

By utilizing advanced terahertz (THZ) spectroscopy, the researchers identified a new type of quantum echo known as Higgs echo in superconductive niobium materials utilized in quantum computing circuits.

“Unlike traditional echoes seen in atoms and semiconductors, Higgs echoes result from intricate interactions between Higgs modes and quasiparticles, generating anomalous signals with unique properties.”

“Higgs echoes can uncover and reveal hidden quantum pathways within a material.”

By employing precisely-timed THZ radiation pulses, the authors were able to detect these echoes.

These THZ radiation pulses can also facilitate the encoding, storage, and retrieval of quantum information embedded in the superconducting material via echoes.

This study illustrates the ability to manipulate and observe the quantum coherence of superconductors, paving the way for innovative methods of storing and processing quantum information.

“Grasping and controlling these distinctive quantum echoes brings us closer to practical quantum computing and advanced quantum sensing technologies,” stated Dr. Wang.

a paper detailing these findings was published in the journal on June 25th in Advances in Science.

____

Chuankun Huang et al. 2025. Discovery of unconventional quantum echoes due to Higgs coherence interference. Advances in Science 11 (26); doi:10.1126/sciadv.ads8740

Source: www.sci.news

Why Physicists Believe Geometry Holds the Key to All Theories

Can you envision the impression a 4D hexagon might create as it travels through a 3D kitchen table? It might seem implausible, yet some individuals can perceive it.

One such individual was mathematician Alicia Bourstott, daughter of logician George Bourg. In the early 20th century, she devised models of shapes while moving through three-dimensional objects. Years later, when mathematicians could verify her work with computer programs, they found that Stott had an uncanny ability to accurately depict these shapes.

This narrative is part of our special concept, uncovering how specialists ponder some of science’s most astonishing ideas. Click here for further details

For many of us, geometry recalls images of pencils, rulers, triangles, and circles. It evokes the complex questions posed in school involving parallel lines and angles. However, as Boole Stott’s experience illustrates, scholars have been expanding the scope of geometry for some time.

Geometry can transcend the conventional realm of 2D and 3D shapes. A prime example is Albert Einstein’s theory of gravity, known as general relativity, which intertwines with time to form a four-dimensional stage where the universe unfolds.

Moreover, geometry can also explore dimensions that defy physical reality. Take meteorology, for example. Atmospheric data encompasses multiple “dimensions” such as latitude, longitude, temperature, pressure, wind speed, and more.

Researchers visualize these dimensions as shapes extending into higher dimensions, aiding in understanding atmospheric behavior. “From this, we can implement mathematical models to explain what occurs. [those properties] In numerous dimensions,” states mathematician Snezana Lawrence of Middlesex University in London.

For theoretical physicists, extra dimensions appear to be essential for a complete understanding of the universe, with some suggesting that our reality might be a “projection” from a higher dimension. While this idea might sound peculiar, under certain simplified assumptions, physicists can perform calculations related to fundamental particles and black holes.

Some physicists have even proposed the concept of “all theories,” a curious geometric idea that may lead to a unified explanation of the universe and everything within it. One of these concepts is the “amplituhedron,” introduced by Jaroslav Trnka from the University of California, Davis, and Nima Arkani Hamed at the Institute for Advanced Study in New Jersey. Imagine it as an abstract, multidimensional crystal that offers an alternative perspective on the fundamentals of particle physics.

Another concept is “causal dynamic triangulation,” developed by Renate Roll at Radboud University in the Netherlands. This approach stitches together various geometric shapes to craft an explanation of space-time that seems to embody characteristics of both quantum mechanics and general relativity—two concepts that are traditionally seen as incompatible. She asserts that it serves as a testable reflection of both abstract geometric theories and true properties of the universe, as observed in the cosmic microwave background radiation.

Neither of these ideas has yet been universally accepted in all theories. However, some believe that a fresh perspective on physics is essential for progress. There is a growing consensus that this perspective may be expressed through the language of geometry. While the truth of this notion remains to be seen, it is evident that geometry encompasses far more than just hexagons.

Explore other stories in this series using the links below:

topic:

Source: www.newscientist.com

Physicists Unveil Heaviest Known Proton-Luminescent Isotope: Astatine-188

At the Accelerator Laboratory of the University of Zibaskira in Finland, physicists utilized a gas-filled recoil separator focal plane spectrometer to observe two attenuation events of the newly discovered isotope astatin-188 (188At), which is composed of 85 protons and 103 neutrons.

Kokkonen et al. Report the discovery of the new nucleus 188At, which is the heaviest proton-emitting isotope known to date.

“Proton emission is a rare type of radioactive decay where the nucleus releases protons, moving toward stability,” explained Henna Kokkonen, a doctoral researcher at Zibaskira University.

“This new nucleus is currently the lightest known isotope of astatin, 188At, containing 85 protons and 103 neutrons.”

“Studying this type of exotic nucleus is exceedingly challenging due to its brief lifespan and low production cross-section. Therefore, precise techniques are essential.”

“The nuclei were produced through fusion deposition reactions by irradiating natural silver targets with a 84Sr ion beam,” added Dr. Kare Auranen of Zibaskira University.

“The detection of the new isotopes was made possible using the Ritu Recoil separator’s detector setup.”

In addition to the experimental findings, the physicists expanded theoretical models to interpret the collected data.

According to the team, 188At can be likened to a strong explosion, resembling “the shape of a watermelon.”

“The nuclear properties suggest a shift in the behavior of the binding energy of valence protons,” Kokkonen stated.

“This is attributed to unprecedented interactions with heavy nuclei.”

“Isotopes are rare globally, and this marks the second occasion I’ve had the chance to make history.”

“All experiments pose challenges, and it is rewarding to conduct research that enhances our understanding of the fundamental limits of matter and nuclear structure.”

The authors intend to refine the current uncertainties and half-life of the attenuation energy by further theoretical exploration of charged particle-damped heavy nuclei, observing the evolution of their shapes, and examining additional decay events of 188At.

“Equally intriguing is the study of the collapse of a currently unknown nuclear isotope 189At, which could be a proton-emitting nucleus, an aspect we have yet to explore in future experiments,” they concluded.

Their paper was published in the journal Nature Communications.

____

H. Kokkonen et al. 2025. New Proton Emitter 188At signifies unprecedented interactions in heavy nuclei. Nat Commun 16, 4985; doi:10.1038/s41467-025-60259-6

Source: www.sci.news

Physicists Investigate Light’s Interaction with Quantum Vacuums

Researchers have successfully conducted the first real-time 3D simulation demonstrating how a powerful laser beam alters the quantum vacuum. Remarkably, these simulations reflect the unusual phenomena anticipated by quantum physics, known as vacuum four-wave mixing. This principle suggests that the combined electromagnetic fields of three laser pulses can polarize a virtual electron-positron pair within a vacuum, resulting in photons bouncing toward one another as if they were billiard balls.



Illustration of photon photon scattering in a laboratory: Two green petawatt laser beams collide in focus with a third red beam to polarize the quantum vacuum. This allows the generation of a fourth blue laser beam in a unique direction and color, conserving momentum and energy. Image credit: Zixin (Lily) Zhang.

“This is not merely a matter of academic interest. It represents a significant advance toward experimental validation of quantum effects, which have largely remained theoretical,” remarks Professor Peter Norries from Oxford University.

The simulation was executed using an enhanced version of Osiris, a simulation software that models interactions between laser beams and various materials or plasmas.

“We are doctoral students at Oxford University,” shared Zixin (Lily) Zhang.

“By applying the model to a three-beam scattering experiment, we were able to capture a comprehensive spectrum of quantum signatures, along with detailed insights into the interaction region and the principal time scale.”

“We’ve rigorously benchmarked the simulation, enabling our focus to shift to more intricate, exploratory scenarios, like exotic laser beam structures and dynamic focus pulses.”

Crucially, these models furnish the specifics that experimentalists depend on to design accurate real-world tests, encompassing realistic laser configurations and pulse timing.

The simulations also uncover new insights into how these interactions develop in real-time and how subtle asymmetries in beam geometry can influence the outcomes.

According to the team, this tool not only aids in planning future high-energy laser experiments but also assists in the search for evidence of virtual particles, such as axes and millicharged particles, or potential dark matter candidates.

“The broader planned experiments at state-of-the-art laser facilities will greatly benefit from the new computational methods implemented in Osiris,” noted Professor Lewis Silva, a physicist at the Technico Institute in Lisbon and Oxford.

“The integration of ultra-intense lasers, advanced detection techniques, cutting-edge analysis, and numerical modeling lays the groundwork for a new era of laser-material interactions, opening new avenues for fundamental physics.”

The team’s paper was published today in the journal Communication Physics.

____

Z. Chan et al. 2025. Computational modeling of semi-real-world quantum vacuums in 3D. Commun Phys 8, 224; doi:10.1038/s42005-025-02128-8

Source: www.sci.news

Physicists Achieve Unmatched Precision in Measuring Magnetic Anomalies in Mines

Researchers from the Muon G-2 Experiment have unveiled their third measurement of the Muon magnetic anomaly. The conclusive results align with findings published in 2021 and 2023 but boast significantly improved precision at 127 parts per billion, surpassing the experimental goal for 140 people.

Muon particles traveling through lead in the cloud chamber. Image credit: Jino John 1996 / cc by-sa 4.0.

The Muon G-2 experiment investigates the wobble of a fundamental particle known as the Muon.

Muons resemble electrons but are roughly 200 times more massive. Like electrons, they exhibit quantum mechanical properties called spins, which can be interpreted as tiny internal magnets.

When subjected to an external magnetic field, these internal magnets wobble akin to the axis of a spinning top.

The precession speed of a magnetic field is influenced by the muon’s characteristics, captured numerically as the G-factor.

Theoretical physicists derive G-factors based on our current understanding of the universe’s fundamental mechanics, as outlined in the standard model of particle physics.

Nearly a century ago, G was anticipated to be 2; however, experimental measurements revealed minor deviations from this value, quantified as the Muon magnetic anomaly, Aμ, based on the formula (G-2)/2, giving the Muon G-2 experiment its name.

Muon magnetic anomalies encapsulate the effects of all standard model particles, enabling theoretical physicists to compute these contributions with remarkable precision.

Earlier measurements conducted at the Brookhaven National Laboratory during the 1990s and 2000s indicated potential discrepancies with the theoretical calculations of that era.

Disparities between experimental results and theoretical predictions could signal the existence of new physics.

In particular, physicists contemplated whether these discrepancies could stem from an undetected particle influencing the muon’s precession.

Consequently, physicists opted to enhance the Muon G-2 experiments to obtain more accurate measurements.

In 2013, Brookhaven’s magnetic storage ring was relocated from Long Island, New York, to Fermilab in Batavia, Illinois.

Following extensive upgrades and enhancements, the Fermilab Muon G-2 experiment launched on May 31, 2017.

Simultaneously, an international collaboration among theorists established the Muon G-2 theory initiative aimed at refining theoretical calculations.

In 2020, the Theoretical Initiative released updated and more precise standard model values informed by data from other experiments.

The differences between the experimental results continued to widen in 2021 as Fermilab announced the initial experimental results, corroborating Brookhaven’s findings with improved accuracy.

Simultaneously, new theoretical predictions emerged, relying significantly on computational capabilities.

This information closely aligned with experimental measurements and narrowed the existing discrepancies.

Recently, the Theoretical Initiative published a new set of predictions integrating results from various groups using novel calculation techniques.

This result remains in close agreement with experimental findings and diminishes the likelihood of new physics.

Nevertheless, theoretical endeavors will persist in addressing the disparities between data-driven and computational approaches.

The latest experimental values for the muon magnetic moment from Fermilab’s experiments are:

aμ =(g-2)/2 (Muon experiment) = 0.001 165 920 705

This final measurement is based on an analysis of data collected over the past three years, spanning 2021 to 2023, and is integrated with previously published datasets.

This has more than tripled the dataset size utilized in the second results from 2023, achieving the precision target set in 2012.

Moreover, it signifies the analysis of the highest quality data from the experiment.

As the second data collection run concluded, the Muon G-2 collaboration finalized adjustments and enhancements to the experiment, boosting muon beam quality and minimizing uncertainties.

“The extraordinary magnetic moment of the muon (G-2) is pivotal as it provides a sensitive test of the standard model of particle physics,” remarked Regina Lameika, associate director of high energy physics at the U.S. Department of Energy.

“This is an exhilarating result, and it’s fantastic to witness the experiment reach a definitive conclusion with precise measurements.”

“This highly anticipated outcome represents a remarkable achievement in accuracy and will hold the title of the most precise measurement of muon magnetic anomalies for the foreseeable future.”

“Despite recent theoretical challenges that have lessened the evidence for new physics in Muon G-2, this finding presents a robust benchmark for proposed extensions to the standard model of particle physics.”

“This is an incredibly exciting moment; not only did we meet our objectives, but we surpassed them, indicating that such precision measurements are challenging.”

“Thanks to Fermilab, the funding agencies, and the host lab, we accomplished our goals successfully.”

“For over a century, the G-2 has imparted crucial insights into the nature of reality,” stated Lawrence Gibbons, a professor at Cornell University.

“It’s thrilling to contribute accurate measurements that are likely to endure for a long time.”

“For decades, muon magnetic moments have served as a significant benchmark for the standard models,” noted Dr. Simon Kolody, a physicist at Argonne National Laboratory.

“The new experimental results illuminate this fundamental theory and establish a benchmark to guide new theoretical calculations.”

These new results will be featured in the journal Physical Review Letters.

Source: www.sci.news

CERN Physicists Witness the Transformation of Lead into Gold

Collisions involving high-energy lead nuclei at CERN’s Large Hadron Collider generate a powerful electromagnetic field capable of displacing protons and converting lead into ephemeral gold nuclei.



The lead ions (208Pb) in the LHC pass by one another without direct collision. During electromagnetic dissociation, photons interact with the nucleus, causing internal vibrations that result in the ejection of a small number of neutrons (2) and protons (3), leaving behind the nucleus of gold (before gold 203Au). Image credit: CERN.

The transformation of base metal lead into the precious metal gold was a long-held aspiration of medieval alchemists.

This enduring pursuit, known as Chrysopia, may have been spurred by the recognition that the relatively common lead, with its dull gray color, bears resemblance to gold.

It has since been established that lead and gold are fundamentally different chemical elements, and that chemical means cannot facilitate their conversion.

The advent of nuclear physics in the 20th century uncovered the possibility of transforming heavy elements into others through processes such as radioactive decay or in laboratory settings involving bombardment by neutrons or protons.

Gold has been artificially generated through such means previously, but physicists from the Alice Collaboration at CERN’s Large Hadron Collider (LHC) have recently measured lead’s conversion into gold using a novel mechanism that relies on close interactions between lead nuclei at the LHC.

High-energy collisions between lead nuclei can lead to the formation of quark-gluon plasma, a state of high temperature and density believed to represent conditions shortly after the Big Bang, initiating phenomena we now recognize.

Simultaneously, in more frequent instances where nuclei narrowly miss each other without direct contact, the strong electromagnetic fields they generate can provoke photon-nucleus interactions, potentially uncovering more exploration avenues.

The electromagnetic field produced by the nucleus is particularly potent due to its 82 protons, each carrying a fundamental charge.

Additionally, when lead nuclei are accelerated to extreme speeds at the LHC, the electromagnetic field lines become compressed into thin layers, extending laterally in the motion direction, generating transient pulses of photons.

This phenomenon often triggers electromagnetic dissociation, where photons interact with the nucleus, causing vibrations in its internal structure and leading to the release of a limited number of neutrons and protons.

To fabricate gold (with 79 protons), three protons must be removed from the lead nuclei in the LHC beam.

“It is remarkable to witness our detectors managing direct collisions that produce thousands of particles, while being sensitive to scenarios where merely a few particles are generated,” said a researcher.

The Alice team employed a zero degree calorimeter (ZDC) to quantify the number of photon-nucleus interactions, correlating them to the emission of zero, one, two, and three protons related to the production of lead, thallium, mercury, and gold, respectively.

While the creation of thallium and mercury occurs more frequently, results indicate that the LHC currently generates gold at a rate of approximately 89,000 nuclei from lead collisions at the Alice collision point.

These gold nuclei emerge from collisions at extremely high energies, colliding with LHC beam pipes or collimators at various downstream points and swiftly fragmenting into individual protons, neutrons, and other particles, lasting mere seconds.

The analysis from Alice shows that roughly 86 billion gold nuclei were produced during four significant experiments across two runs of the LHC, equating to only 29 picograms (2.9*10-11 g) in mass.

With ongoing upgrades to the LHC enhancing its brightness, Run 3 yielded almost double the amount of gold as observed in Run 2, although the overall quantity remains trillions of times less than what is necessary for jewelry production.

Though the technological aspirations of medieval alchemists have been partially fulfilled, their dreams of acquiring wealth have yet again been dashed.

“Thanks to the distinctive capabilities of Alice’s ZDC, our current analysis marks the inaugural systematic detection and examination of gold production signatures at the LHC,” states Dr. Uliana Dmitrieva, a member of the Alice Collaboration.

“These results extend beyond fundamental physics interests and serve to test and refine theoretical models of electromagnetic dissociation, improving our understanding of beam loss— a significant factor influencing the performance limitations of the LHC and future colliders,” adds Dr. John Jowett, also of the Alice Collaboration.

A new study will be published in the journal Physical Review C.

____

S. Acharya et al. (Alice Collaboration). √sNN= 5.02 Proton emission in ultra-fine Pb-Pb collisions at TeV. Phys. Rev. C 111, 054906; doi:10.1103/PhysRevC.111.054906

Source: www.sci.news

Physicists Unveil a Novel Quantum Theory of Gravity

Sure! Here’s the rewritten content:

A novel theory formulated by physicists at Aalto University provides a new perspective on gravity that aligns with established particle physics models, paving the way to understanding the universe’s origins.

The standard model of particle physics delineates the electromagnetic, weak, and strong interactions among three of the four fundamental forces of nature. The challenge in unifying these with gravity has persisted due to the incompatibility of the general theory of relativity and quantum field theory. While quantum field theory employs compact, finite-dimensional symmetry linked to the quantum fields’ internal degrees of freedom, general relativity is grounded in non-competitive, infinite external space-time symmetry. Mikko Partanen & Jukka Tulkki aim to construct a gauge theory of gravity using compact twin symmetry, similar to the formulation of basic interactions in standard models. Image credit: Desy/Science Communication Lab.

“If this research leads to a comprehensive quantum field theory of gravity, it will ultimately address the challenging question of understanding the singularities in black holes and the Big Bang,” stated Dr. Mikko Partanen from Aalto University.

“Theories that effectively unify all fundamental natural forces are often referred to as ‘theory of everything.’

“Several fundamental questions in physics remain unresolved. Current theories do not elucidate why the observable universe exhibits a greater abundance of matter than antimatter.”

The breakthrough lay in formulating gravity through the appropriate gauge theory, which describes how particles interact via fields.

“The most recognized gauge field is the electromagnetic field,” remarked Dr. Jukka Tulkki from Aalto University.

“When charged particles interact, they do so through electromagnetic fields. This represents the proper gauge field.”

“Therefore, if particles possess energy, their interactions will occur through the gravitational field simply because energy exists.”

One of the significant challenges physicists have encountered is discovering a theory of gravity that aligns with the gauge theories governing the three fundamental forces: electromagnetic force, weak nuclear force, and strong nuclear force.

The standard model of particle physics serves as a gauge theory that describes these three forces, characterized by specific symmetries.

“The core concept is to avoid basing your theory on the fundamentally distinct space-time symmetries of general relativity, but rather to establish a gravity gauge theory with symmetry that resembles the standard model’s symmetry,” Dr. Partanen explained.

Without such a theoretical framework, physicists cannot reconcile the two most potent theories at our disposal: quantum field theory and general relativity.

Quantum theory provides insights into the behavior of small particles in a stochastic manner, while general relativity describes the gravitational interactions of massive, familiar objects.

Both theories offer unique perspectives on our universe and have been validated with remarkable accuracy, yet they remain mutually exclusive.

Moreover, due to the weak interactions of gravity, enhanced precision is required to investigate genuine quantum gravity effects beyond the classical theory of general relativity.

“Understanding the quantum theory of gravity is crucial for deciphering phenomena occurring in high-energy gravitational fields,” noted Dr. Partanen.

“These phenomena are particularly relevant in the vicinity of black holes, during the moments following the Big Bang, and in the early universe, areas where existing physical theories fail to apply.”

“I’ve always been captivated by such a grand problem in physics, which inspired me to explore a new symmetry-based approach to gravity theory and begin developing ideas,” he added.

“The resulting work promises to usher in a new era of scientific comprehension, akin to how understanding gravity enabled the creation of GPS technology.”

The theory holds great promise, but the researchers caution that their evidence collection is still ongoing.

This theory employs a technical method known as renormalization, a mathematical technique employed to manage the infinities that arise in calculations.

Currently, Dr. Partanen and Dr. Tulkki have demonstrated its effectiveness to a certain degree for the so-called “first-order” term, but they need to ensure that these infinities can be navigated throughout the calculations.

“If the renormalization process falters under higher-order conditions, the results become endlessly divergent,” Dr. Tulkki explained.

“Hence, demonstrating the continuation of this process is critical.”

“While we still need to gather comprehensive evidence, we are optimistic about our chances for success,” he remarked.

“Challenges remain, but with time and perseverance, I hope they will be surmountable,” Dr. Partanen reflected.

“I cannot predict when, but I expect to gain more insights in the coming years.”

The team’s paper has been published in the journal Report on Progress in Physics.

____

Mikko Partanen & Jukka Tulkki. 2025. Gravity generated by four 1-dimensional single-gauge symmetry and the standard model. Legislator prog. Phys 88, 057802; doi:10.1088/1361-6633/ADC82E

Let me know if you need any additional changes!

Source: www.sci.news

Physicists Claim Gravity Arises from Our Universe’s Computational Processes

Melvin Vopson, a physicist from the University of Portsmouth, introduces a novel perspective on gravity.

This artist’s impression illustrates the evolution of the universe, starting with the Big Bang on the left. Then, the microwave background is depicted, followed by the formation of the first stars, which ends the dark ages of the universe, and continues with the emergence of galaxies. Image credit: M. Weiss/Harvard – Smithsonian Center for Astrophysics.

There is a theory positing that the entire universe is intrinsically informative and operates akin to a computational process, a perspective shared by many notable thinkers.

This line of thinking emerges from the domain of information physics, suggesting that physical reality is fundamentally composed of structured information.

In his latest paper, Dr. Vopson presents findings indicating that gravity stems from a computational process inherent in the universe.

He posits that gravity may be influenced by the organization of information related to matter throughout the universe.

Employing the second law of information dynamics, he demonstrates that universal matter and its objects could be considered as the universe endeavors to organize and compress information.

“My findings support the notion that the universe might operate like a vast computer, or that our reality represents a simulated configuration,” Dr. Vopson remarked.

“In the same way that computers strive to save space and enhance efficiency, the universe may do the same.”

“This presents a new outlook on gravity—it’s about the universe’s effort to stay organized, rather than simply pulling.”

Dr. Vopson has previously posited that information is fundamental and that all elementary particles harbor self-information, similar to how cells in biological entities carry DNA.

The current paper reveals how the spatial pixelation of fundamental cells serves as a medium for data storage, and how the information contained within these cells contributes to the physical properties and coordinates of space-time simulacra.

Each cell is capable of registering information in binary format, meaning an empty cell records a digital 0, while a cell containing matter records a digital 1.

“This process mirrors the design of a digital computer game, a virtual reality application, or other advanced simulations,” Dr. Vopson explained.

“As a single cell can accommodate multiple particles, the system evolves by relocating particles in space, merging them into a singular large particle within a single cell.”

“This sets the rules established in the computing system, causing attraction, which requires minimizing informational content and potentially reducing computational demand.”

“In simple terms, tracking and calculating the position and momentum of a single object is much more computationally efficient than managing multiple objects.”

“Therefore, gravitational attraction appears as yet another optimization mechanism within the computational process aimed at compressing information.”

“This study offers a fresh insight into gravity, affirming that its appeal arises from the fundamental urge to decrease information entropy in the universe.”

“The findings reveal significant conceptual and methodological distinctions, suggesting that gravity functions as a computational optimization process where matter self-organizes to lessen the complexity of encoding within space-time.”

“The broader implications of this work encompass fundamental physics topics, including black hole thermodynamics, dark matter, dark energy considerations, and potential links between gravity and quantum information theory.”

“The question of whether the universe is fundamentally a computational structure remains unresolved.”

This paper was published in the journal on April 25th, 2025, in AIP Advances.

____

Melvin M. Vopson. 2025. Is there evidence of gravity in the computational universe? AIP Advances 15, 045035; doi:10.1063/5.0264945

Source: www.sci.news

Physicists develop innovative form of structured light: Optical rotation

According to a team of Harvard physicists, the structure of the optically rotating animal continues in a logarithmic spiral.

The evolution of light beams carrying the optical decy as a function of propagation distance. Image credits: Dorrah et al. , doi: 10.1126/sciadv.adr9092.

“This is a new behavior of light consisting of optical vortices that propagate space and change in an anomalous way,” says Professor Federico Capaso, a senior author of the study.

“It can potentially help you manipulate small substances.”

With a unique twist, the researchers have discovered that orbital angular momentum-mediated beams of light grow in mathematically recognizable patterns found throughout nature.

Reflecting the Fibonacci number sequence, their optical rotations propagate into logarithmic spirals found in Nautilus shells, sunflower seeds, and tree branches.

“It was one of the unexpected highlights of this study,” says Dr. Ahmed Dora, the first author of the study.

“Hopefully we can help others, who are experts in applied mathematics, to further study these light patterns and gain unique insight into their universal signature.”

This study is based on previous research by the team using thin lenses etched with thin nanostructures to create a light beam with controlled polarization and orbital angular momentum along its propagation path, converting the input of light into other structures that change when it moves.

Now they have introduced another degree of freedom in their light. There, spatial torque can be changed as it propagates.

“We show even more versatility in control and we can do it on a continuous basis,” said Alfonso Palmieri, co-author of the study.

Potential use cases for such exotic rays involve the control of very small particles, such as colloids, in suspension, by introducing new types of forces according to the unusual torque of light.

It also allows for precise optical tweezers for small operations.

Others have demonstrated light that changes torque using high-intensity lasers and bulky setups, but scientists have created theirs with a single liquid crystal display and a low-intensity beam.

By showing that they can create rotary rotary devices in industry-compatible, integrated devices, the barriers to entry for their technology to become a reality are much lower than in previous demos.

“Our research expands the previous literature on structured light, providing new modalities for light and physics, and sensing, suggesting similar effects of condensed material physics and Bose-Einstein condensates,” they concluded.

study Published in the journal Advances in science.

____

Ahmed H. Dora et al. 2025. Rotation of light. Advances in science 11 (15); doi:10.1126/sciadv.adr9092

Source: www.sci.news

Physicists discover innovative methods for producing Livermorium-116

Using the 88-inch cyclotron from the Lawrence Berkeley National Laboratory, an international team of physicists successfully created two atoms Rivermorium (Atomic Symbol LV) A breakthrough in which the lab tries to create a new element 120, using titanium beams for the first time.



Rivermorium, make a gate et al. A fusion isotopes of titanium and plutonium. Image credits: Jennius, Lawrence Berkeley National Laboratory.

Currently there are 118 known elements, 90 of which occur naturally on Earth.

Heavy elements than fermium (with 100 protons) must be created by combining the nuclei of two lighter elements, but not all combinations work.

The heaviest, currently known element was created by fusing a specific isotope of calcium, calcium-48 (containing 20 protons and 28 neutrons), with a heavier element, but this method works only up to element 118 (Oganesson).

The number of special (so-called magic) protons and neutrons makes it more possible to fusion of calcium and the survival of the nucleus of the resulting compounds.

But to go further, scientists need new techniques.

In the new experiment, Lawrence Berkeley National Laboratory and her colleague Dr. Jacklyn Gates made a major breakthrough by accelerating a beam of titanium-50 (containing 22 protons and 28 neutrons) with an 88-inch cyclotron, dissolving it with the nucleus of plutonium-244 (containing 94 protons and 150 diseases) and titanium nucleus.

Over 22 days, physicists successfully produced two atoms of rivermorium, the chemical element with symbol LV and atomic number 116.

This experiment shows that new elements other than Oganesson can be created in the Berkeley Lab.

However, creating element 120 is expected to be 10-20 times more difficult than Livermorium.

If successful, element 120 is the heaviest known element, offering a new opportunity to explore the outermost limits of atomic structures and further test theories of nuclear physics.

“This response has never been demonstrated before, and it was essential to prove that it was possible before embarking on an attempt to make a 120,” Dr. Gates said.

“Creating new elements is a very rare feat. It’s part of the process and it’s exciting to have a promising path forward.”

“This was an important first step in trying to make something a little easier than the new ones to see how the movement from the calcium beam to the titanium beam changes the rate at which these elements are produced,” said Dr. Jennifer Pore of Lawrence Berkeley National Laboratory.

“When we are trying to create these incredibly rare elements, we are at the absolute edge of human knowledge and understanding. There is no guarantee that physics will work as expected.”

“Using titanium to create element 116, we now have the ability to verify that this production method works and plan the hunt for element 120.”

Team’s paper Published in the journal Physical Review Letter.

____

JM Gate et al. 2025. Towards discovering new elements: production of rivermorium (z = 116) 50Ti. Phys. Pastor Rett 133, 172502; doi: 10.1103/physrevlett.133.172502

Source: www.sci.news

Physicists at Catlin determine the maximum weight of neutrinos

Physicists of the Karlsrue Tritium Neutrino (Catlin) experiment have reported so far the most accurate measurement of the upper mass limit of neutrinos, establishing it as 0.45 electron volts (EV), less than a millionth of the electron mass.



Interior view of the main spectrometer of catrin. Image credit: M. Zacher/Katrin Collaboration.

Neutrinos are the most abundant particles in the universe and exist as three different types or flavors: Electron Neutrino, Muon Neutrino, and Tau Neutrino.

These flavors vibrate. In other words, a single neutron can be converted to each type when it moves, providing compelling evidence that neutrinos have masses that contradict the original assumptions of massless neutrinos in the standard model.

But their exact mass remains one of the great mysteries of particle physics.

in New paper In the journal Sciencethe physicists from the Catlin collaboration present the results of the first five measurement campaigns of the Catlin experiment.

“The catrin experiment determines the mass of neutrinos by analyzing the beta decay of tritium,” they explained.

“During this decay, the neutrons are converted into protons, releasing both electron and electron antioxidant, the latter being neutrino antiparticles.”

“We can infer the mass of neutrinos by analyzing the distribution of total disintegration energy between the emitted electrons and the electron antioxidants.”

For 259 days between 2019 and 2021, Catlin physicists measured approximately 36 million electrons of energy. This is a dataset of 6 times the previous run.

The findings establish the strictest laboratory base upper limit for effective electron neutrino masses and place them below 0.45 eV at a 90% confidence level.

This result shows a third improvement in the mass limit of neutrinos, and doubles the previous limit.

“For this result, we analyzed five measurement campaigns. The total data collection from 2019 to 2021 is about a quarter of the total data expected from Catlin,” said Dr. Catlin Valerius, one of the two co-spokemens for the Catlin experiment and a physicist at the Karl-Thru Institute.

“In each campaign, we gained new insights and further optimized the experimental conditions,” said Dr. Suzanne Mertens, a physicist at the Max Planck Institute for Nuclear Physics and the Institute of Technology Munich.

____

Max Aker et al. (Catlin collaboration). 2025. Direct neutrino mass measurements based on 259 days of catrin data. Science 388 (6743): 180-185; doi: 10.1126/science.adq9592

Source: www.sci.news

Physicists Develop Shape-Recovering Liquids | Sci.News

According to a team of physicists at the University of Massachusetts at Amherst, liquids that recover the newly discovered shapes go against years of expectation derived from the laws of thermodynamics.

This image shows emulsion droplets stabilized by silica nanoparticles with nickel nanoparticles remaining on the drop surface. Image credit: Raykh et al. , doi: 10.1038/s41567-025-02865-1.

“Imagine your favorite Italian salad dressing,” said Professor Thomas Russell, Amherst professor at the University of Massachusetts.

“It consists of oil, water and spices, and all the ingredients are mixed together and shaken with it before pouring it into the salad.”

“It is those spices, something else, that are usually mutually exclusive, that mix water and oil, allowing a process called emulsification, that is small bits of those spices, something else, explained by the laws of thermodynamics.”

“Emulsification underlies a vast amount of technology and applications that go far beyond seasonings,” said Anthony Leif, a graduate student at the University of Massachusetts Amherst University.

“One day I was in the lab to mix this batch of science salad dressing and see what I could create. Instead of spice, I used magnetized particles of nickel because I could design any kind of interesting material that has useful properties when it contains magnetic particles.”

“I made the mixture and rocked it – and to my total surprise, the mixture formed this beautiful, pristine ur shape.”

“No matter how many times, how violently it was, the bones have always returned.”

The researchers determined that using additional lab experiments and simulations, they would explain the mysterious phenomenon of magnetism, strong magnetism, discovered.

“A very close look at the individual magnetized nickel nanoparticles that form the water-oil boundary gives you very detailed information on how the different morphologies are assembled.”

“In this case, the particles are magnetized so strongly that the assembly interferes with the emulsification process described by the laws of thermodynamics.”

The particles that are usually added to oil and water mixtures reduce the tension at the interface between the two liquids, allowing them to be mixed.

However, with a twist, the well-heavy magnetized particles actually increase the interfacial tension, bending the oil-water boundary into an elegant curve.

“When you see something impossible, you have to investigate,” Professor Russell said.

“We don’t have any applications yet in our discoveries, but we look forward to seeing how unprecedented states will affect the field of soft matter physics,” added Raykh.

Team’s work It will be displayed in the journal Natural Physics.

____

A.Rafe et al. Shape recovery solution. nut. PhysPublished online on April 4, 2025. doi:10.1038/s41567-025-02865-1

Source: www.sci.news

Physicists created a force-feeding-free version of Foie Gras

Thomas BirgissFood Physicists at the Max Planck Institute for Polymer Research in Germany, have been in love with foie gras for a quarter century. The gorgeous delicateness is a putty or mousse made from the rich, fat liver of ducks and geese.

“It’s truly extraordinary,” Dr. Virgis said, recalling his early encounters with high-quality foie gras when he lived and worked in Strasbourg, France. It was soft and buttery, and as the fat started to melt in my mouth, the flavor evolved and exploded. “It’s like fireworks. Suddenly there’s a feeling of the whole liver,” he said.

But such transcendence is at a price.

To fatten the liver used to create foie gras, farmers force more grains than their bodies need. The excess food is stored as fat in the animal’s liver and has balloons of size.

He sometimes eats foie gras produced by local farmers, but Dr. Virgis discovers that it cannot stand on an industrial scale. “It’s terrible to watch,” he says.

Dr. Virgis somehow thought that “we could make similar products, but without this torture.”

In a paper published in the journal on Tuesday Liquid physicshe and his colleagues believe they have devised techniques that allow ducks and geese to eat and grow normally. But to be clear, this is not a replacement for foie gras that will hold the life of birds.

His lab approach uses enzymes to break down duck fat. Second, the mixture of regular duck liver and treated fats is finished in the same way as traditional foie gras. “Of course, that’s not a 100% agreement, but we’re very close,” Dr. Virgis said.

“It’s far better than many other products that try to simulate foie gras,” he said. It involves the process of using plant fats (“the same flavor, not melting, nothing,” he said) or collagen (“this turns out to be gum,” he said).

Devising this approach was full of failure. When the team tried simply to combine regular duck liver with untreated fat, regardless of the ratio, the result was not foie gras.

“The mechanical properties are different,” he said. “The fat distribution is different. Everything wasn’t working.”

Researchers tried to add emulsifiers and later gelatin from bird skin and bones, but consistency was off.

Dr. Virgis then thought about what would happen when forces were generated inside the bird’s body. Ducks or geese digest all excess food, among other things, using an enzyme called lipase, which acts like a pair of molecular scissors. They can cut fat molecules into small pieces and “rearrange and crystallize in different shapes,” he said. Crystallized fats form irregular clusters surrounded by a matrix of liver proteins, giving them a luxurious flavour and texture.

That was an important insight. “We just did what happens in the small intestines of the lab,” Dr. Virgis said. When the team treated duck fat with lipase, mixed it with regular liver, and studied it using X-ray scattering and other techniques, the results were markedly similar to foie gras.

“The mechanical properties match the properties of foie gras very well,” he said. “This really made me happy because foie gras contains so much basic physics.”

But most importantly, it tasted right. Dr. Virgis was surprised and pleased when he first sampled the fake foie gras. The team adjusted the melting point and fat clustering exactly to the right. “This trick gives you fat so that it melts in your mouth, which is essential,” he said. Dr. Virgis secured a patent for this process.

Roseanna ZiaUniversity of Missouri mechanical and chemical engineers who were not involved in the research praise the research that overcomes key challenges. “One of the difficult things about engineering is to translate what people like and want,” she said.

She explained that foie gras is a kind of soft solid, including butter, chocolate, mayonnaise and ice cream. “It looks like a solid, but when spread out with a knife, it moves like a liquid,” she praises researchers like Dr. Virgis, who can manipulate the behavior of this type of complex material.

He acknowledges that his formulation is “not vegetarian, not vegan.” However, when foie gras is produced and consumed, Dr. Virgis hopes at least some farmers will work to “reduce the suffering of animals a little.”

Source: www.nytimes.com

Physicists generate quantum tornadoes in momentum space

Physicists have long known that electrons can form vortices from quantum materials. What's new is evidence that these small particles create tornado-like structures in momentum space.

In quantum materials called Tantalum harsenide (TAAS), electrons form vortices in momentum space. Image credits: Think-Design / Jochen Thamm.

Momentum space is a fundamental physics concept that explains electron motion in terms of energy and orientation rather than precise physical location.

The counterpart, the position space, is an area where familiar phenomena such as water vortices and hurricanes occur.

Until now, even quantum vortices of materials have been observed only in positional space.

Eight years ago, Dr. Roderrich Mossner of the Max Planck Institute for the Physics of Complex Systems and the Excellence ct.qmat of the Würzburg Denden cluster theorized that quantum tornadoes could also form in momentum spaces.

At the time, he described this phenomenon as a smoke ring. Because, like a ring of smoke, it is made up of vortices.

But up until now, no one knew how to measure them.

To detect quantum tornadoes in momentum space, Dr. Moessner and colleagues have enhanced a well-known technique called ARPES (angle-resolved light emission spectroscopy).

“ARPES is a fundamental tool in experimental solid-state physics,” explained Dr. Maximilian ünzelmann, researcher at the University of Werzburg, the experimental Physik VII and the Würzburg-Dresden Cluster of Excellence Cluster.

“It involves shining light on a material sample, extracting electrons, and measuring energy and outlet angles.”

“This allows us to see the electronic structure of the material directly in the momentum space.”

“By skillfully adapting this method, we were able to measure orbital angular momentum.”

Team's work It will be displayed in the journal Physics Review x.

____

T. figgemeier et al. 2025. Imaging of orbital vortex lines in three-dimensional momentum space. Phys. Rev. X 15, 011032; doi:10.1103/physrevx.15.011032

Source: www.sci.news

Physicists suggest that ultra-high energy cosmic rays originate from neutron star mergers

Ultra-high energy cosmic rays are the highest energy particles in the universe, and their energy is more than one million times greater than what humans can achieve.

Professor Farrar proposes that the merger of binary neutron stars is the source of all or most ultra-high energy cosmic rays. This scenario can explain the unprecedented, mysterious range of ultra-high energy cosmic rays, as the jets of binary neutron star mergers are generated by gravity-driven dynamos and therefore are roughly the same due to the narrow range of binary neutron star masses. Image credit: Osaka Metropolitan University / L-Insight, Kyoto University / Riunosuke Takeshige.

The existence of ultra-high energy cosmic rays has been known for nearly 60 years, but astrophysicists have not been able to formulate a satisfactory explanation of the origins that explain all observations to date.

A new theory introduced by Glennnies Farrer at New York University provides a viable and testable explanation of how ultra-high energy cosmic rays are created.

“After 60 years of effort, it is possible that the origins of the mysterious highest energy particles in the universe have finally been identified,” Professor Farrar said.

“This insight provides a new tool to understand the most intense events in the universe. The two neutron stars fuse to form a black hole. This is the process responsible for creating many valuable or exotic elements, including gold, platinum, uranium, iodine, and Zenon.”

Professor Farrer proposes that ultra-high energy cosmic rays are accelerated by the turbulent magnetic runoff of the dual neutron star merger, which was ejected from the remnants of the merger, before the final black hole formation.

This process simultaneously generates powerful gravitational waves. Some have already been detected by scientists from the Ligo-Virgo collaboration.

“For the first time, this work explains two of the most mystical features of ultra-high energy cosmic rays: the harsh correlation between energy and charge, and the extraordinary energy of just a handful of very high energy events,” Professor Farrar said.

“The results of this study are two results that can provide experimental validation in future work.

(i) Very high energy cosmic rays occur as rare “R process” elements such as Xenon and Tellurium, motivating the search for such components of ultra-high energy cosmic ray data.

(ii) Very high-energy neutrinos derived from ultra-high-energy cosmic ray collisions are necessarily accompanied by gravitational waves generated by the merger of proneutron stars. ”

study It will be displayed in the journal Physical Review Letter.

____

Glennys R. Farrar. 2025. Merger of dichotomous neutron stars as the source of the finest energy cosmic rays. Phys. Pastor Rett 134, 081003; doi:10.1103/physrevlett.134.081003

Source: www.sci.news

Physicists at CERN witness the creation of weak boson triplet

The physicist with Atlas collaboration We presented our first observations of VVZ production at Cern's large Hadron Collider. This is a rare combination of three giant vector bosons.

Three vector boson events recorded by Atlas are when one W-boson collapses into electrons and neutrinos, one collapses into moons and neutrinos, and two moons collapses into z boson. Muons are shown with a red line, electrons are shown with a green line, and a white line where “loss of energy” from Neutrino is destroyed. Image credits: Atlas/Cern.

As carriers of weak forces, W and Z bosons are central to standard models of particle physics.

Accurate measurements of multiboson production processes provide excellent testing of standard models and shed light on new physical phenomena.

“The production of three vector (V) bosons is a very rare process in LHC,” says Dr. Fabio Cerutti, Ph.D., Atlas Physics Coordinator.

“The measurement provides information about the interactions between multiple bosons linked to the symmetry underlying the standard model.”

“It is a powerful tool to uncover new physics phenomena, such as new particles that are too heavy to be produced directly in LHC.”

The Atlas team observed the generation of VVZ with statistical significance of 6.4 standard deviations, exceeding the five standard deviation thresholds needed to assert the observations.

This observation extends previous results from Atlas and CMS collaborations, including observations of VVV production by CMS and observations of WWW production by Atlas.

As some of the heaviest known particles, W and Z bosons can collapse in countless different ways.

In a new study, Atlas physicists focused on seven attenuation channels with the highest discovery potential.

These channels were further refined using a machine learning technique called Boosted Decision Trees, where the algorithms for each channel were trained to identify the desired signal.

By combining the attenuation channels, researchers were able to observe the production of VVZ and set limits on the contributions of new physical phenomena to the signal.

“The resulting limitations confirm the validity of the standard model and are consistent with previous results on the generation of three vector bosons,” they said.

“Analyzing the third run of LHC and the large dataset from future HLHCs will further improve the measurements of the generation of three vector bosons. We will deepen our understanding of these basic particles and our role in the universe.”

Team's result It will be published in journal Physical character b.

____

Atlas collaboration. 2025. Observation of VVZ production at S√=13 TEV using an ATLAS detector. Phys. Rhett. bin press; Arxiv: 2412.15123

Source: www.sci.news

Physicists Chart the Forces Inside Protons

Dr. Ross Young at the University of Adelaide and colleagues at the QCDSF collaboration are investigating the structure of the subatomic problem, which seeks to provide further insight into the powers that underpin the natural world. Their results are perhaps the smallest force field map ever produced in nature.

Distribution of the Colour Lorenz forces acting on the unpolarized quarks of the lateral plane (indicated by vector fields) superimposed on the upper Quark density distribution in the impact parameter space of the unpolarized protons. Image credits: Crawford et al. , doi: 10.1103/physrevlett.134.071901.

“We used a powerful computational technique called lattice quantum chromodynamics to map the forces acting within protons,” Dr. Young said.

“This approach allows us to decompose space and time into fine grids and simulate how strong forces (the fundamental interaction that links quarks to protons and neutrons) change in different regions within the proton. I'll do it.”

“Our findings show that even on these tiny scales, the forces involved reach immeasurable, up to 500,000 Newtons, equivalent to about 10 elephants, in spaces much smaller than the nucleus. It has become clear that it is being compressed,” said the University of Adelaide. D. Student Joshua Crawford.

These force maps provide a new way to understand the complex internal dynamics of protons, and why it works in experiments investigating the basic structure of high-energy collisions and materials such as CERN's large hadron criders. It helps to explain.

“Edison didn't invent the light bulb by studying bright candles. He was built on a generation of scientists who studied how light interacts with matter,” Young said. The doctor said.

“Like almost the same, modern research, such as our recent research, behaves how the basic building blocks of matter are struck by light, and at its most basic level of understanding nature at its most basic level. It makes clear that we will deepen the

“As researchers continue to unravel the inner structure of protons, greater insights could help improve the way protons are used in cutting-edge technologies.

“One of the most notable examples is proton therapy, which uses high-energy protons to accurately target tumors while minimizing damage to surrounding tissue.”

“Just as early breakthroughs in understanding light paved the way for modern lasers and imaging, advances in knowledge of proton structures can shape the next generation of applications in science and medicine.”

“By making the invisible forces within protons visible for the first time, this study bridges the gap between theory and experiment, which reveals the secrets of light to change the modern world. It bridges the same way that we did it.”

a paper Explaining the team's results was published in the journal Physical Review Letter.

____

Ja Crawford et al. 2025. Lateral force distribution of protons from lattice QCD. Phys. Pastor Rett 134, 071901; doi:10.1103/physrevlett.134.071901

Source: www.sci.news

Physicists at CERN investigate potential Lorentz symmetry violations in top quark pair production

A physicist in charge of CERN’s large -scale Hadronco Rider has tested whether top queks follow Albert Einstein’s special theory.

Installation of CMS beam pipe. Image credit: CERN / CMS collaboration.

In addition to quantum mechanics, Albert Einstein’s special relativity is functioning as the basis of the standard model of particle physics.

In that mind, there is a concept called Lorentz symmetry. The experimental results do not depend on the direction or speed of the experiment in which they were taken.

Special relativity has endured the trials of time. However, some theories, including specific models in string rationale, predict that very high energy does not work with special relativity and experimental observation depends on the direction of space -time experiments.

Lorentz’s remnants of the symmetry destruction can be observed with low -energy, such as the energy of a large hoodron co -rider (LHC), but has not been found on LHC or other colliders despite previous efforts.

In a new study, CMS physicists have searched for Lorentz symmetry on LHC using the top quark pair, the most known basic particles.

“In this case, relying on the direction of the experiment means that the speed at which the top quark pair is generated by the LHC collision in the LHC is different over time,” they said.

“To be more accurate, the average direction of the top quark generated in the center of the LHC proton beam and the center of the CMS experiment also changes because the earth rotates around the axis.”

“As a result, and if there is a priority in space -time, the production rate of the highest pair varies by era.”

“Therefore, finding a deviation from a certain speed will discover the direction of space -time priority.”

The new results of the team based on the LHC’s second execution data consistent with a certain speed. In other words, Lorentz’s symmetry is not broken, and Einstein’s special relativity remains effective.

Researchers have used results to limit the size of the parameters that are predicted to be null when symmetry is maintained.

The obtained restrictions have improved up to 100 times with the previous search results, which were destroyed by Lorentz symmetry in the previous Tevatron accelerator.

“The results will open a way to search for the future in which Lorentz symmetry will be destroyed based on the top quark data from the third run of LHC,” said scientists.

“Open the door to scrutinization of processes including other heavy particles that can only be investigated on LHC, such as Higgs Boson, W and Z Bosons.”

study Published in the October 2024 issue of the journal Physics B.

______

CMS collaboration. 2024. Use the Dilepton Event in the 13 TEV Proton Proton collision to search for Lorentz invaluity in the production of top quark pairs. Physics B 857: 138979; DOI: 10.1016/j.physletb.2024.138979

Source: www.sci.news

Physicists discover proof of asymmetry between matter and antimatter in decay of baryons and beauty hadrons

The standard model of particle physics predicts an asymmetry between matter and antimatter known as charge parity (CP) violation. However, the size of this asymmetry in the Standard Model is not large enough to explain the disequilibrium, and so far the asymmetry has only been observed in certain decays of particles called mesons. In two new studies, LHCb collaboration CERN’s Large Hadron Collider (LHC) has discovered evidence of CP violation in baryon decay and beauty hadron decay into charmonium particles, shedding light on these two pieces of the matter-antimatter puzzle.

Exterior view of the LHCb detector. Image credit: CERN.

Experiments involving LHCb have previously searched for baryon CP violation by looking for differences in the way matter and antimatter baryons decay into other particles.

However, these investigations have so far been essentially empty-handed.

One LHCb study provided evidence for a process in the specific collapse of the bottom lambda baryon, but subsequent studies analyzing larger samples of such collapses did not increase that evidence.

in first new studyLHCb physicists scrutinized proton-proton collision data obtained during the first and second runs of the LHC and discovered various decay modes of the bottom lambda baryon, including decay into a lambda baryon and two kaons. You have searched for

We then investigated the CP violation in each decay mode, essentially by counting the number of decays of the bottom lambdabaryon and its antimatter partner and taking the difference between the two.

In the case of the lambda baryon and its decay into two kaons, this difference showed evidence of a CP violation with a significance of 3.2 standard deviations.

in second studythe LHCb team focused on the decay of a beautiful charged meson into J/psi and a charged pion.

J/psi is a charmmonium particle, a meson consisting of a charm quark and a charm antiquark.

We performed an analysis similar to the lower lambda baryon study, also using data from the first and second runs of the LHC, and found evidence for CP violation in this decay mode of charged meons. Again, the significance is 3.2 standard. Deviation.

This finding represents evidence of CP violation in the decay of beauty hadrons to charmonium particles.

“Our study represents an important step toward establishing whether CP violations are present in these types of collapses,” the authors state.

“Data from the high-luminosity LHC, with its third experiment and planned collider upgrades, will shed further light on these and other parts of the matter-antimatter puzzle. .”

_____

LHCb collaboration. 2024. Study of Λ0b and Ξ0b decay to Λh+h'- and evidence of CP violation in Λ0b→ΛK+K- decay. arXiv: 2411.15441

LHCb collaboration. 2024. First evidence of direct CP violation to charmonium decay in cosmetology. arXiv: 2411.12178

Source: www.sci.news

Physicists at CERN make groundbreaking discovery: Evidence of antihyperhelium-4 detected for the first time

Physicists are Alice Collaboration. Evidence of antihyperhelium-4 has been seen for the first time at CERN’s Large Hadron Collider (LHC). Antihyperhelium-4 consists of two antiprotons, an antineutron, and an antilambda. New results are also the first evidence of the heaviest antimatter hypernuclear still at the LHC.

Illustration of the production of antihyperhelium-4 in a lead-lead collision. Image credit: AI-assisted J. Ditzel.

Collisions between heavy ions at the LHC created quark-gluon plasma, a hot, dense state of matter that is thought to have filled the universe about a millionth of a second after the Big Bang.

Heavy ion collisions also create conditions suitable for the production of atomic nuclei, exotic hypernuclei, and their antimatter counterparts, antinuclei and antihypernuclei.

Measuring these forms of matter is important for a variety of purposes, including helping to understand the formation of hadrons from quarks and gluons, the building blocks of plasma, and the matter-antimatter asymmetry seen in the modern universe.

Hypernuclei are exotic atomic nuclei formed by a mixture of protons, neutrons, and hyperons, the latter of which are unstable particles containing one or more strange types of quarks.

More than 70 years after their discovery in cosmic rays, hypernuclei continue to be a source of fascination for physicists. This is because hypernuclei are rarely found in nature and are difficult to create and study in the laboratory.

Collisions of heavy ions produce large numbers of hypernuclei, and until recently, the lightest hypernuclei, hypertriton (composed of protons, neutrons, and lambda), and its antimatter partner, antihypertriton, have been observed.

Following recent observations of antihyperhydrogen-4, ALICE physicists have detected antihyperhelium-4.

This result has a significance of 3.5 standard deviations and is also the first evidence of the heaviest antimatter hypernucleus ever at the LHC.

The ALICE measurements are based on lead-lead collision data taken in 2018 at an energy of 5.02 teraelectronvolts (TeV) for each colliding pair of nucleons (protons and neutrons).

The researchers examined data for the signals of hyperhydrogen-4, hyperhelium-4, and their antimatter partners using machine learning techniques that go beyond traditional hypernuclear search techniques.

Candidates for (anti)hyperhydrogen-4 were identified by looking for an (anti)helium-4 nucleus and a charged pion with which it decays; identified by. -Three atomic nuclei, an (anti)proton, and a charged pion.

In addition to finding evidence for antihyperhelium-4 with a significance of 3.5 standard deviations and evidence for antihyperhydrogen-4 with a significance of 4.5 standard deviations, the ALICE team found that the production yields of both hypernuclei and measured the mass.

“For both hypernuclei, the measured masses are consistent with current global average values,” the scientists said.

“The measured production yields were compared with predictions from a statistical hadronization model that adequately accounts for the formation of hadrons and nuclei in heavy ion collisions.”

“This comparison shows that the model's predictions closely match the data when both the excited hypernuclear state and the ground state are included in the prediction.”

“This result confirms that the statistical hadronization model can also adequately explain the production of hypernuclei, which are compact objects about 2 femtometers in size.”

The authors also determined the antiparticle-to-particle yield ratios for both hypernuclei and found that they agreed within experimental uncertainties.

“This agreement is consistent with ALICE's observation that matter and antimatter are produced equally at LHC energy and further strengthens ongoing research into the matter-antimatter imbalance in the Universe.” concluded.

Source: www.sci.news

Physicists at CERN witness a top quark pair in lead-lead collision

The generation of top quark pairs is observed This process of interaction between atomic nuclei was observed for the first time in lead-lead collisions at CERN's Large Hadron Collider (LHC) and the ATLAS detector.

We show lead-lead collisions at 5.02 TeV per nucleon pair, resulting in the production of candidate pairs of top quarks that decay into other particles. This event contains four particle jets (yellow cone), one electron (green line), and one muon (red line). The inlay shows an axial view of the event. Image credit: ATLAS/CERN.

In quark-gluon plasma, quarks (matter particles) and gluons (strong force transmitters), which are the basic constituents of protons and neutrons, are not bound within particles and exist in an unconfined state of matter, and almost It forms a complete dense fluid.

Physicists believe that quark-gluon plasma filled the universe shortly after the Big Bang, and their study provides a glimpse into conditions at earlier times in the universe's history.

However, the lifespan of quark-gluon plasma produced by heavy ion collisions is extremely short, approximately 10 years.-twenty three Seconds — means not directly observable.

Instead, physicists study the particles produced in these collisions that pass through the quark-gluon plasma and use them as probes of the plasma's properties.

In particular, the top quark is a very promising probe of the evolution of quark-gluon plasmas over time.

The top quark, the heaviest elementary particle known, decays into other particles an order of magnitude faster than the time required to form a quark-gluon plasma.

The delay between the collision and the decay products of the top quark interacting with the quark-gluon plasma may serve as a “time marker” and provide a unique opportunity to study the temporal dynamics of the plasma.

In addition, physicists could potentially extract new information about the nuclear parton distribution function, which describes how the momentum of a nucleon (proton or neutron) is distributed among its constituent quarks and gluons.

In the new study, physicists from the ATLAS collaboration studied lead ion collisions that occurred during LHC Experiment 2 at a collision energy of 5.02 teraelectronvolts (TeV) per nucleon pair.

They observed the production of a top quark in a dilepton channel, where the top quark decays into a bottom quark and a W boson, which then decays into an electron or muon and its associated neutrino.

This result has statistical significance with a standard deviation of 5.0, and is the first observation of the production of a top quark pair in a nucleus-nucleus collision.

“We measured the production rate, or cross section, of the top quark pair with a relative uncertainty of 35%,” the physicists said.

“The overall uncertainty is primarily driven by the size of the dataset, which means new heavy ion data from the ongoing Experiment 3 will improve the accuracy of the measurements.”

“The new results open the door to the study of quark-gluon plasmas,” the researchers added.

“Future studies will also consider semi-leptonic decay channels for top quark pairs in heavy ion collisions. This may provide the first glimpse of the evolution of quark-gluon plasmas over time.” ”

Source: www.sci.news

Physicists conduct measurements on fermium’s nuclear properties

Physicists are GSI/FAIR accelerator facility gained insight into the structure of the atomic nucleus. Fermium is a synthetic chemical element of the actinide series with atomic number 100. Using laser spectroscopy techniques, they tracked changes in the nucleus’s charge radius and found that it steadily increased as neutrons were added to the nucleus.

Fermium isotopes studied by Warbinek others. It is highlighted in this graph. Image credit: S. Raeder.

“The heaviest atomic nucleus known to date owes its existence to quantum mechanical nuclear shell effects,'' say researchers from the Helmholtz Institute Mainz and Geographical Survey Institute Helmholtzzentrum Schwerionenforschung. said Dr. Sebastian Roeder and colleagues.

“These increase the stability of the nucleus against spontaneous fission, allowing the formation of superheavy nuclei.”

“For a certain number of protons (Z) or neutrons (N), the so-called magic numbers, the nuclear shell exhibits a large energy gap, resulting in increased stability of the nucleus.”

“This is similar to the closed electron shell of noble gases, which provides chemical inertness.”

“The heaviest known atomic nucleus with a magic number for both protons (Z = 82) and neutrons (N = 126) is lead-208, a spherical nucleus.”

“The location of the next spherical gap beyond lead-208 is still unknown. Nuclear models predict it most frequently at Z = 114, Z = 120 or Z = 126, and N = 172 or N = 184. Masu.”

“This variation in predictions is primarily due to the large single-particle density in the heaviest nuclei, among other factors.”

The authors used a laser-based method to investigate a fermium nucleus with 100 protons (Z = 100) and 145 to 157 neutrons (N = 145 to 157).

Specifically, we studied the influence of quantum mechanical shell effects on the size of atomic nuclei.

“This allows us to elucidate the structure of these nuclei in the range around the known shell effect of neutron number 152 from a new perspective,” said Dr. Rader.

“At this neutron number, signs of neutron shell closure were previously observed in trends in nuclear binding energies.”

“The strength of the shell effect was measured by high-precision mass measurements at GSI/FAIR in 2012.”

“According to Einstein, mass equals energy, so these mass measurements gave us a hint about the additional binding energy that shell effects provide.”

“The nucleus around neutron number 152 is shaped more like a rugby ball than a sphere, making it an ideal guinea pig for deeper research.”

“This deformation allows many protons within the nucleus to be separated further apart than in a spherical nucleus.”

In the measurements, the researchers investigated fermium isotopes with lifetimes ranging from a few seconds to 100 days, using different methods for producing fermium isotopes and methodological developments in applied laser spectroscopy techniques. Ta.

Short-lived isotopes are produced at the GSI/FAIR accelerator facility, where in some cases only a few atoms per minute are available for experiments.

The generated nuclei were stopped in argon gas, and electrons were picked up to form neutral atoms, which were then examined using laser light.

The neutron-rich, long-lived fermium isotopes (fermium-255, fermium-257) were produced in picogram quantities at the Oak Ridge National Laboratory in Oak Ridge, USA, and the Laue Langevina Institute in France.

Their results provided insight into the variation of the nuclear charge radius of the fermium isotope over neutron number 152 and showed a stable and uniform increase.

“Our experimental results and interpretation by modern theoretical methods show that in fermium nuclei, nuclear shell effects have a small influence on the charge radius of the nuclei, in contrast to their strong influence on the binding energy of these nuclei. “This shows that,” Dr. Jessica said. Mr. Warbinek is a researcher at CERN.

“This result supports the theoretical prediction that local shell effects due to a small number of neutrons and protons lose influence as the nuclear mass increases.”

“Instead, the effects attributed to the complete assembly of all nucleons dominate, with the nuclei being seen rather as charged liquid droplets.”

of result Published in a magazine nature.

_____

J. Warbinek others. 2024. Smooth trend of charge radius in fermium and influence of shell effect. nature 634, 1075-1079;doi: 10.1038/s41586-024-08062-z

Source: www.sci.news

Physicists Find Indications of Superfluidity in Low-Density Neutronic Matter

Accurate description of low-density nuclear matter is critical to explaining the physics of the neutron star’s crust, according to a team of theoretical physicists led by Argonne National Laboratory. Dr. Alessandro Lovato.

Fore others. We study the crust of neutron stars by simulating neutron matter and then adding “hidden” neutrons that mediate interactions between “real” neutrons. The neural network then constructs quantum wave functions for the normal and superfluid phases of neutronic matter. Image credit: Jane Kim, Ohio University.

The inner crust of a neutron star is characteristic Due to the existence of neutron superfluid.

A superfluid is a fluid that has no viscosity. In a neutron star, this means that the superfluid allows neutrons to flow without resistance.

To accurately predict the properties of neutronic matter at the lowest energy levels in this low-density form, researchers typically perform theoretical calculations that assume that neutrons combine to form Cooper pairs.

“The low-density nuclear material found in the crust of neutron stars exhibits complex and interesting structures that vary greatly with density,” Lovato and colleagues said.

“In the outer shell, the nucleons are bound to fully ionized nuclei. As the density increases within this region, these nuclei become increasingly neutron-rich, so in ground-based experiments they are present at lower densities. It is only possible to directly determine the main nuclides that

physicist used Artificial neural networks do not rely on this assumption to make accurate predictions.

They modified the standard “single particle” approach by introducing “hidden” neutrons that facilitate interactions between “real” neutrons and encode quantum many-body correlations.

This allows Cooper pairs to appear naturally during calculations.

“Understanding neutron superfluidity provides important insights into neutron stars,” the researchers said.

“This reveals phenomena such as its cooling mechanisms, rotation, and sudden changes in rotational speed.”

“Although we cannot directly access neutron star material experimentally, the fundamental interactions that govern the behavior of this material are the same as those that govern the nuclei of atoms on Earth.”

“Researchers are working to create simple yet predictable nuclear interactions.”

“Solving the quantum many-body problem accurately is an important part of assessing the quality of these interactions.”

“Our study uses simple interactions that are in good agreement with previous calculations that assumed more complex interactions.”

Low-density neutronic matter is characterized by fascinating emergent quantum phenomena, such as the formation of Cooper pairs and the onset of superfluidity.

“We used a combination of artificial neural networks and advanced optimization techniques to study this dense region,” the scientists said.

“Using a simplified model of the interaction between neutrons, we calculated the energy per particle and compared the results with those obtained from very realistic interactions.”

“This approach is competitive compared to other computational techniques at a fraction of the cost.”

_____

Bryce Foer others. 2024. Investigating the crust of a neutron star with the quantum state of a neural network. arXiv: 2407.21207

Bryce Foer others. 2023. Diluting neutron star material from quantum states in neural networks. Physics. Rev. Research 5(3):033062;doi: 10.1103/PhysRevResearch.5.033062

Source: www.sci.news

Physicists develop one-dimensional photon gas

In an experiment, physicists from the University of Bonn and the University of Kaiserslautern-Landau observed and studied the properties of a one- to two-dimensional crossover in a gas of harmonically confined photons (light particles). The photons were confined in dye microcavities, while polymer nanostructures provided the trapping potential for the photon gas. By varying the aspect ratio of the trap, the researchers tuned it from an isotropic two-dimensional confinement to a highly elongated one-dimensional trapping potential. The team paper Published in a journal Natural Physics.

A polymer applied to the reflective surface confines the photonic gas within the light's parabola. The narrower this parabola is, the more one-dimensional the gas behaves. Image courtesy of University of Bonn.

“To create a gas from photons, you need to concentrate a lot of photons in a limited space and cool them at the same time,” said Dr Frank Wevinger from the University of Bonn.

In their experiments, Dr. Wewinger and his colleagues filled a small container with a dye solution and used a laser to excite it.

The resulting photons bounced back and forth between the reflective walls of the container.

Each time they collided with a dye molecule they cooled, eventually condensing the photon gas.

By modifying the reflective surface, we can affect the gas's dimensions.

“We were able to coat the reflective surface with a transparent polymer and create tiny microscopic protrusions,” said Dr Julian Schulz, a physicist at the University of Kaiserslautern-Landau.

“These protrusions allow us to confine and condense photons into one or two dimensions.”

“These polymers act as a kind of channel for the light,” said Dr Kirankumar Kalkihari Umesh, a physicist at the University of Bonn.

“The narrower this gap becomes, the more one-dimensional the gas behaves.”

In two dimensions, there is a precise temperature limit where condensation occurs, just as water freezes at exactly 0 degrees – physicists call this a phase transition.

“But if you create a one-dimensional gas instead of two-dimensional, things are a bit different,” Dr Wewinger said.

“So-called thermal fluctuations do occur in the photon gas, but in two dimensions they are so small that they have no practical effect.”

“But on one level, these fluctuations can make waves, figuratively speaking.”

These fluctuations destroy the order in a one-dimensional system, causing different regions in the gas to no longer behave in the same way.

As a result, phase transitions that are still precisely defined in two dimensions become increasingly blurred as the system becomes one-dimensional.

However, their properties are still governed by quantum physics, just like for two-dimensional gases, and these types of gases are called degenerate quantum gases.

It's as if water gets cold but doesn't freeze completely, but turns into ice at low temperatures.

“We were able to investigate this behavior for the first time in the transition from a two-dimensional to a one-dimensional photon gas,” Dr. Wewinger said.

The authors were able to demonstrate that a one-dimensional photon gas indeed does not have a precise condensation point.

By making small changes to the polymer structure, it becomes possible to study in detail what happens during the transition between different dimensions.

Although this is still considered fundamental research at this point, it has the potential to open up new applications of quantum optical effects.

_____

K. Kalkihari Umesh othersDimensional crossover in a quantum gas of light. National Physical SocietyPublished online September 6, 2024; doi: 10.1038/s41567-024-02641-7

Source: www.sci.news

Physicists Witness the First Observation of Antihyperhydrogen 4

Physicists from the STAR Collaboration have observed an antimatter hypernucleus, antihyperhydrogen-4, consisting of an antihypernucleus, an antiproton, and two antineutrons, in nuclear collisions at the Relativistic Heavy Ion Collider (RHIC) at the U.S. Department of Energy's Brookhaven National Laboratory.

Artistic representation of antihyperhydrogen-4 produced in the collision of two gold nuclei. Image courtesy of the Institute of Modern Physics.

“What we know in physics about matter and antimatter is that, apart from the opposite charge, antimatter has the same properties as matter – the same mass, the same lifetime before decaying, and the same interactions,” said Junlin Wu, a graduate student at Lanzhou University and the China Institute of Modern Physics.

“But in reality, our universe is made up of antimatter rather than matter, even though equal amounts of matter and antimatter are thought to have been created during the Big Bang about 14 billion years ago.”

“Why our universe is populated with matter remains a question, and we don't yet have a complete answer.”

“The first step in studying the asymmetry between matter and antimatter is to discover new antimatter particles. This is the basic idea of ​​this research,” added Dr Hao Qiu, a researcher at the Institute of Modern Physics.

STAR physicists had previously observed atomic nuclei made of antimatter produced in RHIC collisions.

In 2010, they detected an antihypertriton, the first example of an antimatter nucleus containing a hyperon, a particle that contains at least one strange quark rather than just the light up and down quarks that make up ordinary protons and neutrons.

Just a year later, STAR physicists broke that massive antimatter record by detecting antihelium-4, the antimatter equivalent of a helium nucleus.

Recent analysis suggests that antihyperhydrogen 4 may also be feasible.

But detecting this unstable antihypernucleus is a rare event: all four components (one antiproton, two antineutrons and one antilambda) need to be ejected from the quark-gluon soup produced in the RHIC collision in just the right place, in the same direction and at just the right time, briefly becoming bound together.

“It's just a coincidence that these four component particles appear close enough together in the RHIC collision that they can combine to form an antihypernucleus,” said Brookhaven National Laboratory physicist Lijuan Luan, one of the STAR collaboration's co-spokespeople.

To find antihyperhydrogen-4, STAR physicists studied the trajectories of particles produced when this unstable antihypernucleus decays.

One of these decay products is the previously detected antihelium-4 nucleus, and the other is a simple positively charged particle called a pion (pi+).

“Antihelium-4 had already been discovered with STAR, so we used the same methods as before to pick up those events and reconstruct them with the π+ track to find these particles,” Wu said.

“It is simply by chance that these four component particles emerge from the RHIC collision close enough together to combine to form an antihypernucleus,” said Dr. Lijuan Luan, a research scientist at Brookhaven National Laboratory.

RHIC's collisions produce huge amounts of pions, and physicists have been sifting through billions of collision events to find the rare antihypernuclei.

The antihelium-4 produced by the collision can pair up with hundreds or even a thousand pi+ particles.

“The key was to find an intersection point where the trajectories of the two particles had a particular characteristic – a collapse vertex,” Dr. Luan said.

“That is, the collapse apex must be far enough away from the collision point that the two particles could have originated from the decay of an antihypernucleus that formed shortly after the collision of the particle originally produced in the fireball.”

STAR researchers worked hard to eliminate the background of all other potential collapse pair partners.

Ultimately, their analysis found 22 candidate events with an estimated background count of 6.4.

“That means that about six of what appear to be antihyperhydrogen-4 decays could just be random noise,” said Emily Duckworth, a doctoral student at Kent State University.

Subtracting that background count from the 22, physicists can be confident that they have detected about 16 actual antihyperhydrogen-4 nuclei.

The results were significant enough to allow scientists to make a direct comparison between matter and antimatter.

They compared the lifespan of antihyperhydrogen 4 to that of hyperhydrogen 4, which is made from normal matter variants of the same building blocks.

They also compared the lifetimes of another matter-antimatter pair, antihypertritons and hypertritons.

Neither difference was significant, but the authors were not surprised.

“This experiment tested a particularly strong form of symmetry,” the researchers said.

“Physicists generally agree that this symmetry breaking is extremely rare and is not an answer to the imbalance of matter and antimatter in the universe.”

“If we saw this particular breaking of symmetry, we would basically have to throw a lot of what we know about physics out the window,” Duckworth said.

“So in a way it was reassuring that symmetry still worked in this case.”

“We agree that this result provides further confirmation that our model is correct and marks a major step forward in the experimental study of antimatter.”

Team work Published in a journal Nature.

_____

STAR Collaboration. Observation of the antimatter hypernucleus antihyperhydrogen 4. NaturePublished online August 21, 2024, doi: 10.1038/s41586-024-07823-0

This article is based on an original release from Brookhaven National Laboratory.

Source: www.sci.news

Physicists may have discovered a method to create element 120, the most massive element to date.

Jacqueline Gates of Lawrence Berkeley National Laboratory isolating livermorium atoms.

Marilyn Sargent/Berkeley Lab 2024 Regents of the University of California

The third heaviest element in the universe has been created in a way that points the way to synthesizing the elusive element 120, the heaviest element in the periodic table.

“We were very shocked, very surprised and very relieved that we had not made the wrong choice in installing the equipment,” he said. Jacqueline Gates At the Lawrence Berkeley National Laboratory (LBNL), California.

She and her colleagues created the element, livermorium, by bombarding pieces of plutonium with beams of charged titanium atoms. Titanium has never been used in such experiments before because it’s hard to turn into a well-controlled beam and it takes millions or trillions of collisions to create just a few new atoms. But physicists think that the titanium beam is essential to making a hypothetical element 120, also known as unbinylium, which has 120 protons in its nucleus.

The researchers first evaporated a rare isotope of titanium in a special oven at 1,650°C (about 3,000°F). They then used microwaves to turn the hot titanium vapor into a charged beam, which they sent into a particle accelerator. When the beam reached about 10% of the speed of light and smashed into a plutonium target, a fragment of it hit a detector, where it detected a trace of two livermorium atoms.

As expected, each atom rapidly decayed into other elements. The stability of an atomic nucleus decreases as an atom’s mass increases. But the measurements were so precise that there’s only about a one in a trillion chance that the discovery was a statistical fluke, Gates says. The researchers announced their findings on July 23. Nuclear Structure 2024 Meeting at Argonne National Laboratory, Illinois.

Michael Thornessen The Michigan State University researcher says the experiment supports the feasibility of creating element 120. “We have to do the basic research and we have to go in the dark, so this is a really important and necessary experiment in that sense,” he says.

Toennesen says the creation of unbinylium will have profound implications for our understanding of the strong force, which determines whether heavy elements are stable. Studying unbinylium may also help us understand how exotic elements formed in the early universe.

The heaviest artificial element to date, element 118 (also known as oganesson), has two more protons than livermorium and was first synthesized in 2002. Since then, researchers have struggled to make atoms even heavier, because that requires colliding already-heavy elements with each other, which themselves tend to be unstable. “It’s really, really difficult work,” Thornesen says.

But the new experiment has LBNL researchers feeling optimistic: They plan to launch experiments aimed at creating element 120 in 2025 after replacing the plutonium target with the heavier element californium.

“I think we’re pretty close to knowing what to do,” Gates says, “and we have an opportunity to add new elements to the periodic table.” [is exciting]”…Very few people get that opportunity.”

topic:

  • Chemical /
  • Nuclear Physics

Source: www.newscientist.com

Physicists at CERN study the characteristics of enigmatic particles

Physicists have been intrigued by χc1(3872), also known as X(3872), since its discovery two decades ago. They have been exploring whether it is a conventional charmonium state composed of two quarks or an exotic particle made up of four quarks. The LHCb collaboration at CERN’s Large Hadron Collider (LHC) set out to find the answer.

Artist's impression of a tetraquark, made up of two charm quarks and an up and down antiquark. Image courtesy of CERN.

In the quark model of particle physics, there are heavy particles (composed of three quarks), mesons (consisting of quark-antiquark pairs), and exotic particles (comprising an unusual number of quarks).

To determine the composition of χc1(3872), physicists must measure properties like mass and quantum numbers.

According to theory, χc1(3872) could be a standard charmonium state made of a charm quark and an anticharm quark, or it could be an exotic particle consisting of four quarks.

These exotic particles could be tightly bound tetraquarks, molecular states, cc-gluon hybrid states, vector glueballs, or a combination of various possibilities.

Recent measurements by LHCb physicists revealed that its quantum number is 1++, and in 2020 they obtained precise data on the particle’s width (lifetime) and mass.

They also examined low-energy scattering parameters.

Their findings indicated that the mass of χc1(3872) is slightly less than the combined masses of the D0 and D*0 mesons.

These results have sparked debate within the theoretical community, with some proposing that χc1(3872) is a molecular state made up of spatially separated D0 and D*0 mesons.

However, this hypothesis faces challenges, as physicists anticipate molecular matter to be suppressed in hadron-hadron collisions, yet significant amounts of χc1(3872) are produced.

Other theorists suggest that the particle contains “compact” components, indicating a smaller size and potentially consisting of tightly bound charmonium or tetraquarks.

One method to uncover the composition of χc1(3872) is to calculate the branching ratio, which involves the probabilities of decay into different lighter particles.

By comparing the decay into a photon of the excited charmonium state, physicists can gain insights into the nature of the particle.

A key theoretical indicator is a non-zero ratio, suggesting the presence of compact components and countering a purely molecular model.

Using data from LHC Run 1 and Run 2, LHCb scientists found significant ratios beyond six standard deviations, ruling out a pure D0D*0 molecular hypothesis for χc1(3872).

Instead, the results support various predictions based on alternative hypotheses for the structure of χc1(3872, such as a mix of conventional (compact) charmonium, tetraquarks, light quarks, or molecules with a substantial compact core element.

Thus, the findings provide compelling evidence in favor of a χc1(3872) structure including a compact component.

_____

R. Aiji others (LHCb Collaboration). 2024. Probing the properties of the χc1(3872) state using radiative decay. arXiv: 2406.17006

This article is based on the original release from CERN.

Source: www.sci.news

CERN physicists witness exceptionally rare hyperon decay

A hyperon is a particle that contains three quarks, like a proton or a neutron, and one or more strange quarks. Physicists from the LHCb collaboration at the Large Hadron Collider (LHC) at CERN say they have observed a hyperon decay Σ+→pμ+μ- in proton-proton collisions.

A view of the LHCb detector. Image courtesy of CERN.

“Rare decays of known particles are a promising tool for exploring physics beyond the Standard Model of particle physics,” said the LHCb physicist.

“In the Standard Model, the Σ+ → pμ+μ- process is only possible through a loop diagram, meaning that the decay does not occur directly, but intermediate states have to be exchanged within the loop.”

“In quantum field theory, the probability of such a process occurring is the sum of the probabilities of all particles, both known and unknown, that can possibly be exchanged in this loop.”

“This is what makes such processes sensitive to new phenomena.”

“If a discrepancy is observed between experimental measurements and theoretical calculations, it may be caused by the contribution of some unknown particle.”

“These particles can either be exchanged within the loop or directly mediate this decay, interacting with the quarks and decaying into pairs of muons.”

“In the latter case, the new particle would leave a signature on the properties of the two muons.”

The study of the Σ+ → pμ+μ- decay has been particularly exciting thanks to hints of structure observed in the properties of muon pairs by the HyperCP collaboration in 2005.

With only three occurrences the structure was far from conclusive, and it was hoped that new research would shed light on the situation.

Finally, the LHCb data did not show any significant peak structure in the two-muon mass region highlighted by HyperCP, thus refuting the hint.

However, the new study observes the decay with a high degree of significance, followed by precise measurements of the decay probability and other parameters, which will allow further investigation of the discrepancy with the Standard Model predictions.

“In data collected in Run 2 of pp collisions at the LHCb experiment, the Σ+ → pμ+μ− decay is observed with very high significance, with a yield of NΣ+→pμ+μ− = 279 ± 19,” the authors write in their paper. paper.

“We do not see any structure in the two-muon invariant mass distribution that is consistent with the Standard Model predictions.”

“The collected signal yield allows for measurements of integral and differential branching rates, as well as other measurements such as charge-parity symmetry breaking and front-to-back asymmetry.”

_____

LHCb Collaboration. 2024. Observation of rare Σ+→pμ+μ− decays at LHCb. CERN-LHCb-CONF-2024-002

Source: www.sci.news

Quantum entanglement used by physicists to measure Earth’s rotation

Physicists at the University of Vienna have used a maximally entangled quantum state of light paths in a large interferometer to experimentally measure the speed of the Earth’s rotation.

Silvestri othersThey have demonstrated the largest and most precise quantum-optical Sagnac interferometer to date, sensitive enough to measure the Earth’s rotation rate. Image courtesy of Marco Di Vita.

For over a century, interferometers have been key instruments for experimentally testing fundamental physical questions.

They disproved the ether as a light-transmitting medium, helped establish the theory of special relativity, and made it possible to measure tiny ripples in space-time itself known as gravitational waves.

Recent technological advances allow interferometers to work with a variety of quantum systems, including electrons, neutrons, atoms, superfluids, and Bose-Einstein condensates.

“When two or more particles are entangled, only the overall state is known; the states of the individual particles remain uncertain until they are measured,” said co-first author Dr. Philip Walther and his colleagues.

“Using this allows us to get more information per measurement than we would without it.”

“But the extremely delicate nature of quantum entanglement has prevented the expected leap in sensitivity.”

For their study, the authors built a large fiber-optic Sagnac interferometer that was stable with low noise for several hours.

This allows the detection of entangled photon pairs with a sufficiently high quality to exceed the rotational precision of conventional quantum-optical Sagnac interferometers by a factor of 1000.

“In a Sagnac interferometer, two particles moving in opposite directions on a rotating closed path reach a starting point at different times,” the researchers explained.

“When you have two entangled particles, you get a spooky situation: they behave like a single particle testing both directions simultaneously, accumulating twice the time delay compared to a scenario where no entanglement exists.”

“This unique property is known as super-resolution.”

In the experiment, two entangled photons propagated through a 2 km long optical fiber wound around a giant coil, creating an interferometer with an effective area of ​​more than 700 m2.

The biggest hurdle the team faced was isolating and extracting the Earth’s stable rotation signal.

“The crux of the problem lies in establishing a measurement reference point where light is not affected by the Earth’s rotation,” said Dr Raffaele Silvestri, lead author of the study.

“Since we can’t stop the Earth’s rotation, we devised a workaround: split the optical fiber into two equal-length coils and connect them through an optical switch.”

“By switching it on and off, we were able to effectively cancel the rotation signal, which also increased the stability of larger equipment.”

“We’re basically tricking light into thinking it’s in a non-rotating universe.”

The research team succeeded in observing the effect of the Earth’s rotation on a maximally entangled two-photon state.

This confirms the interplay between rotating reference systems and quantum entanglement, as described in Einstein’s special theory of relativity and quantum mechanics, and represents a thousand-fold improvement in precision compared to previous experiments.

“A century after the first observations of the Earth’s rotation using light, this is an important milestone in that the entanglement of individual quanta of light is finally in the same region of sensitivity,” said co-first author Dr Haokun Yu.

“We believe that our findings and methods lay the foundation for further improving the rotational sensitivity of entanglement-based sensors.”

“This could pave the way for future experiments to test the behaviour of quantum entanglement through curves in space-time,” Dr Walther said.

Team work Published in a journal Scientific advances.

_____

Raffaele Silvestri others2024. Experimental Observation of Earth’s Rotation through Quantum Entanglement. Science Advances 10(24); doi: 10.1126/sciadv.ado0215

Source: www.sci.news

Physicists Investigate True Tauonium: The Heaviest and Smallest QED Atom

Quantum Electrodynamics (QED) Atoms are composed of unstructured point-like lepton pairs held together by electromagnetic forces.



An artist's impression of a true tauonium. Image credit: Fu other., doi: 10.1016/j.scib.2024.04.003.

QED atom “Like hydrogen, which is formed from protons and electrons, it is formed from lepton pairs through electromagnetic interactions,” said physicist Jinghan Hu of Peking University and colleagues.

“Their properties have been studied for things like testing QED theory, fundamental symmetries, gravity, and exploring physics beyond the Standard Model.”

“The first QED atom was discovered in 1951. It was in a bonded state and was named positronium.”

“The second one, discovered in 1960, was in a captive state and was named Muonium.”

“No other QED atoms have been discovered in the past 64 years.”

“A new collider is proposed to discover true muonium, which decays to its final state with electrons and photons,” they said.

“The heaviest and smallest QED atoms are tauonium, ditauonium, or true tauonium

in new paper in a diary science bulletinphysicists introduce a new method to identify true tauonium.

“Tauonium, which consists of tauon and its antiparticle, has a Bohr radius of only 30.4 femtometers, which is about 1/1741 times smaller than the Bohr radius of a hydrogen atom,” the researchers said.

“This means that tauonium can test the fundamental principles of quantum mechanics and QED on a smaller scale, providing a powerful tool for exploring the mysteries of the microscopic world of matter.”

“We will observe taunium by collecting data at 1.5 ab-1, which is close to the threshold for tauon pair production, in an electron-positron collider and selecting signal events containing charged particles accompanied by undetected neutrinos carrying away energy. We have demonstrated that the significance exceeds 5σ.

“This provides strong experimental evidence for the presence of tauonium.”

“We also found that by using the same data, the accuracy of measuring the tau lepton mass can be improved to an unprecedented level of 1 keV, two orders of magnitude higher than the best accuracy achieved in current experiments.”

“This result not only contributes to the accurate verification of the electroweak theory in the Standard Model, but also has profound implications for fundamental physics questions such as the universality of leptonic flavors.”

_____

Jin Hung Fu other. A new method for determining the heaviest QED atoms. science bulletin, published online on April 4, 2024. doi: 10.1016/j.scib.2024.04.003

Source: www.sci.news

Physicists at CERN release data on the discovery of the Higgs particle

Physicist from CMS cooperation at CERN just published the combination of CMS measurements that helped establish the discovery of the Higgs boson in 2012.

CMS event display showing a Higgs boson candidate decaying into two photons. It is one of two decay channels that were key to the particle’s discovery. Image credit: CERN.

“Physical measurements based on data from CERN’s Large Hadron Collider (LHC) are typically reported as central values and corresponding uncertainties,” the CMS physicists said.

“For example, shortly after observing the Higgs boson in the LHC’s proton-proton collision data, CMS determined its mass to be 125.3 plus or minus 0.6 GeV (the mass of a proton is about 1 GeV).”

“But this figure is just a quick summary of the measurements, and is like the title of a book.”

In measurement, the complete information extracted from the data is encoded into a mathematical function known as a likelihood function. This function includes measurements of quantities and dependence on external factors.

“For CMS measurements, these factors include the calibration of the CMS detector, the accuracy of the CMS detector simulation used to facilitate the measurements, and other systematic effects,” the researchers said.

“To fully understand the nasty collisions that occur at the LHC, many aspects need to be determined, so the likelihood function for measurements based on LHC data can be complex.”

“For example, the likelihood function for the combined CMS Higgs boson discovery measurement that CMS just released in electronic form has nearly 700 parameters for a fixed value of the Higgs boson mass.”

“Only one of these, the number of Higgs bosons found in the data, is an important physical parameter, and the rest model systematic uncertainties.”

“Each of these parameters corresponds to a dimension of a multidimensional abstract space in which the likelihood function can be drawn.”

“It is difficult for humans to visualize spaces that contain multiple dimensions, much less spaces that contain many dimensions.”

The new release of the CMS Higgs boson discovery measurement likelihood function, the first publicly available likelihood function from this collaboration, allows researchers to avoid this problem.

Using a publicly accessible likelihood function, physicists outside the CMS Collaboration can now accurately incorporate CMS Higgs boson discovery measurements into their studies.

“The release of this likelihood function and the Combine software used to model likelihood and fit data marks another milestone in CMS’s 10-year commitment to fully open science.” said the people.

“This joins hundreds of open access publications, the release of nearly 5 petabytes of CMS data on the CERN Open Data Portal, and the publication of the entire software framework on GitHub.”

Source: www.sci.news

Physicists at CERN successfully measure a key parameter of the Standard Model

Physicists from the CMS Collaboration at CERN’s Large Hadron Collider (LHC) have successfully measured the effective leptonic electroweak mixing angle. The results were presented at the annual general meeting. Rencontre de Morion Conference is the most accurate measurement ever made at the Hadron Collider and is in good agreement with predictions from the Standard Model of particle physics.

Installation of CMS beam pipe. Image credit: CERN/CMS Collaboration.

The Standard Model is the most accurate description of particles and their interactions to date.

Precise measurements of parameters, combined with precise theoretical calculations, provide incredible predictive power that allows us to identify phenomena even before we directly observe them.

In this way, the model has succeeded in constraining the masses of the W and Z particles, the top quark, and recently the Higgs boson.

Once these particles are discovered, these predictions serve as a consistency check on the model, allowing physicists to explore the limits of the theory’s validity.

At the same time, precise measurements of the properties of these particles provide a powerful tool for exploring new phenomena beyond the standard model, so-called “new physics.” This is because new phenomena appear as mismatches between different measured and calculated quantities.

The electroweak mixing angle is a key element of these consistency checks. This is a fundamental parameter of the Standard Model and determines how unified electroweak interactions give rise to electromagnetic and weak interactions through a process known as electroweak symmetry breaking.

At the same time, we mathematically connect the masses of the W and Z bosons that transmit weak interactions.

Therefore, measurements of W, Z, or mixed angles provide a good experimental cross-check of the model.

The two most accurate measurements of the weak mixing angle were made by experiments at CERN’s LEP collider and by the SLD experiment at the Stanford Linear Accelerator Center (SLAC).

These values ​​have puzzled physicists for more than a decade because they don’t agree with each other.

The new results are in good agreement with standard model predictions and are a step towards resolving the discrepancy between standard model predictions and measurements of LEP and SLD.

“This result shows that precision physics can be performed at the Hadron Collider,” said Dr. Patricia McBride, spokesperson for the CMS Collaboration.

“The analysis had to deal with the challenging environment of LHC Run 2, with an average of 35 simultaneous proton-proton collisions.”

“This paves the way for even more precise physics, where more than five times as many proton pairs collide simultaneously at the high-luminosity LHC.”

Precise testing of Standard Model parameters is a legacy of electron-positron collider such as CERN’s LEP, which operated until 2000 in the tunnel that now houses the LHC.

Electron-positron collisions provide a clean environment ideal for such high-precision measurements.

Proton-proton collisions at the LHC are more challenging for this type of research, even though the ATLAS, CMS, and LHCb experiments have already yielded numerous new ultra-high-precision measurements.

This challenge is primarily due to the vast background from physical processes other than those studied, and the fact that protons, unlike electrons, are not subatomic particles.

With the new results, it seemed impossible to reach accuracy similar to that of the electron-positron collider, but now it has been achieved.

The measurements presented by CMS physicists use a sample of proton-proton collisions collected from 2016 to 2018 at a center of mass energy of 13 TeV and a total integrated luminosity of 137 fb.−1 or about 11 billion collisions.

“The mixing angle is obtained through analysis of the angular distribution in collisions in which pairs of electrons or muons are produced,” the researchers said.

“This is the most accurate measurement ever made at the Hadron Collider and improves on previous measurements by ATLAS, CMS, and LHCb.”

Source: www.sci.news