China’s renewable energy boom at risk of disruption from extreme weather

The three Gorge dams in China are the main sources of hydroelectric power generation

costfoto/nurphoto/shutterstock

China’s vast electric grids cause more fuss than any other country with renewable energy, but the system is also vulnerable to electricity shortages caused by adverse weather conditions. The need to ensure reliable power supply could encourage Chinese governments to use more coal-fired power plants.

China’s energy systems are rapidly becoming cleaner, setting new records for wind power and solar energy generation almost every month. The country’s overall greenhouse gas emissions – the highest emissions in the world are expected to peak soon and begin to decline. Wind, solar and hydroelectric power currently account for about half of China’s generation capacity, and is expected to increase to almost 90% by 2060, when the country promised to reach “carbon neutrality.”

This increasingly reliance on renewables means that the country’s electricity system is becoming increasingly vulnerable to changes in weather. Intermittent winds and sun can be replenished by more stable hydropower produced by huge hydroelectric dams enriched in southern China. But what happens when the wind and sun slump coincides with drought?

Jinjiang Shen Darian Institute of Technology in China and his colleagues modeled how power generation on increasingly renewable grids corresponds to these “extreme weather” years. They estimated how future mixing of wind, solar and hydropower behaves under the most favourable weather conditions seen in the past.

They found that future grids are much more sensitive to weather changes than they are today. In a very unfavourable year, 2060, it could reduce the amount of generation capacity by 12% compared to today’s grid, leading to a power shortage. In 2030, in the most extreme cases, they found that this leads to over 400 hours of blackout times, a power shortage of nearly 4% of total energy demand. “That’s not a number that everyone can ignore.” Li Shuo At the Institute of Policy Studies in Asia Association, Washington, DC.

In addition to the overall lack of force, drought could specifically limit the amount of hydroelectric power available to smooth out irregular winds and solar generation. This could also lead to a shortage of electricity. “It is essential to equip a suitable proportion of stable power sources that are less susceptible to weather factors to avoid large-scale, large-scale power shortages,” the researchers wrote in their paper.

One way to help is to run excess electricity more efficiently across states. By expanding the transmission infrastructure, researchers found that it could eliminate the risk of power shortages on today’s grids and reduce half of the risk by 2060. Adding new energy storage in tens of millions of kilowatts, whether using batteries or other methods, would also be alleviated against hydroelectric droughts.

According to Li Shuo, any additional storage amounts China needs to be added to achieve carbon neutrality “becomes an astronomical number.”

These changes are difficult, but they add that many storage is viable given the enormous amount of batteries already produced in China. Lauri Myllyvirta At the Finland Energy and Clean Air Research Centre. He says the country is also building 190 gigawatts of pumped hydropower storage. This says that it can provide long-term energy storage by using surplus electricity to pump water over the dam and releasing it when more electricity is needed.

But so far, the electricity shortage has primarily spurred the Chinese government to build more coal-fired power plants. For example, in 2021 and 2022, hydroelectric droughts and heat waves increased enough electricity demand to cause serious power outages; Continuous expansion of coal. Record hydropower generation in 2023 resulted in record time for emissions.

Chinese President Xi Jinping said coal would peak this year, but he has entrenched political support for power sources. “If China is struggling with another round of these episodes, more coal-fired power plants shouldn’t be the answer,” says Li Shuo. “It’s difficult to abolish coal. China loves coal.”

topic:

Source: www.newscientist.com

Reconsidering Dark Energy: A Potential Universe-Altering Discovery

The Mayall Telescope Star Trail in Arizona houses dark energy spectroscopy equipment

Luke Tyas/Berkeley Lab

Dark energy is one of the most mysterious features of our universe. We don’t know what it is, but it controls how the universe is expanding and its ultimate destiny. Now, the study of millions of heavenly objects reveals that they may have been thinking about all the wrong things that could potentially have dramatic consequences in the universe.

“This is the biggest hint we have about the nature of dark energy in the roughly 25 years since we discovered it,” he says. Adam Reese at Johns Hopkins University in Maryland.

The results come from three years of data collected by Arizona’s Dark Energy Spectroscopy (DESI). By combining this data with other measurements such as background radiation in cosmic microwaves and maps of supernovas, the DESI team concluded that dark energy may have changed over time.

“This is the cutting edge of human knowledge,” says a member of the DigiTeam. It’ll be Percival At the University of Waterloo, Canada. “We see amazing things throughout the universe.”

Desi is attached to a telescope and works by measuring the “redshift” of light emitted from a distant galaxy, or how that wavelength of light extends as it travels through space. From now on, researchers can determine how much the universe has expanded during the journey of light and calculate how this expansion is changing. So far, the team has analyzed light from nearly 15 million galaxies and other bright objects in the sky.

For decades, physicists have agreed that the universe is expanding at a fixed acceleration. This is a cosmological constant known as the lambda, interpreted as the driving force of dark energy. However, in April 2024, Desi’s measurements provide the first hint that the universe may actually be decreasing faster over time, with the cosmological constants not so constant.

Riess, who is not part of the Desi team, says at the time they were not sure if the discovery would last with more data. In fact, it’s just getting stronger. “It’s very exciting for me to see that. [the team] After another year and after they added more data, no issues were found in the analysis. If anything, the outcome is more important,” he says.

That being said, this discovery still does not meet the “5-sigma” statistical levels traditionally used by physicists to discover it as authentic, rather than as a statistical fluke. Current analysis reaches a maximum of 4.2 sigma, but team members Mustafa Ishak Bouzaki At the University of Texas and Dallas, the team says they believe the results will reach five sigma within two years as Digi continues to acquire the data. “This outcome with dark energy is something we never thought it would happen in our lifetime,” he says.

One of the relief, according to Ishak-Boushaki, is that the discovery relies on Desi’s data as well as several other investigations in the universe. Riess compares the situa…To read more, visit Example Website.

Source: www.newscientist.com

Dark energy could potentially develop in unforeseen manners as time progresses

New results from the collaboration of Digi (dark energy spectroscopy) reveal signs of time-varying dark energy.

Two “fans” corresponding to the two main areas were observed by Desi on top and bottom of the plane of the Milkyway Galaxy. Image credits: Desi Collaboration/DOE/KPNO/NOIRLAB/NSF/AURA/R. Proctor.

“The universe will never surprise us and will never surprise us,” said Dr Arjun Dei, a digiproject scientist at Noir Love and associate director of the Central Scale Observatory for Strategic Initiatives.

“By unprecedentedly revealing the evolving textures of our universe's fabrics, Digi and Mayall telescopes are changing our understanding of the future of our universe and nature itself.”

The DESI data, which is employed alone, is consistent with the standard model of the universe. In Lambda CDM, CDM is cold dark matter, and Lambda represents the simplest case of dark energy that acts as a cosmological constant.

However, when combined with other measurements, the effect of dark energy may be weaker over time, increasing indications that other models may be more appropriate.

Other measurements of them include light leftovers from the dawn of space (cosmic microwave background, or CMB), distance measurements of supernovae, and observations of how light from distant galaxies are distorted by the effects of dark matter gravity (weak lenses).

So far, the evolving dark energy preference has not risen to 5 sigma. This is the gold standard in physics that represents a commonly accepted threshold of discovery.

However, the various combinations of DESI data and CMB, weak lenses, and supernova sets range from 2.8 to 4.2 sigma.

This analysis used techniques to hide results from scientists to the end to reduce unconscious biases about data.

This approach sets new criteria for how data is analyzed from large-scale spectroscopic studies.

The Desi is a cutting-edge instrument mounted on the NSF Nicholas U. Mayall 4-M telescope of the NSF Noirlab program, Kitt Peak National Observatory.

Light from 5,000 galaxies can be captured simultaneously, allowing you to carry out one of the most extensive research to date.

The experiment is currently investigating the fourth sky in five years, with plans to measure around 50 million galaxies and quasars (very far but bright objects with black holes in their cores) and more than 10 million stars by the time the project is finished.

The new analysis uses data from the first three years of observations and includes nearly 15 million best measured galaxies and quasars.

This is a major leap, with the one used in Desi's initial analysis improving the accuracy of the experiment with more than twice as much data set, suggesting evolving dark energy.

Digi tracks the effects of dark energy by studying how matter spreads throughout the universe.

Very early cosmic events left subtle patterns in the way matter was distributed. This is a function called Barion Acoustic Vibration (BAO).

Its Bao pattern acts as a standard ruler, and its size is directly influenced by how the universe is expanding at different times.

Measuring rulers at different distances has shown the strength of dark energy throughout history by researchers.

DESI Collaboration begins work with additional analysis to extract more information from the current dataset, and Desi continues to collect the data.

Other experiments offered online over the next few years will also provide complementary data sets for future analysis.

“Our results are a fertile foundation for our theory colleagues looking at new and existing models, and we look forward to what they came up with,” says Dr. Michael Levi, Desi Director and Scientist.

“Whatever the nature of dark energy, it shapes the future of our universe. It is very noteworthy that we look up at the sky with a telescope and try to answer one of the biggest questions humanity has ever asked.”

“These are prominent results from very successful projects,” said Dr. Chris Davis, NSF Program Director at NSF Neil Love.

“The powerful combination of NSF Mayall Telescope and DOE's dark energy spectroscopy instruments demonstrates the benefits of federal agencies collaborating with fundamental science to improve our understanding of the universe.”

Physicists shared their findings in a A series of papers It will be posted above arxiv.org.

Source: www.sci.news

Research on Dark Energy Supports the Evolving Theory

The Lambda-CDM (λCDM) model has been the basis of modern cosmology for some time, and it successfully explains the large-scale structure of the universe. It proposes that 95% of cosmos consists of dark matter (25%) and dark energy (70%). Dark energy, represented by the cosmic constant (λ), is thought to promote accelerated expansion of the universe, and maintains a constant energy density over time. However, new results from the dark energy research suggest a departure from this assumption, suggesting that dark energy may evolve over time.

This artist's impression shows the evolution of the universe, beginning with the Big Bang on the left. After that, you will see the microwave background of the universe. The formation of the first stars ends the dark ages of the universe, followed by the formation of galaxies. Image credit: M. Weiss/Harvard – Smithsonian Center for Astrophysics.

The Dark Energy Survey (DES) was carried out using a 570 megapixel energy-enhanced dark energy camera (decam) mounted on the NSF Víctor M. Blanco 4-M telescope from the NSF Neuroab program, Cerro Tololo Inter-American Observatory.

By obtaining data of 758 nights over six years, DES scientists mapped almost one-eighth area of ​​the sky.

The project employs multiple observation techniques, including supernova measurement, galaxy clustering analysis, and weak gravity lenses, to study dark energy.

Two important DES measurements, baryon acoustic vibration (BAO) and explosive star distance measurements (type IA supernova) track the enlarged history of the universe.

Bao refers to a standard cosmic ruler formed by early universe sound waves, with peaks spanning approximately 500 million light years.

Astronomers can measure these peaks over several periods of universe history to see how dark energy has expanded the scale over time.

“By analyzing 16 million galaxies, DES discovered that the measured BAO scale is actually 4% smaller than predicted by λCDM,” says Dr. Santiago Avila, an astronomer at the Center for Energy and Environmental Technology Research (CIEMAT).

Type IA supernova acts as a standard candle. In other words, the essential brightness is known.

Therefore, its apparent brightness is combined with information about the host's galaxy to allow scientists to perform accurate distance calculations.

In 2024, the DES team released the most extensive and detailed supernova dataset to date, providing highly accurate measurements of space distance.

New discoveries from the combined supernova data and BAO data independently confirm the anomalies seen in the 2024 supernova data.

By integrating DES measurements with cosmic microwave background data, researchers infer the properties of dark energy, and the results suggest that they evolve time.

When verified, this implies a dynamic phenomenon in which the cosmological constant, dark energy, is not ultimately constant and requires a new theoretical framework.

“The results are interesting as they suggest physics beyond the standard models of cosmology,” says Dr. Juan Mena Fernandez, a researcher at the Institute of Subatomic Physics and Cosmology.

“If more data supports these findings, we may be on the brink of a scientific revolution.”

Although current results are still inconclusive, future analyses incorporating additional DES probes such as Galaxy Clustering and weak lenses could enhance the evidence.

Similar trends have emerged from other major cosmological projects, such as Dark Energy Spectroscopy (DESI).

“We've seen a lot of experience in our research,” said Jesse Muir, a researcher at the University of Cincinnati.

“There's still a lot to learn and it's exciting to see how understanding evolves as new measurements become available.”

Team's paper It will be published in journal Physical Review d.

____

TMC Abbott et al. (DES collaboration). 2025. Dark Energy Survey: Final Devalion Acoustic Vibrations and Impact on Cosmological Expansion Models from Supernova Data. Physical Review din press; Arxiv: 2503.06712

Source: www.sci.news

Physicists suggest that ultra-high energy cosmic rays originate from neutron star mergers

Ultra-high energy cosmic rays are the highest energy particles in the universe, and their energy is more than one million times greater than what humans can achieve.

Professor Farrar proposes that the merger of binary neutron stars is the source of all or most ultra-high energy cosmic rays. This scenario can explain the unprecedented, mysterious range of ultra-high energy cosmic rays, as the jets of binary neutron star mergers are generated by gravity-driven dynamos and therefore are roughly the same due to the narrow range of binary neutron star masses. Image credit: Osaka Metropolitan University / L-Insight, Kyoto University / Riunosuke Takeshige.

The existence of ultra-high energy cosmic rays has been known for nearly 60 years, but astrophysicists have not been able to formulate a satisfactory explanation of the origins that explain all observations to date.

A new theory introduced by Glennnies Farrer at New York University provides a viable and testable explanation of how ultra-high energy cosmic rays are created.

“After 60 years of effort, it is possible that the origins of the mysterious highest energy particles in the universe have finally been identified,” Professor Farrar said.

“This insight provides a new tool to understand the most intense events in the universe. The two neutron stars fuse to form a black hole. This is the process responsible for creating many valuable or exotic elements, including gold, platinum, uranium, iodine, and Zenon.”

Professor Farrer proposes that ultra-high energy cosmic rays are accelerated by the turbulent magnetic runoff of the dual neutron star merger, which was ejected from the remnants of the merger, before the final black hole formation.

This process simultaneously generates powerful gravitational waves. Some have already been detected by scientists from the Ligo-Virgo collaboration.

“For the first time, this work explains two of the most mystical features of ultra-high energy cosmic rays: the harsh correlation between energy and charge, and the extraordinary energy of just a handful of very high energy events,” Professor Farrar said.

“The results of this study are two results that can provide experimental validation in future work.

(i) Very high energy cosmic rays occur as rare “R process” elements such as Xenon and Tellurium, motivating the search for such components of ultra-high energy cosmic ray data.

(ii) Very high-energy neutrinos derived from ultra-high-energy cosmic ray collisions are necessarily accompanied by gravitational waves generated by the merger of proneutron stars. ”

study It will be displayed in the journal Physical Review Letter.

____

Glennys R. Farrar. 2025. Merger of dichotomous neutron stars as the source of the finest energy cosmic rays. Phys. Pastor Rett 134, 081003; doi:10.1103/physrevlett.134.081003

Source: www.sci.news

Potential Massive Energy Sources Await Discovery in Earth’s Mountainous Regions

In the quest for clean energy and a shift away from fossil fuels, scientists may have uncovered new sources of power, potentially hidden in our mountains. A team of researchers from Germany has identified a vast reservoir of hydrogen gas, generated by rocks formed millions of years ago, through advanced simulations.

This discovery is significant as hydrogen (H2) as a power source does not emit greenhouse gases into the atmosphere, making it a more sustainable alternative to fossil fuels that contribute to climate change. Additionally, the production of hydrogen results in water instead of harmful emissions. However, the challenge lies in the fact that natural hydrogen production is rare, with the current synthetic production relying on fossil fuels.

The main hurdle in hydrogen production is sourcing it naturally. While geological processes can generate natural hydrogen without the need for fossil fuels, the availability of large accessible reserves remains uncertain. The recent study conducted by German researchers could potentially address this issue.

“We may be on the brink of a new era in natural hydrogen exploration,” said Dr. Frank Zworn, the lead author of the study published in the journal Advances in Science. “This could pave the way for a new natural hydrogen industry.”

https://c02.purpledshub.com/uploads/sites/41/2025/02/Earths-mantle.mp4
The rocks that produce hydrogen gas originate from the Earth’s mantle, constituting a significant portion beneath the Earth’s crust. Video Credits: Getty Images

Researchers at the GFZ Helmholtz Center for Geosciences in Germany utilized simulations of plate tectonic processes to identify a substantial reserve of natural hydrogen.

Natural hydrogen can be generated through various methods, such as bacterial transformation of organic matter or the splitting of water molecules due to radioactivity in the Earth’s crust. However, one of the most promising natural methods involves a geological process known as “serpentinization,” where rocks from the Earth’s mantle react with water to release H2 gas.

According to researchers, when these hydrogen-rich rocks are situated near the Earth’s surface, they can create potential zones for large-scale hydrogen production via excavation. These rocks are brought closer to the surface through processes such as continental rifting and mountain formation over millions of years.

As the crustal plates collide and create mountains, deep mantle rocks push up to the surface of the Earth. ‘Hot spots’ of hydrogen gas were identified where these rocks surfaced. – Image credit: CC BY-NC-SA 3.0 USGS/ESEU Frankswaan edition, GFZ

By analyzing two processes, researchers determined that mountain formation offers ideal conditions for hydrogen generation. The combination of cold environments in mountains and increased water circulation could enhance hydrogen levels significantly. Simulations showed that rocks emerging through mountain formations have 20 times the hydrogen capacity compared to those brought to the surface via continental rifting.

Signs of natural hydrogen production have already been observed in mountainous regions such as the Pyrenees, European Alps, and Balkans. The research team anticipates that their findings will inspire further exploration of natural hydrogen in these areas and other mountainous regions.

Professor Sasha Brune, the head of the geodynamic modeling section at GFZ, emphasized the economic prospects tied to natural hydrogen. He stated, “It is now crucial to delve deeper into the migration pathways of microbial ecosystems that consume hydrogen, both shallow and deep, and to gain a better understanding of where potential hydrogen reservoirs can be formed.”

Read More:

Source: www.sciencefocus.com

KM3NET continues to observe the highest energy cosmic neutrinos

The newly detected neutrino, called KM3-230213A, has an incredible energy of 220 peta-electronic (PEV), making it one of the most powerful basic particles ever detected. Its energy was about 100 million times more energy than visible photons, and about 30 times the highest neutrino energy previously detected.



Visual impressions of ultra-high energy neutrino events observed in KM3NET/ARCA. Image credit: km3net.

Cosmic neutrinos are generated near or along cosmic ray propagation pathways, leading to the generation of secondary unstable particles, which then collapse into neutrinos.

Cosmic rays interacting in the Earth's atmosphere generate atmospheric neutrinos that form the experimental background of cosmic neutrinos.

Monitor a huge amount of neutrino observatory to detect space neutrinos. Cherenkov Light It is induced by the passage of charged particles due to neutrino interactions within or near the detector.

“This high-energy neutrino is extremely rare and makes it a monumental discovery,” says Professor Miroslav Filipovich of Western Sydney University.

“This finding represents the most energetic neutrinos ever observed, providing evidence that such high energy neutrinos are being produced in the universe.”

“Detecting such extraordinary particles brings us closer to understanding the most powerful forces that shape our universe.”

Detection of KM3-230213a is KM3NET Telescopephotoelectron-filled tubes are used to capture light from charged particles generated when neutrinos interact with the detector.

“KM3NET's research infrastructure consists of two detector arrays of optical sensors deep in the Mediterranean,” the physicist said.

“The ARCA detector is located approximately 3,450 m deep off the coast of Portopalo Di Capo Passero in Sicily, Sicily, Italy, and is connected to the INFN coastal station, Nazionali Del Sud using electro-optic cables.”

“ARCA's geometry is optimized for research into high-energy cosmic neutrinos.”

“The ORCA detector is located at a depth of approximately 2,450 m in France's offshore Toulon and is optimized for studying neutrino oscillations.”

“Both detectors are under construction, but they are already working.”

The KM3-230213A event recorded light of over 28,000 photons, providing clear trajectories and compelling evidence suggesting the cosmic origin of the particles.

“KM3NET can reconstruct neutrino trajectories and energy,” says Dr. Luke Burns of Western Sydney University.

“To create neutrinos like these, like explosive stars and super-large black holes, requires extreme cosmic conditions.”

“The work of following up on the radiotelescope, like the Australia Square Kilometer Array Pathfinder, helps unlock their secrets.”

The researchers concluded that it is difficult to clearly determine its origin based on a single neutrino.

Future observations will focus on constructing clearer images of such events in order to construct clearer images of such events.

“The energy of the KM3-230213A event is much greater than the energy of neutrinos detected so far,” the scientists said.

“This suggests that neutrinos may be derived from a different cosmic accelerator than low-energy neutrinos, or this could be the first detection of cosmicogenic neutrinos. Universe.”

Team's paper Published in the February 12th issue of the journal Nature.

____

KM3NET collaboration. 2025. Observation of ultra-high energy cosmic neutrinos using KM3NET. Nature 638, 376-382; doi:10.1038/s41586-024-08543-1

Source: www.sci.news

Key Points from the Paris AI Summit: Global Inequalities, Energy Issues, and Elon Musk’s Influence on Artificial Intelligence


    1. Aimerica First

    A speech by US vice president JD Vance represented a disruptive consensus on how to approach AI. He attended the summit alongside other global leaders including India’s Prime Minister Narendra Modi, Canadian Prime Minister Justin Trudeau and European Commission head Ursula von der Leyen. I did.

    In his speech at Grand Palais, Vance revealed that the US cannot be hampered by an over-focus on global regulations and safety.

    “We need an international regulatory system that promotes the creation of AI technology rather than strangle it. In particular, our friends in Europe should look to this new frontier, optimistic rather than fear. ” he said.

    China was also challenged. Vance worked with the “authoritarian” regime in warning his peers before the country’s vice-president Zhang Guoqing with a clear reference to Beijing.

    “Some of us in this room learned from our experience partnering with them, and what we’ve learned from your information to the authoritarian masters who try to penetrate, dig into your information infrastructure and seize your information. It means taking the country with you,” he said.

    A few weeks after China’s Deepshek rattles US investors with a powerful new model, Vance’s speech revealed that America is determined to remain a global leader in AI .


    2. Go by yourself

    Naturally, in light of Vance’s exceptionalism, the US refused to sign the diplomatic declaration on “comprehensive and sustainable” AI, which was released at the end of the summit. However, the UK, a major player in AI development, also rejected it, saying the document is not progressing enough to address AI’s global governance and national security implications.

    Achieving meaningful global governance for AI gives us even more distant prospects, as we failed to achieve consensus over seemingly incontroversial documents. The first summit held in Bletchley Park in the UK in 2023, at least voluntarily reached an agreement between major countries and high-tech companies on AI testing.

    A year later, the gathering in Bletchley and Seoul had been carefully agreed, but it was already clear by opening night that this would not happen at the third gathering. In his welcoming speech, Macron threw the shade with a focus on Donald Trump’s fossil fuels, urging investors and tech companies to view France and Europe as AI hubs.

    Looking at the enormous energy consumption required by AI, Macron said France stands out because of its nuclear reliance.

    “I have a good friend on the other side of the ocean who says, ‘drills, babes, drills’. There is no need to drill here. Plugs, babysitting, plugs. Electricity is available,” he said. We have identified various national outlooks and competitive trends at the summit.

    Nevertheless, Henry de Zoete, former AI advisor to Rishi Sunak on Downing Street, said the UK “played the blind man.” “If I didn’t sign the statement, I’d brought about a significant will with Trump’s administrators at almost cost,” he wrote to X.


    3. Are you playing safely?

    Safety, the top of the UK Summit agenda, has not been at the forefront of Paris despite continued concerns.

    Yoshua Bengio, a world-renowned computer scientist and chairman of the major safety report released before the summit, told the Guardians of Paris that the world deals with the meaning of highly intelligent AI. He said that it wasn’t.

    “We have a mental block to the idea that there are machines that are smarter than us,” he said.

    Demis Hassabis ir, head of Google’s AI unit, called for Unity when dealing with AI after there was no agreement over the declaration.

    “It’s very important that the international community continues to come together and discuss the future of AI. We all need to be on the same page about the future we are trying to create.”

    Pointing to potentially worrying scenarios such as powerful AI systems behave at first glance, he added: They are global concerns that require intensive and international cooperation.

    Safety aside, some key topics were given prominent hearings at the summit. Macron’s AI envoy Anne Boubolot says that AI’s current environmental trajectory is “unsustainable” and Christy Hoffman, general secretary of the UNI Global Union, says that AI is productivity at the expense of workers. He said that promoting improvements could lead to an “engine of inequality.” ‘ Welfare.


    4. Progress is accelerating

    There were many mentions of the pace of change. Hassavis said in Paris that the theoretical term for AI systems that match or exceed human on any intellectual task is “probably five years or something apart.”

    Dario Amodei, CEO of US AI company Anthropic, said by 2026 or 2027, AI systems will be like a new country that will take part in the world. It resembles a “a whole new nation inhabited by highly intelligent people who appear on the global stage.”

    Encouraging governments to do more to measure the economic impact of AI, Amodei said advanced AI could represent “the greatest change to the global labor market in human history.” I’ve warned.

    Sam Altman, CEO of ChatGpt developer Openai, has flagged Deep Research, the startup’s latest release, released at the beginning of the month. This is an AI agent, a term for a system that allows users to perform tasks on their behalf, and features the latest, cutting-edge model O3 version of OpenAI.

    Speaking at the Fringe Event, he said the deep research was “a low percentage of all tasks in the world’s economy at the moment… this is a crazy statement.”


    5. China offers help

    Deepseek founder Liang Wenfeng had no shortage of discussion about the startup outcomes, but he did not attend the Paris Summit. Hassavis said Deepshek was “probably the best job I’ve come out of China.” However, he added, “There were no actual new scientific advances.”

    Guoqing said China is willing to work with other countries to protect security and share AI achievements and build a “community with a shared future for humanity.” Zhipu, a Chinese AI company in Paris, has predicted AI systems that will achieve “consciousness” by 2030, increasing the number of claims at the conference that large capacity AI is turning the corner.


    6. Musk’s shadow

    The world’s wealthiest person, despite not attending, was still able to influence events in Paris. The consortium led by Elon Musk has launched a bid of nearly $100 billion for the nonprofit that manages Openai, causing a flood of questions for Altman, seeking to convert the startup into a for-profit company.

    Altman told reporters “The company is not on sale,” and repeated his tongue counter offer, saying, “I’m happy to buy Twitter.”

    We were asked about the future of Openai’s nonprofit organizations. This is to be spun as part of the overhaul while retaining stocks in the profit-making unit. Things…and we’re completely focused on ensuring we save it.

    In an interview with Bloomberg, Altman said the mask bid was probably an attempt to “slow us down.” He added: “Perhaps his life is from a position of anxiety. I feel the man.”

Source: www.theguardian.com

Energy storage potential of batteries made from industrial waste

A redox flow battery at a power plant in Japan. New process could replace rare metals in these batteries with industrial byproducts

Photo by Alessandro Gandolfi/Panos

Industrial waste has been reborn as a battery component that can stably store a large amount of electrical charge. Such batteries could serve an important function for the power grid by smoothing out the peaks and valleys of renewable energy.

A redox flow battery (RFB) stores energy as two liquids called an anolyte and a catholyte in a pair of tanks. When these fluids are pumped into a central chamber separated by a thin membrane, they chemically react to generate electrons and generate energy. This process can be reversed to recharge the battery by passing an electric current through the membrane.

Although such batteries are cheap, they also have drawbacks. They are bulky, often as large as shipping containers, and require regular maintenance because they involve moving parts in pumping liquids. It also relies on metals such as lithium and cobalt, which are in short supply.

now, Emily Mahoney and colleagues at Northwestern University in Evanston, Illinois, have discovered a simple process that can turn previously useless industrial waste into useful anolyte. This could potentially replace these rare metals.

Their process converts triphenylphosphine oxide, which is produced during the manufacture of products such as vitamin tablets, to cyclic triphenylphosphine oxide, which is more likely to accumulate negative charges. When used as an anolyte, no loss of effectiveness is observed after 350 charging and draining cycles.

“Using an anolyte with a very negative potential increases the potential across the cell and therefore increases the efficiency of the battery,” Mahoney says. “But often the increased potential comes with stability issues, so it's exciting to have a stable yet highly negative compound.”

Mahoney said RFBs are designed to be safe and high-capacity, so they could potentially be used to store energy from wind and solar power, but their bulk makes them unsuitable for lithium-ion batteries in cars and smartphones. It is unlikely that they will be replaced.

topic:

Source: www.newscientist.com

Researchers find that fluctuations in the kinetic energy of the expanding universe are often mistaken for dark energy

Dark energy, the unknown energy source accelerating the expansion of the universe, doesn't actually exist, according to a new study.

This artist's impression shows the evolution of the universe, starting with the Big Bang on the left and continuing with the emergence of the Cosmic Microwave Background. The formation of the first stars ends the Dark Ages of the universe, followed by the formation of galaxies. Image credit: M. Weiss / Harvard-Smithsonian Center for Astrophysics.

Dark energy is generally thought to be a weak antigravity that acts independently of matter and accounts for about two-thirds of the mass-energy density of the universe.

The lambda cold dark matter (ΛCDM) model, which has served as the standard cosmological model for a quarter of a century, requires dark energy to explain the observed acceleration in the expansion rate of the universe.

Astrophysicists base this conclusion on measurements of distances to supernova explosions in distant galaxies, which appear to be farther away than they should be if the expansion of the universe is not accelerating.

However, the current expansion rate of the universe is increasingly being questioned by new observations.

First, evidence from the Big Bang's afterglow (cosmic microwave background radiation) shows that the expansion of the early Universe is inconsistent with the current expansion, an anomaly known as the Hubble tension.

Furthermore, in an analysis of new high-precision data from the Dark Energy Spectrometer (DESI), the scientists showed that the ΛCDM model does not fit a model in which dark energy does not remain constant but evolves over time. I discovered it.

Both the Hubble tension and the surprises revealed by DESI are difficult to resolve with models that use the simplistic expansion law of the universe from 100 years ago, or the Friedman equation.

This assumes that the universe expands uniformly on average. It's as if you could put all the cosmic structures in a blender and make a nondescript soup without complex structures.

But the current universe actually contains a complex cosmic web of galaxy clusters of sheets and filaments that surround and thread a vast void.

“Our findings show that dark energy is not needed to explain why the universe appears to be expanding at an accelerating rate,” said Professor David Wiltshire.

“Dark energy is a misidentification of fluctuations in the kinetic energy of expansion, which is not uniform in the blocky universe we actually live in.”

“This study provides compelling evidence that may answer some of the key questions about the quirks of our expanding universe.”

“With new data, the universe's greatest mysteries could be solved by the end of the decade.”

New evidence supports the timescape model of the expansion of the universe, which says dark energy is not needed because the difference in the stretch of light is not a result of the universe's acceleration, but of how it adjusts time and distance. .

An ideal clock in empty space would tick faster than in a galaxy, since gravity slows time down.

This model suggests that the Milky Way's clock is about 35% slower than the same clock at its average location in the large cosmic void. That means billions more years have passed in the void.

This allows for further expansion of the universe, and as such a vast void grows to dominate the universe, it appears to be expanding faster and faster.

“We now have so much data that only in the 21st century can we begin to answer the question of how and why a simple mean expansion law emerges from complexity. ” said Professor Wiltshire.

“A simple law of expansion consistent with Einstein's theory of general relativity does not need to obey Friedman's equation.”

“ESA's Euclid satellite, launched in July 2023, has the ability to test and differentiate the Friedman equation from timescape alternatives.”

“However, this will require at least 1,000 independent high-quality supernova observations.”

of study Published in Monthly Notices of the Royal Astronomical Society: Letters.

_____

antonia seifert others. 2025. Supernovae are evidence of fundamental changes in cosmological models. MNRASL 537 (1): L55-L60;doi: 10.1093/mnrasl/slae112

Source: www.sci.news

Newly Uncovered Massive Energy Reserve Found Beneath Earth’s Crust

The issue of energy consumption and its sources has always been a significant concern in the context of the climate crisis. In response, efforts are being made to utilize cleaner and newer fuels. Recently, a groundbreaking discovery of vast reservoirs of hydrogen energy hiding beneath the Earth’s surface has emerged, prompting questions about its potential impact.

Naturally occurring geological hydrogen is formed through Earth’s geochemical processes and has been identified in limited locations such as Albania and Mali. Research published in the journal Scientific Progress suggests that these reserves are widespread globally.

The study posits that if just 2 percent of the underground hydrogen could be extracted, it could yield 1.4 × 10^16 Joules of energy, equivalent to the world population’s energy consumption in 35 minutes. This amount of energy exceeds that of all natural gas reserves on Earth and could aid in achieving net-zero carbon goals.

While current methods for obtaining hydrogen involve fossil fuels or water-intensive electrolysis processes with a carbon footprint, extracting geological hydrogen is a comparatively low-carbon process, albeit currently practiced only in Mali.

Researchers at the U.S. Geological Survey have developed a model combining knowledge of hydrogen occurrence and geological data to explore these reservoirs on a global scale, estimating a substantial amount of hidden hydrogen beneath the Earth’s surface.

However, experts are hesitant about committing resources to extraction due to the scale and infrastructure required, as highlighted by geoscientist Professor Bill McGuire from University College London (UCL). He emphasizes the abundance of renewable energy sources like wind and solar and questions the necessity of tapping into another finite resource.

About our experts

Professor Bill McGuire is a volcanologist, climatologist, and author currently serving as Professor of Geophysics and Climate Hazards at UCL. His works include books on natural disasters, environmental change, and climate solutions.

Read more:

Source: www.sciencefocus.com

New images of Messier 83 captured by the Dark Energy Camera reveal unexpected discoveries

The spiral arm of Messier 83, one of the most prominent spiral galaxies in the night sky, exhibits a high rate of star formation, with six supernovae observed, according to astronomers at NSF’s NOIRLab.



This DECam image shows the spiral galaxy Messier 83. Image credits: CTIO / NOIRLab / DOE / NSF / AURA / TA Chancellor, University of Alaska Anchorage & NSF NOIRLab / D. de Martin, NSF NOIRLab / M. Zamani, NSF NOIRLab.

Messier 83 is located approximately 15 million light-years away in the southern constellation Hydra.

The galaxy, also known as the Southern Pinwheel Galaxy, M83, NGC 5236, LEDA 48082, and UGCA 366, has a diameter of about 50,000 light-years, making it about twice smaller than the Milky Way.

With an apparent magnitude of 7.5, it is one of the brightest spiral galaxies in the night sky. May is the best month to observe with binoculars.

Messier 83 is oriented almost completely face-on from Earth, meaning astronomers can observe its spiral structure in great detail.

This galaxy is a prominent member of a group of galaxies known as the Centaurus A/M83 group, which also counts dusty NGC 5128 and irregular galaxy NGC 5253 as members.

It was discovered on February 23, 1752 by French astronomer Nicolas Louis de Lacaille.

“Between 1750 and 1754, the French astronomer Nicolas-Louis de Lacaille studied the night sky with the purpose of determining distances to planets,” NOIRLab astronomers said.

“During this period, he observed and cataloged 10,000 stars and identified 42 nebular objects, including Messier 83, which he discovered during an expedition to the Cape of Good Hope in 1752.”

“In 1781, Charles Messier added it to his famous catalog and described it as a ‘starless nebula’, reflecting the limited knowledge of galaxies at the time.”

“It wasn’t until the 20th century, thanks to the work of Edwin Hubble, that astronomers realized that objects like Messier 83 were actually in another galaxy far outside the Milky Way.”

New images of Messier 83 dark energy camera (DECam), mounted on NSF’s Victor M. Blanco 4-meter telescope at the Cerro Tororo Inter-American Observatory, a program of the NSF NOIRLab.

“This image shows Messier 83’s distinct spiral arms filled with clouds of pink hydrogen gas where new stars are forming,” the astronomers said.

“Interspersed between these pink regions are bright blue clusters of hot young stars whose ultraviolet radiation has blown away the surrounding gas.”

“At the center of the galaxy, a yellow central bulge is made up of old stars, and weak bars connect spiral arms through the center, funneling gas from the outer regions toward the center.”

“DECam’s high sensitivity captures Messier 83’s extended halo and the countless more distant galaxies in the background.”

“Just as Messier 83 is filled with millions of newly formed stars, this galaxy is also home to many dying stars,” they added.

“Over the past century, astronomers have witnessed a total of six stellar explosions called supernovae in Messier 83. Only two other galaxies can match this number.”

In 2006, astronomers discovered a mysterious feature at the center of Messier 83.

“At the center of this galaxy, we discovered a never-before-seen concentration of mass similar to a secondary nucleus, likely the remains of another galaxy being consumed by Messier 83 in an ongoing collision. , likely the same collision that caused the starburst activity,’ the researchers said.

“The two nuclei, which likely contain the black hole, are expected to coalesce into a single nucleus in another 60 million years.”

Source: www.sci.news

Chips linked with light could speed up AI training while reducing energy consumption.

SEI 232166506

IBM optical module prototype for connecting chips with optical fibers

IBM’s Ryan Rabin

Fiber optic technology helps chips communicate with each other at the speed of light, allowing them to transmit 80 times more information than using traditional electrical connections. This could significantly reduce the training time required for large-scale artificial intelligence models from months to weeks, while also reducing data center energy and emissions costs.

Most cutting-edge computer chips still communicate using electrical signals transmitted over copper wires. But as the tech industry rushes to train AI models at scale, a process that requires networks of AI superchips to transfer large amounts of data, companies are using fiber optic speed-of-light communications to link chips together. I am very passionate about this.

This technology is not new. The Internet already relies on undersea fiber-optic cables that stretch thousands of kilometers between continents. But to transmit data between fingernail-sized chips, companies need to connect as many hair-thin optical fibers as possible to the end of each chip.

“As everyone knows, the best communication technology is fiber optics. That’s why fiber optics is used everywhere for long-distance communications.” Mukesh Khare A preview of the technology was given at a press conference at IBM Research. “This co-packaged optical innovation essentially brings the power of fiber optics to the chip itself.”

Khare and his colleagues have developed an optical module that allows chipmakers to add six times more optical fibers to the edge of a chip than with current technology. This module uses a structure called an optical waveguide to connect 51 optical fibers per millimeter. It also prevents optical signals from one fiber from interfering with adjacent fibers.

“What IBM has really done here is take advantage of all of its materials and packaging technology, its history of leadership in that field, to truly break down the way waveguides can be used to achieve high-density optical fiber. “It’s about doing it,” he says. dan hutchison at TechInsights, a semiconductor technology research company headquartered in Canada. “For me, when I saw this, it was a big step forward.”

The result is enhanced chip-to-chip communication, potentially allowing AI developers to train large language models in less than three weeks instead of three months. Switching from wires to fiber optics for chip communications could also mean cutting energy costs for training such AI models by a factor of five.

IBM has already put its optical modules through stress tests that include high humidity and temperatures ranging from -40°C (-40°F) to 125°C (257°F). Hutcheson expects large semiconductor manufacturing companies may be interested in licensing the technology.

“We are in the early days of all of this, but semiconductor technology is the hottest area right now in terms of high-performance computing and AI technology,” he says.

topic:

  • artificial intelligence/
  • computing

Source: www.newscientist.com

DESI seeks proof of dark energy originating from black holes

According to the popular inflationary universe theory, at the beginning of the Big Bang, a mysterious energy caused an exponential expansion of the early universe, creating all known matter. That ancient energy shared important characteristics with the dark energy of the current universe. “Where in the later universe will we see gravity as strong as it was at the beginning of the universe?'' The answer lies at the center of a black hole. What happened during inflation could also be reversed, with the matter of a massive star becoming dark energy again during gravitational collapse – like a mini-Big Bang played in reverse. A new study strengthens the evidence for this scenario using recent data. dark energy spectrometer (DESI).

A view of the accretion disk surrounding a supermassive black hole and the jet-like structures flowing out of the disk. The black hole's extreme mass bends space-time so that the backside of the accretion disk can be seen as an image above and below the black hole. Image credit: Science Communication Lab, DESY.

“If a black hole contains dark energy, it could merge with the expanding universe and grow faster,” said Dr. Kevin Croker, an astronomer at Arizona State University.

“We can't know the details of how this is happening, but we can see evidence that it's happening.”

Data from the first year of DESI's planned five-year study shows intriguing evidence that the density of dark energy has increased over time.

This provides a compelling clue to support this idea of ​​what dark energy is. Because that increase in time matches how the amount and mass of black holes has increased over time.

“When I first got involved in this project, I was very skeptical,” said Boston University professor Steve Arlen.

“But I remained open-minded throughout the process, and when I started doing the cosmological calculations, I said, 'This is a really cool mechanism for creating dark energy.'”

To look for evidence of dark energy from black holes, astronomers used tens of millions of distant galaxies measured by DESI.

The instrument looks into the past billions of years and collects data that can be used to determine with great precision how fast the universe is expanding.

Furthermore, these data can be used to infer how the amount of dark energy changes over time.

The researchers compared these data to how many black holes have been created by large star explosions throughout the history of the universe.

“The two phenomena were consistent with each other. When a new black hole was created by the death of a massive star, the amount of dark energy in the universe increased in the right way,” said Dr. Duncan Farrar, a physicist at New York University. said. Hawaii.

“This makes the theory that black holes are the source of dark energy more plausible.”

This study complements a growing literature investigating the possibility of cosmological coupling in black holes.

A 2023 study reported cosmological coupling in a supermassive black hole at the center of a galaxy.

This study encouraged other teams to investigate the effects of black holes in different parts of the universe.

“These papers explore the relationship between dark energy and black holes in terms of their growth rate,” said astrophysicist at Healthpeak Properties and former general counsel at the U.S. Securities and Exchange Commission. said Dr. Brian Cartwright.

“Our new paper links dark energy to when black holes are born.”

The main difference in the new paper is that most of the black holes involved are younger than those studied previously.

These black holes were born at a time when star formation, which tracks black hole formation, was well underway, not just beginning.

Professor Roger Windhorst from Arizona State University said: “This happened fairly late in the universe and is informed by recent measurements of black hole formation and growth observed by the Hubble and Webb Space Telescopes. ” he said.

“The next question is where are these black holes and how have they been moving around for the past eight billion years? Scientists are now working to suppress this,” Croker said. the doctor said.

Science needs more research and observation tools, and now that DESI is online, this exploration of dark energy is just beginning.

“Whether or not we continue to support the black hole hypothesis, this only brings further depth and clarity to our understanding of dark energy,” Professor Ahlen said.

“I think it's great as an experimental endeavor. You can have preconceptions or not, but we're based on data and observation.”

Regardless of what future observations yield, the research being conducted now represents a major shift in dark energy research.

“Essentially, whether black holes are dark energy is no longer just a theoretical question, coupled with the universe in which they live. This is now an experimental question,” said Gregory of the University of Michigan.・Professor Tarr said.

of study Published in Journal of Cosmology and Astroparticle Physics.

_____

Kevin S. Crocker others. 2024. The temporal evolution of DESI dark energy is harvested by cosmologically coupled black holes. JCAP 10:094;Doi: 10.1088/1475-7516/2024/10/094

This article is adapted from the original release by the University of Michigan.

Source: www.sci.news

The implementation of clean energy suggests that China’s emissions could have reached their peak.

China has introduced solar power generation, and panels have been installed on North Barren Mountain in Zhangjiakou City.

Cost Photo/NurPhoto/Getty Images

With large-scale deployment of wind and solar power across China, the country's emissions could peak in 2023, potentially marking a historic turning point in the fight against climate change. be.

China's CO2 emissions hit a record high in 2023 as the Chinese economy recovers from the effects of the coronavirus pandemic. But since then, large amounts of wind and solar power have been added to the country's power grid, while emissions from the construction industry have declined.

China's carbon dioxide emissions remained flat from July to September 2024, after falling by 1% in the second quarter of this year, according to a new analysis. This means that overall emissions in 2024 could be flat or slightly down at 2023 levels.

This will be critical to tackling global climate change. Lauri Milivirta At the Center for Energy and Clean Air Research, a Finnish think tank. “For the past eight years, since the signing of the Paris Climate Agreement, China's emissions growth has been the main driver of global emissions,” he says.

In its climate change plan submitted to the United Nations, China pledged to peak greenhouse gas emissions by 2030 and achieve net-zero emissions by 2060. But experts warn. This plan is not very ambitious Given the large impact that China, the world's largest emitter, has on global climate change.

It's important for China to bring emissions to a peak as soon as possible, Millibilta said. “This would pave the way for the country to start reducing emissions much sooner than current commitments require,” he said. “This will have huge implications for global efforts to avoid catastrophic climate change.”

China is rushing to ramp up power supplies across the country to meet rapidly growing power demand. This demand increased by 7.2% year-on-year from July to September, due to rising living standards and increased demand for air conditioning due to the strong heat wave from August to September.

New renewable energy sources are being introduced at breakneck speed across China to fill the electricity demand gap. From July to September, compared to the same period in 2023, solar power generation increased by 44 percent and wind power generation increased by a whopping 24 percent. Based on the current trajectory, China's solar power growth this year will rival China's total annual electricity generation. Australia in 2023.

However, coal-fired power usage still increased by 2% and gas production increased by 13% from July to September in response to increased demand. This resulted in an overall 3% increase in CO2 emissions from China's power sector during this period. However, these were offset by a slowdown in the construction industry across China as real estate investment declined.

Oil demand also fell by 2% in the third quarter of this year, as electric vehicles continue to make up a larger share of China's car fleet. By 2030, almost one in three cars on China's roads will be expected to be electric.

Myllyvirta carried out an analysis of the website carbon briefs Uses official figures and commercial data. “If the rapid growth of clean energy is sustained, it will pave the way for sustainable emissions reductions,” he says.

However, he said that flat or declining emissions in 2024 were not guaranteed as government stimulus measures to boost the economy could cause emissions to rise in the last three months of the year. He warns that this does not mean that the Carbon emissions must fall by at least 2% in the last three months. He predicted that three months of this year will be below 2023 levels.

still Signals from the Chinese government It has signaled that the country's emissions are expected to continue rising until the end of the decade, which would use up the remaining global carbon budget by 1.5 degrees Celsius.

topic:

Source: www.newscientist.com

The Free Energy Principle: Is One Idea Enough to Explain the Existence of Everything?

Neuroscience seems like an unlikely place to find fundamental truths that might apply to everything in the universe. The brain is a special object that does things that few other objects in the universe are expected to be able to do. they recognize. they act. They read magazine articles. Usually they are the exception, not the rule.

Perhaps this is why the free energy principle (FEP) has attracted so much attention. In the early 2000s, what began as a tool to explain cognitive processes such as perception and behavior began to be presented as a “unified brain theory.” FEP was then put forward as the definition of life beyond the brain and, inevitably, as the basis for a new kind of artificial intelligence capable of reasoning. Today, some proponents argue that FEP even encapsulates what it means for something to exist in the universe. “The free energy principle can be read as the physics of self-organization,” says its founder. carl friston At University College London. “It's a description of what lasts.”

But some researchers, frustrated by the changes in scope, are skeptical that the FEP can deliver on many of its loftiest promises. “It was a moving target,” he says Mateo Colomboa philosopher and cognitive scientist at Tilburg University in the Netherlands.

All of this makes FEP a source of both fascination and frustration. While notoriously difficult to understand, its dizzying breadth is key to its enduring appeal. Therefore, given the claim that it can be used to explain…

Source: www.newscientist.com

Amazon.com partners with nuclear energy industry to address data center needs

Amazon.com has recently signed three agreements to collaborate on the development of small modular reactor (SMR) nuclear power technology. This cutting-edge technology aims to address the increasing demand for power, particularly from data centers. Amazon has solidified its position as a major player in the high-tech industry.

One of the agreements involves Amazon funding a feasibility study for an SMR project near its Northwest Energy site in Washington state. X-Energy will be responsible for developing the SMR, with financial specifics remaining undisclosed.

As per the agreement, Amazon will have the option to procure power from four modules. Energy Northwest, a group of state utilities, may also include up to 80 MW modules, resulting in a total capacity of up to 960 MW. This power will be able to supply over 770,000 US homes, with excess energy being allocated to Amazon and utility companies for residential and commercial usage.

Matt Garman, CEO of Amazon Web Services, expressed, “Our agreement will expedite the advancement of new nuclear technologies that will provide energy for years to come.”

SMR leverages factory assembly of components to reduce construction expenses, a departure from the conventional on-site assembly of large nuclear reactors. While some critics argue that achieving economies of scale with SMR technology may be costly, it remains a promising development.

Nuclear power, known for its near-zero greenhouse gas emissions and creation of high-wage union jobs, garners bipartisan support in the US. Despite this, the country is yet to have a working SMR. NuScale was the lone US entity to secure an SMR design license from the US Nuclear Regulatory Commission recently.

Furthermore, SMRs produce lasting radioactive waste, and the US lacks a definitive disposal site for such byproducts. Scott Burnell, a representative from the US Nuclear Regulatory Commission, stated that regulators still await detailed information about planned SMR implementations.

Source: www.theguardian.com

Lightning can generate energy waves that travel vast distances into space.

Lightning can create special energy waves

Room the Agency/Alamy

This overlooked mechanism could allow lightning energy to reach the top of the atmosphere, threatening the safety of satellites and astronauts.

When lightning strikes, the energy it carries can create special electromagnetic waves called whistlers, so named because they can be converted into sound signals. For decades, researchers thought that the whistlers produced by lightning remained confined to altitudes relatively close to the Earth's surface, below about 1,000 kilometers.

now Vikas Sonwalkar and Amani Lady Researchers at the University of Alaska Fairbanks discovered that some whistlers bounce off a layer of the atmosphere filled with charged particles called the ionosphere, which allows whistler waves and the energy they carry to travel up to 20,000 kilometers above Earth's surface—all the way into the magnetosphere, the region of space governed by Earth's magnetic field.

Researchers found evidence of these reflective whistlers in data from the Van Allen Probes, twin robotic spacecraft that measured the magnetosphere between 2012 and 2019. They also found hints of the phenomenon in studies published in the 1960s. Both the old and new data indicate that the phenomenon is very frequent and happens all the time, Reddy said.

In fact, the lightning may be depositing twice as much energy into this region as previously estimated, the researchers say, and this energy charges and accelerates nearby particles, creating electromagnetic radiation that can damage satellites and endanger the health of astronauts.

“Lightning has always been considered a bit of a smaller player. Until 10 years ago, this data wasn't available and we'd never looked at it at this level of detail.” Jacob Bortnick researcher at the University of California, Los Angeles. He says the new study is a call for others to develop a more accurate picture of the magnetosphere.

Establishing the connection between lightning and the magnetosphere is also important because changes in Earth's climate could make lightning storms more frequent, Sonwalker said.

The research team now hopes to analyze data from more satellites to learn more about how lightning-based whistlers are distributed in the magnetosphere and how they are affected by space weather.

topic:

Source: www.newscientist.com

Is the future of nuclear fusion at risk? Examining the challenges facing the International Experimental Reactor | Energy

IIt was a project that promised the Sun: researchers would use some of the most cutting-edge technology in the world to design machines capable of generating atomic fusion, the process that powers stars, to create a cheap, non-polluting source of electricity.

This was originally the purpose of the International Thermonuclear Experimental Reactor (Iter). Thirty-five countries, including European countries, China, Russia and the United States, agreed to build the reactor in Saint-Paul-lès-Durance in the south of France at an initial cost of $6 billion. Work began in 2010, with the promise of producing an energy-producing reaction by 2020.

Then reality set in: Cost overruns, the coronavirus, corrosion of key components, last-minute redesigns, and disputes with nuclear safety regulators have caused delays, and it was just announced that ITER won’t be ready for another decade. To make matters worse, the energy-producing fusion reaction won’t occur until 2039, adding another $5 billion to ITER’s already ballooning $20 billion budget.

Other estimates put the final cost much higher, the magazine said, potentially making ITER “the most delayed and costly scientific project in history.” Scientific American On the other hand, the journal Science It said only that ITER was currently facing “major problems”. Nature It noted that the project “has been plagued by a series of delays, cost overruns and management problems.”

Scientists warn that dozens of private companies are now threatening to develop fusion reactors on a shorter timeline, including Oxford-based Tokamak Energy and the US company Commonwealth Fusion Systems.

“The problem is that ITER has been going for so long and suffered so many delays that the rest of the world has moved on,” said Robbie Scott, a nuclear fusion expert at the UK Science and Technology Facilities Council. “A lot of new technology has come along since ITER was planned, and that has left the project with serious problems.”

The Iter plant, under construction in Saint-Paul-lès-Durance in the south of France, opened in June. Photo: EJF Riche/Iter Organization

Question marks now hang over the world’s most ambitious technological project, which seeks to understand the process that powers stars, in which two light atomic nuclei combine to form one heavy one, releasing a huge amount of energy – nuclear fusion, which only occurs at very high temperatures.

To generate this heat, doughnut-shaped reactors called tokamaks use magnetic fields to confine a plasma of hydrogen nuclei, then bombard it with particle beams and microwaves. When temperatures reach millions of degrees Celsius, a mixture of two hydrogen isotopes (deuterium and tritium) fuses to form helium, neutrons, and a huge amount of excess energy.

Containing plasma at such high temperatures is extremely difficult. “The original plan was to line the tokamak reactor with beryllium as a protective covering, but this proved extremely difficult and because beryllium is toxic, they ultimately decided to replace it with tungsten,” says David Armstrong, professor of materials science and engineering at the University of Oxford. “This was a major late design change.”

Then, after it was discovered that huge parts of the South Korean-made tokamak had not been fitted together properly, threatening to leak radioactive material, French nuclear regulators ordered construction of the plant halted. Further delays were announced as problems mounted.

Then came COVID-19. “The pandemic caused factories supplying components to close, resulting in related workforce cuts, backlogs in shipments and difficulties in carrying out quality-control inspections,” ITER Secretary General Pietro Barabaschi acknowledged.

So ITER has once again delayed completion until another decade. At the same time, researchers using other approaches to nuclear fusion are making breakthroughs. In 2022, the US National Ignition Facility in California announced that it had used a laser to superheat deuterium and tritium and fuse them to produce helium and surplus energy, which is ITER’s goal.

Skip Newsletter Promotions

Other fusion projects also claim they too could soon achieve breakthroughs. “The past decade has seen a proliferation of private fusion companies promising to do things differently from ITER – faster, cheaper – and, to be fair, some of them have likely overpromised,” said Brian Aperbe, a research physicist at Imperial College London.

It remains to be seen whether ITER will weather these crises and whether backers will continue to fund it. Observer He argued that there was still promising work left to be done.

One example is research into how to produce tritium, a rare hydrogen isotope essential for fusion reactors. It can be made by bombarding lithium samples with neutrons produced in a fusion reactor, producing helium and tritium in the process. “That’s a worthwhile experiment in itself,” Aperbe said.

But it rejected claims ITER was “hugely problematic” and dismissed the notion it was a record-breaking science project in terms of cost overruns and delays – just look at the International Space Station or Britain’s HS2 rail link, a spokesman said.

Some have pointed out that fusion power’s limited carbon emissions could help the fight against climate change. “But fusion will be too slow to reduce carbon emissions in the short term,” says Aneeka Khan, a fusion researcher at the University of Manchester. “Only once fusion power plants are producing significant amounts of electricity later in the century will they help curb carbon emissions, which will be crucial in the fight against climate change.”

Source: www.theguardian.com

Small drones powered by solar energy could fly indefinitely

CoulombFly, a prototype of a small solar-powered drone

Wei Shen, Jingze Peng, and Mingjin Qi

Weighing just 4 grams, the drone is the smallest solar-powered aircraft ever to fly, thanks to special electrostatic motors that generate extremely high voltages and tiny solar panels. Though the hummingbird-sized prototype only lasted an hour, developers say the approach could lead to insect-sized drones that can remain airborne indefinitely.

Small drones are an attractive solution to a variety of problems in communications, espionage and search and rescue, but they suffer from short battery life, while solar-powered drones struggle to generate enough power to be self-sustaining.

When solar-powered drones are made smaller, the solar panels become smaller and the amount of available energy decreases. Minjin Chee Researchers from China's Beihang University say the efficiency of electric motors also declines as more energy is lost as heat.

To avoid this decay cycle, Qi and his colleagues developed a simple circuit that boosts the voltage generated by solar panels to between 6,000 and 9,000 volts. They powered the 10-centimeter rotors using an electrostatic propulsion system, rather than using electromagnetic motors like those used in electric cars, quadcopters, and a variety of robots.

The motor works by alternately attracting and repelling charged parts arranged in a ring, generating torque to spin a single rotor blade like a helicopter. The lightweight parts are made from ultra-thin carbon fiber covered with very delicate aluminum foil. The high voltage requirement is actually an advantage, as the current is reduced, resulting in very little heat loss.

“T“The motor generates very little heat because the operating current is very low for the same power output. The motor's high efficiency and low power consumption allow the vehicle to be powered by very small solar panels,” Qi said. “For the first time, we have successfully flown a micro air vehicle using natural light; previously, this was only achievable with very large ultralight aircraft.”

The machine, which the researchers call the “CoulombFly,” weighs just 4.21 grams and could fly for an hour before it failed. Qi says these weaknesses can be eliminated by design, and future versions could fly essentially indefinitely, using solar panels during the day and powering themselves from radio signals like 4G or Wi-Fi at night.

CoulombFly has a payload capacity of 1.59 grams, allowing it to carry small sensors, computers, and cameras, but with improved designs, the researchers believe this can be increased to 4 grams, and the fixed-wing version could carry up to 30 grams. An even smaller version of CoulombFly, with rotors less than 1 centimeter in diameter, is also in development.

topic:

Source: www.newscientist.com

The Importance of Thermal Storage in the Expansion of Renewable Energy

Artur Widak/NurPhoto/Shutterstock

It's now well established that to mitigate the worst effects of climate change, we need to get to net zero carbon emissions as quickly as possible, which means getting more of our energy from renewable sources and finding ways to store energy for long periods of time to overcome the intermittency of wind and solar.

Giant battery farms and green hydrogen (using surplus renewable energy to split water) are often touted as the most promising storage solutions, and clever new ways to store excess electricity are emerging all the time (see “Giant CO2-filled domes could store surplus renewable power”), but the potential to store renewable energy as heat is often overlooked.

When we think of renewable energy, we tend to think of electricity. But heat is also a valuable commodity in its own right. About half of the world's total energy demand is for heat, whether it's to heat our homes or to power industrial production of food, medicines and materials. What's more, stored heat can be used to generate electricity when the sun stops shining and the wind dies down.

The good news is that, as we outlined in our feature “How Incredibly Simple Technologies Can Accelerate the Race to Net Zero”, a range of thermal storage technologies are emerging. Collectively known as thermal energy storage (TES), many of these innovations are incredibly simple, from baked bricks to molten salt. Crucially, they're affordable: early estimates suggest that these technologies could be as little as one-fifth the cost per kilowatt-hour of energy storage using green hydrogen. In a recent report, the International Renewable Energy Agency said TES offers “unique advantages”.

The problem is that awareness of TES is relatively low, and investment even less. Private backers are starting to pour big dollars into pilot projects in the US and Europe. But for TES to live up to its promise as a relatively easy way to make a big impact on the problem of renewable intermittency, governments will need to step up. And if the price is as reasonable as it appears, there's no reason not to.

topic:

Source: www.newscientist.com

Auramax: Enhancing Sexual Appeal or Draining Energy? | Psychological Perspectives

name: Auramax.

Year: The word “aura” comes from Latin and Ancient Greek and originally meant a gentle breeze. Today it’s more commonly used to describe the subtle, pervasive quality that emanates from someone, which is exactly what we’re talking about here.

And Auramax? It’s new. It’s similar to looksmaxxing, but…

Hold on, what is looksmaxxing? For example, exercising or making cat noises will help you maximize your physical attractiveness…

Are you meowing like a cat? In practice, this means pushing the tongue up in the mouth to improve the jaw and facial structure.

What happens if I get tetanus? It’s a shame, but it’s worth it. Anyway, Auramax is the same kind of idea, but aimed at improving your energy and overall presence.

And where is this aura?Is amaxxing happening? Mainly TikTok, However, other platforms are available.

What should I do? You could also learn from 18-year-old Canadian content creator, Frankie Mekhi. Share your aura upgrade It has 250,000 followers.

It 250,001. Frankie’s number one rule is “Don’t try to emulate someone else’s aura. It has to come from within, it has to be authentic.”

[Takes notes: writes “within” and “authentic”] Second rule: No barking.

Don’t bark? Don’t talk too much!

[Zips mouth closed] Next, you need to find your purpose.

[Trying to talk with mouth closed] it is In capital letters? That’s correct. Also, people with auras have achieved great things in some way.

Hmm, that might be difficult. Maybe your aura score has dropped too quickly. Aura scores are interesting because they actually happen on TikTok, where users are giving and taking away aura points from other users.

How does scoring work? Well, doing something impressive like having other friends with auras might give you points, or it might cost you points…

Bark? Possibly, but Susanna Merrick says there is no such thing as a premium aura level.

Who is Susanna Merrick?? A New York-based aura stylist. “People don’t need to know who they are.” she told The Cut. “They need to discover who they are.”

I would like to know, is Auramax mainly for men? Mekhi said his audience is primarily young men, but The Cut reported that young women are also joining the conversation about Auramax, but in a different way.

What’s the difference? Instead of trying to exude presence, ask how many aura points you might lose because of how you acted during a difficult experience. Bullying or sorrow.

please tell me: “Or you can just be yourself and not worry about how much of a person you are or how other people perceive you.”

Do not say: “You either get it or you don’t. And if you try too hard to get it, you definitely won’t get it, brother.”

Source: www.theguardian.com

Testing Millions of UK homes for Energy Leaks in Effort to Achieve Net Zero Goal

Vehicles equipped with technology to collect data on building conditions

Madeleine Cuff

British city dwellers may have spotted a strange-looking vehicle driving around their neighborhood earlier this year. It looked just like a Google Street View vehicle, with a camera setup sticking out of the back to scan its surroundings. And like the Google car, it scanned city streets and took photos.

But these modified Teslas do more than just take pictures: they’re equipped with cutting-edge sensors and scanners that can report back the exact dimensions, heat loss, materials, age and state of disrepair of every building they drive over.

The car, equipped with what’s called the Built Environment Scanning System (BESS), has been on a spree to find out just how leaky and dilapidated Britain’s buildings really are. Between March and May, the car scanned thousands of roads and millions of buildings across London, Liverpool, Cardiff, Glasgow, Manchester, Leeds and South Yorkshire.

Data from BESS vehicles will be combined with thermal images taken by drones and planes in a £4 million government-funded project to build a huge digital database detailing the condition of buildings across the U.K. The aim is to help housing associations, local authorities and other property owners quickly plan renovation projects for hundreds of properties at once, says Ahsan Khan of xRI, the British nonprofit behind the project.

Decarbonising UK buildings is one of the toughest challenges on the journey to net-zero emissions. The UK’s 30 million buildings account for around a third of the country’s total greenhouse gas emissions, with most of the pollution coming from the use of gas for heating and hot water.

Another problem is that many of the UK’s homes are old and drafty. Retrofitting these homes to make them more energy efficient is crucial, but knowing where to start is a huge challenge, as the age and condition of the buildings varies greatly. “We’re held back as a nation because we don’t really know what we have, where it is in terms of the built environment, and what we can do about it,” says Khan.

Currently, the only means of judging a building’s sustainability is the Energy Performance Certificate (EPC), a mandatory document that rates every building on a scale of A to G and gives owners advice on how to improve the rating. But EPCs, which rely on the judgement of in-person assessors, are “expensive, time-consuming and inaccurate”, says Dr. Mike Pitts The project is part-funded by the government body Innovate UK, with other funding coming from the UK Space Agency and the Welsh Government.

For organisations such as housing associations and local authorities who want to renovate hundreds of properties at once, EPCs are of little use – instead they often have to send their own assessors to the properties and plan the works schedule, which is a costly and time-consuming undertaking.

Speeding up renovations

The new database is expected to digitise much of this process. If it works as planned, it will use machine learning to tell councils, for example, how many properties already have double glazing installed, or which homes need top-up cavity-wall insulation. In an instant, it will be able to pinpoint exactly which homes have the space and sunlight to install rooftop solar panels. Crucially, it will calculate projected savings on energy bills and provide return-on-investment information, helping organisations access green finance.

“The xRI project represents a major advance in our understanding of our existing stock,” says Mat Colmer of Innovate UK. “The validated data set will improve and automate the refurbishment process, speeding up the entire refurbishment process.”

About 7.5% of homes in England, Scotland, and Wales have already been scanned, and Khan says the framework is in place to build a beta version of the database, due to be released later this year. For now, xRI is focused on decarbonizing buildings, but the BESS vehicles are collecting data on everything they see, from tree cover to potholes, that could be put to use in the future. “The amount of data is just staggering,” Pitts says.

David Grew Researchers from Britain’s Leeds Beckett University call the project “exciting,” but warn that an in-home inspection is essential before any renovation work begins. “Homes have been tampered with many times, so the same home could be completely different,” he says. “This quick and agile method is great for accelerating progress and momentum, but it can’t and shouldn’t replace a really high-quality inspection before construction begins.”

Kate Simpson A researcher at Nottingham Trent University in the UK says neighbourhood data collected by BESS vehicles could help plan local power grid upgrades and climate resilience projects. But the data needs to be collected carefully, she says. “What’s the minimum amount of data we need to make the right decisions?” she says. “That way we can minimise the environmental impact of storing that data.”

topic:

Source: www.newscientist.com

California is facing an unexpected energy challenge due to excessive solar power use

Solar panels have become a common sight in suburban neighborhoods in California. However, the state’s ambitious clean energy vision has led to a unique challenge – sometimes producing more solar energy than it can use effectively, resulting in wastage of clean energy.

This excess of solar energy has resulted in a phenomenon known as the “duck curve,” where solar generation surpasses demand. This issue is most pronounced on sunny spring days when demand for electricity is low.

The surplus energy is often exported to other parts of the Western U.S. due to California’s grid connectivity, but in some cases, it may need to be curtailed. Independent System Operator data shows that California has lost a significant amount of renewable energy this year, primarily solar power.

To address this challenge, proposals have been made to increase electricity supply through additional transmission lines and more battery installations to store excess power. However, recent changes in financial incentives for homeowners installing solar power have negatively impacted the rooftop solar industry in California.

Despite the setbacks, Governor Gavin Newsom remains optimistic about California’s clean energy progress, pointing out the state’s significant solar power generation and increasing battery installations. Critics of the incentive changes argue that it could lead to higher energy costs for non-solar customers and hinder the state’s transition to renewable energy.

As California navigates these challenges on its path to achieving 100% clean energy by 2045, the state’s decisions are closely watched by other states considering similar transitions. The rooftop solar industry plays a crucial role in this transition, as highlighted by industry experts.

Source: www.nbcnews.com

Is the climate resilient enough to handle the escalating energy needs of the AI arms race?

The rise of artificial intelligence has propelled the stock prices of major tech companies to new heights, but this growth has come at the expense of the industry’s environmental efforts.

Google recently admitted that AI technology poses a challenge to its sustainability objectives. The company disclosed that its data centers, crucial for its AI infrastructure, have caused a 48% increase in greenhouse gas emissions since 2019. Google cited “significant uncertainties” in achieving its goal of net-zero emissions by 2030, particularly due to the complex and unpredictable environmental impacts of AI.

As the tech industry races ahead with AI advancements, the question arises: can technology mitigate the environmental impact of AI, or will the pursuit of cutting-edge innovation overshadow these concerns?


Why is AI a threat to tech companies’ environmental goals?

Data centers play a critical role in developing and operating AI models like Google’s Gemini and OpenAI’s GPT-4. These centers house complex computing equipment that require substantial electricity, leading to CO2 emissions both from energy sources and the manufacturing processes involved. According to the International Energy Agency, data centers are projected to double their electricity consumption by 2026, equivalent to Japan’s energy demand. Additionally, studies suggest that AI’s water consumption could reach significant levels by 2027, potentially straining resources equivalent to England’s annual consumption.


What do experts say about the environmental impact?

Government-sponsored reports in the UK have highlighted the importance of energy sources in determining the environmental cost of technology. Some experts caution that the reliance on fossil-fuel-powered energy sources for training AI models remains a significant challenge. While tech companies are increasing their use of renewable energy to meet sustainability goals, concerns persist that the lack of clean energy may push other users towards fossil fuels.

Alex de Vries, founder of Digiconomist, notes the dual challenge of rising energy consumption in AI and the struggle to secure sustainable energy sources.


Will there be enough renewable energy?

Global efforts to triple renewable energy resources by the end of the decade face challenges due to surging energy demands from AI data centers. The International Energy Agency warns that current plans may only double renewable energy capacity by 2030, potentially impacting climate goals.

Technology companies may need to invest heavily in new renewable energy projects to meet the escalating electricity needs driven by AI.


How quickly can new renewable energy projects be built?

While renewable energy projects like wind and solar farms can be developed relatively quickly, bureaucratic hurdles and grid connectivity issues can delay the process for years. The pace of building offshore wind and hydroelectric schemes faces similar challenges, posing concerns about whether renewable energy can keep up with the expansion of AI.

The reliance on existing low-carbon sources by tech companies may divert clean energy away from other users, potentially increasing fossil fuel consumption to meet growing demands.


Will AI’s power demands keep growing?

The escalating energy needs of AI systems could lead to higher energy costs, prompting cost-saving measures in the industry. However, the competitive landscape and the push for cutting-edge AI technologies may result in excessive electricity consumption despite rising costs.

The pursuit of state-of-the-art AI systems has fueled a “winner takes all” mentality among tech giants, compelling heavy investments in the development of advanced AI. The pressure to remain at the forefront of AI innovation, including the race towards achieving AGI, threatens to escalate energy consumption and costs.

Despite advancements in AI efficiency, the industry’s drive for innovation may offset potential energy savings, akin to the economic concept known as “Jevons’ Paradox.”


Won’t AI companies learn to use less electricity?

While AI breakthroughs continue to enhance efficiency, the industry’s relentless pursuit of cutting-edge models may counteract potential energy savings. The growth in AI capabilities does not necessarily translate to reduced energy consumption, leading to a paradox similar to historical instances of technological advancements increasing use rather than conserving resources.

Source: www.theguardian.com

Researchers Nearing Discovery of Elusive ‘Chameleon’ Particle Associated with Dark Energy

A team of physicists at the University of California, Berkeley has developed the most sophisticated instrument ever designed to search for dark energy, the mysterious force that is accelerating the expansion of the universe.

The results of their experiment were published today in a prestigious journal. Nature – targets a hypothetical particle known as the chameleon, which could hold the key to unlocking this mysterious cosmic force.

First identified in 1998, dark energy makes up about 70 percent of all matter and energy in the universe, and despite many theories, its true nature remains a mystery.


One leading hypothesis is that there is a fifth force that is distinct from the four fundamental forces known in nature (gravity, electromagnetism, and the strong and weak nuclear forces).

This power is thought to be mediated by particles known as chameleons due to their ability to hide in plain sight.

In an experiment at the University of California, Berkeley, Professor Holger Muller utilizes an advanced atom interferometer combined with an optical lattice.

If that sounds technical, it is. Essentially, this setup allows for precise gravity measurements by holding free-falling atoms in place for a set period of time.

Physicists at UC Berkeley have clamped a small cluster of cesium atoms (the pink blob) in a vertical vacuum chamber and split each atom into a quantum state where half of the atom is close to the tungsten weight (the shiny cylinder) and the other half (the split sphere below the tungsten) is close to the tungsten weight. – Image credit: Cristian Panda/UC Berkeley

The longer we can keep the atoms there, the greater our chances of finding (or not finding) a trace of the chameleon.

“Atom interferometry is the technology and science that exploits the quantum properties of particles – their properties as both particles and waves. We split the waves so that the particles take two paths at the same time, and then we interfere with them at the end,” Muller said.

“The waves are either in phase and add, or out of phase and cancel each other out. The key is that whether they are in phase or out of phase depends very sensitively on the quantities you want to measure, such as acceleration, gravity, rotation, or fundamental constants.”

Whereas previous experiments have only been able to move atoms for a few milliseconds at a time, the new device can keep them in motion for much longer periods – from seconds to tens of seconds – a major improvement that improves the most precise measurements by a factor of five.

In a recent paper published in the journal Natural Physics Muller and his colleagues extended the hold time to a whopping 70 seconds.

To reveal whether chameleon particles are indeed the dark energy mastermind, scientists would need to find holes in the outcomes predicted by the accepted theory of gravity — something no one has managed to do since Isaac Newton formulated it 400 years ago.

Muller and his team found no deviations from Newtonian gravity in their recent tests, suggesting that if chameleons exist, their effects are quite subtle.

Still, the researchers are optimistic: The improved precision of their instruments means future experiments may provide the evidence needed to confirm or disprove the existence of chameleons and other hypothesized particles that contribute to dark energy.

About the Experts

Holger Muller At the age of 14, he successfully filed his first patent. He then wrote his undergraduate thesis under the supervision of Jürgen Mullinek at the University of Konstanz in Germany. He graduated from the Humboldt University in Berlin with Achim Peters as his supervisor. Müller received a fellowship from the Alexander von Humboldt Foundation and joined Steven Chu’s group at Stanford University as a postdoctoral researcher. In July 2008, he joined the Physics Department at the University of California, Berkeley, where he is currently a Professor of Physics and Principal Investigator. He is currently the Principal Investigator of his research group, the Müller Group.


read more:

Source: www.sciencefocus.com

Dark energy could be even more mysterious than previously believed

The choice to name a new project the Dark Energy Spectroscopic Instrument (DESI) may come across as presumptuous. Dark energy, you see, is completely unseen; it does not emit any detectable light for a spectrometer to analyze. In fact, dark energy has never been directly observed and has managed to evade capture despite efforts made using the most advanced telescopes and detectors available.

As far as we understand it, dark energy is invisible, uniformly spread throughout space, does not interact with matter or light, and serves the sole purpose of accelerating the universe’s expansion through a mechanism that remains a mystery to us.

So, with the recent announcement of DESI’s initial data release, are we witnessing a shift in our comprehension of dark energy, as promised?

In the search for elusive dark energy, our observations offer limited insights: dark energy merely stretches space-time. To investigate different theories about dark energy, we must examine how this stretching occurred over cosmic time.

One method is to observe the universe’s expansion history, while another involves examining how matter accumulated within galaxies and clusters at various junctures in the universe’s past.

Efforts to measure the expansion rate often involve constructing a precise 3D map of the universe’s matter. By studying the spectra of light, we can determine how much it has stretched due to the universe’s expansion. By combining this information with accurate physical distances, we gain valuable insights into the universe’s evolution.

DESI’s new model has stirred speculation by proposing that dark energy may have a more intricate history than previously believed. If these indications prove to be accurate, they could revolutionize our understanding of not just the universe’s past, but also its eventual fate.

The Concordance Model of Cosmology outlines the prevailing model of the universe and its components. In this model, dark energy is viewed as a cosmological constant, providing a minimal flexibility to every part of space.

DESI and other surveys commonly report their dark energy findings in terms of an “equation of state” parameter denoted as w. A value of w = -1 is expected if dark energy behaves as a cosmological constant. Any deviation from this value implies a different characteristic for dark energy.

The recent DESI findings present a puzzling scenario: while a constant w of -1 aligns well with the results, a scenario where w is variable suggests a different interpretation. When combined with data from other sources, these results hint at a changing w, implying a varying impact of dark energy on the universe over time.

While the implications of these findings remain uncertain, they raise intriguing possibilities about the future course of the universe and the role of dark energy within it. Though still preliminary, these results suggest that dark energy may continue to surprise us in unforeseen ways in the future.

Source: www.sciencefocus.com

There is a possibility that dark energy is even more mysterious than previously believed.

Naming a new project the Dark Energy Spectroscopic Instrument (DESI) may come across as overly confident. This is because dark energy is undetectable and cannot be examined or analyzed through traditional methods like spectroscopy. Despite numerous attempts, dark energy has never been directly observed and remains a mystery in the realm of astrophysics.

Dark energy is believed to be a uniform force throughout the universe that does not interact with matter or light in any discernible way. Its primary function is to expand space at an accelerating rate, a process that baffles scientists due to its unknown underlying mechanism.

The recent release of data from DESI has sparked interest and debate in the scientific community. This project aims to shed light on the behavior of dark energy by mapping the expansion history of the universe. By studying the motion of galaxies and measuring the stretching of light emitted from distant objects, researchers can gain insights into the evolution of the cosmos.

The current prevailing model of cosmology suggests that dark energy exists as a static cosmological constant, affecting the fabric of space uniformly. However, DESI’s findings hint at a more complex history for dark energy, challenging conventional theories about its nature and implications for the future of the universe.

As researchers delve deeper into the mysteries of dark energy, they may uncover new discoveries that challenge existing paradigms and push the boundaries of our understanding of the cosmos. The DESI project represents an important step towards unraveling the enigma of dark energy and its profound impact on the universe.

Source: www.sciencefocus.com

Ceti AI acquires Big Energy Investments Inc. to enhance its high-performance computing capabilities in North America

Vancouver, Canada, April 18, 2024, Chainwire

Chey Eye, a leader in distributed artificial intelligence infrastructure, is pleased to announce the acquisition of Big Energy Investments Inc., a Canadian company specializing in strategic investments in high-performance computing infrastructure. This acquisition is an important step in CeτiAI's strategy to advance the development and accessibility of AI technology.

Strategic acquisitions and enhancements

Following the acquisition, Big Energy Investments, Ltd. acquired an advanced high-performance computing (HPC) infrastructure that included five HPC servers equipped with eight NVIDIA H100 Tensor Core GPUs and two NVIDIA Quantum-2 InfiniBand switches. We have reached a basic agreement to acquire it. These agreements are expected to be signed within the next week and underline our commitment to rapidly increasing our technological capabilities.

This strategic enhancement is critical to the initial deployment of the ceτi AI Infrastructure Network in North America, leveraging the ceτi AI Intelligent Computing Fabric to support decentralized AI networks, decentralized physical infrastructure networks (DePIN), and Manages and provides computing resources to a variety of other applications. .

Strategic development and pilot implementation

The new HPC infrastructure will support the first North American deployment of the ceτi AI Intelligent Computing Fabric, which manages the ceτi AI Infrastructure Network. The network is designed to provide essential computing resources to a variety of decentralized client networks and is a key component of ceτi AI's broader mission to democratize AI technology through decentralization. The pilot implementation will not only demonstrate the capabilities of the ceτi AI solution, but will also begin revenue generation and accumulation for the CETI token ecosystem.

Roadmap and future plans

Successful integration and demonstration of this infrastructure will set the stage for immediate expansion to data center-scale implementations, significantly scaling up ceτi AI's operational capabilities. The development of the CETI token ecosystem continues and its introduction is the next major milestone in the ceτi AI roadmap.

executive insights

“This acquisition is an important milestone in ceτi AI’s growth trajectory and is consistent with our strategic objectives to strengthen our infrastructure and accelerate the development of decentralized AI technology. By combining our capabilities, we will be able to innovate and expand our reach across North America,” said Dennis Jarvis, CEO of ceτi AI.

Forward-looking statements

This press release contains forward-looking statements regarding expected future events and anticipated results that are subject to significant risks and uncertainties. These include, but are not limited to, final procurement and integration of HPC infrastructure, deployment and performance of the ceτi AI Intelligent Computing Fabric, and broader adoption and impact of the CETI token ecosystem. Actual results and results may differ materially from those expressed or anticipated in such forward-looking statements due to a variety of factors.

About ceτi AI

Chey Eye is at the forefront of revolutionizing artificial intelligence through decentralization. cτi AI is committed to innovation and accessibility, developing a globally distributed, high-performance, scalable AI infrastructure designed to empower developers and networks around the world. ceτi AI aims to accelerate the advancement of AI technology by democratizing access to cutting-edge resources, making it more diverse and accessible to everyone. Our mission is not limited to infrastructure development. We are building the foundation for the future of AI, allowing it to grow in ways that benefit all of humanity without sacrificing freedom of choice and expression.

Users can learn more about our mission, technology, and the future we're building, along with the latest updates and community discussions, by visiting:

light paper I Website I X I telegram I discord

contact

Chey Eye
press@taoceti.ai

Source: www.the-blockchain.com

Biden announces $7 billion in federal funding for solar energy projects in celebration of Earth Day

WASHINGTON — THE PRESIDENT Joe Biden marked Earth Day by announcing $7 billion in federal grants for residential solar power projects serving more than 900,000 households in low- and moderate-income areas. He also plans to expand the New Deal-style U.S. Climate Change Corps Green Jobs Training Program.

The grants were awarded by the Environmental Protection Agency, with 60 recipients announced on Monday. Government officials expect the projects to reduce emissions by the equivalent of 30 million tons of carbon dioxide and save households $350 million a year.

Biden’s climate announcement is aimed at energizing young voters in his re-election bid. Young people played a key role in defeating then-President Donald Trump in 2020. They have shown interest in Biden’s climate policy and are eager to contribute through programs like the Climate Change Corps.

Solar energy is gaining popularity as a renewable energy source that can reduce dependence on fossil fuels and improve the power grid’s reliability. However, the initial installation cost of solar energy remains a barrier for many Americans.

The grants include 49 state-level grants, six grants for Native American tribes, and five multi-state grants. They can be used for investments in rooftop solar power generation and community solar gardens.

Biden made the announcement at Prince William Forest Park in northern Virginia, about 30 miles southwest of Washington. The park was established in 1936 by President Franklin D. Roosevelt as part of his Civilian Conservation Corps during the Great Depression.

Biden’s American Climate Corps, modeled after President Roosevelt’s New Deal, offers about 2,000 positions in 36 states, including partnerships with the Building Trades Union of North America.

The grants are part of the Solar for All program, funded by a $27 billion “green bank” established as part of a broader climate law initiative. The program aims to reduce climate change, air pollution, and support disadvantaged communities most affected by climate change.

EPA Deputy Administrator Janet McCabe expressed excitement about the funds benefiting communities, providing skills, creating jobs, and helping households save on utility bills.

Among the businesses receiving grants are nonprofit projects in West Virginia, solar leasing programs in Mississippi, and solar worker training programs in South Carolina.

Concerns remain about Republican opposition to taxpayer-funded green banks and accountability for how the funds are used. The EPA previously allocated the remaining $20 billion in bank funds to support clean energy projects in various organizations and communities.

Source: www.nbcnews.com

More Energy Recovery Potential in Wind Turbines Modeled After Condor Wings

The curved tip of a wind turbine blade, or winglet, based on the shape of a condor wing

Kashayar Ranamai Vahanbali

A design change inspired by the wings of the Andean condor could increase the energy produced by wind turbines.

Different types of birds have upturned tips at the ends of their wings, which help maximize lift. Similar features, known as winglets, are commonly used on aircraft wings, but have not been tested on the giant turbine blades used to generate electricity.

Kashayar Ranamai Vahanbali Researchers at the University of Alberta in Canada say collecting experimental data on wind turbines with winglets is extremely difficult due to their size.

His team designed a winglet based on the Andean condor, the heaviest flying bird in the world. The Andean condor can travel vast distances, despite weighing up to 15 kg.

Computer simulations of airflow through the turbine showed that these winglets reduced drag and increased efficiency by an average of 10%.

“Another perspective is that the winglets allow the turbine to capture more wind energy with minimal losses. [resistance]” says Ranamayvahanbury.

Winglets can be retrofitted after a turbine is manufactured, he said, by slipping “sock-like” pieces onto the ends of the blades. Researchers are developing an experimental setup to test models of wind turbine winglets.

Peter Majewski, who recently retired from the University of South Australia, said the research results made sense from an engineering and aerodynamics perspective, but retrofitting existing wind turbines would be prohibitive in terms of downtime and cost. He said it might be realistic.

But for new turbine blades, adding winglets during the manufacturing process can lead to significant performance improvements, he says.

topic:

  • aerodynamics/
  • Renewable energy

Source: www.newscientist.com

Ceti AI acquires Big Energy Investments Inc. to enhance high-performance computing capabilities in North America

Vancouver, Canada, April 18, 2024, Chainwire

Chey Eye, a leader in distributed artificial intelligence infrastructure, is pleased to announce the acquisition of Big Energy Investments Inc., a Canadian company specializing in strategic investments in high-performance computing infrastructure. This acquisition is an important step in CeτiAI's strategy to advance the development and accessibility of AI technology.

Strategic acquisitions and enhancements

Following the acquisition, Big Energy Investments, Ltd. has an advanced high-performance computing (HPC) infrastructure that includes five HPC servers equipped with eight NVIDIA H100 Tensor Core GPUs and two NVIDIA Quantum-2 InfiniBand switches. We have reached a basic agreement to acquire it. These agreements are expected to be signed within the next week and underline our commitment to rapidly increasing our technological capabilities.

This strategic enhancement is critical to the initial deployment of the ceτi AI Infrastructure Network in North America, leveraging the ceτi AI Intelligent Computing Fabric to support decentralized AI networks, decentralized physical infrastructure networks (DePIN), and Manages and provides computing resources to a variety of other applications. .

Strategic development and pilot implementation

The new HPC infrastructure will support the first North American deployment of the ceτi AI Intelligent Computing Fabric, which manages the ceτi AI Infrastructure Network. The network is designed to provide essential computing resources to a variety of decentralized client networks and is a key component of ceτi AI's broader mission to democratize AI technology through decentralization. The pilot implementation will not only demonstrate the capabilities of the ceτi AI solution, but will also begin revenue generation and accumulation for the CETI token ecosystem.

Roadmap and future plans

Successful integration and demonstration of this infrastructure will set the stage for immediate expansion to data center-scale implementations, significantly scaling up ceτi AI's operational capabilities. The development of the CETI token ecosystem continues and its introduction is the next major milestone in the ceτi AI roadmap.

executive insights

“This acquisition is an important milestone in ceτi AI's growth trajectory and is consistent with our strategic objectives to strengthen our infrastructure and accelerate the development of decentralized AI technology. Big Energy Investments' resources and By combining our capabilities, we will be able to innovate and expand our reach across North America,” said Dennis Jarvis, CEO of ceτi AI.

Forward-looking statements

This press release contains forward-looking statements regarding expected future events and anticipated results that are subject to significant risks and uncertainties. These include, but are not limited to, final procurement and integration of HPC infrastructure, deployment and performance of the ceτi AI Intelligent Computing Fabric, and broader adoption and impact of the CETI token ecosystem. Actual results and results may differ materially from those expressed or anticipated in such forward-looking statements due to a variety of factors.

About ceτi AI

Chey Eye is at the forefront of revolutionizing artificial intelligence through decentralization. ceτi AI is committed to innovation and accessibility, developing a globally distributed, high-performance, scalable AI infrastructure designed to empower developers and networks around the world. ceτi AI aims to accelerate the advancement of AI technology by democratizing access to cutting-edge resources, making it more diverse and accessible to everyone. Our mission is not limited to infrastructure development. We are building the foundation for the future of AI, allowing it to grow in ways that benefit all of humanity without sacrificing freedom of choice and expression.

Users can learn more about our mission, technology, and the future we're building, along with the latest updates and community discussions, by visiting:

light paper I Website I X I telegram I discord

contact

Chey Eye
press@taoceti.ai

Source: the-blockchain.com

Signs of Potentially Weakening Dark Energy

Slice of the universe's largest 3D map showing the fundamental structure of matter

A collaboration between Leah Raman and DESI. Custom colormap package with cmastro

The largest 3D map of the universe ever created offers hints about the evolution of the universe and suggests we may be wrong about the behavior of dark energy, which makes up most of the universe. I am. This mysterious power may weaken over time.

“If it can be maintained, this is a very big deal,” he says Adam Rees Johns Hopkins University in Maryland discovered the first evidence of dark energy 25 years ago. That's because the standard model of cosmology, called the lambda CDM, suggests that the intensity of dark energy should not change over time.

Dark energy is thought to cause the accelerated expansion of the universe. If it is not static, it could also have major implications for our ideas about the universe's beginning, its size, and ultimate fate. Mr. Reese, who was not involved in the new work, said the impact was that “we… [our understanding of] “Gravity and Field”.

This strange finding comes from the Dark Energy Spectroscopy Instrument (DESI) in Arizona, where even DESI collaborators say data suggests dark energy may be weakening in recent times. I don't really know what to make of that fact. A DESI spokesperson said: “Whether this is interesting or not, this is all we have been talking about in this collaboration for months.” Kyle Dawson at the University of Utah.

DESI researchers investigated the strength of dark energy by measuring the large-scale structure and distribution of galaxies in the universe, revealing how the universe has expanded over time. The researchers then combined this information with three sets of data about supernovae. Supernovae act as so-called “standard candles” that determine the distance to cosmic objects thanks to their predictable brightness.

Surprisingly, each of the three supernova samples gave a different answer to the changing rate of expansion of the universe over time. All three suggest that the influence of dark energy may have declined in recent years, but the strength of these suggestions varies, so researchers wonder how to interpret the data. I don't really understand.

“Two of the supernova samples don't match each other, but they are very similar,” Dawson said. “We don't know which one is correct. The truth may lie somewhere in between, but the real difference seems to be in the method.” [the supernova researchers] We evaluated the data. ”

Model discrepancy is indicated by a coefficient called sigma. Sigma measures the likelihood that similar collisions will occur by chance when the models do not match each other. “About 3 sigma is the level at which we typically sit and pay attention and call it a 'sign' of something,” Reese says. Values ​​lower than that are usually not of particular interest to researchers. It would be too likely a simple coincidence.

The discrepancies between the lambda CDM and combined supernova and DESI measurements ranged from 2.5 sigma to 3.9 sigma. “Both opinions are true. There's enough tension and it's interesting. And there's not enough tension to say that something is definitely there,” says Dawson.

Dark energy makes up nearly 70 percent of the universe, so errors in our understanding of its properties can have far-reaching implications for physics. However, more precise measurements will be needed in the coming years to prove whether the error really exists.

“if [this is] “Certainly, this is the first real clue we've had about the nature of dark energy in 25 years,” says Rees.

topic:

Source: www.newscientist.com

Gigapixel Images of Bella Supernova Remnant Captured by Dark Energy Camera

Astronomers harness powerful energy dark energy camera The Victor M. Blanco 4-meter Telescope (DECam) at Cerro Tororo Inter-American Observatory, a program of NSF's NOIRLab, Huge 1.3 gigapixel image The Vela supernova remnant is the remains of a giant star that exploded in the constellation Vela about 11,000 years ago.

This DECam image shows the Vela supernova remnant, the remnant of a supernova explosion 800 light-years away in the southern constellation of Vela. Image credits: CTIO / NOIRLab / DOE / NSF / AURA / TA University of Alaska Anchorage Chancellor and NSF's NOIRLab / M. Zamani and D. de Martin, NSF's NOIRLab.

of Bella supernova remnantVela SNR for short, is one of the most well-studied supernova remnants in the sky and one of the closest supernova remnants to Earth.

Its progenitor star exploded 11,000 to 12,300 years ago south of the constellation Vore.

The association of this supernova remnant with the bella pulsar, made by Australian astronomers in 1968, provided direct observational evidence that supernovae form neutron stars.

“When this star exploded 11,000 years ago, its outer layer was violently stripped away and splattered around, creating a shock wave that can still be seen today,” the astronomers said in a statement.

“As the shock wave spreads into the surrounding region, hot, energetic gas flies away from the point of explosion, becomes compressed and interacts with the interstellar medium, producing the blue and yellow thread-like filaments seen in the image. .”

“Vela SNR is a gigantic structure, almost 100 light-years long and 20 times the diameter of a full moon in the night sky.”

“Although the star's final moments were dramatic, he did not completely disappear.”

“After the outer layers were shed, the star's core collapsed into a neutron star, an ultra-dense ball of protons and electrons that collided with each other to form neutrons.”

“The neutron star, named Bela pulsar, is now a supercondensed object containing the mass of a Sun-like star in a sphere just a few kilometers in diameter.”

“The Bela pulsar, located in the lower left region of this image, is a relatively faint star and indistinguishable from the thousands of objects next to it.”

Vela SNR's new image is the largest DECam image ever published, containing an astonishing 1.3 gigapixels.

“The striking reds, yellows, and blues in this image were achieved by using three DECam filters, each collecting a specific color of light,” the researchers said.

“Separate images were taken with each filter and stacked on top of each other to produce this high-resolution color image showing the intricate web-like filaments snaking throughout the expanding gas cloud.”

Source: www.sci.news

AI’s insatiable appetite for data is only rivaled by its relentless demand for water and energy.

One of the most harmful myths about digital technology is that it is somehow weightless or immaterial. Remember the early talk about “paperless” offices and “frictionless” transactions? And of course, our personal electronic devices Several Electricity is insignificant compared to a washing machine or dishwasher.

But even if you believe this comforting story, you might not survive when you come across Kate Crawford’s seminal book. Atlas of AI or impressive Structure of an AI system A graphic she created with Vladan Joler. And it definitely won’t survive a visit to the data center. One giant metal shed houses tens or even hundreds of thousands of servers, consuming large amounts of electricity and requiring large amounts of water for cooling systems.

On the energy side, consider Ireland, a small country with a huge number of data centers. According to a report by the Central Bureau of Statistics, these huts will be consumed in 2022 More electricity than every rural home in the country (18%), and as much as any urban dwelling in Ireland. And as far as water consumption is concerned, a 2021 Imperial College London study estimates: One medium-sized data center used the same amount of water as three average-sized hospitals. This serves as a useful reminder that while these industrial warehouses embody the metaphor of “cloud computing,” there’s nothing foggy or fluff about them. If you’re tempted to see it for yourself, forget it. Getting into Fort Knox should be easy..

There are currently between 9,000 and 11,000 such data centers around the world. Many of them are old-style server farms with thousands or millions of cheap PCs that store all the data our smartphone-driven world generates, including photos, documents, videos, and recordings. It’s starting to look a little outdated. In such casual abundance.

what i was reading

shabby philanthropist
Read Deborah Doan’s book sharp review for alliance Tim Schwab’s critical book magazine, bill gates problem.

final write
Veteran commentator Jeff Jarvis think about giving up “About old journalism and its legacy industry,” in a BuzzMachine blog post.

slim picking
In his blog No Mercy/No Malice, Scott Galloway suggests that AI and weight loss drugs have a lot in common.

Source: www.theguardian.com

US judge stops government from monitoring energy usage of cryptocurrency mining.

The U.S. government has halted an investigation into a cryptocurrency mining operation over its rising energy use following a lawsuit from an industry accused by environmental groups of fueling the climate crisis.

A federal judge in Texas granted an interim order blocking new requirements to verify cryptocurrency miners’ energy use, stating that the industry would suffer “irreparable harm” if forced to comply.

The U.S. Department of Energy launched an “emergency” initiative last month to examine the energy usage of mining operations, which use computational power to mine currencies like Bitcoin.

The growth of cryptocurrencies and mining activities has led to a surge in electricity usage, with data centers popping up and even reviving coal-fired power plants for mining operations.

The federal government requires more information on big miners’ electricity use, as mining facilities provided a significant portion of total U.S. electricity demand last year. Globally, cryptocurrency mining is responsible for a notable portion of energy consumption.

Campaigners warn that the increased electricity consumption from cryptocurrency mining exacerbates the climate crisis, with mining operations releasing significant amounts of carbon dioxide each year.

Cryptocurrency mining is straining power grids, with instances of Bitcoin companies receiving energy credits to reduce power usage during peak demand periods.

The industry has managed to avoid an investigation it deems burdensome, citing political motives from the government. The debate continues on the regulation of cryptocurrency mining in the U.S.

The Blockchain Council of Texas and other groups argue that the government’s actions are aimed at limiting or eliminating Bitcoin mining in the U.S., causing concerns for the industry and its employees.

Source: www.theguardian.com

The Surprising Reason Why Mental Exertion Can Drain Our Energy

The myth that we only use 10 percent of our brains has been completely debunked. Perhaps this idea persists because it is so tempting to believe that you can become a genius simply by learning how to tap into your dormant 90 percent. In reality, no part of your brain can keep up with demands, and your brain is always switched on, even when you're asleep or not thinking at all.

But that doesn't necessarily mean that your brain uses the same amount of energy while daydreaming as it does when you're concentrating. We've all experienced the feeling of being mentally exhausted after concentrating on a difficult problem. It certainly feels like a lot of work to think about it in detail, but is it really? The answer is more nuanced than you might think.

It is true that the brain is a starving organ. “It's the most energy-intensive part of your body,” he says. Nili Ravi At University College London. It makes up about 2% of your body weight, but consumes about 20% of your energy at rest.

Most of this energy is used to maintain varying levels of electrical charge across the neuron's membrane. This unbalanced state must be restored after the neuron fires the signal. “That requires a lot of fuel,” he says. Ewan McNay at the University at Albany in New York.

Interestingly, when it comes to energy use, the brain doesn't differentiate between tasks we traditionally think of as “difficult” and tasks that come more naturally. This was the first…

Source: www.newscientist.com

JET fusion reactor in the UK achieves record-breaking energy output

Inside the JET fusion reactor

eurofusion

A 40-year-old nuclear fusion reactor in the UK has set a world record for energy output in its final run before permanent shutdown, scientists have announced.

The Joint European Taurus (JET) in Oxfordshire began operations in 1983. During its operation, it briefly became the hottest point in the solar system, reaching 150 million degrees Celsius.

The reactor's previous record was in 2021 for a reaction that lasted five seconds and produced 59 megajoules of thermal energy. However, it surpassed this in its final test in late 2023, using just 0.2 milligrams of fuel to sustain the reaction for 5.2 seconds, reaching an output of 69 megajoules.

This corresponds to an output of 12.5 megawatts, enough to power 12,000 homes, Mikhail Maslov of the UK Atomic Energy Agency said at a press conference on February 8.

Today's nuclear power plants rely on nuclear fission reactions, in which atoms are shattered to release energy and small particles. Fusion works in reverse, pushing smaller particles together into larger atoms.

Nuclear fusion can produce more energy without any of the radioactive waste produced by nuclear fission, but there is still no practical way to use the process in power plants.

JET trains atoms of two stable isotopes of hydrogen, deuterium and tritium, together in a plasma to create helium, releasing a huge amount of energy at the same time. This is the same reaction that powers our sun. This is a type of fusion reactor known as a tokamak, which uses rings of electromagnets to contain plasma in a donut shape.

Scientists conducted the final experiment using deuterium and tritium fuel on JET in October last year, and other experiments continued until December. However, the machine is now permanently closed and will be decommissioned over the next 16 years.

Juan Matthews Researchers at the University of Manchester in the UK say many secrets will be revealed during JET's dismantling. For example, how the reactor lining deteriorated from contact with the plasma, and where in the machine the precious tritium, worth around £30,000 a gram, is embedded. You can recover. This will be important information for future research and commercial reactors.

“It's great to have a little bit of a bang,” Matthews said. “It has a noble history. Now that it has served its purpose, we plan to squeeze out more information during the decommissioning period as well. So it's not sad. It's something to be celebrated.”

France's larger, more modern replacement for JET, the International Thermonuclear Experimental Reactor (ITER), is nearing completion, with first experiments scheduled to begin in 2025.

ITER construction project deputy director Tim Luce told a news conference that ITER plans to expand its energy output to 500 megawatts and possibly 700 megawatts.

“These are what I normally call power plant sizes,” he said. “They are at the lowest level of cost required for a power generation facility. Moreover, to obtain high fusion power and gain the timescale needs to be extended to at least 300 seconds, but from an energy production point of view it is probably less than an hour. So what JET has done is exactly a scale model of what we need to do with the ITER project.”

Another reactor using the same design, the Korea Superconducting Tokamak Advanced Research (KSTAR) device, recently succeeded in sustaining a reaction for 30 seconds at temperatures above 100 million degrees Celsius.

Other approaches to creating practical fusion reactors are also being pursued around the world, such as the National Ignition Facility at Lawrence Livermore National Laboratory in California. It fired a very powerful laser into the fuel capsule, a process called inertial confinement fusion, and was able to release almost twice the energy that was put into it.

topic:

Source: www.newscientist.com

UK’s JET fusion reactor achieves highest energy output in the world

A 40-year-old nuclear fusion reactor in the UK has set a world record for energy output in its final run before permanent shutdown, scientists have announced.

The Joint European Taurus (JET) in Oxfordshire began operations in 1983. During its operation, it briefly became the hottest point in the solar system, reaching 150 million degrees Celsius.

The reactor's previous record was in 2021 for a reaction that lasted five seconds and produced 59 megajoules of thermal energy. However, it surpassed this in its final test in late 2023, using just 0.2 milligrams of fuel to sustain the reaction for 5.2 seconds, reaching an output of 69 megajoules.

Inside the JET fusion reactor

eurofusion

This corresponds to an output of 12.5 megawatts, enough to power 12,000 homes, Mikhail Maslov of the UK Atomic Energy Agency said at a press conference on February 8.

Today's nuclear power plants rely on nuclear fission reactions, in which atoms are shattered to release energy and small particles. Fusion works in reverse, pushing smaller particles together into larger atoms.

Nuclear fusion can produce more energy without any of the radioactive waste produced by nuclear fission, but there is still no practical way to use the process in power plants.

JET trains atoms of two stable isotopes of hydrogen, deuterium and tritium, together in a plasma to create helium, releasing a huge amount of energy at the same time. This is the same reaction that powers our sun. This is a type of fusion reactor known as a tokamak, which uses rings of electromagnets to contain plasma in a donut shape.

Scientists conducted the final experiment using deuterium and tritium fuel on JET in October last year, and other experiments continued until December. However, the machine is now permanently closed and will be decommissioned over the next 16 years.

Juan Matthews Researchers at the University of Manchester in the UK say many secrets will be revealed during JET's dismantling. For example, how the reactor lining deteriorated from contact with the plasma, and where in the machine the precious tritium, worth around £30,000 a gram, is embedded. You can recover. This will be important information for future research and commercial reactors.

“It's great to have a little bit of a bang,” Matthews said. “It has a noble history. Now that it has served its purpose, we plan to squeeze out more information during the decommissioning period as well. So it's not sad. It's something to be celebrated.”

France's larger, more modern replacement for JET, the International Thermonuclear Experimental Reactor (ITER), is nearing completion, with first experiments scheduled to begin in 2025.

ITER construction project deputy director Tim Luce told a news conference that ITER plans to expand its energy output to 500 megawatts and possibly 700 megawatts.

“These are what I normally call power plant sizes,” he said. “They are at the lowest level of cost required for a power generation facility. Moreover, to obtain high fusion power and gain the timescale needs to be extended to at least 300 seconds, but from an energy production point of view it is probably less than an hour. So what JET has done is exactly a scale model of what we need to do with the ITER project.”

Another reactor using the same design, the Korea Superconducting Tokamak Advanced Research (KSTAR) device, recently succeeded in sustaining a reaction for 30 seconds at temperatures above 100 million degrees Celsius.

Other approaches to creating practical fusion reactors are also being pursued around the world, such as the National Ignition Facility at Lawrence Livermore National Laboratory in California. It fired a very powerful laser into the fuel capsule, a process called inertial confinement fusion, and was able to release almost twice the energy that was put into it.

Source: www.newscientist.com

Nuclear fusion reactions produce nearly double the energy they consume

Nuclear fusion experiments at the US National Ignition Facility reach a significant milestone

philip saltonstall

Scientists confirmed that a 2022 fusion reaction reached a historic milestone by releasing more energy than it put in, and subsequent tests yielded even better results. Says. The findings, now published in a series of papers, offer encouragement that fusion reactors will one day produce clean, abundant energy.

Today's nuclear power plants rely on nuclear fission reactions, in which atoms are shattered to release energy and small particles. Fusion works in reverse, pushing smaller particles together into larger atoms. The same process powers our sun.

Nuclear fusion can produce more energy without any of the radioactive waste that comes with nuclear fission, but science has yet to find a way to contain and control the process, let alone extract energy from it. Researchers and engineers couldn't find it for decades.

Experiments to do this using laser-irradiated capsules of deuterium and tritium fuel – a process called inertial confinement fusion (ICF) – began in 2011 at California's Lawrence Livermore National Laboratory (LLNL) . Initially, the energy released was only a fraction of the energy. The laser energy input was gradually increased and the experiment finally crossed the important break-even milestone on December 5, 2022. That reaction generated his 1.5 times the laser energy needed to kickstart.

One paper claims that the institute's National Ignition Facility (NIF) has seen even higher ratios in subsequent commissioning, peaking at 1.9 times its energy input on September 4, 2023. .

Richard Towne LLNL said it believes the team's checks and double-checks since the 2022 results have proven it was “not a flash in the pan” and there is still room for improvement.

Town said yields are likely to improve with the hardware currently in place at NIF, but things could move further if the lasers can be upgraded, which would take years. “A sledgehammer always comes in handy,” he says. “If I could get a bigger hammer, I think I could aim for a gain of about 10.”

But Town points out that NIF was never built as a prototype reactor and is not optimized for high yields. His main job is to provide critical research to the US nuclear weapons program.

Part of this research involves exposing the bomb's electronics and payload to the neutron irradiation that occurs during the ICF reaction to see if they would function in the event of an all-out nuclear war. The risk of electronic equipment failure was highlighted during a 2021 test when NIF opened fire, knocking out all lights throughout the site, plunging researchers into darkness. “These lights were not hardened, but you can imagine military components having to withstand much higher doses,” Town says.

This mission means that some of the project's research remains classified. Until the 1990s, even the concept of ICF was secret, Town says.

The announcement that ICF would reach break-even in 2022 raised hopes that fusion power is on the horizon, and this will be further strengthened by news that further progress has been made. However, there are some caveats.

First, the energy output is far below what is needed for a commercial reactor, producing barely enough to heat a bath. What's worse is that this ratio is calculated using the power of the laser, so for him to produce 2.1 megajoules of energy, the laser consumes her 500 trillion watts. That's more power than the output of the entire U.S. national power grid. Therefore, these experiments apply even in a very narrow sense.

martin freer The researchers, from the University of Birmingham in the UK, say these results certainly do not indicate that a practical fusion reactor can now be built. “Science still has work to do,” he says. “We don't know the answers to all of these, and we don't need researchers anymore.”

Freer says that as scientific experiments advance, they pose engineering challenges to create better materials and processes, which in turn enables better experiments and further progress. “Nuclear fusion could happen,” he says. “But the challenges we face are quite steep from a scientific perspective.”

Aneeka Khan The professor at the University of Manchester, UK, agrees that recent advances in fusion research are positive, but stresses that it will be decades before commercial power plants are operational, and that only global cooperation and He stressed that it depends on a concerted effort to train more people. field. She cautions against interpreting advances in fusion research as a possible solution to dealing with dependence on energy from fossil fuels.

“Fusion is already too slow to address the climate crisis. We are already facing the devastation of climate change on a global scale,” says Khan. “In the short term, we need to leverage existing low carbon technologies such as nuclear fission and renewables, and in the long term, invest in fusion to become part of a diverse low carbon energy mix. must commit to tackling the climate crisis.”

topic:

  • nuclear energy/
  • nuclear fusion power generation

Source: www.newscientist.com