Quantum Computers Prove More Valuable Than Anticipated by 2025

Quantum Computers Could Shed Light on Quantum Behavior

Galina Nelyubova/Unsplash

Over the past year, I consistently shared the same narrative with my editor: Quantum computers are increasingly pivotal for scientific breakthroughs.

This was the primary intent from the start. The ambition to leverage quantum computers for deeper insights into our universe has been part of its conception, even referenced in Richard Feynman’s 1981 address. In his discussion about effectively simulating nature, he suggested: “Let’s construct the computer itself using quantum mechanical components that adhere to quantum laws.”

Currently, this vision is being brought to life by Google, IBM, and a multitude of academic teams. Their devices are now employed to simulate reality on a quantum scale. Below are some key highlights.

This year’s advancements in quantum technology began for me with two studies in high-energy particle physics that crossed my desk in June. Separate research teams utilized two unique quantum computers to mimic the behavior of particle pairs within quantum fields. One utilized Google’s Sycamore chip, crafted from tiny superconducting circuits, while the other, developed by QuEra, employed a chip based on cryogenic atoms regulated by lasers and electromagnetic forces.

Quantum fields encapsulate how forces like electromagnetism influence particles across the universe. Additionally, there’s a local structure that defines the behaviors observable when zooming in on a particle. Simulating these fields, especially regarding particle dynamics—where particles exhibit time-dependent behavior—poses challenges akin to producing a motion picture of such interactions. These two quantum computers addressed this issue for simplified versions of quantum fields found in the Standard Model of particle physics.

Jad Halime, a researcher at the University of Munich who was not a part of either study, remarked that enhanced versions of these experiments—simulating intricate fields using larger quantum computers—could ultimately clarify particle behaviors within colliders.

In September, teams from Harvard University and the Technical University of Munich applied quantum computers to simulate two theoretical exotic states of matter that had previously eluded traditional experiments. Quantum computers adeptly predicted the properties of these unusual materials, a feat impossible by solely growing and analyzing lab crystals.

Google’s new superconducting quantum computer, “Willow,” is set to be utilized in October. Researchers from the company and their partners leveraged Willow to execute algorithms aimed at interpreting data obtained from nuclear magnetic resonance (NMR) spectroscopy, frequently applied in molecular biochemical studies.

While the team’s demonstration using actual NMR data did not achieve results beyond what conventional computers can handle, the mathematics underlying the algorithm holds the promise of one day exceeding classical machines’ capabilities, providing unprecedented insights into molecular structures. The speed of this development hinges on advancements in quantum hardware technology.

Later, a third category of quantum computer made headlines. Quantinuum’s Helios-1, designed with trapped ions, successfully executed simulations of mathematical models relating to perfect electrical conductivity, or superconductivity. Superconductors facilitate electricity transfer without loss, promising highly efficient electronics and potentially enhancing sustainable energy grids. However, currently known superconductors operate solely under extreme conditions, rendering them impractical. Mathematical models elucidating the reasons behind certain materials’ superconducting properties are crucial for developing functional superconductors.

What did Helios-1 successfully simulate? Henrik Dreyer from Quantinuum provided insights, stating that it is likely the most pivotal model in this domain, capturing physicists’ interests since the 1960s. Although this simulation didn’t unveil new insights into superconductivity, it established quantum computers as essential players in physicists’ ongoing quest for understanding.

A week later, I was on another call with Sabrina Maniscalco discussing metamaterials with the quantum algorithm firm Algorithmiq. These materials can be finely tuned to possess unique attributes absent in naturally occurring substances. They hold potential for various applications, ranging from basic invisibility cloaks to catalysts accelerating chemical reactions.

Maniscalco’s team worked on metamaterials, a topic I delved into during my graduate studies. Their simulation utilized an IBM quantum computer built with superconducting circuits, enabling the tracking of how metamaterials manipulate information—even under conditions that challenge classical computing capabilities. Although this may seem abstract, Maniscalco mentioned that it could propel advancements in chemical catalysts, solid-state batteries, and devices converting light to electricity.

As if particle physics, new states of matter, molecular analysis, superconductors, and metamaterials weren’t enough, a recent tip led me to a study from the University of Maryland and the University of Waterloo in Canada. They utilized a trapped ion quantum computer to explore how particles bound by strong nuclear forces behave under varying temperatures and densities. Some of these behaviors are believed to occur within neutron stars—poorly understood cosmic entities—and are thought to have characterized the early universe.

While the researchers’ quantum computations involved approximations that diverged from the most sophisticated models of strong forces, the study offers evidence of yet another domain where quantum computers are emerging as powerful discovery tools.

Nevertheless, this wealth of examples comes with important caveats. Most mathematical models simulated on quantum systems require simplifications compared to the most complex models; many quantum computers are still prone to errors, necessitating post-processing of computational outputs to mitigate those inaccuracies; and benchmarking quantum results against top-performing classical computers remains an intricate challenge.

In simpler terms, conventional computing and simulation techniques continue to advance rapidly, with classical and quantum computing researchers engaging in a dynamic exchange where yesterday’s cutting-edge calculations may soon become routine. Last month, IBM joined forces with several other companies to launch a publicly accessible quantum advantage tracker. This initiative ultimately aims to provide a leaderboard showcasing where quantum computers excel or lag in comparison to classical ones.

Even if quantum systems don’t ascend to the forefront of that list anytime soon, the revelations from this past year have transformed my prior knowledge into palpable excitement and eagerness for the future. These experiments have effectively transitioned quantum computers from mere subjects of scientific exploration to invaluable instruments for scientific inquiry, fulfilling tasks previously deemed impossible just a few years prior.

At the start of this year, I anticipated primarily focusing on benchmark experiments. In benchmark experiments, quantum computers execute protocols showcasing their unique properties rather than solving practical problems. Such endeavors can illuminate the distinctions between quantum and classical computers while underscoring their revolutionary potential. However, transitioning from this stage to producing computations useful for active physicists appeared lengthy and undefined. Now, I sense this path may be shorter than previously envisioned, albeit with reasonable caution. I remain optimistic about uncovering more quantum surprises in 2026.

Topics:

Source: www.newscientist.com

Quantum Computers Are Now Practical and Valuable

3D illustration of a quantum computer

AdventTr/Getty Images

Amidst the excitement surrounding quantum computing, the technology may appear as a catch-all solution for various challenges. While the science is impressive, real-world applications are still developing. However, the quest for viable uses is starting to yield fruitful results. Particularly, the search for exotic quantum materials is gaining traction, which could revolutionize electronics and enhance computational power.

The discovery and exploration of new phases—especially more exotic forms analogous to ice or liquid water—remain foundational to condensed matter physics. Insights gained here can enhance our understanding of semiconductor functionality and lead to practical superconductors.

Yet, traditional experimental methods are increasingly inadequate for studying certain complex phases that theory suggests exist. For instance, the Kitaev honeycomb model predicts materials with a unique type of magnetism, but it took “decades of exploration to actually design this with real materials,” according to Simon Everred of Harvard University.

Everred and colleagues simulated this phenomenon using a quantum computer with 104 qubits made from ultra-cold atoms. They’re not alone in this endeavor; Frank Pollmann from the Technical University of Munich and his team utilized Google’s Sycamore and Willow Quantum Computers, which house 72 and 105 superconducting qubits respectively, to model conditions based on iterations of the Kitaev honeycomb framework. Both teams have documented their findings.

“These two projects harness quantum computers to investigate new phases of problems that had been theoretically predicted but not observed experimentally,” notes Petr Zapletal from the University of Erlangen-Nuremberg, who was not involved in the studies. “The advancement of quantum simulations for complex condensed matter systems is particularly thrilling.”

Both research teams confirmed the presence of anyons in their simulations, a significant progress that illustrates the growth and potential utility of quantum computers. Anyons differ fundamentally from qubits and represent exotic particles that are challenging to emulate.

Existing particles typically categorize into fermions and bosons. While chemists and materials scientists often focus on fermions, qubits generally function as bosons. The distinctions—like spin and collective behaviors—complicate the simulation of fermions using bosons. However, cold atom quantum experiments utilized Kitaev models to bridge these gaps. Masin Karinowski of Harvard, who participated in the research, described the Kitaev model as a “canvas” for exploring new physics. Through this model, the team could tune quasiparticles in their simulations by adjusting interactions among the qubits. According to Karinowski, some of these new particles might be employed to replicate novel materials.

Another critical aspect of the research was the use of Google’s quantum computer to examine materials outside equilibrium. Despite the significant exploration of equilibrium states in laboratories, the non-equilibrium realm remains largely uncharted. Pollmann notes that this aligns with laboratory trials where materials are repeatedly subjected to laser pulses. His team’s work reflects how condensed matter physicists study materials by exposing them to extreme temperatures or magnetic fields and then diagnosing changes in their phases. Such diagnostics are crucial for determining the conditions under which materials can be effectively utilized.

It’s important to clarify that these experiments don’t yield immediate real-world applications. To translate these findings into usable technologies, researchers will need to conduct further analysis on larger, less error-prone quantum computers. However, these preliminary studies carve out a niche for quantum computers in exploring physical phenomena, akin to the way traditional experimental tools have been employed for decades.

That material science might be the first field to showcase the value of quantum computing is not surprising. This aligns with how pioneers like Richard Feynman discussed quantum technology in the 1980s, envisioning its potential beyond mere devices. Moreover, this perspective diverges from the usual portrayal of quantum computing as technology primarily focused on outperforming classical computers in non-practical tasks.

“Viewing the advancement of quantum computing as a scientific approach, rather than simply through the lens of individual device performance, is undeniably supported by these experimental findings,” concludes Kalinowski.

topic:

Source: www.newscientist.com

Unexpectedly Valuable Mathematical Patterns in Real-World Data

“When you search for stock market prices, you may see patterns…”

Muhla1/Getty Images

Flipping through the front page of a newspaper, one is greeted by a myriad of numbers—metrics about populations, lengths, areas, and more. If you were to extract these figures and compile them into a list, it might seem like a random assortment.

However, these figures are not as arbitrary as they may appear. In reality, the leading digit of many numbers, such as total revenues or building heights, tends to be predominantly the number 1. While true randomness would suggest that each digit has an equal chance of leading, the actual data shows that about one-third of the time, the first digit is a 1. The number 9, interestingly, appears as the leading digit in about 5% of cases, with other digits following such a trend.

This phenomenon is referred to as Benford’s Law, which illustrates the expected distribution of first digits within a dataset of a certain type—especially those spanning a wide, unspecified range. Although values like human height (where numbers are confined within a limited spectrum) or dates (which also have defined limits) don’t follow this law, others do.

Consider checking your bank balance, numbering a house, or analyzing stock prices (as displayed). Such numbers commonly exhibit a distribution with varied digit lengths. In neighborhoods with just a handful of houses, you might see a balance of numbers, whereas in larger towns, hundreds may share similar leading digits.

Picture a street hosting nine houses. The proportion of leading digits resembles an even split among the nine options. Conversely, on a street with 19 houses, a larger fraction—often over fifty percent—will begin with 1. As the housing number increases, this pattern persists. With 100 houses, you would observe a fairly uniform distribution across all digits, yet with 200 occupants, once again, more than half will typically start with 1.

Due to the diverse origins of data in real-world collections, the average likelihood of seeing numbers that start with 1 fluctuates between these two extremes. Similar calculations can be made for other digits, resulting in an overall frequency distribution observable in extensive datasets.

This characteristic is particularly useful in identifying potential data fabrication. When analyzing a company’s financial records, a Benford-like distribution is expected in their sales figures. However, when someone generates random numbers, the frequency distribution of the leading digits lacks a defined curve. This principle serves as one of the many tools forensic accountants employ to root out dubious activities.

The next time you examine your bank statement or compare river lengths, take note of how often those numbers start with 1.

Katie Steckles is a mathematician, lecturer, YouTuber, and author based in Manchester, UK. She also contributes advice to Brent Wister, a puzzle column for New Scientist. Follow her @stecks

For additional projects, please visit newscientist.com/maker

topic:

Source: www.newscientist.com

Extracting valuable resources in the Arctic is an unwise endeavor

The Arctic is a rich land. Not only is its beauty, wildlife, cultural heritage, but also among the kinds of products we cherish most: oil, gas, lithium, cobalt, gold.

But those treasures aren’t good for us. As our special report on polar science reveals (see Why the Pole-Disappearing Sea Ice is a Planet-wide Crisis), it is difficult to extract the rich resources of the Arctic for commercial benefits.

Carrying oil and gas from this area is an expensive business, even the suspicious tailwinds of sea ice, which help clean new patches of the ocean for drilling. As industry and transportation gradually move towards power and hydrogen output, demand for oil decreases, making it difficult to justify costs.

The same is true for minerals. Greenland is a hotspot for demand material and perhaps one of the reasons why US President Donald Trump is actively pursuing its acquisition. But even leaving Greenland’s lack of infrastructure is difficult for roads to come to this icy island. This is a dangerous place to invest. The landscape changes rapidly as the glacier melts, revealing new, unstable coastlines that threaten landslides and tsunamis.

For hardness business executives, there are places that are easy and less dangerous to mine.

Crossing the Arctic, melted permafrost is destabilizing existing roads, buildings and industrial sites. For business executives at Hardnose, mine is easier and more dangerous.

To see the Arctic as a ticket to prosperous economic growth is a fool’s errand. Instead of viewing it as a ripe area of ​​exploitation, we should treat it as a scientific wonder while respecting the people who live there. After all, as the fastest changing region on the planet, it is a pioneer of our climate future. And there’s still a lot to learn: how quickly does the ice disappear? How fast does the sea level rise? And what happens when the ice runs out?

In a more positive note, researchers are pioneering more inventive ways to unlock these mysteries, from new “drift” labs to ultra-deep ice training and cutting-edge submarines. The Arctic is filled with opportunities for exploration and discovery. We need to let go of the idea of ​​monetizing them.

topic:

Source: www.newscientist.com

Digital ID cards in Poland: A valuable tool or a barrier to progress in e-Government?

There has been much talk about the potential for Poland’s economy to surpass that of the UK by 2030, but in some aspects, Poland is already ahead.

One such area is the digital ID card and driving license created by Poles, allowing them to access various public services through the mobile app mObywatel. Users must verify their identity through e-banking login, a digitally enabled physical ID card, or a special “trusted profile” online upon initial access.

With 8 million users, the mObywatel app enables Poles to create a digital ID, check demerit points on their driver’s license, review their car history, monitor local air quality, and find their polling place.

Rafał Sionkowski, a senior government official overseeing the app, emphasized the importance of keeping the core developer team within the public institution to ensure immediate public access to the digitized database.

As more EU countries develop similar apps in anticipation of the EU’s eIDAS 2.0 regulation on electronic identification, authentication, and trust services, significant progress is expected.

The regulation, set to be fully implemented by 2026 or 2027, establishes the legal framework for electronic identification systems that can be used across EU borders. Sionkowski noted that digital driving licenses can be presented in Germany and digital IDs in Spain for verification.

A digital version of your Polish driving license can be stored on your smartphone via an app. Photo: SOPA Images/LightRocket/Getty Images

Sionkowski mentioned plans to enhance the app with new features like notifying insurance companies of accidents and exploring its potential in verifying age online and assisting vulnerable groups in accessing public services.

He stressed the importance of focusing on services that people use, highlighting the value added through features like air quality monitors for local readings.

Privacy lawyer Wojciech Kulikki advocated for adhering to strict privacy principles while adding service features to the app. He cautioned against intrusive features like unauthorized location tracking.

Citizens could have more control over their data either through open-source app development for independent oversight or by checking data accessed by other government departments.

Janusz Ciezynski, a former digital minister, noted the smoother rollout of the app in Poland compared to the UK due to the presence of physical ID cards, quelling concerns about privacy infringements.

Ciezynski expressed enthusiasm for incorporating more public services into a single app, envisioning benefits for disaster-affected areas with quick access to funds through virtual payment cards.

Source: www.theguardian.com

“9 valuable lessons I learned from TikTok, including how to avoid stale potato chips” | TikTok

The average TikTok user is approximately 1 hour every day on the app. However, the app is set to be banned in the US, and US users' screen time is about to decline rapidly.

On Friday, the U.S. Supreme Court rejected an appeal against a law banning social media platforms, citing national security concerns. TikTok's China-based parent company ByteDance had challenged the law, arguing that it violated free speech protections for its more than 170 million users in the United States. But the ruling was upheld by a unanimous vote, requiring the app to find an approved buyer for the U.S. version by Sunday or be blocked. Outgoing U.S. President Joe Biden has said he has no intention of enforcing the ban, instead deferring to that decision. President-elect Donald Trump's administration. While there are rumors that the ban may still be thwarted, a major U.S. exodus has already begun as users flock to alternative Chinese video-sharing app Red Note.

many Users have been Post a “Farewell TikTok” video. The app was first released in the US in 2016. Some of them satirize or mock national security concerns as users bid farewell to “China's personal spies.” Others are more heartfelt, such as a video montage of a teenager turning 18 or a newborn puppy growing into a white-haired dog. Many comments said, “I grew up with this app.”

But that is“What I learned on TikTok that changed my life” video that is the most attractive. While social media is regularly criticized for spreading fake news and harming mental health, these videos show there is a more positive side, at least for some people.

When life gives you lemons… Photo: TikTok

one video by Brigitte Muller piled up Over 1 million views. Her advantage is to use a yoga mat underneath sofa Use cushions to prevent them from slipping off, spray vodka on vintage clothing, etc. Gets rid of musty odor. Thousands of users flocked to the comments section to learn everything from recognizing the characteristics of neurodivergence to making jammy eggs. Some people describe TikTok as the parent or grandmother they never had.

So, in honor of TikTok's final hours (at least in the US), I'm sharing nine of my own favorite lessons.

1. Always roll limes and lemons before squeezing them to maximize the amount of juice. I also ditched the glass lemon squeezer and replaced it with a fork.

2. I envy those houses. architectural digest Does your sofa always seem to have fluffy cushions instead of squishy ones? The secret is Purchase a cushion insert that is 2 inches larger than the cover.

3. To prevent the cutting board from slipping, put a wet tea towel down. And somehow Place a damp kitchen towel next to the onion. meanwhile Chopping stops the eye streaming.

4. You are tying the dressing gown incorrectly. Remove the tie and rethread it through the loop so that it hangs in front of you. after that Instead of tying it at the back, Pull the ends together at the front and voila! It will not open unexpectedly again.

Let's tie…the correct way to tie a dressing gown. Photo: TikTok

5. A former American POW tells us How to keep a bag of potato chips fresh without using rubber bands or clips. TLift the bag and fold the two corners so that the top forms a triangle. Next, roll the other side under and curl it completely, creating two “pockets” on the other side. Turn it over and keep it in place Clips are never stale again.

6. Instead Do you leave your cosmetics scattered near the sink? I dumped my bag by the front door when I entered the house and now follow this mantra: “Don’t put it down, please put it away.” IIf you use something and then immediately put it back in its original place, No need to organize it later.

7. How to use Turntable – aka Lazy Susan – Storing spices means they don’t have to be rooted deep in the back of your cupboard That particular seasoning seemed like I would never find it.

Say cheese…turn the grater sideways. Photo: TikTok

8. Turn the box mold to make it easier to grate the cheese. put the grater aside. The same goes for can openers. put it on top of the canrather than using it from the side. Stops sharp and jagged edges.

9. Hair dryer helps remove stubborn stains glass and plastic adhesive labels. You can also remove address labels from cardboard boxes. Very useful for reusing after purchasing on eBay or Vinted.

Source: www.theguardian.com

Bill Gates advocates for AI as a valuable tool in achieving climate goals

Bill Gates argues that artificial intelligence will assist, not hinder, in achieving climate goals, despite concerns about new data centers depleting green energy supplies.

The philanthropist and Microsoft co-founder stated that AI could enhance technology and power grids’ efficiency, enabling countries to reduce energy consumption even with the need for more data centers.

Gates reassured that AI’s impact on the climate is manageable, contrary to fears that AI advancements might lead to increased energy demand and reliance on fossil fuels.

“Let’s not exaggerate this,” Gates emphasized. “Data centers contribute an additional 6% in energy demand at most. But it’s likely around 2% to 2.5%. The key is whether AI can accelerate the reduction to 6% or beyond. The answer is, ‘Definitely.’

Goldman Sachs estimates that AI chatbot tool ChatGPT’s electricity consumption for processing queries is nearly ten times more than a Google search, potentially causing carbon dioxide emissions from data centers to double between 2022 and 2030.

Experts project that developed countries, which have seen energy consumption decline due to efficiency, could experience up to a 10% rise in electricity demand from the growth of AI data centers.

In a conference hosted by his venture fund Breakthrough Energy, Gates told reporters in London that the additional energy demand from AI data centers is likely to be offset by investments in green electricity, as tech companies are willing to pay more for clean energy sources.

Breakthrough Energy has supported over 100 companies involved in the energy transition. Gates is heavily investing in AI through the Gates Foundation Trust, which has allocated about a third of its $77 billion assets into Microsoft.

However, Gates’ optimism about AI’s potential to reduce carbon emissions aligns with peer-reviewed papers, suggesting that generative AI could significantly lower CO2 emissions by simplifying tasks like writing and creating illustrations.

AI is already influencing emissions directly, as demonstrated by Google using deep learning techniques to reduce data center cooling costs by 40% and decrease overall electricity usage by 15% for non-IT tasks.

Despite these advancements, concerns remain about the carbon impact of AI, with Microsoft acknowledging that its indirect emissions are increasing due to building new data centers around the world.

Gates cautioned that the world could miss its 2050 climate goals by up to 15 years if the transition to green energy is delayed, hindering efforts to decarbonize polluting sectors and achieve net-zero emissions by the target year.

He expressed concerns that the required amount of green electricity may not be delivered in time for the transition, making it challenging to meet the zero emissions goal by 2050.

Gates’ warning follows a global report indicating a rise in renewable energy alongside fossil fuel consumption, suggesting that meeting climate goals requires accelerated green energy adoption.

This article was corrected on Friday, June 28. The Gates Foundation does not invest in Microsoft. The Gates Foundation Trust, which is separate from the foundation, holds Microsoft shares.

Source: www.theguardian.com

Nvidia emerges as the most valuable company in the world amidst AI boom

Nvidia surpassed Microsoft on Tuesday to become the world’s most valuable company, driven by its essential role in the competition for artificial intelligence dominance.

With a 3.5% increase in its shares to $135.58, Nvidia now has a market capitalization of $3.34 trillion, following its recent surpassing of Apple to become the second-most valuable company.

Originally known for making video-game chips, Nvidia has evolved into a global powerhouse, benefiting from the industry’s shift towards artificial intelligence and becoming a go-to supplier for tech giants.

Outperforming industry giants like Google and Apple, Nvidia’s growth has spurred investment and market interest.

The company’s success has contributed to record highs on Wall Street, with the S&P 500 closing at 5,487.03 on Tuesday.

Nvidia’s shares have soared by approximately 180% this year, significantly outperforming Microsoft’s 19% increase, driven by high demand for its cutting-edge processors.

Tech leaders like Microsoft, Meta Platforms Inc., and Alphabet Inc. are in a race to bolster AI computing capabilities and dominate emerging technologies.

The surge in Nvidia’s stock price has pushed its market capitalization to new heights, adding over $103 billion on Tuesday alone.

By splitting its stock 10-for-1 on June 7, Nvidia aimed to make its highly-valued stock more accessible to retail investors.

Nvidia’s chips, utilized in crucial AI tools such as OpenAI’s ChatGPT chatbot, have driven its revenue and stock price up, arousing increased investor interest in Silicon Valley.

Skip Newsletter Promotions

As Nvidia solidifies its presence in the tech sector, CEO Jensen Huang, aged 61, has ascended to the ranks of the world’s wealthiest individuals, with a net worth exceeding $100 billion.

In less than two years, Nvidia’s market capitalization has jumped from $1 trillion to $3 trillion, marking a remarkable growth trajectory.

Reuters assisted with reporting.

Source: www.theguardian.com

Traditional pessimism could be a valuable tool in combating climate change

Pessimism is a dirty word in climate policy circles. There are good reasons for this. Especially because while optimism can encourage positive change, assuming the worst can paralyze us and prevent us from taking action. But when it comes to climate modeling, a certain amount of negativity can be a good thing.

The Intergovernmental Panel on Climate Change is already working on various models and pathways to assess how to limit warming to 1.5°C and how to ensure that carbon emissions continue unabated or experience many possibilities in between. I use it to hedge my bets. These pathways are backed by thousands of scientific papers, tons of data, and the brains of the world’s climate scientists, but like all models, they are built on assumptions.

One of the key assumptions in the scenario of keeping temperature rise below 1.5°C is that the technology to remove carbon dioxide from the atmosphere will be rapidly perfected in the near future. This is not an unreasonable prediction, given human ingenuity and strong incentives to do so. But incorporating carbon capture technology into these models is like declaring that winning the lottery will balance the household budget. If you can’t reduce your spending to an affordable level, you better hope that a big prize is on the way.

As the two articles in this issue demonstrate, this is a dangerous approach. A detailed analysis of geological carbon storage plans shows that it is at least very unlikely, if not impossible, to meet the levels envisioned for many 1.5°C pathways. (“Our plans to tackle climate change with carbon storage add up”). The chances of winning the lottery don’t seem that high. On the other hand, we also received an unexpected carbon bill in the form of melting Arctic permafrost, releasing more greenhouse gases than previously accounted for. Frozen soil is now a major net source of greenhouse gases (see “Frozen soil is now a major net source of greenhouse gases”).

While these revisions in our understanding of climate change are entirely expected and to be welcomed, they do signal that the challenges we face over the next decade will only get more difficult. . Rather than narrowing down climate models until the numbers roughly match the 1.5°C goal, perhaps it would be better to take a more pessimistic outlook and accelerate efforts to limit the damage.

topic:

  • climate change/
  • global warming

Source: www.newscientist.com

Newly Found Fossil Remains in France Offer Valuable Information on Ordovician Polar Ecosystems

in new paper in diary natural ecology and evolution, paleontologists described the diversity of the Cabrières biota, a new Early Ordovician site in the Montagne Noire in southern France. During the Early Ordovician, this region was an open marine environment located in the southern hemisphere at high polar latitudes, on the margin of the Gondwana supercontinent.



Artistic reconstruction of Cabrière Biota: in the foreground, Unpix (trilobites) and various ostracods including brachiopods and cryoliths (bottom left corner). Behind the trilobites are lobopods, chelicerates, cnidarians (blue), sponges (green), thin branched algae (red and green), hemichordates (purple), and some soft bodies. There are animals. Bivalve arthropods live in the water column along with graptolites. Image credit: Christian McCall, Prehistorya Art.

“Early Paleozoic sites with preserved soft tissues provide a wealth of information about the evolution of past life and improve our understanding of earlier ecosystems, but they are unevenly distributed in time and space. ,” said paleontologist Farid Saleh of the University of Lausanne and his colleagues.

“About 100 soft-tissue preserved assemblages have been recorded from the Cambrian, while about 30 are known from the Ordovician, and only a few have been discovered in early Ordovician rocks. .”

“The distribution of early Paleozoic remains is also paleogeographically biased, as approximately 97% of the biota discovered represents tropical and temperate ecosystems within 65 degrees north and south of the paleoequator.”

“This pattern is especially true for the Ordovician, where very few sites are known to have polar environments.”

“Among the most famous Ordovician sites, Sumchere in South Africa, Big Hill in the United States, and Winneshiek exhibit tropical ecosystems.”

“Given the rarity of Ordovician sites and their lopsided paleogeographical distribution, discovering new biota with preserved soft tissues across the aforementioned paleogeographic zones and environments will deepen our understanding of this period and This is crucial for gaining better insight into the factors driving increases in animal diversity on Earth. ”



Biomineralized species of the Cabriere biota: (a) Trilobites of the genus Unpix(b) gastropods with tubular structures, probably conuraids Sphenothalas(c) biomineralized canine cnidarians; (d) Arthrobrachiopod attached to a spongiosa, probably of the leptomid family. (e) Assemblage formed by an articulated brachiopod (center), a flattened carapace of a probably bivalve arthropod (left and right of center), and the skull of a calimenin trilobite (left). (f) Possibly visceral cyst. Scale bars – (a) and (e) 4 mm, (b) and (d) 1 cm, (c) 5 mm, (f) 2 mm.Image credit: Saleh other., doi: 10.1038/s41559-024-02331-w.

In a new paper, paleontologists describe a group of 470-million-year-old (early Ordovician) fossils, named Cabrière Biota, discovered in southern France's Montagne Noire.

The fossil site was discovered by two French amateurs, Eric Montseret and Sylvie Montseret Goujon.

Saleh and his co-authors examined about 400 extremely well-preserved soft tissue fossils taken from the site.

Fossils typically exhibit shades of brown, red, or orange and are embedded within a siliciclastic matrix of mudstone and siltstone, and their colors range from blue to green to yellow.

The Cabriere biota is characterized by a prevalence of sponges and branched algae, which constitute 26% of all identified fossils.

Also included are molluscs (14%), trilobites (12%), brachiopods (9%), cystoliths (7%), and cnidarians (6%).

An interesting feature of this biota is its rarity, with echinoderms being represented by only three specimens.

The Cabrières biota also exhibits the shells of various bivalve arthropods, which constitute 16% of the fossils identified.

Some wormlike organisms are also present in the biota (approximately 1% of identified fossils).

“The Cabrière biota was once located in close proximity to Antarctica and reveals the composition of the southernmost Ordovician ecosystem,” Dr Saleh said.

“The high biodiversity of this site suggests that the area served as a refuge for species fleeing the high temperatures that were prevalent further north at the time.”

“During this period of global warming, animals were certainly living in high-latitude refuges, escaping the extreme temperatures at the equator.”

Dr Jonathan Antcliffe, a paleontologist at the University of Lausanne, said: “The distant past gives us a glimpse of the near future that could happen to us.''

_____

F. Saleh other. Cabrières Biota (France) provides insight into Ordovician polar ecosystems. Nat Ecole Evol, published online on February 9, 2024. doi: 10.1038/s41559-024-02331-w

Source: www.sci.news

The Post Office Horizon Scandal: Valuable Lessons for Big Tech Companies to Learn

TThe Post Office Horizon scandal has long been a frustrating one to follow as a technology reporter. Because even though it stems from the failure to deploy a large-scale government IT project, it’s not about technology at all.

In such stories there is a desire to uncover the specific fault lines that caused the disaster to occur. Taking Grenfell Tower as an example, the entire system was flawed and the investigation into the fire revealed gory details, but it is also clear that the fatal error was in covering the building with combustible panels. Identifying that fulcrum leads both ways to further questions (how were the panels deemed safe, and was the building able to be safely evacuated despite their flaws?), but the catastrophic It is clear where it is.

I feel like there should be comparable focus points in the Horizon system. “What happened at Horizon that led to so many false accounts?” is a question I’ve asked many times over the decade since I first learned of the scandal. Thanks to Computer Weekly for the coverage. I searched for systems in the hopes of finding some important crux, a terrible decision around which all subsequent problems swirl, that could be sensibly explained to provide a technical foundation for a very human story of malice and greed. I’ve been looking into architecture.

Still, the conclusion I’m forced to draw is that Horizon was really, really broken. From toe to toe, the system was terrible. Each postmaster had fundamentally different flaws, so a plethora of technical errors, worst practice decisions, and lazy cutbacks were probably part of the reason the Postal Service continued to fight for so long. Masu.

One system continued to accept input even when the screen froze, writing transactions to the database invisibly, while other systems simply had edge-case bugs in the underlying system that caused transactions to change. It just couldn’t lock when it shouldn’t have. There was also a problem with the network with the central database, causing transactions to be dropped without warning whenever there was a problem with the data connection.

Still, if you want to trace the point in time when bad IT became a crisis, you need to look completely into the technology past. The Post Office declared Horizon to be functional as legal tender. Everything that happened after that was a logical conclusion. If Horizon works, the cause of the error should be in the subpostmaster operation. If they say they haven’t made a mistake, they must have committed fraud. If they committed fraud, a conviction is morally right.

But Horizon didn’t work.

Today’s big technology companies aren’t so cocky as to claim that their software is perfect. In fact, the opposite is accepted as reality. The phrase “all software has bugs” is repeated too often and casually, implying that users are demanding too much of the technology they rely and work reliably on.

But they often still act as if they believe the opposite. My inbox is constantly filled with unmanageable people who have been falsely flagged as spammers, scammers, or robots by Facebook, Google, Amazon, and Apple’s automated systems. These people have lost years of shopping, lost access to friends and family, and lost the pages and profiles on which they built their careers. I can’t help them all and still do my day job, but strangely enough, the cases I decide I can contact a large company for are almost always easily resolved. It turns out.

No one would argue that even the worst software Google has put out is as broken as Horizon. (The Post Office says the current version of the software, created in 2017, has been found to be “robust compared to comparable systems.”) But the real culprit is broken software with flaws. If you’re acting like something isn’t supposed to be there, that’s serious. The tech industry may have more lessons to learn from this scandal than it’s willing to admit.

Source: www.theguardian.com

Microsoft Surpasses Apple to Reclaim Title of Most Valuable Company after Two Years

Microsoft's stock closed above Apple's for the first time since 2021 on Friday, making it the world's most valuable company, as demand concerns hit the iPhone maker's stock price.

On Friday, Apple rose 0.2% and Microsoft rose 1%. This brings Microsoft's market capitalization to $2.887 trillion, an all-time high, according to LSEG data. Apple's market capitalization, calculated based on Thursday's filing data, was $2.875 trillion.

Concerns about smartphone demand have pushed Apple stock down 3% so far in 2024 after rising 48% last year. Microsoft is up about 3% since the beginning of the year after soaring 57% in 2023 on a bull run driven in part by its lead in generative artificial intelligence through its investment in ChatGPT maker OpenAI.

According to LSEG, Apple's market capitalization peaked at $3.081 trillion on December 14th.

Microsoft is incorporating OpenAI's technology into its suite of productivity software, which helped fuel a recovery in its cloud computing business in the July-September quarter. His AI leadership at the company has also created an opportunity to challenge Google's dominance in web search.

Meanwhile, Apple is grappling with sluggish demand, including for its cash cow iPhone. Demand in China, a major market, is sluggish as the Chinese economy has been slow to recover from the coronavirus pandemic and a revived Huawei is eating away at market share.

Sales of Apple's Vision Pro mixed reality headset will begin in the US on February 2nd, marking Apple's biggest product launch since the iPhone in 2007. However, UBS estimated in a report this week that Vision Pro sales are “relatively insignificant” to Apple. Earnings per share in 2024.

Since 2018, Microsoft briefly overtook Apple as the most valuable company, and most recently in 2021, when concerns about pandemic-related supply chain shortages affected the iPhone maker's stock price.

In its latest quarterly report in November, Apple gave a holiday quarter sales forecast that was lower than Wall Street's expectations due to weak demand for iPads and wearables.

Analysts on average expect Apple's December quarter sales to rise 0.7% to $117.9 billion, according to LSEG. As a result, sales will increase year-on-year for the first time in four quarters. Apple announced its financial results on February 1st.

Analysts expect Microsoft to report a 16% increase in revenue to $61.1 billion in the coming weeks due to continued growth in its cloud business.

Source: www.theguardian.com