Achieving the 1.5°C Climate Goal: The Century’s Best Vision for a Sustainable Future

New Scientist - Your source for groundbreaking science news and in-depth articles on technology, health, and the environment.

During the first decade of the 21st century, scientists and policymakers emphasized a 2°C cap as the highest “safe” limit for global warming above pre-industrial levels. Recent research suggests that this threshold might still be too high. Rising sea levels pose a significant risk to low-lying islands, prompting scientists to explore the advantages of capping temperature rise at approximately 1.5°C for safeguarding vulnerable regions.

In light of this evidence, the United Nations negotiating bloc, the Alliance of Small Island States (AOSIS), advocated for a global commitment to restrict warming to 1.5°C, emphasizing that allowing a 2°C increase would have devastating effects on many small island developing nations.

James Fletcher, the former UN negotiator for the AOSIS bloc at the 2015 UN COP climate change summit in Paris, remarked on the challenges faced in convincing other nations to adopt this stricter global objective. At one summit, he recounted a low-income country’s representative confronting him, expressing their vehement opposition to the idea of even a 1.5°C increase.

After intense discussions, bolstered by support from the European Union and the tacit backing of the United States, as well as intervention from Pope Francis, the 1.5°C target was included in the impactful 2015 Paris Agreement. However, climate scientists commenced their work without a formal evaluation of the implications of this warming level.

In 2018, the Intergovernmental Panel on Climate Change report confirmed that limiting warming to 1.5°C would provide substantial benefits. The report also advocated for achieving net-zero emissions by 2050 along a 1.5°C pathway.

These dual objectives quickly became rallying points for nations and businesses worldwide, persuading countries like the UK to enhance their national climate commitments to meet these stringently set goals.

Researchers at the University of Leeds, including Piers Foster, attribute the influence of the 1.5°C target as a catalyst driving nations to adhere to significantly tougher climate goals than previously envisioned. “It fostered a sense of urgency,” he remarks.

Despite this momentum, global temperatures continue to rise, and current efforts to curb emissions are insufficient to fulfill the 1.5°C commitment. Scientific assessments predict the world may exceed this warming threshold within a mere few years.

Nevertheless, 1.5°C remains a crucial benchmark for tracking progress in global emissions reductions. Public and policymakers are more alert than ever to the implications of rising temperatures. An overshoot beyond 1.5°C is widely regarded as a perilous scenario, rendering the prior notion of 2°C as a “safe” threshold increasingly outdated.

Topic:

Source: www.newscientist.com

Achieving Net Zero: Why America Needs a Balanced Approach of Incentives and Regulations

Subsidies for Low-Emission Technologies

Subsidies Promote Adoption of Low-Emission Technologies like Electric Vehicles

Kent Nishimura/Los Angeles Times via Getty Images

To achieve net-zero greenhouse gas emissions in the United States by 2050, implementing green subsidies is essential, complemented by a potential carbon tax, both of which may face opposition under President Donald Trump.

Introducing a price or tax on carbon emissions stands out as the most effective strategy to curb carbon output. However, the U.S. government has continually struggled to enact cap-and-trade laws that would limit emissions and require companies surpassing these limits to buy allowances.

Subsidies are straightforward to deploy and could lower the cost of adopting low-emission technologies, including electric vehicles, thus alleviating the financial impact of carbon pricing.

Wei Peng at Princeton University analyzed the implications of subsidies and carbon taxes to find the most effective policy sequence for emissions reduction in the U.S.

The results indicate that subsidies could lead to a 32% reduction in energy system emissions by 2030; however, this impact may decrease over time as fossil fuels like natural gas remain economically viable.

Conversely, implementing a carbon tax in 2035 could result in the phase-out of most fossil fuels, reducing overall emissions by more than 80% by 2050.

“Subsidies will help cultivate green industries, but we will still require regulatory enforcement to meet decarbonization objectives,” states Penn. “The key question is how to navigate that transition.”

Following President Joe Biden’s 2050 net-zero aim, recent legislation has introduced tax incentives for investments in green infrastructure, ranging from electric vehicle charging stations to carbon sequestration technologies. In contrast, President Trump dismissed these subsidies as “the new green scam” and rescinded many of them.

This unpredictable policy landscape is “the worst-case scenario,” according to Peng. “This inconsistency will either slow down decarbonization or inflate costs.”

If subsidies are reinstated post-Trump’s presidency in 2029, along with introducing a carbon tax by 2045, researchers conclude that the carbon tax would need to be 67% higher than current rates to achieve net-zero emissions. This is primarily due to the necessity of employing costly technology to extract vast amounts of carbon dioxide from the atmosphere.

Yet, researchers suggest that “accelerated innovation” through unforeseen technological breakthroughs could lessen the need for stringent regulations.

The findings advocate strongly for a carbon pricing model, yet extending this analysis globally would yield richer insights into effective carrot-and-stick combinations, notes Gregory Nemet at the University of Wisconsin-Madison. Countries like China and those in the European Union have adopted extensive subsidies and carbon pricing initiatives, leading to advancements such as affordable solar panels, which empower other nations to cut emissions.

“Progress is ongoing in these regions, along with robust policy frameworks,” remarks Nemet. “This fosters accelerated innovation, and the U.S. stands to benefit significantly from this evolution.”

Topics:

Source: www.newscientist.com

AI-Driven Electricity Usage Forecasting Shows Industry is Far from Achieving Net-Zero Goals

Sure! Here’s the rewritten content while keeping the HTML tags intact:

Data Center in Ashburn, Virginia

Jim Roe Scalzo/EPA/Shutterstock

As the artificial intelligence sector grows swiftly, concerns about the ecological effects of data centers are increasingly being discussed. New projections indicate that the industry may fall short of achieving net-zero emissions by 2030.

Fenki Yu and researchers from Cornell University in New York have evaluated the potential energy, water, and carbon consumption of current leading AI servers by 2030, under various growth scenarios and specific U.S. data center locations. Their analysis integrates anticipated chip production, server energy demands, and cooling efficiency, coupled with state power grid data. While not all AI enterprises have declared net-zero objectives, major tech firms involved in AI, like Google, Microsoft, and Meta, have set targets for 2030.

“The rapid expansion of AI computing is fundamentally altering everything,” says Yu. “We’re striving to understand the implications of this growth.”

The researchers estimate that establishing AI servers in the U.S. may require between 731 million to 1.125 billion cubic meters of additional water by 2030, along with greenhouse gas emissions ranging from 24 million to 44 million tons of carbon dioxide each year. These estimates hinge on the pace of AI demand growth, the actual number of advanced servers that can be produced, and the sites of new U.S. data centers.

To address these issues, the researchers modeled five scenarios based on varying growth rates and outlined potential measures to minimize the impact. “The top priority is location,” Yu explains. By situating data centers in Midwestern states with abundant water resources and a significant share of renewable energy in the power grid, the environmental fallout can be mitigated. The team also emphasizes that transitioning to decarbonized energy sources and enhancing efficiency in computing and cooling processes are essential strategies for minimizing environmental impact. Collectively, these three measures could potentially lower industry emissions by 73% and reduce water usage by 86%.

However, public resistance may disrupt these predictions, particularly regarding the environmental ramifications of establishing data centers. In Virginia, where 1/8 of the world’s data centers are located, residents have voiced opposition to upcoming construction plans, citing concerns over water resources and broader environmental impacts. Similar petitions against data centers have arisen in Pennsylvania, Texas, Arizona, California, and Oregon. As per Data Center Watch, a firm that monitors data center developments, local opposition is stalling approximately $64 billion worth of projects. Even where certain locations successfully deny data center projects, questions remain regarding their potential power and water consumption.

This new research is viewed cautiously by those analyzing and quantifying AI’s environmental effects. “The AI field evolves so quickly that making accurate future predictions is incredibly challenging,” says Sasha Luccioni from the AI company Hugging Face. “As mentioned by the authors, breakthroughs in the industry can radically alter computing and energy needs, reminiscent of DeepSeek’s innovative techniques that reduced reliance on brute-force calculations.”

Chris Priest from the University of Bristol in the UK concurs, highlighting the necessity for increased investment in renewable energy infrastructure and the importance of data center placement. “I believe their projections for water usage in direct cooling of AI data centers are rather pessimistic,” he remarks, suggesting that the model’s “best case” scenario aligns more closely with “business as usual” for contemporary data centers.

Luccioni believes the paper underscores a vital missing element in the AI ecosystem: “greater transparency.” She notes that this issue can be addressed by “mandating model developers to track and disclose their computing and energy consumption, share this information with users and policymakers, and commit to reducing overall environmental impacts, including emissions.”

Topic:

If you need further adjustments or another type of rewrite, let me know!

Source: www.newscientist.com

Can Underground Natural Hydrogen Assist the UK in Achieving Net Zero?

The Lizard Peninsula in Cornwall has rocks capable of producing hydrogen gas

PIO3/SHUTTERSTOCK

Recent discoveries of small amounts of underground hydrogen gas have sparked a global search for a potential zero carbon fuel source, yet the UK has largely been overlooked by prospectors.

According to a Briefing from the Royal Society on natural hydrogen production, the lack of exploration is not due to geological factors. “There are rocks that could produce hydrogen, but no research has been conducted,” states Barbara Sherwood Lollar, who contributed to a report at the University of Toronto.

The UK also doesn’t lack interest in gas. The latest Hydrogen Strategy highlights its crucial role in achieving the ambition of becoming a clean energy superpower through low-carbon production methods for heavy industry and transportation, yet natural hydrogen is not mentioned as a potential source.

Novelty plays a role in this oversight, according to Philip Ball, who contributed to the report and invests in natural hydrogen firms at Keele University. “Essentially no one is paying attention. There’s no regulation for this emerging sector, and there’s a lack of understanding.”

However, the situation may be changing. Ball notes that several companies have obtained rights to explore hydrogen in parts of the UK, including Devon in the southwest, while multiple universities conduct related research. The UK Geological Survey is also delving into the country’s potential for natural hydrogen, drawing on a wealth of existing geological data.

There is reason to believe that natural hydrogen exists beneath the surface. A report by the Royal Society notes that certain types of rocks, particularly iron-rich super-solid rocks, can generate hydrogen when interacting with water. Such formations are found in locations like the Lizard Peninsula in Cornwall and Scotland’s Shetland Islands. Geoplasms in areas like the North Pennines could also yield hydrogen through the breakdown of water molecules via natural radioactivity.

“It will definitely be found in the UK,” Ball asserts. “The question remains whether it will be economically viable.”

If hydrogen is discovered in the UK, expectations should be tempered; Sherwood Lollar emphasizes that one of the report’s goals was to correct some exaggerated claims about natural hydrogen, such as the concept of massive quantities of gas continually rising from the Earth’s mantle and core.

Nonetheless, it is critical to consider conservative estimates of the hydrogen production within the Earth’s crust. The report indicates that around 1 million tonnes of hydrogen permeates the crust annually. “Even capturing a fraction of this could significantly contribute to the hydrogen economy,” Sherwood Lollar states.

topics:

Source: www.newscientist.com

Removing barriers is essential for achieving true powerfulism, not turning a blind eye to them.

Kent Nishimura/Los Angeles Times Getty Images

In the latter half of the 18th century, mathematicians and physicists Joseph Lewis Lagrange made a shocking discovery. His star student, Monsieur Le Blanc, was actually a woman.

Lagrange was taught by Ecall Police Technique in France. As a result, students were able to receive lecture notebooks and submit their jobs without going directly to the university. This was especially useful for Sophie Germann, who was forced to study mathematics despite objections from his parents. She picked up the expired student and may have escaped, but Lagrange realized the vast and sudden improvement of Le Blanc's work and demanded that she would meet directly.

Germain is not the only person who pays attention to how the name used is perceived. As the psychologist Kion West explains here, experiments using the same recruitment application belong to white people who belong to blacks. It indicates that it is not more successful than the person who thinks.

In recent years, many organizations have adopted measures to compete with these results, such as deleting names from recruiting applications. These measures fall under the umbrella of diversity, fairness, and inclusion (DEI). But now, US President Donald Trump has ordered government agencies to dismantle the DEI program, promising that society is a “merit base.”

Trump approaches to diversity, fairness, and wrapping are unlikely to create talentedism

Some DEI Initiative are based on stronger evidence than other Initiative. As the resume test shows, the benefits alone are not enough to overcome people's prejudice, but many researchs show it. Anonymous application tends to improve the results In the case of a blessed group. On the other hand, unconscious biastration is a one -time session in the form of a single session aimed at to make employees recognize snap -judgments for people based on races and gender. It turns out that there is almost no difference in changing people's behavior

The harsh approach to Trump's Day, not evidence, is unlikely to produce his remarkable consequences of his remarkable ability. Instead of developing organizations that are encouraging the best people to prosper, the current efforts seem to promote the culture of fear. Government worker We are warned of “adverse effects” because we did not end DEI work.

Thankfully, Germain did not have such results. Lagrange accepted who she was and defended mathematical development. Nevertheless, she still used Le Blanc's pseudonyms in some communications. The most prominent is that when she discovered her true identity, she has “courage, extraordinary talent, and excellent genius.” If we want to prosper more germen, we must recognize and deal with the barriers they face.

topic:

Source: www.newscientist.com

The hidden radioactive waste problem lies at the core of achieving net zero emissions

A dog chased a ball past me at full speed across the open fields of Seascale Beach, Cumbria. The beach is surrounded by a small park, rows of shops, and houses, with tall chimneys and large rectangular buildings visible on a vast industrial site as you walk north.

Close to Seascale Beach is the Sellafield complex, a 2 square mile nuclear facility located 5 km away. Sellafield is home to most of the UK’s radioactive nuclear waste and the world’s largest store of plutonium.

I visited Sellafield earlier this year to learn about the management of Britain’s nuclear waste. It was an eye-opening and expensive lesson in dealing with hazardous material with no clear plan.

Sellafield played a crucial role in producing plutonium during the Cold War. The current cleanup operation involves processing and storing spent nuclear fuel, cooling and stabilizing it, then storing it in silos covered with steel and concrete.

Initially, safe long-term storage was not a priority, leading to waste being disposed of from decades ago. The process of moving waste from dilapidated silos to more modern stores is ongoing.

Read More:

  • Meet the rebels building fusion reactors in your neighborhood
  • Where next for nuclear energy?
  • Why do nuclear fission and fusion both release energy?

A recent report by the National Board of Audit highlighted that Sellafield is still in the early stages of the cleanup mission, expected to last until 2125 with an estimated cost of £136bn, showcasing uncertainty about the exact tasks and timeline.

The plan for the most dangerous nuclear waste is to bury it deep underground in a geological disposal facility (GDF). Finding a suitable location involves not just solid rock but also a willing community.

Three communities are currently in discussion about building a GDF facility, with experts believing it to be the best option. Several countries are also working on similar facilities.

The complexity of site selection may delay the facility’s opening until the 2040s or 2050s, amidst a push for new nuclear power to reduce emissions and reach net zero.

As we navigate through the challenges of nuclear waste management, experts like Professor Claire Corkhill from the University of Bristol play a crucial role in advancing our understanding of radioactive waste.


About our expert Professor Claire Corkhill

Claire is Professor of Mineralogy and Radioactive Waste Management in the School of Earth Sciences at the University of Bristol.

Her work has been published in magazines material, nature, and ceramics.

Read More:

  • Nuclear fusion: Inside the construction of the world’s largest tokamak
  • Instant Genius Podcast: The race to bury nuclear waste in hidden bunkers
  • Sticky atoms and devastating iron: The strange science behind nuclear fusion

Source: www.sciencefocus.com

Nations are falsely achieving net zero by excessively depending on forests

Russia’s plan to reach net zero by 2060 relies on existing forests to absorb continued carbon emissions

Varnakov R/Shutterstock

Countries are taking shortcuts to net-zero emissions by including forests and other “passive” carbon sinks in their climate plans, a tactic that thwarts global efforts to halt climate change. leading researchers have warned.

Relying on natural carbon sinks to absorb continued carbon emissions from human activities will keep the world warmer. This comes from the researchers who first developed the science behind net zero emissions and today launched a highly unusual intervention accusing nations and companies of abusing the concept.

“This document calls on people to be clear about what net zero really means.” Miles Allen The Oxford University professor said this at a press conference on November 14th.

Natural sinks such as forests and peat bogs play an important role in the Earth’s natural carbon cycle by absorbing some of the carbon from the atmosphere. However, we cannot rely on existing sinks to offset ongoing greenhouse gas emissions.

If used in this way, global atmospheric carbon dioxide concentrations would remain stable even when we reach “net zero,” and warming would continue for centuries due to the way the oceans absorb heat. Allen warned. “Even if we think we’re on the path to 1.5C, we could end up with temperatures rising well above 2C,” he says. “This ambiguity could effectively destroy the goals of the Paris Agreement.”

To halt global temperature rise, we need to reduce emissions to net zero, without relying on passive absorption by land and oceans. This allows existing natural sinks to continue absorbing excess CO2, reducing the concentration of the gas in the atmosphere and offsetting ongoing warming from the deep ocean.

However, many countries already count passive land sinks such as forests as greenhouse gas removals in their national carbon accounts. In some countries, such as Bhutan, Gabon, and Suriname, Already declared net zeroThanks to the existing vast forests.

Some companies are setting long-term net-zero targets based on this approach. For example Russia Pledging to achieve net-zero emissions by 2060but this plan relies heavily on using existing forests to absorb ongoing carbon emissions.

“Maybe some countries will use this in a deliberately naughty way.” glenn peters He is from the CICERO International Climate Research Center in Oslo, Norway, and spoke at a press conference. “This problem will be even more problematic in countries where forest area is a large proportion of total land area.”

The researchers fear this problem will become more serious as carbon markets develop and pressure on countries to decarbonize increases. “As the value of carbon increases, there will be more pressure to define anything that can be removed as a negative emission, potentially to be able to sell it in the carbon offset market,” Allen said.

Countries and companies with net-zero targets will need to modify their approach to exclude passive carbon sequestration from their accounts, the researchers say.

Natural sinks count as carbon removal when they are added to existing ones, for example when new forests are planted or peat bogs are rewetted. However, this type of natural carbon sink is vulnerable to climate impacts such as wildfires, drought, and the spread of invasive species, and is unreliable for long-term sequestration.

This has not stopped countries from relying heavily on these natural sinks in their net-zero strategies. one 2022 survey It turns out that a number of countries, including the United States, France, Cambodia and Costa Rica, plan to rely on forest carbon and other naturally occurring removals to offset ongoing emissions. “Many national strategies ‘bet’ on increasing carbon sinks in forests and soils as a means of achieving long-term goals,” the study authors wrote.

Allen stressed that natural carbon sinks must be conserved but not relied on to balance ongoing emissions. Instead, he urges countries to aim for “geological net zero,” where all ongoing carbon emissions are balanced by long-term carbon sequestration in underground storage.

“Countries need to recognize the need for geological net zero,” he said. “That means if we are producing carbon dioxide from burning fossil fuels by mid-century, we need to have a plan to put that carbon dioxide back into the ground.”

“Geological net zero seems like a sensible global goal for countries to aspire to,” he says. harry smith At the University of East Anglia, UK. “This will help clarify many of the ambiguities that plague the current way countries consider land travel.”

But he warns that it could have a knock-on effect on climate ambitions. “What does the new politics of geological net zero look like? If geological net zero drives the goals of governments’ climate strategies, what does this mean for governments’ climate change ambitions?” Will it have an impact?”

topic:

Source: www.newscientist.com

Review: Team Asobi’s 3D Platformer Astro Bot Showcases Brilliant Ideas, Achieving Masterpiece Status

circleTo mention that Astro Bot brings back memories of Super Mario Galaxy is a high compliment. It’s not because it’s a copy, but rather due to the abundance of new ideas that positions this game as one of Nintendo’s top 3D platformers. Traveling around a small galaxy filled with asteroid-style levels, from bathhouses to diorama-sized jungle temples to rainy islands. Each level is brimming with innovative one-shot concepts, like frog boxing gloves, backpack monkeys, and a time-stopping clock that freezes giant speeding darts for you to navigate around. The creativity of this development team truly shines in this game.

Team Asobi, known for producing Rescue Mission for PSVR and the short game Astro’s Playroom packaged with the PS5 at launch, now presents a full-length game with bonus difficulty levels that serve as a stimulating challenge for fans of 3D platforming. The game is incredibly enjoyable and distinct thanks to the lovable blue-and-white robot and its quirky friends, many of whom are dressed as characters from obscure PlayStation worlds. The meticulous attention to detail in these robots, from their movements, expressions, dance sequences, to their tiny pleas for help when in distress, exudes personality.

In Astro’s Playroom, you explore levels inspired by the speed of the SSD and the graphic processing unit’s visual flair, housed within the PlayStation 5 itself. The visual design of the environments is tech-themed, featuring trees made of tangled wires and computer-chip-like patterns decorating every surface. Astro Bot maintains a similar aesthetic while extending beyond it.

In this adventure, your PS5 acts as a robot mothership that crash-lands on a desert planet, dispersing numerous robots across the galaxy. As the lone surviving robot, you journey into each level aboard a rescue ship shaped like your PS5 controller to reunite your allies and reconstruct your robotic crew back home.




An astrobot riding a PS5 controller-shaped ship. Photo: Sony/Team Asobi

At the conclusion of each planetary cluster, a boss reminiscent of a slapstick cartoon is encountered, guarding a section of your spaceship. You then engage in cleaning and reassembling that section using a massive robotic arm, strategically pulling triggers and tilting the controller to clear away debris, cut ice chunks, and align pieces. This interactive process is incredibly fun and tactile, emphasizing the unique and sometimes eccentric aspects of the PS5 controller. Various features of the controller, from the small microphone to the touchpad, are ingeniously utilized in Astro Bot’s gameplay. The protagonist searches for weak spots along walls, clinging to his ship as you navigate through space by tilting the controller like a steering wheel.

The developers’ profound understanding of the PlayStation 5 is evident. Whether constructing a bridge with 100 robots on-screen, witnessing landscapes shattering into tiny fragments, or careening down a waterslide accompanied by inflatable balls, the gameplay is seamless and responsive. Whether testing if a log floats by slicing it with Astro’s jetpack or feeling the impact of each action through vibrations in the controller, every detail is finely tuned. Astro’s movements, jumps, and maneuvers are flawless, showcasing the level of precision in the game. This attention to detail sets this game apart, offering players a luxurious experience akin to five-star service.




The Astro Bot puts Frog’s boxing gloves to good use. Photo: Sony/Team Asobi

Another aspect I appreciate about Astro Bot is its suitability for playing with children. While lacking two-player co-op, it functions well as a game to pass the controller among players. My 7-year-old enjoyed watching me play, while my 5-year-old explored safe areas of levels and handed me the controller when faced with challenges.

Skip Newsletter Promotions

Some planets in Astro Bot feature hub areas resembling enclosed playgrounds where players can engage in activities like kicking a ball, battling harmless enemies, jumping into pools, and taking on acrobatic challenges. My kids found the setting charming and dynamic, with references to classic PlayStation games like Uncharted, God of War, and Ape Escape scattered throughout.

Astro Bot, akin to Astro’s Playroom, pays homage to PlayStation’s history and design while expanding beyond a mere tech demo to establish itself as one of the top platform games in recent memory. It truly stands out as one of the finest platform games I’ve had the pleasure of playing. Until now There have been many games I’ve experienced, but being a 90s kid, I’ve played my fair share. The PlayStation hasn’t seen a captivating family game since LittleBigPlanet, and Astro Bot carries on that tradition of playful humor.

Astro Bot is set to release on September 6th, priced at £54.99.

Source: www.theguardian.com

Bill Gates advocates for AI as a valuable tool in achieving climate goals

Bill Gates argues that artificial intelligence will assist, not hinder, in achieving climate goals, despite concerns about new data centers depleting green energy supplies.

The philanthropist and Microsoft co-founder stated that AI could enhance technology and power grids’ efficiency, enabling countries to reduce energy consumption even with the need for more data centers.

Gates reassured that AI’s impact on the climate is manageable, contrary to fears that AI advancements might lead to increased energy demand and reliance on fossil fuels.

“Let’s not exaggerate this,” Gates emphasized. “Data centers contribute an additional 6% in energy demand at most. But it’s likely around 2% to 2.5%. The key is whether AI can accelerate the reduction to 6% or beyond. The answer is, ‘Definitely.’

Goldman Sachs estimates that AI chatbot tool ChatGPT’s electricity consumption for processing queries is nearly ten times more than a Google search, potentially causing carbon dioxide emissions from data centers to double between 2022 and 2030.

Experts project that developed countries, which have seen energy consumption decline due to efficiency, could experience up to a 10% rise in electricity demand from the growth of AI data centers.

In a conference hosted by his venture fund Breakthrough Energy, Gates told reporters in London that the additional energy demand from AI data centers is likely to be offset by investments in green electricity, as tech companies are willing to pay more for clean energy sources.

Breakthrough Energy has supported over 100 companies involved in the energy transition. Gates is heavily investing in AI through the Gates Foundation Trust, which has allocated about a third of its $77 billion assets into Microsoft.

However, Gates’ optimism about AI’s potential to reduce carbon emissions aligns with peer-reviewed papers, suggesting that generative AI could significantly lower CO2 emissions by simplifying tasks like writing and creating illustrations.

AI is already influencing emissions directly, as demonstrated by Google using deep learning techniques to reduce data center cooling costs by 40% and decrease overall electricity usage by 15% for non-IT tasks.

Despite these advancements, concerns remain about the carbon impact of AI, with Microsoft acknowledging that its indirect emissions are increasing due to building new data centers around the world.

Gates cautioned that the world could miss its 2050 climate goals by up to 15 years if the transition to green energy is delayed, hindering efforts to decarbonize polluting sectors and achieve net-zero emissions by the target year.

He expressed concerns that the required amount of green electricity may not be delivered in time for the transition, making it challenging to meet the zero emissions goal by 2050.

Gates’ warning follows a global report indicating a rise in renewable energy alongside fossil fuel consumption, suggesting that meeting climate goals requires accelerated green energy adoption.

This article was corrected on Friday, June 28. The Gates Foundation does not invest in Microsoft. The Gates Foundation Trust, which is separate from the foundation, holds Microsoft shares.

Source: www.theguardian.com

Achieving the Perfect Ratio of Omega-3 and Omega-6 Fatty Acids in Your Diet

Linda Steward/Getty Images

The advice is the same no matter where you look. If you want to reduce your risk of heart disease, obesity, cancer, and all sorts of other health problems, you should reduce butter and the “bad” saturated fats found in it. Red meat or processed meat. Instead, you should consume “good” polyunsaturated fats. This means cooking with vegetable oil and focusing on leafy vegetables, fatty fish, and nuts and seeds. Simple.

Nothing is ever simple, except in nutrition. In this case, the complication arises from the growing recognition that not all “good” fats are created equal. Specifically, while omega-3 fatty acids are certainly good for us, omega-6 fatty acids may actually be damaging to our health.

The idea that the balance of omega in the foods we eat can affect our health is well established. Additionally, while the typical Western diet has become increasingly high in omega-6s and low in omega-3s over the past 50 years, at the same time the incidence of diseases associated with excessive inflammation has skyrocketed. It is also clear that These include heart disease and type 2 diabetes.

All this has led to the argument that in addition to increasing the amount of omega-3 in the diet, we also need to reduce the intake of omega-6. But correlation is not causation. So can consuming too much omega-6, which has long been thought to be beneficial, really be bad for you? If so, what foods should I eat more or less of to optimize…

Source: www.newscientist.com

Bisexual women anticipate achieving orgasms more frequently with women than with men

Women, especially straight men, report that they are less likely to reach orgasm during sex than straight men, a phenomenon known as the “orgasm gap.”

Zoonar GmbH / Alamy

Bisexual women expect to be more likely to orgasm when they have sex with another woman than when they have sex with a man, a study found.

Orgasm is usually a strong indicator of sexual satisfaction and often reflects satisfaction within a relationship. In a study of more than 52,000 adults in the United States, david frederick Chapman University in California and his colleagues. 95% of straight men say they usually always reach orgasm during sexcompared to 65 percent of straight women.

This difference is often referred to as the “orgasm gap,” and research suggests it almost completely disappears. Masturbating or By acts such as stimulation of the clitoris. In Chapman and his team's study, 86% of lesbian women and 66% of bisexual women said they usually always orgasm during sex.

If you would like to learn more about the orgasm gap, Grace Wetzel Researchers at Rutgers University in New Jersey asked 481 non-transgender bisexual women to imagine themselves in hypothetical sexual scenarios. About half were asked to imagine themselves with a man, and the other half were asked to imagine themselves with a woman.

Participants were asked to rate their expectations for orgasm on a scale of 1 to 7, with 1 indicating that they thought orgasm was very unlikely and 7 indicating that they thought it was very likely. The average score when imagining sex with a man was 4.88, compared to 5.86 when imagining sex with a woman. Although this may seem like a relatively small difference, statistical analysis suggests that the results are not due to chance.

In another part of the study, researchers asked an additional 476 women to complete an online survey about their sexual experiences with recent or current partners. There were no transgender women. Just under 60 percent of them were heterosexual, and the rest were lesbian.

Lesbian women had an orgasm 78 percent of the time, compared to 65 percent of straight women. They also reported having higher orgasm expectations before sexual encounters, more actively trying to reach climax during sex, and receiving more clitoral stimulation.

“Research shows that clitoral stimulation is the key to female orgasm,” says Wetzel. “So women have more orgasms when they're with other women because it involves more frequent clitoral stimulation.”

The study found that women who have sex with women expect more clitoral stimulation. “The dominant heterosexual script focuses on penetration rather than clitoral stimulation, which leaves fewer opportunities for female orgasm,” says Wetzel.

But “sexual scripts can be malleable,” she says. “Heterosexual couples can reduce the orgasm gap in their relationships by prioritizing the sexual activities that the woman needs to reach orgasm.”

topic:

Source: www.newscientist.com

The Launch of Streamr Network 1.0 Mainnet: Achieving Decentralized Data Broadcasting as outlined in the 2017 Roadmap.

Zug, Switzerland, March 19, 2024, Chainwire

Streamr announced the launch of Streamr Network 1.0 mainnet, a milestone that marks the completion of the original network. 2017 roadmap. 1.0 introduces the full deployment of the $DATA token incentive layer, transforming the network into a fully featured, fully decentralized protocol, run and operated by users.

The culmination of more than six years of research and development, three incentivized testnets, and overcoming technical hurdles that caused a last-minute launch cancellation, Streamr 1.0 marks the arrival of decentralized data broadcasting.

Main features of Streamr 1.0:

  • Fully expanded tokenomics: activation of $DATA Token Incentive Layermeans that the Streamr network can operate autonomously from teams as a neutral, fully distributed messaging protocol.
  • Introducing new network roles: Unleash the power of a peer-to-peer marketplace between sponsors, operators, and delegators.
  • Stream sponsorship: Sponsors create and fund sponsorships, and operators earn income from them. These smart contracts manage reward distribution between operators who run the nodes and help relay data within the nodes.
  • Trackerless network architecture: Moving to a trackerless architecture. Leverages a globally distributed hash table (DHT) to enhance efficiency and scalability.
  • New benefits for node operators: 1.0 gives node operators the opportunity to earn more revenue by allowing them to accept delegations and receive a portion of the revenue. 1.0 also brings other enhancements for node operators, including the removal of per-IP node limits, instant reward claims from active sponsorships, and other quality of life improvements.

New use cases are unlocked:

1.0 sets the stage for exploring new use cases in areas such as decentralized physical infrastructure networks (DePIN), decentralized AI, and decentralized video streaming.

decentralized video streaming: Streamr is exploring decentralized live video streaming, testing its ability to deliver scalable and stable video feeds to viewers at scale. By leveraging the network's peer-to-peer protocols, Streamr eliminates dependence on centralized distribution points and enables viewers to contribute directly to the broadcast network while consuming content, optimizing efficiency and scalability.

Depin: Streamr 1.0 enhances DePIN's ability to move from a centralized data pipeline to a fully decentralized contributor array. The network's serverless, secure, and scalable framework is ideal for broadcasting data between connected devices and moving DePIN to a truly decentralized architecture.

decentralized AI: The 1.0 milestone can be: Transforming artificial intelligence We position Streamr as a neutral data layer, providing a secure data stream for AI development, trust, and transparency. Streamr allows AI models to connect with each other, share insights, connect to real-time tuning data and live content delivery, and collectively power intelligence. By integrating with decentralized frameworks, you can take a step toward making your AI operations more open, verifiable, and modern.

These changes, along with exploring new use cases, highlight Streamr's commitment to pushing the boundaries of what is possible in the realm of decentralized technology.

About streamers

streamer is building a real-time data protocol for the decentralized web. This includes a scalable, low-latency, secure P2P network for data broadcast, distribution, and exchange. As part of our vision, Streamr built The Hub, a dApp that champions open data, allowing DePin, AI, and Web3 builders to decentralize their tech stacks with real-time data flows. The Streamr project was started by real-time data veterans with experience in algorithmic trading and financial markets.

contact

CMO
mark little
streamer network
media@streamr.net

Source: the-blockchain.com

Achieving 99.99923% Reflectance: The Ultimate Mid-Infrared Mirror

As reported in nature communications, a collaborative international effort led to the creation of the first mid-infrared supermirror with extraordinary reflectivity. This innovation promises to significantly enhance environmental gas sensing and industrial processes, and represents a major leap forward in mirror technology.

Advanced infrared mirrors enhance climate and biofuels research through precise trace gas detection. An international research team from the United States, Austria, and Switzerland has demonstrated the first true supermirror in the mid-infrared spectral region. These mirrors are key to many applications, including optical spectroscopy for environmental sensing and laser cutting and welding for manufacturing.

Achieving near-perfect reflectance. In the field of high-performance mirrors, everyone is searching for the impossible: a perfectly reflective coating. In the visible wavelength range (i.e. between 380 nm and 700 nm), advanced metal mirrors achieve reflectivity as high as 99%. This means 1. photon 99 is lost with each reflection.

Although this may be impressive, in the near-infrared region (i.e., between approximately 780 nm and 2.5 μm), mirror coatings demonstrate a reflectance of 99.9997%, meaning that out of 1 million reflected photons, only one photon is lost. There are only three.

There has been a long-standing desire to extend the performance level of this supermirror into the mid-infrared (wavelengths from 2.5 μm to 10 μm and beyond), where it could enable advances in trace gas sensing tasks related to climate change and biofuels. Become. It can also be used in industrial applications such as laser processing and nanofabrication. So far, the best mid-infrared mirrors lose one photon to approximately 10,000 photons. This is about 33 times worse than near-infrared mirrors.

International cooperation leads to breakthroughs. As explained in the article published in nature communications, Thorlabs’ Crystalline Solutions (Santa Barbara, California), the Christian Doppler Institute for Mid-Infrared Spectroscopy at the University of Vienna (Austria), the US National Institute of Standards and Technology (NIST), and the University of Neuchâtel (Switzerland) are the first to create a truly intermediate We have demonstrated an infrared super mirror.

These mirrors lose only 8 out of 1 million photons and achieve a reflectance of 99.99923%. Achieving such extreme reflectance required a combination of mastery of materials, mirror design, and manufacturing processes.

Patterned 4-inch GaAs wafer with single-crystal GaAs/AlGaAs die. The final product is fused onto a coated silicon substrate. Credit: George Winkler A new paradigm for mirror coatings. To realize this first-generation mid-infrared (MIR) supermirror, researchers devised and demonstrated a new paradigm in coatings. They combined traditional thin-film coating techniques with new semiconductor materials and methods to overcome material limitations in the challenging mid-infrared region.

Garrett Cole, technical manager for our crystal solutions team, said: By extending this platform to longer wavelengths, our international collaboration has demonstrated for the first time a MIR coating method with less than 5 ppm of undesired absorption and scattering losses. ”

These mirrors utilize the extremely high purity and superior structural quality of molecular beam epitaxy, an advanced process used to fabricate many different semiconductor devices, to produce single-crystal GaAs/AlGaAs with negligible absorption and scattering. Produces multilayer films. This starting material is processed into high-performance mirrors using advanced microfabrication techniques, including direct “fusion” bonding onto high-quality conventional amorphous thin-film interference coatings deposited at the University of Neuchâtel.

Manufacturing these revolutionary mirrors was only half the challenge. Scientists also needed to systematically measure the mirrors to prove their superior performance. Gar-Wing Truong, principal scientist at Thorlabs Crystalline Solutions, said: “It was a huge team effort to bring together our equipment and expertise to clearly demonstrate total losses as low as 7.7ppm, six times higher than previously achieved by any of his traditional MIR coating techniques will do. ”

Co-lead author Lukas Perner, a scientist at the University of Vienna, added: Our combined efforts in innovative mirror technology and advanced characterization methods have enabled us to demonstrate superior performance and break new ground in MIR. ”

Impact on environmental sensing and spectroscopy. The immediate application of these new MIR supermirrors is to significantly improve the sensitivity of optical devices used for measuring trace gases. These devices, called cavity ring-down spectrometers (CRDS), can detect and quantify trace amounts of important environmental markers, such as carbon monoxide.

The research team turned to NIST research chemists Adam Fleischer and Michelle Bailey, who have been working on this technology for years. In a proof-of-concept experiment that put these mirrors through their paces, Fleischer and Bailey showed that the mirrors were already superior to the state-of-the-art.

“Low-loss mirrors make it possible to achieve very long optical path lengths in small devices, in this case like compressing the distance from Philadelphia to New York City to a one-meter range,” says Bailey. says Mr. “This is an important advantage for ultrasensitive spectroscopy in the mid-infrared spectral range, such as the measurement of radioisotopes important for nuclear forensics and carbon dating.”

Reference: “Mid-infrared supermirror with finesse greater than 400,000” Gar-Wing Truong, Lukas W. Perner, D. Michelle Bailey, Georg Winkler, Seth B. Cataño-Lopez, Valentin J. Wittwer, Thomas Südmeyer, Catherine Nguyen, Author David Folman, Adam J. Fleischer, Oliver H. Heckle, Garrett D. Cole, December 6, 2023, nature communications. DOI: 10.1038/s41467-023-43367-z

(function(d, s, id){ var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) return; js = d.createElement(s); js.id = id; js.src = “//connect.facebook.net/en_US/sdk.js#xfbml=1&version=v2.6”; fjs.parentNode.insertBefore(js, fjs); }(document, ‘script’, ‘facebook-jssdk’));

Source: scitechdaily.com