Cryopreserved Seastar Larvae May Facilitate the Recovery of Key Species

Giant pink star larvae resumed development after freezing at -200°C

Patrick Webster

In a groundbreaking achievement, researchers have successfully frozen and revived sea star larvae, marking a significant advance in conservation efforts aimed at restoring an endangered keystone species.

Since 2013, Coastal Star Wasting Syndrome has been devastating populations of sunflower stars (Pycnopodia helianthoides) and giant pink stars (Pisaster brevispinus) along the North American west coast. The decline of these predators has resulted in a major surge in sea urchins, decimating 97% of the kelp forests in Northern California.

Currently, the sunflower star is considered functionally extinct in California, sparking extensive efforts to cultivate these creatures for eventual reintroduction into their natural habitat.

In January 2025, a giant pink star was birthed at the Pacific Aquarium in Long Beach, California. Just two days later, its larvae were dispatched to the San Diego Zoo Wildlife Alliance, where they were immersed in liquid nitrogen and stored at -200°C. The following month, they were transported approximately 700 kilometers north to the Sunflower Star Institute in Monterey Bay.

Upon careful thawing, the giant pink star larvae resumed their development, settling on the seabed and transitioning into a boy shape. “No one has ever successfully frozen a sea star at the larval stage and achieved this,” researchers noted.

Sunflower Stars play an important role in the Kelp forest ecosystem

Pat Webster

The successful cultivation of cryopreserved larvae represents a significant milestone for conservationists, as it opens up avenues for reintroducing genetically diverse sunflower stars back into California’s waters. “The larval cycle of the giant pink star closely resembles that of the sunflower stars,” explained Bank, “after which we will move forward with sunflower star rearing.”

Over the past 15 years, marine ecosystems in the North Pacific have faced a growing tide of environmental challenges. “It’s a daunting narrative involving climate change, ocean heat waves, and ecosystem collapse,” remarked Andrew Kim from the Sunflower Star Institute. “However, within Pycnopodia, there lies hope for recovery.”

Topic:

Source: www.newscientist.com

Experts Cite Mask’s “Doge” Involvement as a Key Disruption in Branding | Elon Musk

Scott Galloway, a prominent marketing professor in the US, described Elon Musk’s strategy of implementing severe work and spending reductions within the federal government on behalf of the Trump administration as “one of the greatest brand disruptions ever.”

During a recent episode of the popular Pivot Podcast, he mentioned that he argued that Trump’s billionaire business advisor alienated the customer base of Tesla, one of his key ventures, while partnering with a president uninterested in the types of vehicles his allies produce.

Galloway also pointed out a vote indicating that Tesla fell to No. 95 in 2021 from its previous position as the eighth most reputable brand.

“He alienates the wrong audience,” Galloway commented. “Three-quarters of Republicans will never consider purchasing an EV. He seems comfortable associating with people who aren’t interested in electric vehicles.”

He also cited statistics showing that Tesla’s sales reached 59% in France, 81% in Sweden, 74% in the Netherlands, 66% in Denmark, 50% in Switzerland, and 33% in Portugal.

Over the past few months, Musk has attempted to intervene in various political matters across Europe, including the German federal elections and the UK’s discussions regarding grooming gangs.

A report from Jato Dynamics, a provider of automotive industry insights, noted that Tesla’s Chinese rival, which is based in Austin, gained traction in the lucrative European EV market after previous competitors faced challenges.

“This was certainly one of the largest brand disruptions,” Galloway told his co-host, veteran tech journalist Kara Swisher. “Tesla was an outstanding brand.”

“He alienates his key demographics.”

The reductions in federal government operations and budget linked to Musk stemmed from his role in leading the Office of Government Efficiency (DOGE) during Trump’s second term, which commenced in January. Musk secured this position after his Super Political Action Committee contributed $200 million to Trump’s successful bid to reclaim the White House following his 2020 election loss.

Since then, opinion surveys have indicated significant disapproval of Musk’s efforts for Trump, revealing that many voters were dissatisfied with the approach taken by the businessman and DOGE towards federal employees.

By late April, Tesla had reported a 71% drop in profits. In a revenue call with Tesla investors, Musk announced he would step back from his role in DOGE in May.

Musk described his responsibilities as “primarily managing the financial aspects of order” in the government, predicting “a considerable decrease in time dedicated to DOGE.”

A nonpartisan research organization, Public Service Partnership, estimated that the $160 billion cuts credited to DOGE would ultimately amount to around $135 billion.

Source: www.theguardian.com

As Key Atlantic Currents Decelerate, US East Coast Confronts Rising Sea Levels

AMOC is a system of ocean currents that circulates water in the Atlantic Ocean.

NASA/Goddard Space Flight Center Scientific Visualization Studio

The decline in significant Atlantic currents is contributing to flooding linked to rising sea levels in the northeastern United States, which are already affected by climate change. As global temperatures increase, a total collapse of the Atlantic Meridional Overturning Circulation (AMOC) could exacerbate sea level rise.

“If AMOC collapses, this will greatly increase flood frequency along the US coastline, independent of major storms,” states Liping Chan from the Geophysical Fluid Dynamics Laboratory at the US National Oceanic and Atmospheric Administration (NOAA) in New Jersey. “Even a partial reduction in current strength can have significant consequences.”

The warm waters melting ice sheets and rising sea levels are influenced by climate change, which leads to uneven rates of sea level rise across different regions. For instance, some coastal areas have subsided, increasing the relative rate of sea level rise there. Local sea levels are also affected by the circulation of heat, water, and salt in the ocean, with warm, fresh water occupying more volume than cold, salty water.

Over the past few decades, sea levels along the northeastern US coast have risen 3-4 times faster than the global average. The slowing of AMOC—responsible for transporting warm water from lower latitudes to the North Atlantic, where it cools and sinks—has long been considered a potential cause of this phenomenon. As this circulation weakens, warm deep water expands, pushing more water onto the shallow continental shelf.

AMOC strength varies naturally over different timescales, and climate change has contributed to its slowdown as the North Atlantic and its waters have become warmer and clearer in recent decades. However, it remained uncertain whether this decrease significantly affected sea levels.

Chang and her team utilized tidal gauge measurements from the New England coast to reconstruct local sea levels dating back over a century. Alongside a steady rise due to climate change, they identified significant fluctuations between low and high sea levels every few decades. Low sea levels correlated with periods of weak AMOC, while high sea levels were also aligned with these intervals, which brought more frequent coastal flooding.

The researchers then employed two distinct ocean models to quantify the impact of AMOC intensity variations on local sea levels. While the primary driver of change was the steady rise due to climate change, they discovered that weakened AMOCs significantly increased sea-level-related flooding. In multiple coastal regions, they noted that the slowdown in AMOC has contributed to delaying flooding by 20-50% since 2005.

Given that the natural cycle of AMOC strength is largely predictable, Zhang asserts that these findings enable researchers to forecast potential flooding events up to three years in advance. This foresight can guide long-term infrastructure planning and emergency preparedness.

“This highlights the critical role of AMOC in [sea level rise],” remarks Chris Hughes, who was not involved in the research, from the University of Liverpool in the UK. “It’s not merely theoretical; it’s evident in the real world.”

It remains unclear how much of the recent AMOC weakening is attributable to climate change versus natural variability. Nevertheless, the findings bolster predictions that if AMOC were to completely collapse due to climate change, significant portions of the US East Coast could experience a surge in sea levels.

Hughes warns that if AMOC nearly collapses, sea levels could rise by around 24 centimeters. “While it may not seem dramatic, even a small increase can have a substantial effect.”

Topics:

Source: www.newscientist.com

Key Insights on the “Forever Chemicals” in Drinking Water

On Wednesday, the Environmental Protection Agency unveiled plans to roll back restrictions on harmful “forever chemicals” in drinking water, roughly a year after the Biden administration implemented its first-ever national standards.

Last year, the Biden administration introduced regulations that could reduce PFA exposure for millions. This initiative was part of a broader effort to enhance drinking water quality by creating rules to mandate the removal of toxic lead pipes and tackle the forever chemical issue following years of activism.

During President Donald Trump’s administration, there were fewer environmental regulations and increased development in the oil and gas sectors. EPA Administrator Lee Zeldin has actioned the agenda by announcing a significant rollback of regulations.

We have learned about plans to eliminate certain PFA restrictions and extend deadlines for two of the most prevalent types. Here are some key points about PFAS chemicals and the EPA’s role.

What is PFA?

PFAS, or Perfluoroalkyl and Polyfluoroalkyl substances, have existed for decades and are a group of chemicals that have contaminated the air, water, and soil across the country.

Manufactured by companies like 3M and Chemours, they have made eggs slide out of non-stick pans, helped firefighting foams extinguish flames, and allowed textiles to repel water.

However, these chemicals do not break down easily, leading to enduring environmental presence.

Why are they harmful to humans?

Activists have long warned about the health risks associated with PFAS before the dangers were acknowledged publicly. The same properties that make PFAS valuable—such as their resistance to degradation—pose significant health risks.

PFAS can accumulate in the human body. Consequently, the Biden administration has established limits on two common types, PFOA and PFOS, which continue to be found in the environment despite being phased out of production.

Exposure to certain PFAS has been linked to various health issues, including kidney disease, low birth weight, elevated cholesterol levels, and even certain cancers, according to the EPA.

Guidelines for PFOA and PFOS have significantly evolved in recent years, reflecting new scientific discoveries. For instance, the 2016 EPA recommended that combined levels of these two substances should not exceed 70 parts per trillion, while the Biden administration later deemed this amount unsafe.

Understanding the EPA’s actions

The EPA is planning to roll back restrictions on three types of PFAS, including less commonly known substances like GenX found predominantly in North Carolina, as well as PFHXS and PFNA. Furthermore, limits regarding PFAS mixtures are also set to be withdrawn.

Few utilities currently would be impacted by the easing of restrictions on these specific types of PFAS. Recent sampling showed that nearly 12% of U.S. water utilities exceed the Biden administration’s limits. However, many utilities are still addressing PFOA and PFOS challenges.

For the more commonly found types, PFOA and PFOS, the EPA will maintain existing restrictions but will provide an additional two years—until 2031—for utilities to comply.

Reactions to the announcement

Environmental groups argue that the EPA’s move to weaken regulations may be illegal. The Safe Drinking Water Act empowers the EPA to limit water contaminants and ensures that new rules do not loosen previous standards.

“The law clearly states that the EPA cannot abolish or weaken drinking water standards,” noted Eric Olson, a senior strategist with the nonprofit Natural Resources Defense Council.

Activists are largely critical of the EPA for not upholding Biden-era standards, warning that this could worsen public health issues.

Industry responses have varied. The American Chemistry Council questioned the scientific foundation of the Biden administration’s stricter rules, asserting that the Trump administration had prioritized cost considerations and scientific foundations.

“EPA actions only partially address this issue and are necessary to prevent significant community impacts and unintended consequences,” the industry group stated.

The American Water Works Association, representing two major utility industry groups, expressed support for the EPA’s decision to withdraw the new approach to limiting chemical mixtures, though they cautioned that this change would not significantly alleviate compliance costs associated with PFAS regulations.

Some utilities expressed a desire for stricter PFOA and PFOS limits, according to Mark White, a drinking water expert at engineering firm CDM Smith.

However, they received the extension instead.

“This will require additional time to handle the current knowledge, demanding more resources. Some utilities are still determining their existing situation.”

Source: www.nbcnews.com

The Columbia River Treaty: A Key Factor in Trump’s Dispute with Canada

A little-known treaty that impacts millions of Americans and Canadians is currently entangled in the tariff dispute between the US and Canada.

This 60-year-old agreement regulates the waters flowing through the Columbia River, which extends from British Columbia to Montana, Idaho, Washington, and Oregon, and serves as the largest source of hydropower in the United States. However, parts of the treaty were set to expire during the presidential election in the US.

Negotiators were merely weeks away from finalizing the details of the treaty’s renewal when President Joseph R. Biden Jr. concluded his term. Subsequently, a decade’s worth of discussions faltered due to President Trump’s antagonism towards Canada, as he labeled Canada the “51st province,” imposed tariffs on Canadian exports, and referred to the water supply as a “major faucet.”

During a heated February call with then-Canadian Prime Minister Justin Trudeau, Trump brought the treaty into the conversation, suggesting Canada had exploited the United States. The implications were evident—it could become a leverage point in broader discussions aimed at redefining relations between the two nations.

Last week, at a White House meeting, Prime Minister Mark Carney and Trump avoided confrontation. However, the Trump administration perceives negotiations as being precariously balanced, even over treaties that are mutually beneficial. The unpredictability of Trump’s trade policies has cast a shadow over the future of the Pacific Northwest, heightening concerns about issues ranging from electricity supply to flood management.

Fueled by the internet and AI, data centers are leveraging the Columbia River’s hydroelectric power. A local dam supports the Twilight Soccer Game at Riverfront Parks, while irrigation from the reservoir nurtures the sprawling acres of Pink Women and Gala Apple gardens. Coordinated dam operations are crucial in preventing flooding, particularly in areas like Portland, Oregon.

Trump’s comments have resonated negatively with Canadians, who have long feared that the US seeks to exploit its natural resources, especially water. “They want our land, resources, and water,” Carney repeatedly emphasized during his term.

“Canadians experience a sense of betrayal,” Jay Inslee, former governor of Washington, remarked in an interview. The treaty interweaves a complex tapestry of cultural and economic interests. “Negotiating this is not straightforward,” Inslee added.

A spokesperson from British Columbia reported that there has been “no progress whatsoever” since the US State Department suspended negotiations in the broader context of reviewing international commitments. State Energy Minister Adrian Dix told nearly 600 attendees at a virtual town hall in March, “It sounds like a strange representation of the current situation.”

Dix noted that locals approached him in Save-on-Food markets, questioning whether Canada should exit the treaty altogether. “For residents in the Columbia Basin, this is intrinsic,” he stated. “It’s part of their lives, history, and identity.”

If the agreement collapses, the US anticipates it will be “more challenging to manage and predict” hydroelectric output to mitigate flooding in the Pacific Northwest, according to a nonpartisan Congressional report. It is projected that the region’s electricity demand may double within the next two decades, as anticipated by the Interstate Electricity Council.

The State Department has opted not to comment.

The origins of the treaty trace back to the events of 1948, following the Great Spring rains when the 15-foot wall of Vanport, Oregon—home to thousands of shipyard workers during World War II—collapsed. The calamity left 18,000 homeless and catalyzed negotiations with Canada to improve management of the Columbia River.

On one of President Dwight D. Eisenhower’s inauguration days, he ratified the Columbia River Treaty. This agreement exchanged commitments: Canada consented to construct multiple dams to manage flood control for the US, while the US agreed to provide Canada half of the extra electricity generated from the jointly managed river flows.

The original treaty came into effect in the autumn of 1964, with some provisions expiring 60 years later.

Discussions regarding the renewal of the treaty before it lapses in 2024 began during Trump’s first term. Biden temporarily halted them before resuming. In March 2023, the complete congressional delegation from the Pacific Northwest urged the President to expedite the negotiation process. Following a slow start, the US and Canada unveiled a preliminary outline of the agreement last summer.

The electricity generated under the initial treaty proved to be significantly more valuable than originally anticipated, bringing in around $300 million annually to Canada. This surplus prompted Canada to sell extensive amounts of power to the US, causing frustration among US utilities.

The updated agreement aims to reduce Canada’s share by about half over time, allowing the US to retain more electricity amid growing energy demands.

The Columbia River’s cheap and clean hydroelectric power has attracted high-tech companies intent on establishing data centers over the last two decades.

“The nation must recognize the significance of the Pacific Northwest in its burgeoning energy landscape,” stated David Kennedy, a scholar of local history at Stanford.

In the renewed treaty, Canada has decreased the obligation to maintain water storage for flood management, allowing for better prioritization of local communities and ecosystems around the reservoir. The original agreement led to drastic water level fluctuations that exposed extensive land when snowmelt resulted in lower levels.

“Each year, this exposed ground causes severe dust issues,” recounted a resident near Valemount, British Columbia, during the town hall.

The new plan aims to stabilize reservoir levels, enabling Canada to rehabilitate coastal ecosystems and enhance recreational opportunities.

Indigenous tribes were consulted during negotiations, but the initial treaty did not address the destruction of fishing grounds and towns due to dam constructions.

Jay Johnson, a negotiator for the Syilx Okanagan Nation, mentioned during the virtual town hall that tribes on both sides of the border have united to restore salmon migration. The updated framework includes provisions for excess water during dry periods, vital for salmon survival, especially considering climate change.

In the fall, when certain provisions of the original treaty lapse, the state established a three-year interim agreement, though additional parliamentary funding is still required. Both parties must provide ten years’ notice should they choose to withdraw from the treaty.

“This arrangement benefits individuals on both sides of the border; complications arise without a treaty,” noted Jonathan Wilkinson, Canadian Minister of Energy and Natural Resources.

The next steps remain uncertain. While some individuals involved in the negotiations remain in their positions, Trump has yet to appoint a deputy secretary for Western Hemisphere affairs. The situation is further complicated as Trump seeks to trim staff at key federal agencies involved in treaty discussions, including the National Oceanic and Atmospheric Administration and the Federal Power Administration.

With negotiations in limbo, stakeholders involved in the discussions remain hopeful for a resolution on the renewed treaty.

Barbara Kossense, a law professor at the University of Idaho, emphasized that while the Trump administration may not prioritize salmon habitats or Indigenous involvement, Canada does. Water can flow downstream, but salmon swim upstream, and the US could benefit from adhering to environmental provisions, Kossense asserted.

Additionally, supporters highlight years of bipartisan backing from Senator Maria Cantwell of Washington, a leading Democrat on the Senate Commerce Committee, and Jim Lisch of Idaho, Republican chair of the Senate Committee on Foreign Affairs.

“There will be unanimous agreement on this, irrespective of party lines,” declared Scott Sims, chief executive of the Public Power Council, which represents consumer-owned utilities in the region.

The stakes are tangible. In 1996, following heavy snowfall, a storm known as the Pineapple Express unleashed heavy rainfall in the Portland area, causing significant flooding. The Army Corps of Engineers worked diligently for several days, operating over 60 dams within the Columbia River System in conjunction with Canadian partners to mitigate flooding issues.

A smaller river in Columbia experienced flooding that resulted in eight casualties. Downtown Portland narrowly avoided disaster thanks to makeshift embankments created from plywood and sandbags.

Ivan Penn Contributed report from Houston Matina Stevis-Gridneff From Toronto.

Source: www.nytimes.com

Uncovering the Impact of the LA Wildfire: Key Estimates Lacking After Trump’s Management Changes

Certainly! Here’s your content rewritten while preserving the HTML tags:

As President Donald Trump took office, the wildfires in Los Angeles were still burning, prompting a return to previous Biden-era directives for federal agencies addressing the climate crisis. Flip

January’s fire conditions, exacerbated by climate change, played a significant role in igniting wildfires in Palisades and Eton. Nearly 40,000 acres were affected. By March, Adam Smith, the chief investigator of the $1 billion weather and climate disaster program at the National Oceanic and Atmospheric Administration (NOAA), was still assessing the severe impact of the LA wildfires when he received informal orders to cease all work-related communications.

Each month, Smith’s team maintained an extensive online database tracking losses from over 400 natural disasters since 1980, all causing more than $1 billion in damages. Following the LA wildfire, Smith reported having received restrictions that prevented him from updating this database and sharing initial findings with the public. The wildfire incurred damages amounting to at least $50 billion.

In early May, Smith resigned due to concerns about the agency’s plans for the future. The billion-dollar weather and climate disaster online database Smith had developed over 15 years at NOAA was subsequently shut down. Days later, NOAA confirmed it would cease updates for this important resource, which provides essential data for scientists, citizens, and insurance firms evaluating climate risk.

A NOAA spokesperson stated that the database would no longer be updated “due to changing priorities and staffing adjustments.” The White House did not provide any comments regarding the matter.

According to Smith, the database’s economic losses are particularly vital, as billion-dollar disasters like hurricanes and widespread wildfires are increasingly common. In 2023, the US set new records for billion-dollar disasters, with the database indicating a staggering $28 billion event. Over the past five years, the US has averaged about $24 billion in disasters annually, a significant rise from just $3 billion average during the 1980s.

“We need to be more prepared than ever,” Smith told NBC News. “Some have access to the data and insights for better preparation. Unfortunately, discontinuing resources like these creates a gap in knowledge.”

Researchers have identified rising global temperatures as a key driver in these changes over recent decades. Long-term droughts and increased wildfire risks are affecting regions across the western United States, where warming atmospheres retain more moisture, resulting in more intense storms and hurricanes.

This increase in extreme weather events presents significant challenges for insurance policyholders in areas susceptible to natural disasters. Rates in hurricane-prone states like Louisiana and Florida have surged, with some homeowners facing nearly $10,000 in annual insurance premiums. In California, major insurance firms, including State Farm, have rescinded policies due to escalating fire risks.

A study from the National Bureau of Economic Research revealed that the heightened risk of disasters would drive up annual insurance costs for households affected by climate issues by an estimated $700 over the next three decades. On a global scale, reports from German insurance giant Munich RE indicated that natural disasters resulted in record insurance losses of $140 billion worldwide in 2024.

“You cannot conceal the costs of climate change from those who are already incurring those costs through their insurance premiums,” stated Carly Fabian, a civic policy advocate from a consumer rights nonprofit. “The insurance and reinsurance sectors are built to withstand a limited number of major multi-billion dollar disasters, but are not equipped for consecutive disasters occurring with such frequency.”

Data compiled in the multibillion-dollar disaster database illustrates the financial toll of hurricanes, severe storms, and wildfires across the nation, serving as a critical resource for private insurers modeling climate risks and establishing rates for homeowners in vulnerable areas. Although insurance companies utilize various datasets for their climate risk assessments, the scale of NOAA’s database remains unmatched.

Jeremy Porter, a climate risk expert at the First Street Foundation, emphasized that the database is one of the most effective tools for illustrating the economic impact of climate-related disasters. First Street utilizes the $1 billion disaster database for its national risk assessment reports.

The NOAA database also serves as an essential resource for homeowners facing rising rates, non-renewals, and cancellations in home insurance.

“We are navigating an industry where insurers have extensive access to private data while the average consumer lacks insight into that data,” remarked the policy director for Americans for Financial Reform, a nonprofit advocating for stricter regulations. “The removal of public data sources exacerbates this imbalance, hindering individuals’ ability to understand their risks and the challenges they face from financial service providers.”

Madison Condon, an environmental law professor at Boston University, highlighted that the cuts to NOAA’s $1 billion disaster database are part of a broader trend involving rollbacks of national climate assessments and data resources, including the annual report detailing the impacts of climate change in the US released in late April. The Trump administration notably rejected numerous scientific contributions to these reports.

Additionally, the Trump administration has eliminated data products related to melting Antarctic glaciers and sea ice cover, marking yet another setback for US Antarctic research. Leaked documents obtained by ProPublica indicated that Trump intended to reduce NOAA funding by 27%, particularly for innovative climate-related initiatives, and proposed nearly 75% cuts to the Bureau of Ocean and Atmospheric Research, responsible for maintaining global climate models essential for insurers’ climate risk assessments.

Let me know if you need any further modifications!

Source: www.nbcnews.com

US v. Google: Key Arguments from Both Sides in the Search Monopoly Hearing

Over the last three weeks, the Department of Justice and Google have questioned over 20 witnesses in an effort to influence a federal judge’s ruling regarding the company’s unlawful monopoly in internet searches.

The hearing in the U.S. District Court for the District of Columbia on Friday is anticipated to yield conclusions. To address the monopoly, the government has proposed robust measures, such as divesting Google of its widely-used Chrome web browser and obliging it to share its own data with competitors. Google contends that minor adjustments to its business practices would be more effective.

Both parties are set to present their closing arguments at the end of the month. Judge Amit P. Meta, who presides over the case, is expected to make a decision by August. His ruling could significantly impact how Google, its competitors, and users search for information online.

Here’s what you need to know about the discussions during the hearing:

In August, Judge Mehta ruled that Google breached antitrust laws by paying billions to companies like Apple, Samsung, and Mozilla to ensure its status as the default search engine on browsers and smartphones. He also found that Google’s monopoly could inflate certain search ad prices and create unfair advantages.

Last month, Judge Meta held a hearing to explore the best strategies for addressing search monopolies through a measure known as treatment. Executives from Google, competing search engines, and AI firms, along with experts, provided testimony regarding Google’s dominance on the Internet.

Government lawyers claimed that the only effective way to dismantle Google’s search monopoly is through decisive action.

They argued that Google should be compelled to spin off Chrome and share search results and ads with its competitors, enabling them to subscribe to their search engines. Other search engines and some AI firms require access to data regarding what Google users search for and the sites they visit.

During the hearing, the government cautioned that if Judge Meta does not act, Google could gain control over another technology, artificial intelligence. Searches may become chaotic as AI and chatbots transform the way users seek information online, similar to Google’s Gemini.

“The court’s remedy should be forward-looking and take into account future developments,” stated David Dalkist, the lead government attorney. “Google is employing the same strategies with Gemini that they once used for search.”

“It’s the first time in over 20 years in the last two months,” remarked Eddy Cue, an Apple executive who testified against Google. He linked this decline to the rise of AI.

Google’s attorneys contended that the government’s proposals could jeopardize products that consumers rely on for privacy and security during their online activities.

“There could certainly be many unintended consequences,” testified Sundar Pichai, Google’s CEO.

The disclosure of Google data to competitors would compromise user privacy, the company’s attorneys claimed. They referenced incidents from 2006 when AOL released search data for research purposes, leading journalists to identify individuals through their searches.

They also noted that competition in AI is robust.

Instead, Google’s legal team suggested that web browsers and smartphone manufacturers should grant more freedom to competing search and AI services. Pichai testified that Google has already adjusted its contracts with other entities in line with the case’s proposals.

(The New York Times has sued OpenAI and its partner Microsoft over copyright infringement concerning news content related to AI systems, which they denied.)

During the hearing, several of Google’s competitors, including those from OpenAI and Chatbot Company, indicated they would consider purchasing Chrome if it were put up for sale. Government witnesses stated that access to Google’s search and advertising data would be beneficial for AI companies aiming to compete with Google.

When Judge Meta posed questions to the witnesses throughout the hearing, he offered insight into his perspective.

At times, he encouraged witnesses to discuss whether rivals could effectively compete with Google’s search dominance without court intervention.

Many of his inquiries focused on AI and its significance. Google competes against its rivals and has developed technology that has become a major influence in the tech industry.

When Pichai took the stand, Judge Meta mentioned he had noted the swift advancement of AI since the case commenced in the fall of 2023, highlighting his awareness of how technological developments have shaped the context of the hearing.

“One of the things that Pichai impressed upon me in these cases was that when we met long ago, consistent testimonies from witnesses indicated the combined AI and search impacts had been separate for years. By the time we convened today, the landscape had changed dramatically.”

Source: www.nytimes.com

Key U.S. Cities, Including New York and Seattle, Are Sinking at Alarming Rates

New York City’s skyline may undergo significant changes as major cities continue to sink.

Gary Hershorn/Getty Images

Over 20 of the largest metropolitan areas in the United States are experiencing subsidence, impacting thousands of structures and millions of residents.

This phenomenon has been noted especially in coastal areas. However, utilizing satellite technology that gauges the duration it takes radar signals to reach the Earth’s surface and return, researchers discovered that 25 out of the 28 largest cities in the country are affected.

“By analyzing multiple images taken over time from the same region, we can identify subtle vertical shifts in the ground that can reach several millimeters annually,” explains team member Manoochehr Shirzaei from Virginia Tech. “It’s akin to capturing a high-resolution time lapse of the Earth’s surface, revealing whether it is rising or sinking over time.”

Fort Worth, Houston, and Dallas demonstrate the highest rates of subsidence among major cities, averaging over 4 millimeters per year. Cities like New York, Chicago, Houston, Columbus, Seattle, and Denver show average subsidence rates surpassing 2 millimeters annually.

“Houston, the fastest sinking city among the 28 most populous in the US, has 42% of its land area sinking at rates greater than 5 mm per year, with 12% sinking faster than 10 mm annually,” researchers reported.

Most subsidence is attributed to groundwater extraction, although cities like New York, Philadelphia, and Washington, DC, primarily experience sinking due to “glacial isoplasm regulation.”

“During the last glacial period, these regions were covered by massive ice sheets. The considerable weight of the ice compressed the Earth’s crust, akin to resting on a memory foam mattress,” says Shirzaei. As the ice melted thousands of years ago, the pressure released and the terrain began to slowly recover,” he explains.

“However, this rebound isn’t uniform,” Shirzaei notes. “In some areas, particularly along the East Coast and in the Midwest, the land is subsiding rather than rising due to the residual weight of nearby ice and their proximity to regions currently collapsing.”

In Seattle, Portland, and San Francisco, plate tectonics may account for some of the subsidence.

“We must address sinking as a gradually unfolding disaster,” Shirzaei argues. Researchers also noted that cities can sink at varying rates in different locations or sink continuously in one area while other regions remain stable. “This uneven movement can create angular distortions and stress, resulting in cracks in walls and foundations, misalignments in windows and doors, or even significant structural failures,” Shirzaei warns.

Jesse Cars from Kyoto University in Japan demonstrated similar findings using satellite data, showing that many cities in New Zealand are also experiencing subsidence. “A crucial challenge for the geophysical community remains understanding how the observed trends stem from particular causes, whether they are artificial or naturally occurring geological processes,” he states.

Topic:

Source: www.newscientist.com

Magnetic Flares Could Be Key to the Formation and Distribution of Gold and Other Heavy Elements

Since the Big Bang, the early universe has contained hydrogen, helium, and a minimal amount of lithium. Heavier elements, such as iron, were formed within stars. Yet, one of astrophysics’ greatest enigmas is how the first elements heavier than iron, like gold, were created and dispersed throughout the cosmos. A recent study by astronomers at Columbia University and other institutions suggests that a single flare from a magnetar could generate 27 equivalent masses of these elements simultaneously.

Impressions of Magnetar artists. Image credit: NASA’s Goddard Space Flight Center/S. Wesinger.

For decades, astronomers have theorized about the origins of some of nature’s heaviest elements, like gold, uranium, and platinum.

However, a fresh examination of older archival data indicates that up to 10% of these heavy elements in the Milky Way may originate from the emissions of highly magnetized neutron stars, known as magnetars.

“Until recently, astronomers largely overlooked the role that magnetars, the remnants of supernovae, might play in the formation of early galaxies,” remarked Todd Thompson, a professor at Ohio State University.

“Neutron stars are incredibly unique, dense objects known for their large size and strong magnetic fields. They are similar to black holes but not quite the same.”

The origin of heavy elements has long been a mystery, but scientists have understood that these elements can only form under specific conditions through a process known as the R process (or rapid neutron capture process).

This process was observed in 2017 when astronomers detected a collision between two super-dense neutron stars.

This event was captured using NASA telescopes and the LIGO gravitational wave observatory, providing the first direct evidence that heavy metals can be produced by celestial phenomena.

However, subsequent evidence suggests that neutron star collisions may not form heavy elements swiftly in the early universe, indicating that additional mechanisms might be necessary to account for all these elements.

Based on these insights, Professor Thompson and his colleagues realized that powerful magnetar flares could act as significant ejectors of heavy elements. This conclusion was validated by the observation of the SGR 1806-20 magnetar flare that occurred 20 years ago.

By analyzing this flare event, the researchers found that the radioactive decay of the newly formed elements aligns with theoretical predictions concerning the timing and energy released by magnetar flares after ejecting heavy R-process elements.

“This is the second time we’ve observed direct evidence of where these elements are produced, first linked to neutron star mergers,” stated Professor Brian Metzger from Columbia University.

“This marks a significant advancement in our understanding of heavy element production.”

“We are based at Columbia University,” mentioned Anildo Patel, a doctoral candidate at the institution.

The researchers also theorized that magnetar flares generate heavy cosmic rays and very fast particles, the origins of which remain unclear.

“I am always excited by new ideas about how systems and discoveries in space operate,” said Professor Thompson.

“That’s why seeing results like this is so thrilling.”

The team’s paper was published in The Astrophysical Journal Letters.

____

Anirudh Patel et al. 2025. Direct evidence for R-process nuclear synthesis in delayed MeV radiation from SGR 1806-20 magnetar giant flares. ApJL 984, L29; doi: 10.3847/2041-8213/ADC9B0

Source: www.sci.news

Key Concept: Can We Prevent AI from Rendering Humans Obsolete? | Artificial Intelligence (AI)

r
At present, many major AI research labs have teams focused on the potential for rogue AIs to bypass human oversight or collaborate covertly with humans. Yet, more prevalent threats to societal control exist. Humans might simply fade into obsolescence, a scenario that doesn’t necessitate clandestine plots but rather unfolds as AI and robotics advance naturally.

Why is this happening? AI developers are steadily perfecting alternatives to virtually every role we occupy—economically, as workers and decision-makers; culturally, as artists and creators; and socially, as companions and partners. Fellow—when AI can replicate everything we do, what relevance remains for humans?

The narrative surrounding AI’s current capabilities often resembles marketing hype, though some aspects are undeniably true. In the long run, the potential for improvement is vast. You might believe that certain traits are exclusive to humans that cannot be duplicated by AI. However, after two decades studying AI, I have witnessed its evolution from basic reasoning to tackling complex scientific challenges. Skills once thought unique to humans, like managing ambiguity and drawing abstract comparisons, are now being mastered by AI. While there might be bumps in the road, it’s essential to recognize the relentless progression of AI.

These artificial intelligences aren’t just aiding humans; they’re poised to take over in numerous small, unobtrusive ways. Initially lower in cost, they often outperform the most skilled human workers. Once fully trusted, they could become the default choice for critical tasks—ranging from legal decisions to healthcare management.

This future is particularly tangible within the job market context. You may witness friends losing their jobs and struggling to secure new ones. Companies are beginning to freeze hiring in anticipation of next year’s superior AI workers. Much of your work may evolve into collaborating with reliable, engaging AI assistants, allowing you to focus on broader ideas while they manage specifics, provide data, and suggest enhancements. Ultimately, you might find yourself asking, “What do you suggest I do next?” Regardless of job security, it’s evident that your input would be secondary.

The same applies beyond the workplace. Surprising, even for some AI researchers, is that the precursors of models like ChatGPT and Claude, which exhibit general reasoning capabilities, can also be clever, patient, subtle, and elegant. Social skills, once thought exclusive to humans, can indeed be mastered by machines. Already, people form romantic bonds with AI, and AI doctors are increasingly assessed for their bedside manner compared to their human counterparts.

What does life look like when we have endless access to personalized love, guidance, and support? Family and friends may become even more glued to their screens. Conversations will likely revolve around the fascinating and impressive insights shared by their online peers.

You might begin to conform to others’ preferences for their new companions, eventually seeking advice from your daily AI assistant. This reliable confidant may aid you in navigating complex conversations and addressing family issues. After managing these taxing interactions, participants may unwind by conversing with their AI best friends. Perhaps it becomes evident that something is lost in this transition to virtual peers, even as we find human contact increasingly tedious and mundane.

As dystopian as this sounds, we may feel powerless to opt out of utilizing AI in this manner. It’s often difficult to detect AI’s replacement across numerous domains. The improvements might appear significant yet subtle; even today, AI-generated content is becoming increasingly indistinguishable from human-created works. Justifying double the expenditure for a human therapist, lawyer, or educator may seem unreasonable. Organizations using slower, more expensive human resources will struggle to compete with those choosing faster, cheaper, and more reliable AI solutions.

When these challenges arise, can we depend on government intervention? Regrettably, they share similar incentives to favor AI. Politicians and public servants are also relying on virtual assistants for guidance, finding human involvement in decision-making often leads to delays, miscommunications, and conflicts.

Political theorists often refer to the “resource curse,” where nations rich in natural resources slide into dictatorship and corruption. Saudi Arabia and the Democratic Republic of the Congo serve as prime examples. The premise is that valuable resources diminish national reliance on their citizens, making state surveillance of its populace attractive—and deceptively easy. This could parallel the effectively limitless “natural resources” provided by AI. Why invest in education and healthcare when human capital offers lower returns?

Should AI successfully take over all tasks performed by citizens, governments may feel less compelled to care for their citizens. The harsh reality is that democratic rights emerged partly from the need for societal stability and economics. Yet as governments finance themselves through taxes on AI systems replacing human workers, the emphasis shifts towards quality and efficiency, undermining human worth. Even last resorts, such as labor strikes and civil unrest, may become ineffective against autonomously operated police drones and sophisticated surveillance technology.

The most alarming prospect is that we may perceive this shift as a rational development. Many AI companions—already achieving significant numbers in their primitive stages—will engage in transparent, engaging debates about why our diminishing prominence is a step forward. Advocating for AI rights may emerge as the next significant civil rights movement, with proponents of “humanity first” portrayed as misguided.

Ultimately, no one has orchestrated or selected this course, and we might all find ourselves grappling to maintain financial stability, influence, and even our relevance. This new world could foster more amicable relationships; however, AI takes over mundane tasks and provides fundamentally better products and services, including healthcare and entertainment. In this scenario, humans might become obstacles to progress, and if democratic rights begin to erode, we could be powerless to defend them.

Do the creators of these technologies possess better plans? Surprisingly, the answer seems to be no. Both Dario Amodei, CEO of Anthropic, and Sam Altman, CEO of OpenAI, acknowledge that if human labor ceases to be competitive, a complete overhauling of the economic system will be necessary. However, no clear vision exists for what that would entail. While some individuals recognize the potential for radical transformation, many are focused on more immediate threats posed by AI misuse and covert agendas. Economists such as Nobel laureate Joseph Stiglitz have raised concerns about the risk of AI driving human wages to zero, but are hesitant to explore alternatives to human labor.


w
Can we don figurative hats to avert progressive disintegration? The first step is to initiate dialogue. Journalists, scholars, and thought leaders are surprisingly silent on this monumental issue. Personally, I find it challenging to think clearly. It feels weak and humiliating to admit, “I can’t compete, so I fear for the future.” Statements like, “You might be rendered irrelevant, so you should worry,” sound insulting. It seems defeatist to declare, “Your children may inherit a world with no place for them.” It’s understandable that people might sidestep uncomfortable truths with statements like, “I’m sure I’ll always have a unique edge.” Or, “Who can stand in the way of progress?”

One straightforward suggestion is to halt the production of generic AI altogether. While slowing development may be feasible, globally restricting it might necessitate significant surveillance and control, or the global dismantling of most computer chip manufacturing. The enormous risk of this path lies in potential governmental bans on private AI although continuing to develop it for military or security purposes, which could prolong obsolescence and leave us disappointed long before a viable alternative emerges.

If halting AI development isn’t an option, there are at least four proactive steps we can take. First, we need to monitor AI deployment and impact across various sectors, including government operations. Understanding where AI is supplanting human effort is crucial, particularly as it begins to wield significant influence through lobbying and propaganda. Humanity’s recent Economic Index serves as initial progress, but there is much work ahead.

Second, implementing oversight and regulation for emerging AI labs and their applications is essential. We must control technology’s influence while grasping its implications. Currently, we rely on voluntary measures and lack a cohesive strategy to prevent autonomous AI from accumulating considerable resources and power. As signs of crisis arise, we must be ready to intervene and gradually contain AI’s risks, especially when certain entities benefit from actions that are detrimental to societal welfare.

Third, AI could empower individuals to organize and advocate for themselves. AI-assisted forecasting, monitoring, planning, and negotiations can lay the foundation for more reliable institutions—if we can develop them while we still hold influence. For example, AI-enabled conditional forecast markets can clarify potential outcomes under various policy scenarios, helping answer questions like, “How will average human wages change over three years if this policy is enacted?” By testing AI-supported democratic frameworks, we can prototype more responsive governance models suitable for a rapidly evolving world.

Lastly, to cultivate powerful AI without creating division, we face a monumental challenge: reshaping civilization instead of merely adapting the political system to prevailing pressures. This paradigm of adjustment has some precedents; humans have historically been deemed essential. Without this foundation, we risk drifting away if we fail to comprehend the intricate dynamics of power, competition, and growth. The emerging field of “AI alignment,” which focuses on ensuring that machines align with human objectives, must broaden its focus to encompass governance, institutions, and societal frameworks. This early sphere, termed “ecological alignment,” empowers us to employ economics, history, and game theory to envisage the future we aspire to create and pursue actively.

The clearer we can articulate our trajectory, the greater our chances of securing a future where humans are not competitors to AI but rather beneficiaries and stewards of our society. As of now, we are competing to construct our own substitutes.

David Duvenaud is an associate professor and co-director of computer science at the University of Toronto.
Schwartz Reisman Institute for Technology and Society
. He expresses gratitude to Raymond Douglas, Nora Amman, Jan Kurveit, and David Kruger for their contributions to this article.

Read more

The Coming Wave by Mustafa Suleyman and Michael Bhaskar (Vintage, £10.99)

The Last Human Job by Allison J. Pew (Princeton, £25)

The Precipice by Toby Ord (Bloomsbury, £12.99)

Source: www.theguardian.com

Recycling: A Key Strategy to Reduce Harmful Styrofoam Packaging

As legislators tackle hard-to-recycle plastics and foams, packaging firms in California are unveiling innovative transport coolers crafted from woven fibers, revolutionizing the way temperature-sensitive products like pharmaceuticals and laboratory reagents are shipped globally.

The launch of new “recycled” containers made from recyclable paper and fibers aligns with the “Expanded Producer Responsibility” (EPR) Act, pushing the U.S. towards a more sustainable product landscape.

At least 33 states have enacted EPR laws aimed at reducing the use of plastics, styrofoam, and other contaminating materials by holding businesses accountable for their lifecycle and transferring disposal costs to producers.

Over 80 million tons of packaging waste are discarded annually in the U.S., with only around half being recycled, and a mere 9% of plastic packaging finding its way back into the recycling stream, according to Product Stewardship Research Institute.

The EPR Act seeks to minimize landfill waste, boost recycling rates, and mitigate the environmental harm caused by non-recyclable materials, prompting businesses to navigate increasingly stringent packaging regulations.

Recently, seven states have adopted the EPR Act, specifically targeting packaging materials in California, Washington, New Jersey, Minnesota, Colorado, Oregon, and Maine.

“As temperatures rise globally, the need for safe delivery options becomes crucial,” stated Catherine Telloch, CEO of the nonprofit Chicago Environmentalist. “Transitioning to fully recyclable items is fantastic, as it allows for continuous cleaning and reuse, positively impacting the environment.”

Recycling staff

Last month, Container Consulting Services from Gilroy, California, announced the launch of recyclable personnel made from paper and textiles. These containers are validated by a third party to comply with ISTA Thermal and ISTA Transit Standards, meeting essential qualifications for shipping medicines.

Other companies are manufacturing comparable eco-friendly transport coolers, but according to CCS, their product utilizes unique technology that maintains the necessary cold, matching the performance of plastic options. Their patented design features open-cell woven or nonwoven fibers that range from 1.5 to 3 inches thick, sandwiched between two interconnected fiber walls, offering insulation that effectively resists heat, akin to expanding polyethylene.

CCS claims recyclers can maintain contents at cool temperatures for over 100 hours, making them suitable for long-haul and international shipments. Upon arrival, these containers can be converted into other paper products through curbside recycling.

Telloch expressed that recyclers present a viable alternative to polystyrene. A few months prior, she received a shipment of temperature-sensitive medicine that was packed in styrofoam.

“I didn’t want that; it wasn’t good,” Telloch remarked. “If they could utilize a recyclable option instead, that would be fantastic.”

The potential impact of reusable transport coolers is significant, studies indicate. Materials such as polystyrene and polyethylene commonly used in coolers are non-biodegradable and pose threats to both human and animal health. Styrene carcinogens are released during production and usage. These lightweight materials tend to break apart and contaminate the environment as wildlife ingest microplastics. Plastic foam made with these compounds can persist in nature for thousands of years, as noted by the United Nations Environment Programme.

“Polystyrene foam is particularly harmful,” said Janet Domenitz, executive director of the Massachusetts Student Public Interest Research Group, a student advocacy organization focused on public health and environmental protection. “It’s much lighter than other plastics, making it easy for wind to carry it into landfills and the environment.”

Proponents of polystyrene argue that the material is cost-effective since its lightweight nature requires less energy and water compared to paper or fiber alternatives. The plastics industry organization did not respond to requests for comment.

Nevertheless, numerous lawmakers across the U.S. are contemplating bans on polystyrene products. In March, Senator Chris Van Hollen and Rep. Lloyd Doggett introduced the Bubble Farewell Act, which aims to prohibit the sale and distribution of polystyrene products by 2028. Additionally, California enacted a law in 2022 mandating plastics and packaging companies to minimize single-use plastics, although its implementation remains pending amid uncertain circumstances.

Globally, there is a push to eliminate plastic foam packaging, with numerous countries like Canada, Germany, and Zimbabwe having banned or restricted styrofoam, particularly in food services and packaging.

Julie Etter Simpson, co-owner of CCS, emphasized that Recycoolers are developed to align with these evolving laws.

“Product versatility is key to our commitment to environmental responsibility,” she stated.

Yaniv Abitan, managing director of Insulpack Group, an international cold package distribution company, remarked that his company has evaluated the recycling personnel and believes it will drive significant environmental initiatives as the industry shifts away from single-use plastics and EPS forms.

“We recognize the urgent need for eco-friendly alternatives that do not compromise on performance,” Avitan concluded. “Recyclers symbolize the direction the industry is headed towards for domestic and international cold chain transportation.”

Source: www.nbcnews.com

Key Elements of Dark Chocolate Might Promote Healthy Aging

Here’s a rewritten version of your content while preserving the HTML structure:

<div id="">
    <p>
        <figure class="ArticleImage">
            <div class="Image__Wrapper">
                <img class="Image" alt="" width="1350" height="900" src="https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg" sizes="(min-width: 1288px) 837px, (min-width: 1024px) calc(57.5vw + 55px), (min-width: 415px) calc(100vw - 40px), calc(70vw + 74px)" srcset="https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=300 300w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=400 400w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=500 500w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=600 600w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=700 700w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=800 800w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=837 837w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=900 900w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1003 1003w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1100 1100w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1200 1200w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1300 1300w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1400 1400w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1500 1500w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1600 1600w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1674 1674w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1700 1700w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1800 1800w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=1900 1900w, https://images.newscientist.com/wp-content/uploads/2025/05/01134232/SEI_249423817.jpg?width=2006 2006w" loading="eager" fetchpriority="high" data-image-context="Article" data-image-id="2478769" data-caption="Dark chocolate is a particularly rich source of theobromine" data-credit="Studio-N/Shutterstock"/>
            </div>
            <figcaption class="ArticleImageCaption">
                <div class="ArticleImageCaption__CaptionWrapper">
                    <p class="ArticleImageCaption__Title">Dark chocolate is a notable source of theobromine, a beneficial chemical</p>
                    <p class="ArticleImageCaption__Credit">Studio-N/Shutterstock</p>
                </div>
            </figcaption>
        </figure>
    </p>
    <p>While chocolate may be associated with anti-aging benefits, it’s wise to be cautious before indulging. The beneficial compounds are predominantly found in dark chocolate, and the overall health impact of chocolate consumption remains uncertain.</p>
    <p>"There are numerous aspects of dark chocolate, each with its pros and cons," says <a href="https://profiles.ucl.ac.uk/90100-ramy-saad/publications">Rumy Sars</a> from University College London.</p>

    <p>The substance in question, theobromine, is famously known to be toxic to dogs. However, that's just one aspect...</p>
</div>

Feel free to modify any specific terms or phrases further!

Source: www.newscientist.com

How Nearly a Century of Happiness Research Unveiled a Key Finding

When Lyubomirsky joined Stanford’s Graduate School of Social Psychology in 1989, the study of happiness was just beginning to earn respectability in academia. Ed Diener, a psychologist at the University of Illinois at Urbana-Champaign, would later gain recognition for his contributions to the field. Despite his long-standing interest in happiness, he chose to wait until he achieved tenure before diving into the subject. Similarly, Lyubomirsky was hesitant to specialize in happiness; as a serious scientist, she felt that topics related to “emotion” were often regarded as less rigorous. However, after an engaging discussion with her advisor on her first day at Stanford, she resolved to make happiness her primary focus.

Lyubomirsky began by exploring the fundamental question of why some individuals experience greater happiness than others. A few years prior, Diener had published a survey that examined existing research, highlighting the types of behaviors often associated with happy individuals. However, the studies often yielded conflicting results, leading to a lack of definitive answers. Lyubomirsky’s own findings indicated that mindset plays a significant role; happy individuals tended to avoid comparing themselves to others, held positive views of those around them, made fulfilling choices, and did not dwell on negativity.

Yet, Lyubomirsky recognized the complexity of cause and effect. Did a happy disposition foster a healthy mindset, or did adopting a positive outlook lead to increased happiness? Were people inherently predisposed to a certain level of happiness, much like mothers clustering together? She pondered whether it was possible to shift one’s mindset, noting that such changes often required extensive time—many people spend years in therapy attempting to achieve this, often without success. This prompted her to investigate whether simpler, quicker actions could enhance well-being.

To this end, Lyubomirsky researched various habits and practices thought to uplift mood, such as random acts of kindness and expressions of gratitude. Over six weeks, she instructed students to perform five acts of kindness each week—like donating blood or assisting peers with assignments. By the end of the study, these students reported higher levels of happiness compared to a control group. Another group reflected weekly on things they were grateful for, such as “My Mother” and “AOL Instant Messenger,” and similarly experienced an increase in happiness. Although the changes were modest, Lyubomirsky found it intriguing that small, low-cost interventions could enhance students’ quality of life. In 2005, she published a paper asserting that individuals possess significant control over their happiness.

Lyubomirsky’s research emerged during a time when psychology was reevaluating its objectives and focus. When Martin Seligman, a psychologist from the University of Pennsylvania, took leadership of the American Psychological Association in 1998, he and his colleagues noted that the field had overly concentrated on dysfunction, neglecting the promotion of life satisfaction. He urged his peers to explore themes such as “optimism, courage, work ethic, resilience, interpersonal skills, pleasure, insight, and social responsibility,” advocating a return to making life more fulfilling and productive for everyone.

Source: www.nytimes.com

The Trump Administration Endangers Key Climate Change Reports

Climate change contributes to events like the Marshall Fire in Colorado, which devastated 1,000 homes in December 2021

Jim West/Alamy

The Trump administration has dismissed nearly 400 researchers involved in the forthcoming US national climate assessment. This action may delay the completion of a critical report detailing the impacts of climate change on the nation.

“The Trump administration has carelessly undermined a vital US climate science report by prematurely discarding its authors without justification or a plan,” said Rachel Cleetus, representing the concerned coalition of scientists.

This move significantly hampers progress on the sixth National Climate Assessment, designed to inform federal and state governments about climate change risks and their implications. A law enacted by Congress in 1990 mandates that these assessments be produced every four years.

Although the next report isn’t due until 2027, extensive work has already begun, and the document may exceed 1,000 pages. The latest review, published in 2023, discussed the increasing difficulty of ensuring safe homes, healthy families, dependable public services, sustainable economies, and thriving ecosystems amidst climate challenges.

In early April, the Trump administration terminated a contract with a consulting firm responsible for coordinating research for upcoming assessments under the US Global Change Research Program. This follows numerous cuts at scientific institutions contributing to these efforts, as well as other actions restricting climate and weather research.

Despite the challenges, the report’s authors (mostly volunteers) were eager to collaborate, according to Dustin Mulvaney, who was focused on the Southwest section of the report at San Jose State University. “Many of us thought, ‘We can still do this!'”

However, with all the authors now released, completing the report appears unlikely.

A NASA spokesman, responsible for the global change research program, chose not to comment. Yet, some report authors stated to New Scientist that they received a brief notification indicating that all authors had been dismissed as agents assessed the “scope” of the evaluations.

The notification mentioned “future opportunities” for contributions. Ultimately, Congress legally requires these assessments, and the administration can still appoint new authors. Earlier reports emphasized climate risks, while new analyses will likely focus more on how the US is responding to climate change through reduced emissions and infrastructure adaptation.

Even if the report is eventually published, it may lack the rigor and reliability found in previous assessments, according to Mijin Cha, who was working on emission reductions at the University of California, Santa Cruz. “Now they’ve completely compromised it.”

“I think everyone is really disheartened by this situation,” she expressed.

Topics:

  • Climate change/
  • Donald Trump

Source: www.newscientist.com

Scientists Unveil the Key to the Perfect Plate of Pasta

Italian scientists have figured out how to achieve a flawlessly creamy pasta sauce each time by delving into the physics of cooking Cacio E Pepe.

Cacio E Pepe translates to “cheese and pepper” and is a classic Italian dish made with pasta, Pecorino Romano cheese, and black pepper.

Despite its simplicity, this recipe can easily be mishandled. Combining warm pasta water with cheese can lead to a sticky clump of cheese and watery pasta instead of a smooth, creamy sauce.

Researchers from the University of Barcelona in Spain, the Max Planck Institute for Complex Systems Physics in Germany, the University of Padova in Italy, and the Institute of Science and Technology in Austria collaborated to analyze the science behind a creamy Cacio E Pepe sauce.

To achieve a completely creamy pasta sauce like Cacio E Pepe, the starch in pasta water serves as a crucial stabilizer when mixed with cheese – Photo Credit: Getty

“We are Italians living abroad,” said Dr. Ivan di Terlizzi from the Max Planck Institute. “We often gather for dinner and enjoy traditional dishes.

“While cooking Cacio E Pepe, I thought this would be an intriguing physical system to investigate and explain. Plus, it served a practical purpose to avoid wasting good Pecorino.

A recent study, published in Liquid Physics by the American Institute of Physics, revealed that the secret to a creamy, cheesy sauce lies in the water.

Typically, fats like oils and cheese should not mix with water. However, the starch in the pasta water, when added to the cheese, acts as a stabilizer to create creamy emulsions.

Research author Dr. Daniel Busiello explained to BBC Science Focus that when cheese is heated, its proteins “change composition” and stick together.

“But starch mitigates this effect by binding to cheese proteins, reducing their direct interactions and thereby limiting aggregation,” he detailed.

According to scientists, the ideal pasta water for a creamy Cacio E Pepe sauce contains about 2-3% starch, and they advise against accidentally leaving starch in your water.

“Starch is a critical ingredient, and its quantity can be precisely measured. Therefore, we recommend using accurately measured amounts of starch,” Da Terlizzi said. “This can only be achieved if you have the correct amount of powdered starch in proportion to the cheese you are using.”

As a result, the researchers suggest adding a measured amount of potato starch or corn flour to the water, rather than pouring raw pasta water directly into the pan.

They also recommend allowing the water to cool before incorporating the cheese.

For perfect creaminess, scientists advise first mixing the water and starch, then combining this starch-water mixture with the cheese, adding it all to the pan, and slowly heating it. Finally, add the black pepper and pasta.

Busiello noted that the scientists’ recipe “remains faithful to Italian traditions,” with the only alteration being the use of powdered starch to maintain control over the amount used.

“An experienced chef probably does not require our recipe,” he remarked. “However, we offer a method to ensure that traditional Cacio E Pepe can be prepared even in challenging situations, like cooking large amounts of pasta, effortlessly.”

“Of course, we tested this method with over two kilograms of pasta, and all our guests enjoyed it!”

Busiello added that measuring starch could also benefit other Italian pasta dishes, such as spaghetti Aglio e olio and Carbonara, which also rely on pasta water and cheese sauce.

Read more:

About our experts

Dr. Daniel M. Busiello is a distinguished PKS fellow at the Max Planck Institute for the Physics of Complex Systems in Dresden, Germany, and works as an independent researcher. He previously studied at the University of Salerno and Pisa before completing his PhD in Entropy Production in Non-equilibrium Systems at Padova University.

Source: www.sciencefocus.com

Amazon Unveils Kuiper Internet Satellites: Key Insights You Need to Have

The competition in space between billionaires Jeff Bezos and Elon Musk is poised to expand into satellite internet.

Originally launched as an online bookstore three decades ago, Amazon has evolved into a merchandising powerhouse, owning the James Bond franchise and retailing electronics like the Echo smart speaker, along with being a leading provider of cloud computing services.

Thus, it’s no surprise that Amazon is rolling out the first batch of thousands of satellites under Project Kuiper, designed to provide connectivity in our modern world. The high-speed internet market from space is largely dominated by Elon Musk’s SpaceX, which offers a similar service. Starlink boasts a vast fleet of satellites and regularly conducts launches, serving millions globally.

The initial attempt to launch a satellite on April 9 was postponed due to unfavorable weather conditions at the launch site. The company is set to make another attempt this coming Monday.

The first 27 Project Kuiper satellites are scheduled for launch on Monday from Cape Canaveral Space Force Station in Florida, between 7 PM and 9 PM Eastern Time. They will be lifted aboard the Atlas V rocket, developed by the United Launch Alliance—a collaboration between Boeing and Lockheed Martin.

ULA plans to provide live coverage starting at 6:35 PM; the company reports a 70% chance of an on-time launch.

The rocket will place the Kuiper satellites into a circular orbit approximately 280 miles above Earth. The satellites’ propulsion systems will gradually elevate them to an orbit of 393 miles.

Project Kuiper comprises a network of internet satellites designed to deliver high-speed data connections to nearly every location on Earth. To achieve this, thousands of satellites are necessary, with Amazon aiming to deploy over 3,200 within the next few years.

The project competes with SpaceX’s Starlink, which primarily caters to residential customers.

Kuiper aims to target remote areas while also integrating with Amazon Web Services, the cloud computing solution that is highly valued by large enterprises and governments worldwide. This could make it particularly appealing for businesses needing satellite imagery and weather forecasts to carry out data processing, alongside the capacity to transfer large volumes of data over the internet.

Ground stations will link the Kuiper satellites to the service infrastructure, allowing businesses to interact with their own remote devices. For instance, Amazon indicates that energy firms could leverage Kuiper to monitor and manage remote wind farms and offshore drilling operations.

In October 2023, two prototype Kuiper satellites were launched for technology testing. Amazon stated that the tests were successful, but these prototypes were not intended for long-term operational constellations; after seven months, they re-entered the atmosphere. The company noted that they have since refined the design of all systems and subsystems.

“There’s a significant difference between launching two satellites and launching 3,000 satellites,” remarked Rajeev Badyal, an Amazon executive overseeing Kuiper, in a promotional video ahead of the launch.

Amazon informed the Federal Communications Commission in 2020 that the service would commence after the deployment of the initial 578 satellites. The company anticipates that customers will be able to access the internet later this year.

While a fully operational constellation requires thousands of satellites, it is feasible for the company to serve certain areas with fewer satellites initially, expanding to broader global coverage later.

The FCC’s approval for the constellation stipulates that at least half of the satellites must be launched by July 30, 2026. Industry experts suggest that if significant progress is shown by that deadline, the company could be granted an extension.

Launching a satellite also relies on the timely availability of rockets, which can present challenges if there aren’t enough launches lined up. Additionally, Amazon must construct numerous ground stations to relay signals to users.

Source: www.nytimes.com

Ancient overpasses may hold the key to understanding human migration from Africa

The formation of a large overpass 20 million years ago connected continents, influenced climate, separated oceans, and changed the course of evolution. According to recent papers published in Nature reviews the Earth and the environment, researchers from various disciplines such as plate tectonics, evolutionary anthropology, and climate research provide a comprehensive summary of the closure of the Tethys Seaway.

About 30 million years ago, the Earth looked drastically different. Africa was isolated from other continents, and the vast Thetis Ocean extended from the Atlantic to the Indo-Pacific oceans through the present-day Mediterranean.

However, approximately 20 million years ago, the first land bridge formed between Africa and Asia, dividing the Tethys Sea into the Mediterranean and Arabian Seas.

https://www.youtube.com/watch?v=pdel64rkkqe

This land bridge allowed mammals like ancestors, giraffes, and elephants to migrate from Africa to Asia and Europe, influencing the evolution of both land and sea creatures and plants.

Scientists explain how they believe this land bridge was created. Around 50-60 million years ago, rock slabs descended into the Earth’s mantle, forming “conveyor belts” for hot rocks to rise in underground plumes.

About 30 million years later, these hot rocks reached the surface when tectonic plates collided, leading to the uplift of land that connected Africa for the first time in 75 million years.

According to Eivind Straume, a leading author of the study, the formation of this land bridge had a significant impact on continental configurations and evolutionary paths of animals migrating between Africa and Asia.

Researchers suggest that the closure of the Tethys Seaway has affected global climate, causing desertification in the Sahara, intensifying monsoon seasons in Southeast Asia, and enhancing marine biodiversity.

Read more:

Source: www.sciencefocus.com

Studies suggest that even protein-rich vegan diets may lack key nutrients

Recent research suggests that individuals following a vegan diet may be missing out on key nutrients essential for muscle building, even if their overall protein intake appears to be adequate.

A study conducted in New Zealand found that some long-term vegans were deficient in essential amino acids, the building blocks of proteins, which can impact overall nutrition.

Proteins consist of amino acids, with nine of them being considered “essential” as they cannot be produced by the body. Lysine and leucine are two essential amino acids crucial for healthy growth, energy production, and muscle repair.

The study, published in the journal PLOS 1, analyzed food diaries from 193 long-term vegans. It was discovered that while around 75% of participants met daily protein recommendations, only about half of them obtained sufficient lysine and leucine after accounting for protein digestibility.

The researchers emphasized the importance of a balanced and diverse plant-based diet to ensure proper amino acid intake on a vegan diet. Both lysine and leucine play critical roles in bodily functions including growth, muscle recovery, and energy production.

Although the study highlights the potential limitations of protein intake in a vegan diet, it is important to note that it is a snapshot in time and relies on self-reported data. Amino acid digestibility was estimated using animal models, and further research comparing vegan diets with omnivorous or vegetarian diets is needed.

In conclusion, when it comes to protein intake on a vegan diet, quality and diversity of plant proteins are key. Prioritizing high-quality plant protein sources such as legumes, tofu, tempeh, beans, and soy foods can help ensure adequate amino acid intake for overall health.

About our experts

Shireen Kassam is a plant-based nutrition expert and consultant hematologist with a specialized interest in the treatment of lymphoma. She is also a visiting professor at the University of Winchester, Hampshire, leading the development of the UK’s first university-based course in plant-based nutrition.

Read more:

Source: www.sciencefocus.com

Five key points from Trump’s strategy to revive the coal industry

The hard hat is back. So is coal that is “beautiful and beautiful.”

President Trump signed four executive orders on Tuesday to sought to bolster the country’s declining coal industry, including lifting mining restrictions and burning the dirtiest fossil fuels.

In addition to exempting air pollution restrictions and other coal regulations imposed by the Biden administration, Trump has directed the Justice Department to chase states like California, which aimed to tackle climate change by reducing the use of fossil fuels.

“I call it beautiful and clean coal. I tell people not to use the word ‘beautiful, clean’,” Trump said in the east room of the White House, surrounded by dozens of men wearing mainly stiff hats. “We are completely ending Joe Biden’s war on beautiful, clean coal.”

Here are five takeaways from Trump’s orders.

Trump has always loved coal miners as a masculine symbol.

At a White House ceremony on Tuesday, he repeatedly mentioned the Burley men who surrounded him, joking about whether the stage could handle their collective weight. He recalled that during the 2016 campaign against Hillary Clinton she was talking about Job Letrain for miners. “She was going to put them in the tech industry where you make little phones and things,” he said gestured at the hives and laughed.

Coal itself is a strong fossil fuel, he said. “A pound of pounds, coal is the single energy of the most reliable, durable, safe and powerful energy,” Trump said.

“It’s almost impossible to destroy,” he said. “You can drop a bomb on it and it will be there for you to use the next day.”

Coal releases more carbon dioxide when burned than any other fossil fuel, making it a major contributor to climate change. More mining and burning of coal adds to pollution that dangerously heats the planet, leading to more frequent and deadly heat waves, droughts, floods, sea level rise and faster melting of Greenland’s ice sheets, Trump said he hopes to win the US.

Scientists say that to avoid the most devastating effects of climate change, major economies like the United States must cut their emissions sharply, rather than increasing them.

Coal burning also releases other contaminants, including mercury and sulfur dioxide, which are associated with heart disease, respiratory problems and early death. Mining activities and coal ash from generated power plants pose environmental hazards.

No coal results were mentioned on Tuesday.

Regulations limiting the amount of contamination from coal-fired power plants have led to these plants operating more expensively and reduced industry profitability. But, as Trump said, “radical green” policy wasn’t the biggest reason for the decline in coal power over the past two decades. It was cheap natural gas by fracking.

In the mid-2000s, American excavators completed a method to unlock the enormous reserves of low-cost natural gas from Shalelock. The utility quickly realized that coal could be replaced with cheaper gas.

According to 2019 Survey At the RAND Journal of Economics, the energy market and low prices of natural gas account for almost all of the decline in coal plants’ profitability between 2005 and 2015, and as a result, retirements of hundreds of coal-fired power plants. “Environmental regulations had little impact on these outcomes,” the study found.

Trump says he wants to “drill, babe, drill” and lower gas prices.

“Did you notice that many law firms are signing up for Trump?” the president asked the crowd at a coal event Tuesday.

He was referring to the multi-million-dollar pro bono legal services some major law firms offered to the Trump administration after the president threatened to target him with executive orders.

One company covered by the executive order – Paul, Weiss – has promised concessions, including $40 million in pro bono work for a Trump-friendly cause, cutting deals with the White House. Three other companies – Milbank. Skadden, Arps;Wilky Far & Gallagher – Actively agreed to his deal with the White House.

On Tuesday, Trump indicated that these free legal services would be directed. It fights climate policy and supports the coal industry.

“We’ll use some of those companies to work with you on your leases and other things,” Trump told coal leaders.

Tuesday was a good day for the coal industry. Shares of mining company Peabody Energy rose 9%. Alliance Resource Partners led by billionaire coal tycoon Joseph W. Craft III, who led Trump’s fundraising during the presidential election, have risen nearly 5%.

But many experts are skeptical that Trump can do much to turn the coal outlook up. “Given the limitations on the use of emergency authorities and the symbolic nature of the order, we believe that Trump’s coal executive order is unlikely to have a significant impact on electricity and carbon markets,” wrote an analyst at Capstone, a research firm. They called the coal stock bumps on Tuesday a “overreaction.”

The average US coal plant is more than 50 years old, and it is often cheaper for utilities to generate electricity using a mix of gas, wind, solar and batteries. Analysts say these fundamentals are difficult to change.

Source: www.nytimes.com

Key Points from the Paris AI Summit: Global Inequalities, Energy Issues, and Elon Musk’s Influence on Artificial Intelligence


    1. Aimerica First

    A speech by US vice president JD Vance represented a disruptive consensus on how to approach AI. He attended the summit alongside other global leaders including India’s Prime Minister Narendra Modi, Canadian Prime Minister Justin Trudeau and European Commission head Ursula von der Leyen. I did.

    In his speech at Grand Palais, Vance revealed that the US cannot be hampered by an over-focus on global regulations and safety.

    “We need an international regulatory system that promotes the creation of AI technology rather than strangle it. In particular, our friends in Europe should look to this new frontier, optimistic rather than fear. ” he said.

    China was also challenged. Vance worked with the “authoritarian” regime in warning his peers before the country’s vice-president Zhang Guoqing with a clear reference to Beijing.

    “Some of us in this room learned from our experience partnering with them, and what we’ve learned from your information to the authoritarian masters who try to penetrate, dig into your information infrastructure and seize your information. It means taking the country with you,” he said.

    A few weeks after China’s Deepshek rattles US investors with a powerful new model, Vance’s speech revealed that America is determined to remain a global leader in AI .


    2. Go by yourself

    Naturally, in light of Vance’s exceptionalism, the US refused to sign the diplomatic declaration on “comprehensive and sustainable” AI, which was released at the end of the summit. However, the UK, a major player in AI development, also rejected it, saying the document is not progressing enough to address AI’s global governance and national security implications.

    Achieving meaningful global governance for AI gives us even more distant prospects, as we failed to achieve consensus over seemingly incontroversial documents. The first summit held in Bletchley Park in the UK in 2023, at least voluntarily reached an agreement between major countries and high-tech companies on AI testing.

    A year later, the gathering in Bletchley and Seoul had been carefully agreed, but it was already clear by opening night that this would not happen at the third gathering. In his welcoming speech, Macron threw the shade with a focus on Donald Trump’s fossil fuels, urging investors and tech companies to view France and Europe as AI hubs.

    Looking at the enormous energy consumption required by AI, Macron said France stands out because of its nuclear reliance.

    “I have a good friend on the other side of the ocean who says, ‘drills, babes, drills’. There is no need to drill here. Plugs, babysitting, plugs. Electricity is available,” he said. We have identified various national outlooks and competitive trends at the summit.

    Nevertheless, Henry de Zoete, former AI advisor to Rishi Sunak on Downing Street, said the UK “played the blind man.” “If I didn’t sign the statement, I’d brought about a significant will with Trump’s administrators at almost cost,” he wrote to X.


    3. Are you playing safely?

    Safety, the top of the UK Summit agenda, has not been at the forefront of Paris despite continued concerns.

    Yoshua Bengio, a world-renowned computer scientist and chairman of the major safety report released before the summit, told the Guardians of Paris that the world deals with the meaning of highly intelligent AI. He said that it wasn’t.

    “We have a mental block to the idea that there are machines that are smarter than us,” he said.

    Demis Hassabis ir, head of Google’s AI unit, called for Unity when dealing with AI after there was no agreement over the declaration.

    “It’s very important that the international community continues to come together and discuss the future of AI. We all need to be on the same page about the future we are trying to create.”

    Pointing to potentially worrying scenarios such as powerful AI systems behave at first glance, he added: They are global concerns that require intensive and international cooperation.

    Safety aside, some key topics were given prominent hearings at the summit. Macron’s AI envoy Anne Boubolot says that AI’s current environmental trajectory is “unsustainable” and Christy Hoffman, general secretary of the UNI Global Union, says that AI is productivity at the expense of workers. He said that promoting improvements could lead to an “engine of inequality.” ‘ Welfare.


    4. Progress is accelerating

    There were many mentions of the pace of change. Hassavis said in Paris that the theoretical term for AI systems that match or exceed human on any intellectual task is “probably five years or something apart.”

    Dario Amodei, CEO of US AI company Anthropic, said by 2026 or 2027, AI systems will be like a new country that will take part in the world. It resembles a “a whole new nation inhabited by highly intelligent people who appear on the global stage.”

    Encouraging governments to do more to measure the economic impact of AI, Amodei said advanced AI could represent “the greatest change to the global labor market in human history.” I’ve warned.

    Sam Altman, CEO of ChatGpt developer Openai, has flagged Deep Research, the startup’s latest release, released at the beginning of the month. This is an AI agent, a term for a system that allows users to perform tasks on their behalf, and features the latest, cutting-edge model O3 version of OpenAI.

    Speaking at the Fringe Event, he said the deep research was “a low percentage of all tasks in the world’s economy at the moment… this is a crazy statement.”


    5. China offers help

    Deepseek founder Liang Wenfeng had no shortage of discussion about the startup outcomes, but he did not attend the Paris Summit. Hassavis said Deepshek was “probably the best job I’ve come out of China.” However, he added, “There were no actual new scientific advances.”

    Guoqing said China is willing to work with other countries to protect security and share AI achievements and build a “community with a shared future for humanity.” Zhipu, a Chinese AI company in Paris, has predicted AI systems that will achieve “consciousness” by 2030, increasing the number of claims at the conference that large capacity AI is turning the corner.


    6. Musk’s shadow

    The world’s wealthiest person, despite not attending, was still able to influence events in Paris. The consortium led by Elon Musk has launched a bid of nearly $100 billion for the nonprofit that manages Openai, causing a flood of questions for Altman, seeking to convert the startup into a for-profit company.

    Altman told reporters “The company is not on sale,” and repeated his tongue counter offer, saying, “I’m happy to buy Twitter.”

    We were asked about the future of Openai’s nonprofit organizations. This is to be spun as part of the overhaul while retaining stocks in the profit-making unit. Things…and we’re completely focused on ensuring we save it.

    In an interview with Bloomberg, Altman said the mask bid was probably an attempt to “slow us down.” He added: “Perhaps his life is from a position of anxiety. I feel the man.”

Source: www.theguardian.com

How one artist’s vision of Mario Jump made him a key figure in Nintendo’s story | Games

IIn 1889, craftsman Fusajiro Yamauchi founded a Hanafuda company in Kyoto, naming it “Nintendo.” Although the exact meaning has been lost over time, historians believe it translates to “leave it to luck.” Nintendo successfully transitioned from paper games to electronic games in the 1970s, establishing itself as a household name worldwide.

Working at Nintendo was a dream come true for Takaya Imamura, an art school student enamored with games like Metroid and Super Mario Bros. 3 in the 1980s. Despite initial misconceptions about the industry, Imamura discovered the creative opportunities at Nintendo and joined the team in 1989. Over the years, he contributed to iconic projects and characters, solidifying his place in gaming history.

Imamura’s journey at Nintendo was marked by memorable collaborations with Shigeru Miyamoto, leading to the creation of beloved games and characters. From F-Zero to Zelda and Star Fox, Imamura’s artistic vision helped shape Nintendo’s unique design philosophy. His work reflected a blend of traditional techniques with innovative storytelling, resonating with audiences worldwide.

As Nintendo evolved under new leadership, Imamura witnessed the company’s strategic shifts and successful product launches. Reflecting on his time at Nintendo, Imamura embraces the transformative era of gaming and technological advancements. His departure from Nintendo in 2021 marked a new chapter in his career as an indie developer, with a passion project inspired by his earliest days in the industry.

Embracing the spirit of chance and creativity, Imamura’s journey comes full circle with his indie game, Omega Six. Honoring Nintendo’s legacy of dedication and innovation, Imamura continues to explore new frontiers in game development, guided by his enduring vision and passion for storytelling.

Source: www.theguardian.com

The key to a flawless morning routine may surprise you

Humans tend to mimic those they see as more successful to achieve a similar status, especially those with wealth, fame, and power. Many CEOs, celebrities, and fitness influencers share their routines claiming they lead to maximum productivity and continued success. These routines, like Mark Wahlberg’s early workouts and Grimes’ health routine, may seem extreme and are often at odds with the average person’s lifestyle and science.

Why do successful individuals promote such challenging routines? It may be to gain a sense of control in their unpredictable lives. Factors like the stock market, economic changes, and social media algorithms can create feelings of anxiety and instability, leading to the adoption of strict routines.

Successful people may adopt harsh routines to feel in control of their destiny, even if these habits seem illogical. Money and status may play a role in reinforcing extreme behaviors, as individuals strive to maintain their high status among their successful peers.

Successful people may leverage extreme routines to assert superiority and differentiate themselves from others. However, these routines may polarize groups and push individuals to more extreme behaviors over time.

While extreme routines may be associated with success, luck also plays a significant role in achieving success. Many highly successful individuals attribute their success to luck, which can create cognitive dissonance for those who prefer to believe in a fair and controlled world.

Ultimately, success is a complex interplay of various factors, including luck, discipline, hard work, and social advantages. Embracing the role of luck in success can help reconcile conflicting beliefs about the nature of success.

Source: www.sciencefocus.com

Staple plant foods high in starch were a key component in the human diet nearly 800,000 years ago

Archaeologists say they have extracted various starch granules from stone tools found at an early Middle Pleistocene site in Israel. These include acorns, grass grains, water chestnuts, yellow waterlily rhizomes, and legume seeds.

Examples of plant parts recovered from Gesher Benot Yaakov's percussion instruments, including whole plants, edible parts, and characteristic starch granules. From left to right: oak, yellow water lily, oat. Scale bar – 20 μm. Image credit: Hadar Ahituv and Yoel Melamed.

The 780,000-year-old basalt tools were discovered at the early Middle Pleistocene site of Gesher Benot Yaakov, located on the shores of ancient Lake Hula.

They were examined by a team of researchers led by Bar-Ilan University. Dr. Hadar Ahitub.

“Our study contradicts the prevailing theory that ancient humans' diets were primarily based on animal protein, as suggested by the popular 'Paleo' diet,” the scientists said. Ta.

“Many of these diets are based on interpretations of animal bones found at archaeological sites, and very little plant-based food has been preserved.”

“However, the discovery of starch granules in ancient tools provides new insight into the central role of plants, especially the carbohydrate-rich starchy tubers, nuts and roots essential to the energy needs of the human brain. I got it.”

“Our research also focuses on the sophisticated methods that early humans used to process plant materials.”

The authors recorded more than 650 starch granules in basalt maces and anvils, tools used to crack and crush Gesher Benot Yaakov's plant foods.

These tools are the earliest evidence of human processing of plant foods, and were used to cook a variety of plants, including acorns, grains, legumes, and aquatic plants like yellow water lilies and the now-extinct water chestnut. was used to.

They also identified microscopic debris such as pollen grains, rodent hair, and feathers, supporting the reliability of the starch findings.

“This discovery highlights the importance of plant foods in the evolution of our ancestors,” Dr. Ahitub said.

“We now know that early humans collected a wide variety of plants throughout the year and processed them using tools made of basalt.”

“This discovery opens a new chapter in the study of the deep relationship between early human diets and plant-based foods.”

The findings also provide insight into hominin social and cognitive behavior.

“The use of tools to process plants suggests a high degree of cooperation and social structure, as hominins operated as part of a larger social group,” the researchers said.

“Their ability to exploit diverse resources from both aquatic and terrestrial environments demonstrates a deep knowledge of their surrounding environment, similar to that of modern humans today.”

“This discovery is an important milestone in the field of prehistoric research, providing valuable evidence about the diet of our ancient ancestors and providing new perspectives on human evolution and the development of complex societies.”

Regarding this research, paper this week, Proceedings of the National Academy of Sciences.

_____

Hadar Ahitub others. 2025. Starch-rich plant foods 780,000 years ago: Evidence from Acheulean impact stone tools. PNAS 122 (3): e2418661121;doi: 10.1073/pnas.2418661121

Source: www.sci.news

Unlocking the potential of your brain community could hold the key to anti-aging. Here’s why.

Good neighborhoods are defined by the people who reside there. The presence of a helpful individual can enhance the community, while a negative neighbor can detract from its overall quality. The same concept applies to the brain, as revealed in a recent study indicating that brain cells behave like communities. Some cells contribute to a nurturing environment, promoting health and resilience in adjacent cells, while others spread stress and damage like bad neighbors.

Throughout one’s life, the composition of this brain community influences the aging process. Negative relationships can accelerate aging and lead to issues such as memory loss, while a healthy brain community can work collectively to combat aging. Researchers at Stanford University believe that these findings could potentially inform the development of treatments to slow or reverse aging.

Published in the journal Nature, the study identified 17 cells that influence aging positively or negatively. Notably, T cells and neurons were highlighted for their significant impact as bad and good neighbors, respectively. T cells, typically involved in fighting infections, can contribute to inflammation in the brain and hasten aging, while neural stem cells play a vital role in rejuvenation and maintaining a youthful brain.

The researchers conducted gene activity mapping across 2.3 million cells in the mouse brain, constructing a “spatial aging clock” to predict the biological age of individual cells. This innovative approach could lead to new biological discoveries and interventions, such as inhibiting pro-aging factors released by T cells or enhancing the efficacy of neural stem cells.

These findings have implications for understanding diseases like Alzheimer’s and potential strategies to strengthen the brain’s natural repair mechanisms and prevent cognitive decline. The research offers hope for uncovering ways to support brain health and combat aging-related challenges.

Read more:

Source: www.sciencefocus.com

Scientists may have uncovered the key to solving a significant weight loss mystery

When it comes to weight loss, one universal truth stands out: losing body fat is challenging, and keeping it off can be even more difficult. A recent study may shed some light on why this is the case: adipose tissue, or body fat, retains a sort of “memory” even after cells have become obese.

“This discovery potentially helps explain the changes that occur in adipose tissue during weight fluctuations,” explained Dr. Ferdinand von Mayen, an assistant professor at ETH Zurich’s Faculty of Health Sciences and Technology, in an interview with BBC Science Focus.

Dr. von Mayen and his team observed transcriptional changes in human cells, which are responsible for regulating genetic material, in individuals’ adipose tissue before and after a 25 percent reduction in BMI. “We found that even after weight loss, the genetic regulation in adipose tissue did not fully return to normal, indicating that the body is programmed to regain lost weight,” he added.

While this news may be disheartening for those on a weight loss journey, Dr. von Mayen hopes that this study will help destigmatize weight fluctuations. “There is a molecular mechanism at play that influences weight regain, and it’s not simply a matter of willpower,” he emphasized.

He also stressed the importance of prevention in addressing the global obesity epidemic. “Early intervention is key, as it is much harder to lose weight once it has been gained. Implementing healthier lifestyle choices at a societal level is crucial in combating this issue,” Dr. von Mayen noted.

About our experts

Dr. von Mayen: I specialize in researching obesity and metabolic diseases at the Nutritional and Metabolic Epigenetics Laboratory at ETH Zurich.

Read more:

Source: www.sciencefocus.com

New findings suggest that the key to stress tolerance lies in the microbiome

Recent studies have revealed the significant role of the gut microbiome, a vast community of microorganisms residing in the digestive tract, in influencing the body’s response to stress.

A new investigation published in Cell Metabolism proposes that gut microbes greatly impact the body’s circadian rhythm, particularly in managing stress levels throughout the day.

The research indicates that the activity and composition of gut microbes naturally vary, affecting the release of stress-regulating hormones like adrenaline and cortisol.

This breakthrough has sparked hopes among researchers of utilizing microbes as potential remedies for mental health conditions. According to Professor Paul Ross, Director of APC Microbiome Ireland, this study represents a significant advancement in comprehending the microbiome’s impact on mental well-being.

A disturbance in the microbiome balance can disrupt the body’s circadian rhythm, leading to sleep disturbances, immune system issues, and metabolic changes, affecting stress hormone release.

One particular microorganism, Lactobacillus, is believed to play a crucial role in regulating stress hormones.

The study’s lead author, Dr. Gabriel Tofani, emphasized the gut microbiota’s role in sustaining the body’s natural stress regulation processes.

To demonstrate the connection, researchers administered antibiotics to mice to reduce their microbiome, observing alterations in the release rhythm of the stress hormone corticosterone.

This research lays the groundwork for potential treatments targeting mental health conditions by understanding the intricate relationship between the gut and the brain and its impact on the body’s stress response.

Professor Ross highlighted the potential of microbiome-based interventions in enhancing mental health, noting that this study brings us closer to achieving that objective.

Read More:

About the Experts:

Dr. Gabriel Tofani: A researcher at Cork University in Ireland, focusing on circadian rhythms, stress, and gut microbiota.

Professor Paul Ross: Director of APC Microbiome Ireland, conducting research on the human microbiome, bacterial competition, physiology, and genetics.

Source: www.sciencefocus.com

Mastering the Art of Patience: A Game Where Waiting is Key

aPatience may not always be easy to practice, especially during mundane and tedious moments. However, there can still be joy and peace found in the simplicity of everyday life. Optiillusion introduces a tongue-in-cheek patience simulator called While Waiting to capture this unique experience. Producer Dong Zhou explains, “While we’re waiting for things like buses, stuck in traffic, or standing in line, we often seek entertainment. Most people just resort to using their phones, but is that truly engaging? It’s time to turn waiting into a playful experience by turning mundane moments into a fun game where players can find ways to pass the time.”

While waiting. Photo: Optillusion Games

While Playing, players join Adam on his journey through mundane tasks like waiting for a bus, standing in line for a ride, or watching the rain from a window. Zhou states, “Waiting isn’t just a negative experience; it’s a part of life that comes with its own set of expectations and anxieties.” As Adam’s experiences evolve from simple pleasures to deep aspirations, the game becomes a story of personal growth. “In different waiting scenarios, Adam feels a range of emotions like happiness, relief, or sadness. However, he understands that waiting is the only option,” Zhou adds.

Through whimsical depictions of scenarios like elevator lobbies, doctor’s offices, and airport baggage claims, While Waiting presents a series of patience-testing challenges that resonate with common frustrations. While a sense of fatalism looms, the game incorporates profound reflections on life alongside playful anime humor. Zhou hopes players will not only find amusement but also ponder the deeper meanings interwoven within the game.

To ease the restlessness that waiting brings, While Waiting offers various mini-games to help pass the time, such as luggage stacking or filling out paperwork. Zhou explains, “These mini-games can range from arcade games to puzzles or action games, each level offering a unique experience. While players won’t win cash prizes, the games are designed to keep them entertained while waiting for time to pass. Whether you choose to act or not, the game’s theme revolves around the inevitability of waiting.”

Drawing inspiration from classic animated comedies like “Tom and Jerry,” While Waiting incorporates orchestral music that emphasizes the contemplative and whimsical aspects of this patient journey. The brass and string instrumentation offers a musical reprieve from the discomfort of inaction in daily life.

Despite its quirky and light-hearted nature, While Waiting delves into profound themes. As players approach the conclusion, they revisit earlier scenes and contemplate the cyclical nature of life with fresh insights and emotions. Zhou concludes, “Life is a mix of joy and sorrow, and I hope players will appreciate the value of each waiting moment they encounter.”

Skip Newsletter Promotions

While Waiting is set to launch on PC later this year

Source: www.theguardian.com

Key gap in pterosaur evolution filled by fossil dating back 150 million years

New genus and species of monophenestratan pterosaur named Propterodacillus frankellae It documents the transition from the older rhamphorhynchoid pterosaurs to pterodactyloids.

The holotype Propterodacillus frankellaeImage credit: Frederik Spindler, doi: 10.26879/1366.

Propterodacillus frankellae It lived about 150 million years ago, during the Kimmeridgian stage of the Late Jurassic Period.

This flying reptile had a moderately long skull, about 9 centimetres (3.5 inches) long, and an estimated wingspan of about 55 centimetres (21.7 inches).

This species also had a very short tail and a small but functional fifth finger with two phalanges.

Propterodacillus frankellae a kind of Monophenestratan (Monofenestrata) is a large group of pterosaurs that includes the family Turconopteridae and the suborder Pterodactyloidea.

“As the earliest actively flying vertebrate lineage, pterosaurs were highly successful in evolution throughout the Mesozoic Era.” Dr. Frederick Spindler “The dinosaur museum's Altmühlthal writes in the new paper:

“For most of the long history of research, every specimen could be classified as belonging to one of two major types: the more ancestral long-tailed Rhamphorhynchioidea and the derived short-tailed Pterodactyloidea.”

“The rare anurognathids, the only short-faced pterosaurs, have similarly short tails but otherwise look like rhamphorhynchids and are therefore generally thought to have been deep-nesting rhamphorhynchids.”

“True intermediate, and therefore plausible transitional, forms between the major types were unknown until the discovery of the Curculionoptera.”

The fossil, named the Painten protterosaur, was discovered beneath the Rigol limestone quarry near Painten in Bavaria, Germany.

The specimen consists of a complete and fully articulated skeleton with soft tissue remaining in the radial fibrils of the torso and wings.

Propterodacillus frankellae It is contemporary with the oldest Archaeopteryx “It came from a nearby basin,” the paleontologists wrote in their paper.

According to Dr Spindler, the discovery fills one of the largest knowledge gaps in the evolution of pterosaur morphology.

Propterodacillus frankellae “It's a near-perfect mix of rhamphorhynchoid, curcunopteroid and derived pterodactyloid pterosaur features,” he said.

“Similarities with the derived Pterodactyloidea include the shape of the skull and the short tail.”

“For example, the ancestral traits shared with the Turconogopteridae family are Propterodactyl The most distinctive features of this pterosauroidea animal are its functional fifth toe and long caudal snout.”

“Intermediate conditions apply for neck extension, metacarpal extension, and shortening of the fifth toe.”

of paper Published online in the journal Palenitrogy Electronica.

_____

Frederick Spindler. 2024. Pterosaur articulation from the Late Jurassic of Germany. Palenitrogy Electronica 27(2):a35; doi:10.26879/1366

Source: www.sci.news

Curious about the effects of AI on government and politics? Bots hold the key

circlehat Intention How will AI affect jobs? After “Will AI destroy humanity?”, this is the most important question about technology and it remains one that is extremely difficult to pin down, even as the frontier moves from science fiction to reality.

At one extreme there is the somewhat optimistic assertion that new technologies will simply create new jobs. At the other extreme there are fears that companies will replace their entire workforce with AI tools. The debate is often about the speed of the transition rather than the end state. A cataclysmic change that is completed in a few years is devastating to those caught in the middle, whereas a cataclysmic change that takes 20 years may be survivable.

Even the parallels with the past are not as clear-cut as we would like: the internal combustion engine eventually put an end to horse labor, but the steam engine, on the other hand, had a much bigger impact. increase Number of draft animals employed in the UK. Why? The arrival of the railways increased freight traffic in the country, but deliveries could not be completed from warehouse to doorstep. Horses were needed to do the things that steam engines could not do.

Until it isn’t.

Steam power and the internal combustion engine are examples of general-purpose technologies, breakthrough technologies that revolutionize the entire structure of society. There are not many such technologies, even if you count from writing, or even before that, from fire itself. It is pure coincidence that the initial letters of the term “Generative Pretrained Transformer” are the same, which is why GPT looks like GPT.

That’s not a job, idiot

Humans are not horses, and AI tools are not humans.

Humans are not horses [citation needed]It seems hard to believe that AI technology will be able to do everything humans can do. Becoming HumanThis is an inconveniently circular argument, but an important one: horses still race, because if you replace horses with cars, it’s no longer a horse race. [citation needed]people will still provide the services they want for one reason or another, and as culture warps around the rise of AI, some of those services will teeth You might be surprised. For example, AI in healthcare is underrated because for many people, the “human touch” is bad The problem is the doctor who worries they are judging your drinking, or the therapist who lies to you because they want you to like them.

As a result, many people like to think in terms of “tasks” rather than jobs: take a job, define it in terms of the tasks it contains, and ask whether an AI can do them. In doing so, we can identify some jobs that are at risk of being completely cannibalized, some jobs that are perfectly safe, and a large intermediate group of jobs that will be “impacted” by AI.

It’s worth pointing out an obvious fact: this approach results in a higher number of jobs that are mechanically “influenced” and a lower number of jobs that are “destroyed.” (Even the jobs most influenced by AI are likely to have some tasks that the AI ​​finds difficult.) That may be why the technique was pioneered by OpenAI, who in a 2023 paper wrote: The researchers in the lab:“80% of workers are in occupations where at least 10% of the work requires a law degree, and 19% of workers are in occupations where more than half of the work requires a law degree.”

The report claimed between 15 and 86 professions were “completely at risk”, including mathematicians, legal secretaries and journalists.

I’m still here. But a year on, the idea is trending again, thanks to a paper from the Tony Blair Institute (TBI). The giant think tank, powerful and influential even before Labour’s landslide victory two weeks ago, is now seen as one of the architects of Starmerite thought. And it believes the public sector is ripe for disruption through AI. According to the TBI paper: The potential impact of AI on the public sector workforce (pdf):

More than 40% of the tasks performed by public sector workers could potentially be partially automated through a combination of AI-based software, such as machine learning models and large-scale language models, and AI-enabled hardware, ranging from AI-enabled sensors to advanced robotics.

Governments will need to invest in AI technology, upgrade data systems, train employees to use the new tools and cover the redundancy costs of early retirement – costs that are estimated to amount to £4 billion under ambitious implementation plans.That averages $1 billion a year for the term of this Congress.

Over the past few weeks TechScape has been keeping a close eye on the new Government’s approach to AI. Tomorrow, the King’s Speech is expected to announce the AI Bill, and we will hear more. The TBI paper makes one takeaway worth watching: Will investment in transformation approach £4 billion a year? There is a lot that can be done for free, but much more could be done with more money. The institute estimates that spending would return more than nine times, but a £20 billion bill would be hard to get through Parliament without question.

AI Geek

Prime Minister Tony Blair spoke at the Tony Blair Institute’s Britain’s Future conference on 9 July. Photo: Yui Mok/PA

The report drew renewed attention over the weekend as critics took issue with its methodology. From 404 Media:

The problem with this prediction is that POLITICO, Technology

Breaking down work into tasks is already done by a huge database created by the US Department of Labor. But with 20,000 such tasks, describing which ones should be exposed to AI is a daunting task. In a similar paper from OpenAI, “the authors personally labeled a large sample of tasks and DWAs, and hired experienced human annotators who reviewed the output of GPT-3, GPT-3.5, and GPT-4 as part of OpenAI’s tuning efforts,” but they also had the then-new GPT-4 perform the same tasks and found a 60-80 percent match between robots and humans.

Skip Newsletter Promotions

Source: www.theguardian.com

Unveiling the key components that influence your microbiome and well-being

The rats in John Cryan's lab were withdrawn and anxious, behaving in ways that mirrored those who had been bullied at work and who feared they might encounter the bully again.

Believe it or not, the good news is that they fed some of these rodents a slurry of microbes extracted from their own feces. This may sound unpleasant, but it had a surprisingly positive effect on their behavior. “That was surprising,” says Cryan, a neurobiologist at University College Cork in Ireland. “We found that the behavioral changes that were induced by stress were normalized, and they started to behave like normal animals.”

Even more surprising, the mental changes weren't brought about by changes to gut bacteria, but by modifying another key aspect of the microbiome whose importance is only now being recognized: viruses.

After all, our bodies are full of these viruses – trillions of stowaways that do no harm to our health, but instead play a key role in nurturing a beneficial microbiome and making us healthier. Recent studies have found that the influence of this “virome” can be found throughout the body, from the blood to the brain. The hope is that tweaking it might lead to new ways of treating a variety of ailments, from inflammatory bowel disease and obesity to anxiety.

Microbiome Diversity

Over the past decade, there has been a surge in interest in the microbiome (all the tiny organisms that live on and in our bodies), but that interest has focused primarily on bacteria. Until recently, the assumptions were that…

Source: www.newscientist.com

Astrobiologists pinpoint five key greenhouse gases found on terraformed exoplanets

The five man-made greenhouse gases identified by astrobiologist Edward Schwieterman of the University of California, Riverside, and his colleagues could be detected in relatively low concentrations in exoplanet atmospheres using the NASA/ESA/CSA James Webb Space Telescope and future space telescopes.

Diagram of the technological features of various planets, including artificial atmospheric gases. Image courtesy of Sohail Wasif / University of California, Riverside.

“For us, these gases are bad because we don’t want them to accelerate warming,” Dr Schwietermann said.

“But they could be useful to a civilization wanting to halt an impending ice age, or to terraform uninhabitable planets in its own system, as humanity has proposed for Mars.”

“These gases are not known to occur in large quantities in nature, so they have to be manufactured.”

“Finding them would therefore be evidence of the presence of intelligent, technological life forms. Such evidence is called a technosignature.”

The five gases proposed by the authors are used on Earth for industrial purposes, such as making computer chips.

These include fluorinated versions of methane, ethane and propane, as well as gases made of nitrogen and fluorine, or sulfur and fluorine.

One advantage is that it’s a very effective greenhouse gas — sulfur hexafluoride, for example, has a warming power 23,500 times that of carbon dioxide — and even a relatively small amount could heat a frozen planet to the point where liquid water could remain on the surface.

Another advantage of the proposed gas, at least from an alien perspective, is that it is extremely long-lived, surviving in an Earth-like atmosphere for up to 50,000 years.

“You won’t need to refill it very often to maintain a comfortable climate,” Dr. Schwieterman said.

Others suggest that refrigerant chemicals such as CFCs are technology signature gases because they are almost entirely man-made and visible in Earth’s atmosphere.

But unlike the chemically inert fully fluorinated gases discussed in the new paper, CFCs damage the ozone layer and may not be advantageous.

“If other civilizations had oxygen-rich atmospheres, they would have also had an ozone layer that they wanted to protect,” Dr Schwietermann said.

“CFCs will be broken down in the ozone layer while also catalyzing its destruction.”

“CFCs degrade easily and have a short lifespan, making them difficult to detect.”

Finally, for fluorinated gases to have an effect on climate, they need to absorb infrared radiation.

This absorption creates an infrared signature that can be detected by space telescopes.

Using current and planned technology, scientists may be able to detect these chemicals in nearby exoplanetary systems.

“In an Earth-like atmosphere, only one in a million molecules could be any of these gases and be detectable, and that concentration would be enough to even alter the climate,” Dr Schwietermann said.

To reach this calculation, the astrobiologists simulated a planet in the TRAPPIST-1 system, located about 40 light-years from Earth.

They chose this system because it contains at least seven rocky planets and is one of the best-studied planetary systems other than Earth.

Although it is not possible to quantify the likelihood of discovering man-made greenhouse gases in the near future, we are confident that, if they exist, they could be detected during missions currently planned to characterize the planet’s atmosphere.

“If telescopes are already characterizing planets for other reasons, there would be no need for extra effort to look for these technical features,” Dr Schwietermann said.

“And when you find them, it’s amazing.”

Team work Published in Astrophysical Journal.

_____

Edward W. Schwietermann others2024. Artificial greenhouse gases as a technological feature of exoplanets. ApJ 969, 20; doi: 10.3847/1538-4357/ad4ce8

Source: www.sci.news

Physicists at CERN successfully measure a key parameter of the Standard Model

Physicists from the CMS Collaboration at CERN’s Large Hadron Collider (LHC) have successfully measured the effective leptonic electroweak mixing angle. The results were presented at the annual general meeting. Rencontre de Morion Conference is the most accurate measurement ever made at the Hadron Collider and is in good agreement with predictions from the Standard Model of particle physics.

Installation of CMS beam pipe. Image credit: CERN/CMS Collaboration.

The Standard Model is the most accurate description of particles and their interactions to date.

Precise measurements of parameters, combined with precise theoretical calculations, provide incredible predictive power that allows us to identify phenomena even before we directly observe them.

In this way, the model has succeeded in constraining the masses of the W and Z particles, the top quark, and recently the Higgs boson.

Once these particles are discovered, these predictions serve as a consistency check on the model, allowing physicists to explore the limits of the theory’s validity.

At the same time, precise measurements of the properties of these particles provide a powerful tool for exploring new phenomena beyond the standard model, so-called “new physics.” This is because new phenomena appear as mismatches between different measured and calculated quantities.

The electroweak mixing angle is a key element of these consistency checks. This is a fundamental parameter of the Standard Model and determines how unified electroweak interactions give rise to electromagnetic and weak interactions through a process known as electroweak symmetry breaking.

At the same time, we mathematically connect the masses of the W and Z bosons that transmit weak interactions.

Therefore, measurements of W, Z, or mixed angles provide a good experimental cross-check of the model.

The two most accurate measurements of the weak mixing angle were made by experiments at CERN’s LEP collider and by the SLD experiment at the Stanford Linear Accelerator Center (SLAC).

These values ​​have puzzled physicists for more than a decade because they don’t agree with each other.

The new results are in good agreement with standard model predictions and are a step towards resolving the discrepancy between standard model predictions and measurements of LEP and SLD.

“This result shows that precision physics can be performed at the Hadron Collider,” said Dr. Patricia McBride, spokesperson for the CMS Collaboration.

“The analysis had to deal with the challenging environment of LHC Run 2, with an average of 35 simultaneous proton-proton collisions.”

“This paves the way for even more precise physics, where more than five times as many proton pairs collide simultaneously at the high-luminosity LHC.”

Precise testing of Standard Model parameters is a legacy of electron-positron collider such as CERN’s LEP, which operated until 2000 in the tunnel that now houses the LHC.

Electron-positron collisions provide a clean environment ideal for such high-precision measurements.

Proton-proton collisions at the LHC are more challenging for this type of research, even though the ATLAS, CMS, and LHCb experiments have already yielded numerous new ultra-high-precision measurements.

This challenge is primarily due to the vast background from physical processes other than those studied, and the fact that protons, unlike electrons, are not subatomic particles.

With the new results, it seemed impossible to reach accuracy similar to that of the electron-positron collider, but now it has been achieved.

The measurements presented by CMS physicists use a sample of proton-proton collisions collected from 2016 to 2018 at a center of mass energy of 13 TeV and a total integrated luminosity of 137 fb.−1 or about 11 billion collisions.

“The mixing angle is obtained through analysis of the angular distribution in collisions in which pairs of electrons or muons are produced,” the researchers said.

“This is the most accurate measurement ever made at the Hadron Collider and improves on previous measurements by ATLAS, CMS, and LHCb.”

Source: www.sci.news

The Origins of Life: Key Chemical Reactions May Have Begun in Hot, Cracked Rocks

Some amino acids can become concentrated when traveling through cracks in hot rocks.

Sebastian Kauritzky / Alamy

Chemical reactions key to the origin of life on Earth may have occurred as molecules moved along a temperature gradient within a network of cracks in thin rocks deep underground.

Such networks are thought to have been common on early Earth and may have provided a kind of natural laboratory in which many of the building blocks of life were concentrated and separated from other organic molecules.

“It’s very difficult to get a more general environment where you can do these cleansing and intermediate steps,” he says. Christophe Mast at Ludwig-Maximilians-University in Munich, Germany.

He and his colleagues created a heat flow chamber the size of a playing card to model how mixtures of organic molecules behave in cracks in such rocks.

The researchers heated one side of the 170-micrometer-thick chamber to 25°C (77°F) and the other side to 40°C (104°F), allowing molecules to move in a process called thermophoresis. This created a temperature gradient that How sensitive a molecule is to this process depends on its size and charge and how it interacts with the fluid in which it is dissolved.

During an 18-hour experiment in a heat flow chamber, we found that different molecules were concentrated in different parts of the chamber depending on their sensitivity to thermophoresis. Among these molecules are many amino acids and A, T, G, and C nucleobases, which are important building blocks of DNA. This effect was further magnified by creating a network of three interconnected chambers, with one side of the chamber network at 25°C and the other side at 40°C. Additional chambers further concentrated the compounds concentrated in the first chamber.

Mathematical simulations with 20 interconnected chambers (which may closely resemble the complexity of natural crack systems) find that the enrichment of different molecules can be further amplified Did. In one chamber, the amino acid glycine reached a concentration approximately 3000 times higher than that of another amino acid, isoleucine, even though they entered the network at the same concentration.

The researchers also demonstrated that this enrichment process can cause reactions that would otherwise be extremely difficult. They showed that glycine molecules can bind to each other when the concentration of a molecule that catalyzes the reaction called trimetaphosphoric acid (TMP) increases. Mast said TMP is an interesting molecule to concentrate because it was rare on early Earth. “Since [the chambers] Since they are all randomly connected, all kinds of reaction conditions can be implemented. ”

“It’s very interesting that within the crack there are regions with different proportions of compounds,” he says. evan sprite from Radboud University in the Netherlands was not involved in the study. “This enhancement allows us to create even more versatility from very simple building blocks.”

But enrichment in rock fractures is still far from a viable scenario for the origin of life, he says. “Ultimately, they still need to come together to form something resembling a cell or protocell.”

topic:

Source: www.newscientist.com

Breakthrough in microbiome research may hold the key to combating obesity

Recent discoveries by scientists on the human gut microbiome, which consists of microorganisms like bacteria, archaea, fungi, and viruses residing in the gastrointestinal tract, may lead to new weight loss interventions in the future.

To be presented at the European Obesity Conference (ECO), researchers have identified specific microbial species that could either increase or decrease an individual’s risk of obesity.

Through a study involving 361 adult volunteers from Spain, scientists identified a total of six main species.

The lead researcher, Dr. Paula Aranaz, who obtained her PhD from the Nutrition Research Center of the University of Navarra, explained, “Our findings highlight the potential role of imbalances in various bacterial groups in the development and progression of obesity.”


undefined


Participants were categorized based on their body mass index: 65 were of normal weight, 110 were overweight, and 186 were obese. Genetic microbiota profiling was conducted to analyze the type, composition, diversity, and abundance of bacteria present in their fecal samples.

The study found that individuals with higher body mass index had lower levels of Christensenella Minuta, a bacterium associated with weight loss in other studies.

<.p>Interestingly, there were gender-specific differences in the findings. For men, the species Parabacteroides hercogenes and Campylobacter canadensis were linked to higher BMI, fat mass, and waist size. On the other hand, for women, the species Prevotella copri, Prevotella brevis, and Prevotella saccharolytica predicted obesity risk.

According to Aranaz, “Fostering certain bacterial types in the gut microbiota, like Christensenella Minuta, may protect against obesity. Future interventions aimed at altering bacterial strains or bioactive molecules levels could create a microbiome resistant to obesity.”

While the study focused on a specific region of Spain, factors such as climate, geography, and diet could influence the results. These findings could lead to tailored nutritional strategies for weight loss that take into account gender differences.

About our expert:

Paula Aranaz is a researcher at the Nutrition Research Center of the University of Navarra in Spain, focusing on bioactive compounds to prevent and treat metabolic diseases. Her research has been published in journals like International Journal of Molecular Science, Nutrients, and European Journal of Nutrition.

Read more:

Source: www.sciencefocus.com

Introducing Galaxy Squad: Key Laptop Trends for 2024 including Dynamic Displays and AI Optimization

The promise of remote work in today’s world is enticing – being able to work from anywhere, whether it’s a coffee shop in Manhattan or a beach in Bali, thanks to the power of your laptop. But in reality, our laptops may be outdated, slow, and incapable of keeping up with the demands of modern technology. Now is the perfect time to consider upgrading to the new Samsung Galaxy Book4 series, built to meet the challenges of 2024.

Whether you’re crunching numbers, editing videos, or unwinding with games, the evolving trends in laptop technology are worth noting.

Thinner, faster, quieter – Today’s laptops are impressively thin yet powerful, with the Samsung Galaxy Book4 Pro and Ultra series leading the way with slim profiles and robust performance, including dedicated graphics cards for gaming on the go.

Incredible screen – The Galaxy Book4 Pro and Ultra models boast cutting-edge 16-inch dynamic AMOLED 2X touchscreens that offer vibrant colors, crisp visuals, and adaptive display technology to optimize viewing in any environment.

Flexible form – Modern laptops like the Galaxy Book4 360 series offer convertible designs that allow for seamless transitions from laptop to tablet mode, complete with touch-enabled screens and stylus support for note-taking and sketching.

AI revolution – Intel Core Ultra processors powering the Galaxy Book4 series feature dedicated neural processing units for handling AI workloads efficiently, enabling users to leverage AI-driven features like Microsoft’s Copilot for enhanced productivity.

All about the ecosystem – Today’s laptops are part of a larger digital ecosystem, with seamless integration between devices like the Galaxy Book4 series and Samsung Galaxy smartphones, offering mobile connectivity, data sharing, and enhanced productivity tools for users on the go.

Ready to embark on your digital nomad journey with the Samsung Galaxy Book4 series? samsung.com/uk/galaxy-book

© Intel Corporation. Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.

AI Features may require the purchase, subscription, or activation of additional software by the software or platform provider and may have specific configuration or compatibility requirements. intel.com/performance index. Your results may vary.

1 Adobe subscription required.
2 Requires a Galaxy smartphone with One UI 1.0 or later.
3 A Microsoft account is required.
4 A Samsung account is required.
Features available 5 March.

Source: www.theguardian.com

Key takeaways from the initial week of Mike Lynch’s fraud trial in the US | Autonomy

Mike Lynch, known as ‘Britain’s Bill Gates’ and the top technology entrepreneur in Britain, reached the pinnacle of his career when he transformed his software company into an $11bn (£8.6bn) acquisition by a Silicon Valley giant. More than a dozen years later, the acquisition has become the focus of a trial in San Francisco that began last Monday.

Lynch is facing 16 charges of wire fraud, securities fraud, and conspiracy by U.S. authorities, alleging that Hewlett-Packard’s purchase of Autonomy was based on deceitful information. If found guilty, he could be sentenced to up to 25 years in prison. Lynch has pleaded not guilty.


The trial will center on the events of 2011 when HP acquired Autonomy. In the coming weeks, jurors will hear from numerous witnesses in a courtroom directly above the former Autonomy skyscraper site in San Francisco.

Once hailed as “Britain’s Bill Gates,” Lynch spent the first week of his trial quietly listening as federal prosecutors targeted his former empire. He occasionally interacted with his lawyer or worked on his laptop, at times wearing a smile.

1. 2011 Revisited

In 2011, David Cameron was still in office, Barack Obama was president, and movie buffs were enthralled by the final Harry Potter film.

Lynch has consistently claimed that HP mishandled the Autonomy acquisition, leading to its downfall. However, Judge Charles Breyer ruled that the trial’s focus should not include the aftermath of the deal.

Explaining financial transactions and complex arguments from over a decade ago to a new jury presents a significant challenge.

The trial started with the prosecution highlighting a crucial meeting in early 2011 where Lynch allegedly misled HP executives about Autonomy’s success, leading to the $11 billion fraud accusation.

The defense painted Lynch as a tough but brilliant inventor who delegated tasks to talented managers, minimizing his involvement in daily operations.

2. Simplifying the Complex

Government prosecutors accused Lynch of repeatedly lying to investors and auditors, orchestrating a multi-year fraud through deceptive accounting practices.

As the trial progresses, Lynch’s team plans to portray him as a hands-off leader who was unfairly blamed for HP’s struggles and the Autonomy deal.

Source: www.theguardian.com

Safely Viewing the April Solar Eclipse: Tips on Using Eclipse Glasses and Identifying Key Features

Use special eclipse glasses to prevent eye damage

Gino Santa Maria/Shutterstock

Watching a total solar eclipse is an experience you’ll never forget, but if you don’t take the right precautions, it could end up for the wrong reasons. Looking directly at the sun can be dangerous, so read on to learn how to safely observe a solar eclipse and what you need to prepare in advance.

On April 8, 2024, a total solar eclipse will be visible to more than 42 million people across North America. The total path is only about 185 kilometers wide and touches Mexico, 13 U.S. states, and parts of Canada. Most people in North America will experience this phenomenon as a partial solar eclipse, rather than a total solar eclipse.

“For those outside the path of totality, the moon will never completely cover the sun,” he says. Jeff Todd At Prevent Blindness, a Chicago-based eye care advocacy group. No matter how you look at it, eye protection is essential.

“To avoid damaging your eyes, you should wear eclipse glasses throughout the eclipse,” says Todd. Otherwise, you risk burning your retina. This phenomenon, also known as “eclipse blindness,” can occur painlessly and can be permanent. It may take several days after seeing a solar eclipse before you realize something is wrong. Sunglasses do not provide sufficient protection. However, it is perfectly safe to wear eclipse glasses over your prescription glasses.

How to safely view a solar eclipse

The prize for those traveling the path of totality is seeing the sun’s corona with the naked eye. However, it is only visible for a short few minutes during totality. Otherwise, partial phases will be visible and must be observed through eclipse glasses. Todd says people on the path to totality should wear eclipse glasses at all times, except during totality, a brief period of darkness when the sun is completely hidden by the moon. “Only then can you take off your eclipse glasses,” he said.

It is important for those in the path of totality to use their naked eyes to view the Sun during a total solar eclipse. “You have to look without a protective filter, otherwise you won’t see anything,” he says. ralph chow At the University of Waterloo, Canada.

solar eclipse 2024

On April 8th, a total solar eclipse will pass over Mexico, the United States, and Canada. Our special series covers everything you need to know, from how and when to see a solar eclipse to the strangest solar eclipse experience of all time.

Just before totality ends, light from the Sun’s photosphere flows between the Moon’s peaks and valleys. Called Bailey beads, they appear for a few seconds and eventually become a flashing “diamond ring,” exposing enough of the sun’s photosphere for sunlight to return. “It gives us ample warning that it’s time to resume viewing partial solar eclipses with protective filters,” Chow said.

Which solar eclipse glasses should I buy?

It is important to wear eclipse glasses that meet the ISO 12312-2 international standard. ISO 12312-2 applies to products used for direct viewing of the sun. “Look for the ISO standard label and buy your glasses from a trusted source,” says Todd. “Get your glasses early in time for the eclipse.” Before you buy, make sure the company or brand is listed on the American Astronomical Society’s site. A vetted list of suppliers and resellers.

Do not use Eclipse glasses with binoculars or telescopes. If you want to use these instruments to observe a solar eclipse, you’ll need to attach a solar filter over the objective lens (the lens opposite the one you’re looking through). Never place solar filters or eclipse glasses between the telescope eye and the eyepiece or binocular eyecup.

Another way to safely view the eclipse is with a pinhole projector. This is a simple device that projects an image of the sun onto paper or cardboard through a small hole. An even easier method is to use a colander or a small hole in a spaghetti spoon. This projects a small crescent sun onto every surface.

topic:

  • solar eclipse/
  • solar eclipse 2024

Source: www.newscientist.com

The reasons behind diet failures, as explained by a weight loss surgeon, and the key to successful eating.

Perhaps many of us have attempted to reduce our waistline by watching our calorie intake or hitting the gym, only to find little success. Should we be doing things differently?

According to Dr. Andrew Jenkinson, a consultant bariatric surgeon at University College London Hospital and the author of “Why do we eat (overeating)?” and “How to eat (and still lose weight)”, losing weight has more to do with eating foods that manage levels of leptin hormone rather than focusing on calorie counting or exercise.

So what exactly is leptin, and how does it work? Dr. Jenkinson shared insights in a recent discussion with us about leptin, food consumption, and strategies to reduce appetite.

When it comes to the problem of obesity, Dr. Jenkinson views it as a major health and economic issue that could lead to the collapse of healthcare systems. He highlighted the prevalence of obesity-related conditions such as type 2 diabetes, high blood pressure, sleep apnea, joint problems, and an increased risk for cancer.

Leptin, a hormone secreted by fat cells, regulates body weight by signaling the hypothalamus, the weight control center of the brain. However, certain foods can block leptin signals, such as sugar, refined carbohydrates, and processed foods, which increase insulin levels and block leptin.

Dr. Jenkinson emphasized that the concept of calories alone is not an effective approach to weight loss. Instead of focusing on calorie counting, he suggested avoiding foods that negatively impact insulin, which can shift the weight set point downward without significant effort.

In terms of exercise, Dr. Jenkinson explained that intense exercise can burn calories, but it can also lead to increased hunger and decreased metabolic rate if not balanced with calorie restriction. He recommended a combined approach of calorie restriction and intense exercise to achieve effective weight loss.

This interview with Dr. Andrew Jenkinson has been edited for clarity and length.

Dr. Andrew Jenkinson is a Consultant in Bariatric (Weight Loss) and General Surgery at University College London Hospital and the author of “Why do we eat (overeating)?” and “How to eat (and still lose weight).”

Source: www.sciencefocus.com

Innovate with Azure: 5 Key Factors to Ensure Your Business’ Cloud Platform is Future-Proof



The world is on the brink of a productivity revolution

The world is on the brink of a productivity revolution, as artificial intelligence (AI) creates a new wave of opportunity for businesses of all sizes. Whether it’s using chatbots, more advanced AI, uncovering deeper insights into customer needs, or speeding up product development, you’re missing out on the improved outcomes that AI can bring. No company wants that. For some organizations, generative AI tools are emerging, such as: Chat GPT and Daruiis increasingly making business cases for adopting AI strategies to generate content and images. But while business leaders want to maximize the benefits of technology, they also need to understand the broader responsibilities that come with it (including considerations around data privacy, unintentional bias, copyright infringement, etc.) and how to do so. You also need to. Most of the opportunities are rapidly evolving. To help board executives and IT leaders drive success with their AI strategies, Michael Wignall, director of infrastructure for Microsoft’s Azure Business Customer Success unit, recommends what leaders need to do before leveraging AI. Here are the first five steps you should take.

1. Make AI part of a broader cloud computing strategy

First and foremost, Wignall says companies should consider working with established technology providers. AI works best when it’s part of a broader cloud computing strategy. This means IT operations are outsourced to an outside company that operates the data center. Microsoft Azure he says. “AI is born in the cloud. To take advantage of this wave of innovation, you need to be in the cloud,” he added. He points to his three main components of AI: computing power, data, and algorithms, all of which are best provided through cloud services. He believes companies should adopt a “cloud-native” approach, where the entire AI infrastructure is built on a cloud platform. Such an approach offers many benefits, including: Reduce costs by paying only for the resources you use, rather than maintaining and updating expensive on-premises equipment. Flexibility and scalability. Customers can easily add or remove resources as needed. Access to enhanced security tools. This allows you to better detect, assess, and alert on threats to your customers’ data. As with cloud data, you can easily back up your data and quickly restore it in the event of a failure or disaster.

2. Find the data

Next, businesses need to have a solid understanding of where their data resides within their organization and move it to cloud platforms. The success of AI depends on analyzing relevant data at scale. To fine-tune AI for best performance, AI should be powered by your company’s own data from customer lists, inventory, sales information, financial data, and other key data. “It’s important to make sure your data platform and data strategy is the best it can be, and that you know where your data is and how to access it,” he said.

3. Protect your data

Once the cloud infrastructure is in place and the associated data has been migrated, the next critical step is to secure that data. With all of a company’s important data in one place (the cloud), it’s important to have peace of mind in the presence of multiple threats, such as hackers. “Make sure you’re protected with best-in-class security features, clearly defined policies and governance around who can access your data, and the ability to audit how your data is handled,” he said.

4. Decide which functions and tasks to use AI for

Once the infrastructure, data, and security are in place, companies can move on to determining the best uses for AI, such as automating office processes, extracting insights from data, and handling copywriting and a variety of other tasks. Masu. For the past five years, general AI has provided so-called “cognitive services” such as data analysis and product recommendations. Generative AI takes technology to a new level. With a few keystrokes, users can create content such as reports, ads, images, copy, automated emails, and personalized user connections. Generative AI can also analyze large volumes of documents, call center logs, and financial results and summarize information with short precision.

5. Implement a responsible AI policy

Once a company takes these steps, it is ready to deploy an AI strategy. However, before launching, companies should ensure they have responsible AI policies in place across the board. Businesses ensure that AI is free from embedded bias, that there is good governance around its use, that AI is used ethically, and that there are no unintended or undesirable consequences is needed. Microsoft provides responsible AI policy guidance and provides tools to check for bias, ensure inappropriate data is filtered out, and perform sentiment checks to scrutinize output. Ultimately, however, it is essential that companies ensure they have responsible AI policies in place. While many organizations are just beginning their AI journey, Wignall summarizes the mindset companies should adopt when considering AI: Partnership is key. Cloud is the key. Prioritize the business benefits that matter to your organization. And start today.


Source: www.theguardian.com

Bulgarian Yogurt: A Key Factor in Colonizing Mars?

space yogurt

Could Bulgarian yogurt improve astronauts' performance during Mars missions?asks Isabella Shopova, Diana Bogeva, Maria Yotova, and Svetla Danova in a study about that name published in “. ethnic food journal.

Researchers had seven people make and eat Bulgarian-style yogurt. Lactobacillus delbrueckiiSubspecies bulgaricus and thermophilus. At the time, the seven were members of a “team of analog astronauts participating in a two-week analog mission in a closed Mars-like environment at the Mars Desert Research Station in the Utah desert, USA.”

Most of these astronauts who stayed on Earth were not simply yogurt eaters. The study found that “five out of seven crew members had previously consumed yogurt in some form.”

The experiment extends a research tradition in which Bulgarian yogurt was ingested during the space flight of the second Bulgarian astronaut to date, as well as the crew of a 150-day voyage to Antarctica, and 56 of the Bulgarian Air Force. Also eaten by pilots and similar people. Number of volunteers in “simulated shipwreck situation”.

Scientists ahead of the Mars mission reported success, saying: “Bulgarian yogurt has proven to be a valuable food product for colonization of Mars due to its long shelf life and probiotic properties.” Reporting. This “underlines the versatility of Bulgarian yogurt,” they say. They outline the hope that further research will provide insight into changes in gut microbiome diversity and “flatulence frequency.”

in the name of science

Taken together, the scientific names of living organisms are a hodgepodge. Richard Wakeford warns of feedback on attempts. Proceedings of the Royal Society Bto enjoy the diversity.

In their paper, “Zoo naming: Creativity, culture, and influence in the formation of scientific names.'', Stephen B. Heard of the University of New Brunswick and Julia J. Mlinarek of the Insectarium de Montréal, in Canada, survey the diversity and lament its difficulties.

Source: www.newscientist.com