Historic High Seas Treaty Now in Effect: First-Ever Protection for Ocean Waters

Antarctic Trawler

Shutterstock

A groundbreaking treaty aimed at protecting the high seas has officially entered into force, marking a significant moment in marine conservation.

The vast expanses of the high seas, beyond a country’s 370-kilometre exclusive economic zone, are often referred to as the “Wild West” of the oceans. This region is notorious for its minimal regulations on fishing, making it a vital area that remains largely unexplored. According to recent studies, this deep-sea environment is inhabited by diverse marine organisms, with up to 95% of the habitat being vital to marine life.

In September 2025, over 60 countries ratified the UN Convention on the Conservation and Sustainable Use of Marine Biodiversity in the open ocean, which encompasses half of our planet’s surface. This historic agreement has initiated a 120-day countdown to its official implementation.

“This is one of the most important environmental agreements ever,” states Matt Frost from the Plymouth Marine Laboratory in the UK. “There was no established mechanism for creating protected marine areas on the high seas prior to this treaty.”

World-renowned marine biologist Sylvia Earle calls this treaty a ‘turning point’ in safeguarding ‘Earth’s blue heart’, which plays a crucial role in regulating climate and sustaining life.

A year remains before nations can establish protected areas under the treaty, as regulations and monitoring systems need to be finalized at the inaugural meeting of the parties in late 2026.

“This moment demonstrates that global cooperation is feasible,” says Earle. “Now we must act decisively.”

In the Atlantic, conservationists aim to safeguard unique ecosystems such as the “lost cities” formed by the seaweed mats of the Sargasso Sea, a crucial breeding ground for American and European eels, alongside the remarkable hydrothermal vent communities. Meanwhile, the Pacific Ocean conservation efforts target the Salas y Gomez and Nazca ridges, underwater mountains that serve as habitats for diverse marine species including whales, sharks, and turtles.

The treaty also envisions a shared repository for genetic resources discovered in the high seas, which could facilitate breakthroughs in medicinal research.

As maritime technology advances, fleets of factory ships are exploiting the high seas, leading to the overfishing of species and habitat destruction. This escalation threatens crucial biodiversity zones. Bottom trawling, in particular, causes severe damage to the ocean floor. Emerging techniques are being developed to fish in the “twilight zone” of mid-depth waters, between 200 and 1,000 meters, further complicating conservation efforts.

Local management organizations have noted that for two decades, there has been a call for a treaty to mitigate the overfishing of 56% of targeted fish stocks in international waters, as highlighted in recent studies.

Support for protective measures stems from the fact that 90% of marine protected areas in national waters are actively being preserved, positively influencing nearby fish populations by providing safe environments for spawning and growth.

Additionally, the 30 by 30 commitment aims to safeguard 30% of the Earth’s surface by 2030, making it essential to address the high seas for its successful realization.

Oceans currently absorb approximately 90% of the excess heat resulting from climate change. By shielding these critical areas from fishing and associated pollution, marine ecosystems can better adapt to rising temperatures.

“If you’re battling multiple afflictions, alleviating two can empower you to confront the remaining issues,” Frost asserts.

Moreover, marine ecosystems are responsible for absorbing a quarter of the CO2 emissions that contribute to climate change. Coastal environments like seagrass meadows and kelp forests are crucial carbon sinks, and activities such as the nocturnal feeding patterns of mesopelagic fish and plankton play a role in the carbon cycle.

“These species transport carbon from surface waters to deeper ocean layers, significantly influencing the carbon dynamics,” explains Callum Roberts from the Convex Seascape Survey, a global research initiative focusing on the ocean’s impact on climate change.

The treaty’s initial challenge involves identifying appropriate areas for protection, especially as species migrate in response to shifting ocean temperatures. Only 27% of the ocean floor has been thoroughly mapped.

Enforcement will also be a formidable challenge. Current marine protected areas in national waters include a significant number of “paper parks” that offer little actual protection for species.

Advancements in satellite imagery and AI technology have made it feasible to monitor vessels and detect unlawful activities. Nonetheless, enforcement will rely on member states to act against flagrant violations, including barring offending ships from their ports.

While 145 countries have signed the treaty, it is only enforceable for those that ratify it. Currently, 83 nations have adopted the treaty, with the UK, US, Canada, and Australia yet to follow suit.

“The more nations that ratify this treaty, the stronger it becomes,” says Sarah Bedorf from Oceana. “We all share the responsibility of protecting the high seas, which ultimately benefits everyone.”

Topic:

Source: www.newscientist.com

Do Data Centers’ High Energy Demands Threaten Australia’s Net Zero Goals?

The demand for electricity by data centers in Australia could triple over the next five years, with projections indicating it may surpass the energy consumed by electric vehicles by 2030.

Currently, data centers obtain approximately 2% of their electricity from the National Grid, equating to around 4 terawatt-hours (TWh). The Australian Energy Market Operator (Aemo) is optimistic about this share significantly increasing, projecting a growth of 25% annually to reach 12TWh, or 6% of grid demand by 2030, and 12% by 2050.

Aemo anticipates that the rapid expansion of this industry will drive “substantial increases in electricity usage, especially in Sydney and Melbourne.”


In New South Wales and Victoria, where the majority of data centers are situated, they contribute to 11% and 8% of electricity demand, respectively, by 2030. Electricity demand in each state is projected to grow accordingly.

Tech companies like OpenAI and SunCable are pushing Australia towards becoming a central hub for data processing and storage. Recently, the Victorian Government announced a $5.5 million investment aimed at establishing the region as Australia’s data center capital.

However, with 260 data centers currently operating across the nation and numerous others in the pipeline, experts express concerns about the implications of unchecked industry growth on energy transition and climate objectives.

Energy Usage Equivalent to 100,000 Households

The continual operation of numerous servers generates substantial heat and requires extensive electricity for both operation and cooling.

Subscribe to Climate and Environment Editor Adam Morton’s Clear Air column for insightful newsletters

Globally, the demand for data centers is growing at a rate four times faster than other sectors, according to the International Energy Agency. The number and size of centers are escalating, with large facilities becoming increasingly common.

As highlighted by the IEA, “AI-centric hyperscale data centers possess a capacity exceeding 100MW and consume energy equivalent to what 100,000 homes use annually.”

Professor Michael Blair, a mechanical engineering professor at the University of Melbourne and director of the Net Zero Australia project, stated that there is a significant connection between electricity and water usage due to cooling requirements, as servers convert electrical energy into heat.

“In confined spaces with many computers, air conditioning is required to maintain an optimal operating temperature,” he explains.

Typically, digital infrastructure is cooled through air conditioning or water systems.

Ketan Joshi, a climate analyst at the Oslo-based Australia Institute, shares that many tech companies are reporting a surge in electricity consumption compared to last year. The intensity of energy usage has also been increasing across several metrics: energy per active user and energy per unit of revenue, when compared to five years ago.

“They aren’t consuming more energy to serve additional users or increase revenue,” he asserts. “The pertinent question is: why is our energy consumption escalating?”

In the absence of concrete data, Joshi suggests that the undeniable growth in demand is likely attributed to the rise of energy-intensive generative AI systems.

“Running Harder to Stay in the Same Place”

Joshi is monitoring this issue, as data centers globally are evidenced to place substantial and inflexible demands on power grids, resulting in two significant repercussions: increased dependence on coal and gas generation, and diverting resources away from the energy transition.

While data center companies often assert they operate using clean energy through investments in solar and wind, Joshi remarks that there can often be a mismatch between their companies’ persistent reliance on the grid and their renewable energy production profiles.

“What’s the ultimate impact on the power grid?” he questions. “Sometimes, we have surplus energy, and other times, there isn’t enough.”

Skip past newsletter promotions

“So, even if everything appears favorable on paper, your data center might be inadvertently supporting fossil fuel transportation.”

Moreover, instead of renewable energy sources displacing coal and gas, these sources are accommodating the growing demands of data centers, Joshi notes. “It’s like sprinting on a treadmill—no matter how hard you run, it feels like the speed is continually increasing.”


The demand for electricity has surged to the extent that some companies have resorted to restarting their operations. Nuclear power plants in the U.S. that were once mothballed are being revived as demand for gas turbines increases. Some Australian developers are even proposing the installation of new gas generators to fulfill their energy needs.

Aemo predicts that by 2035, data centers could consume 21.4TWh, nearing the country’s annual energy consumption, comparable to that of four aluminum smelters.

Blair pointed out that AI adoption is in its infancy, and the outlook remains uncertain, as Aemo’s 2035 energy consumption scenarios range between 12TWh and 24TWh, indicating that the future might not be as expansive as anticipated.

In the National AI Plan released Tuesday, the federal government recognized the necessity for advancements in new energy and cooling technologies for AI systems. Industry Minister Tim Ayers stated that principles for data center investments will be established in early 2026, emphasizing requirements for supplementary investments in renewable energy generation and water sustainability.

“Undeniable Impact” on Electricity Prices

Dr. Dylan McConnell, an energy systems researcher at the University of New South Wales, noted that while renewable energy is on the rise in Australia, it is not yet progressing rapidly enough to meet required renewable energy and emissions targets. The expansion of data centers will complicate these challenges.

“If demand escalates beyond projections and renewables can’t keep pace, we’ll end up meeting that new demand instead of displacing coal,” he explains.

Unlike electric vehicles, which enhance demand on the grid while lowering gasoline and diesel usage, data centers do not reduce fossil fuel consumption elsewhere in the economy, according to McConnell.

“If this demand materializes, it will severely hamper our emissions targets and complicate our ability to phase out coal in alignment with those targets,” he advises.

In its climate targets recommendations, the Climate Change Agency stated: “Data centers will continue to scale up, exerting deeper pressure on local power sources and further hampering renewable energy expansions.”

McConnell asserted there will be a significant effect on overall energy costs, influencing electricity prices.

“To support this load, we will need a larger system that utilizes more costly resources.”

Source: www.theguardian.com

Many Individuals Carrying the High Cholesterol Gene Are Unaware, Study Reveals

Experts caution that you might be unintentionally increasing your risk for a hereditary condition that leads to elevated cholesterol levels, according to new findings. Familial hypercholesterolemia can remain undetected for generations, thereby heightening the risk of heart attack and stroke for affected individuals, as reported.

This condition impacts approximately 1 in 200-250 individuals globally and leads to elevated levels of low-density lipoprotein (LDL) cholesterol from birth. LDL is often referred to as “bad” cholesterol because it contributes to arterial plaque buildup. However, researchers indicate it frequently goes unnoticed by standard testing methods.

To assess how many cases of familial hypercholesterolemia remain undiagnosed, Mayo Clinic researchers conducted an analysis involving 84,000 individuals. They specifically examined exome sequencing data, a genetic test that evaluates the segments of DNA that code for proteins.

Among these participants, 419 were identified as being at risk for familial hypercholesterolemia, with 90% unaware of their condition.

Adding to the concern, one in five of these individuals had already developed coronary artery disease.

The findings suggested that these patients would likely not be identified through standard genetic testing methods.

At present, genetic testing in the United States is only conducted on those exhibiting sufficiently high cholesterol levels or possessing a recorded family history of such levels—an issue identified by Mayo Clinic researchers as a “blind spot” in national guidelines. Seventy-five percent of those diagnosed in this study would not have qualified under these criteria.

The study emphasizes that regular screenings can reveal symptoms and potentially save lives, though other researchers highlight that this is not straightforward.

“The challenge is that screening everyone who would benefit from a genetic test can be prohibitively expensive, necessitating certain thresholds,” remarked cardiometabolic medicine researcher Professor Naveed Sattar in an interview with BBC Science Focus.

“Broadening screening efforts for familial hypercholesterolemia will only be feasible if testing costs decrease significantly. Nonetheless, we still need more individuals to undergo blood tests and seek genetic evaluations.”

Most individuals with familial hypercholesterolemia exhibit no symptoms. However, Sattar points out that yellowish deposits beneath the skin or, if under 45, a grayish-white ring around the eye’s cornea can indicate the condition.

“Yet, many people have no visible signs. If there is a strong family history of early heart attacks—especially if a first-degree relative experienced one before age 50—you should consider getting a lipid test earlier than the typical midlife screening.”

The findings were published in the journal Circulation: Genomic Medicine and Precision Medicine.

Read more:

Source: www.sciencefocus.com

Revolutionary Single Treatment May Permanently Eliminate High Cholesterol

Cholesterol management may be achievable by altering just one switch in an individual’s genetic code—potentially for a lifetime.

A pilot study featured in the New England Journal of Medicine demonstrated a novel gene therapy that decreased patients’ low-density lipoprotein (LDL) cholesterol, commonly known as “bad” cholesterol, by nearly 50%, while also reducing triglycerides by an average of 55%.

If forthcoming trials yield similar results, this one-time therapy could serve as an alternative to the combination of medications that millions currently rely on to manage their cholesterol.

LDL cholesterol and triglycerides are lipids produced by the liver; however, excessive accumulation in the bloodstream can lead to fat deposits that may result in cardiovascular diseases, which account for about one-third of deaths in the United States.

“Both LDL cholesterol and triglycerides are linked to severe cardiovascular risks, such as heart attacks, strokes, and mortality,” remarked Steven Nissen, a professor of medicine at the Cleveland Clinic Lerner School of Medicine. BBC Science Focus.

Nissen was part of a research team focusing on lowering cholesterol levels by targeting the ANGPTL3 gene, associated with LDL cholesterol and triglycerides.

About 1 in 250 individuals possess a mutation that deactivates this gene, leading to lower lipid levels in their blood. Nissen noted, “Importantly, the occurrence of cardiovascular diseases in these individuals is also minimal.”

Thanks to CRISPR gene-editing technology, identifying individuals who might benefit from this mutation is no longer just a matter of chance.

CRISPR selectively modifies DNA by targeting specific genes. – Credit: Getty

Utilizing CRISPR, Nissen and his team developed a treatment to deactivate the ANGPTL3 gene in the liver, which was then infused into 15 patients during an initial safety study.

The treatment significantly reduced participants’ LDL and triglyceride levels within two weeks, and these reductions remained stable after 60 days. Nissen stated, “These changes are anticipated to be permanent.”

Healthcare professionals recommend maintaining LDL cholesterol levels below 100mg/dL to promote heart health. While lifestyle changes can assist, many individuals, particularly those with genetic tendencies to high cholesterol, find it challenging to reach this target.

While existing medications are effective, no drugs simultaneously lower both LDL cholesterol and triglycerides, often requiring patients to take multiple medications daily for life to manage their cholesterol.

“The next phase of the trial is set to commence in the coming months, involving more patients with elevated LDL cholesterol or triglycerides,” Nissen stated.

If the trials continue to succeed, this therapy could serve as a lasting solution against some of the most significant health threats globally.

read more:

Source: www.sciencefocus.com

AI Firm Secures High Court Victory in Copyright Dispute with Photo Agency

An artificial intelligence company based in London has achieved a significant victory in a High Court case that scrutinized the legality of an AI model using extensive copyrighted data without authorization.

Stability AI, led by Oscar-winning Avatar director James Cameron, successfully defended itself against allegations from Getty Images, claiming that it infringed on the international photography agency’s copyright.

This ruling is seen as a setback for copyright holders’ exclusive rights to benefit from their creations. Rebecca Newman, a legal director at Addleshaw Goddard, cautioned that it suggests “the UK derivative copyright system is inadequate to protect creators”.

There was evidence indicating that Getty Images were utilized in training Stability’s model, which enables users to generate images via text prompts. In certain instances, Stability was also found to violate Getty’s trademarks.

Judge Joanna Smith remarked that determining the balance between the interests of the creative industries and AI sectors holds “real social significance.” However, she could only address relatively limited claims as Getty had to withdraw parts of its case during the trial this summer.

Getty Images initiated legal action against Stability AI for violations of its intellectual property rights, claiming the AI company scraped and replicated millions of images with “complete indifference to the content of the training data.”


This ruling comes amid ongoing debates about how the Labour government should legislate on copyright and AI matters, with artists and authors like Elton John, Kate Bush, Dua Lipa, and Kazuo Ishiguro advocating for protections. In contrast, tech firms are seeking broader access to copyrighted material to develop more powerful generative AI systems.

The government is conducting a consultation regarding copyright and AI, stating: “The uncertainty surrounding the copyright framework is hindering the growth of both the AI and creative sectors. This situation must not persist.”

Lawyers at Mishcon de Reya, pursuing this matter, are contemplating introducing a “text and data mining exception” to the UK copyright law, which would enable copyrighted works to be utilized for training AI models unless rights holders opt-out.

Due to a lack of evidence indicating that the training took place in the UK, Getty was compelled to retract its original copyright claim. Nevertheless, the company proceeded with its lawsuit, asserting that Stability continues to use copies of visual assets, which it describes as the “lifeblood” of its business. The lawsuit alleges trademark infringement and “spoofing,” as some generated images bore Getty’s watermark.

Highlighting the complexities of AI copyright litigation, the group essentially argued that Stability’s image generation model, known as Stable Diffusion, constitutes an infringing copy, as its creation would represent copyright infringement if produced in the UK.

The judge determined that “AI models like Stable Diffusion that do not (and never have) stored or reproduced copyrighted works are not ‘infringing copies.'” She declined to adjudicate on the misrepresentation claims but ruled in favor of some of Getty’s trademark infringement claims regarding the watermark.

In a statement, Getty Images remarked: “We are profoundly worried that even well-resourced organizations like Getty Images face considerable challenges in safeguarding creative works due to the absence of transparency requirements. We have invested millions with one provider alone, but we must continue our pursuit elsewhere.”

“We urge governments, including the UK, to establish more robust transparency regulations. This is crucial to avoid expensive legal disputes and ensure creators can uphold their rights.”

Stability AI’s General Counsel, Christian Dowell, stated, “We are pleased with the court’s ruling on the remaining claims in this case. Although Getty’s decision to voluntarily withdraw most of the copyright claims at the trial’s conclusion left the court with only a fraction of the claims, this final decision addresses the core copyright issues. We appreciate the time and effort the court has dedicated to resolving the significant matters in this case.”

Source: www.theguardian.com

Rhinos Roamed the High Arctic 23 Million Years Ago

Paleontologists have identified a new early Miocene species of rhinoceros from the genus Epiaceratherium, based on fossilized remains uncovered in the Canadian Highlands Arctic.

This new rhinoceros existed in present-day Canada around 23 million years ago during the early Miocene epoch.

Named Epiaceratherium ijirik, it is most closely related to other rhinoceros species that thrived in Europe millions of years ago.

“Currently, there are only five species of rhinos found in Africa and Asia, but they were once widespread in Europe and North America, with over 50 species documented in the fossil record,” stated Dr. Daniel Fraser, a researcher from the Canadian Museum of Nature, Carleton University, and the Smithsonian’s National Museum of Natural History.

“The inclusion of this Arctic species enriches our understanding of the evolutionary history of rhinoceroses.”

Epiaceratherium ijirik was relatively small and slender, comparable in size to a modern Indian rhinoceros, but notably lacked a horn.

The fossilized remains were excavated from the sediments of a fossil-abundant lake in Horton Crater on Devon Island, Nunavut.

“What’s impressive about this Arctic rhinoceros is the excellent condition of the fossilized bones,” remarked Dr. Marisa Gilbert, also from the Canadian Museum of Nature.

“They are three-dimensionally preserved and only partially mineralized.”

“Approximately 75 percent of the skeleton has been recovered, which is remarkably complete for a fossil.”

By analyzing the occurrences of 57 other now-extinct rhino species, researchers traced the family tree of Epiaceratherium ijirik.

The findings were derived from visits to museum collections, reviews of scientific literature, and database analyses.

The researchers were also able to geographically categorize each rhino species across five continental regions.

This exhaustive process employed mathematical modeling techniques to gauge dispersal rates among different continents within the Rhinocerotidae family, with scoring based on their locations.

The analysis sheds light on how rhinoceroses utilized the North Atlantic land bridge for migration between North America and Europe (via Greenland) over millions of years.

Previous studies indicated that the land bridge may have served solely as a migration route until about 56 million years ago.

However, the new analysis implies that Epiaceratherium ijirik and its relatives suggest that these migrations from Europe to North America could have occurred much more recently, potentially as late as the Miocene.

“Discovering and describing new species is always thrilling and enlightening,” noted Dr. Fraser.

“But there is more to be gleaned from this identification: Epiaceratherium ijirik reveals that the North Atlantic played a more significant role in rhinoceros evolution than previously acknowledged.”

“Overall, this study reaffirms that the Arctic continues to unveil new insights and discoveries, enhancing our understanding of mammalian diversification across epochs.”

Results of this research are published in the journal Nature Ecology and Evolution.

_____

D. Fraser et al. Dispersal of rhinos through the North Atlantic during the mid-Cenozoic Era. Nat Ecol Evol published online October 28, 2025. doi: 10.1038/s41559-025-02872-8

Source: www.sci.news

Just Kidding, Meatbag! Channel 4’s AI Presenter Delivers High Demands on Multiple Levels

LTonight’s Dispatch was called “Will AI take my job?” The presence of a question mark usually suggests a negative answer, but this time it feels different. The lurking threat of AI taking over our roles is sobering to ponder.

The film claims that 8 million jobs in the UK are potentially at risk due to AI outsourcing. Occupations including call center agents, translators, and graphic designers—essentially everyone except for masseuses and scaffolders—may soon face redundancy from rapidly advancing technology, despite its dire environmental consequences. My lifespan may be limited, and it’s clear I’ll likely be replaced by a prompt from ChatGPT instructing, “Be histrionic and outraged about what’s on TV.” Grok can even generate a signature image of a comically smug egg to accompany it. Nobody surpasses this level of intelligence.

But why would anyone tune in to Dispatches knowing that AI could render us all obsolete? There wasn’t much to be enthusiastic about unless they were disturbingly motivated to bolster their darkest fears about humanity’s future. However, the film anticipated this sentiment and provided its own clever twist. The segment was introduced by a journalist named Aisha Gavan, who, shockingly, was AI-generated from the start.

Indeed, Channel 4 has fully embraced Tilly Norwood. Gavan was devoid of humanity; she was entirely a computer-generated entity. Yet, despite being mere pixels and code, her hosting was surprisingly convincing. Sure, she had an unfeeling gaze and struggled to articulate sibilant sounds correctly, but she appeared largely human and maintained the stilted pacing typical of TV documentaries. Quite the amusing situation! Who did you consider an authority? Not even a real person! Just kidding, flesh vessel.

The episode itself was fairly well-crafted, featuring four experts—a doctor, a lawyer, a musician, and a photographer—pitted against the capabilities of AI. The overall conclusion seemed to imply that while humans might be superior, AI excels in speed and cost-efficiency. Since greed drives every industry, we’re all in deep trouble.

Some of the technology discussed made sense, like a diagnostic tool capable of assessing a patient in half the time of a typical GP. This could potentially aid healthcare professionals already overloaded by systemic failures. But who really needs an AI photographer? The mission of AI is not to automate creativity but rather to alleviate mundane tasks in our lives. The emergence of AI photographers hints at a daunting future where we could be consigned to producing low-quality art endlessly spewed out by machines.

Ultimately, the spotlight was on Aisha Gavan, touted as Britain’s first AI TV presenter. Honestly, it felt like Channel 4 was trying to have both benefits and fun at once. Not only did they showcase their shiny new toy, but they also managed to lampoon the very technology that generated her. What a clever stunt!

It’s tough to view the film as anything but a stern warning for Channel 4’s other presenters. Hey Krishnan Guru Murthy, you’d better stop grumbling about office snacks or face replacement by an animated mannequin programmed to deliver scripts! Kevin MacLeod, no contract disputes, as a virtual avatar can perform your role without ever needing a break.

And let’s not forget the environmental implications involved. It would have been refreshing if Dispatch wrapped up with Gavan detailing the water consumption required to operate the data center that produced her, especially given Channel 4’s long-standing pledge to reach net zero.

In summary, it was a profoundly challenging watch on various levels. The situation will likely worsen as AI technology continues to advance at a staggering pace. Three years from now, while you’re foraging for bugs to nourish your family, you could have a ChatGPT providing you precise, bullet-point critiques of shows presented by AI-generated hosts. Yet, it was enjoyable while it lasted, wasn’t it?

Source: www.theguardian.com

High Dosage of Wegovy: Impacts on Weight Loss and Side Effects

Weekly Wegovy Injections Facilitate Weight Loss

James Manning/PA Images/Alamy

Testing reveals that a higher weekly dosage of Wegovy is linked to an increased risk of side effects.

Prior research indicates that individuals receiving the standard weekly doses of Wegovy, which contains the active ingredient semaglutide, can lose up to 15% of their body weight over a year when combined with a healthy diet and exercise regimen. This medication mimics the action of GLP-1, a hormone akin to glucagon, which aids in numerous ways such as delaying stomach emptying and signaling the brain to suppress appetite.

Produced by the pharmaceutical company Novo Nordisk, Wegovy is approved for use among individuals with obesity or those who are overweight and possess at least one weight-related condition like type 2 diabetes. “However, some patients may not experience the desired level of weight loss or may seek more than the average 10-15% reduction,” says Lora Heisler, who was not part of the research team from the University of Aberdeen in the UK.

To explore the potential benefits of increased dosage, Shawn Wharton and colleagues, including scientists from the University of Toronto and Novo Nordisk, studied over 1,000 obese adults across 11 countries, including the United States, Canada, and parts of Europe.

Participants, all without diabetes, were randomly assigned to receive either a high dosage of 7.2 milligrams or the standard 2.4 milligrams of semaglutide, alongside a placebo injection mimicking Wegovy. Doses of semaglutide were gradually elevated over several weeks, and all participants were encouraged to maintain a caloric deficit of 500 calories per day and engage in physical activity for 150 minutes weekly.

After one year, those receiving the standard dosage lost an average of 16% of their body weight, while the high-dose group achieved about 19% weight loss. Conversely, the placebo group lost approximately 44% of their body weight.

One-third of participants in the standard dosage category experienced over 20% weight loss, compared to almost half in the high-dose group. Only 3% in the placebo group reached this threshold, indicating that higher dosages can significantly enhance weight loss outcomes, according to Heisler.

At the onset of the study, more than one-third of participants in each group had prediabetes, marked by elevated blood sugar levels without qualifying for type 2 diabetes. By the study’s conclusion, diabetes cases in the high-dose group decreased by 83%, while cases within the standard-dose group fell by 74%. “This is highly encouraging, as the main objective of weight loss is to enhance overall health,” Heisler remarks.

Nevertheless, there are noteworthy drawbacks. Bowel-related side effects, such as nausea, vomiting, and diarrhea, were reported by 61% in the standard-dose group, while 71% in the high-dose category experienced similar issues. In comparison, 40% of those taking the placebo also faced these symptoms, which might not be directly related to the treatment, according to Heisler.

Moreover, over 20% of the high-dose group reported unpleasant skin sensations known as dysesthesia. As a result, four participants discontinued their treatment. By contrast, only 6% of the standard-dose recipients and just one in the placebo group reported this side effect, with no one ceasing treatment.

These findings indicate that the advantages of higher dosages may outweigh the associated risks for certain individuals, according to Heisler. “For those requiring substantial weight loss who don’t experience many side effects, the higher dose may facilitate their goals,” she states. However, it may not be appropriate for individuals achieving sufficient weight loss on standard doses or those enduring severe side effects. She emphasizes the need for additional trials to validate these results before clinical application.

In a separate trial, Wharton and his team suggest that higher dosages might yield greater weight loss and enhanced blood glucose levels in individuals with both obesity and type 2 diabetes. However, the results were not statistically significant, indicating a need for further investigation, says Simon Cork from Anglia Ruskin University in the UK.

Topics:

  • obesity/
  • Weight loss drugs

Source: www.newscientist.com

Exploring ‘Silly, Fat, and Ugly’: A Personal Journey Through High School in Gaming

I visited the V&A Exhibition on design/play/confusion. Back in 2018, you may have experienced an intriguing set of mini-games. In this space, one could navigate a Tetris-like board to balance calories perfectly, while also battling quirky physics to nourish a character named Jenny, twisting her into a Pilates position.

Nearly seven years later, the complete version of Consume Me, which recently snagged the Grand Prize at the Independent Game Festival, is set to release this September. Developer Jenny Jiao Hsia explains that the game evolved into a semi-autobiographical narrative reflecting her high school feelings of being “silly, fat, and ugly.” What initially started as a series of mini-games focusing on Hsia’s struggles with restrictive dieting has transformed into an exploration of various aspects of her teenage life.




Many aspects of life as a teenager… consumer me. Photo: 66

Hsia and co-designer Alec “AP” Thomson have been collaborating on games since their time at NYU Game Center. The idea for Consume Me emerged when Hsia shared her old diaries with Thomson, which contained her calorie charts and dieting notes. “I thought, ‘Hey, doesn’t this look like a game?'” she recalls. Thomson concurred, stating, “We started with a small prototype, and once we secured funding, the game really took off.”

The duo continued to refine their ideas, aiming to create a substantial game. “The last major project we worked on was essentially a student project,” Thomson mentions. This game was a match-3 puzzle released in 2016, and compared to that, “the entire process of Consume Me is completely different.” Hsia humorously reflects on the experience, saying, “I sat next to AP every day and was eager to guide him. With Consume Me, I had to take on more responsibility. I don’t consider myself very experienced, so it took me quite some time.”

Hsia clarifies that Consume Me was not meant as a means to address her challenges with dieting, as she left behind a phase of her life before the game’s development. Instead, she finds that crafting the narrative from her own experiences yields a richer story. “If you’re creating something from scratch without solid experiences to draw from—unless you have a strong imagination—I think it’s uninteresting,” she notes. “The character Jenny in the game isn’t solely based on me. She’s a blend of AP and me, depicting the enthusiasm for achieving goals beyond her to-do list.”

Hsia expresses her surprise that many players find Consume Me relatable and approachable. Individuals who struggle with focus might see themselves in the reading mini-games, where Jenny’s attention keeps spinning in circles with her books. Moreover, she often finds herself short on time. Additionally, Jenny’s financial woes are highlighted in a mini-game where she discovers a lucky $20 bill on the street while walking her dog, humorously dealing with its “tremendous” bowel movements. Jenny’s habit of discovering money on the streets of New York is a reflection of Hsia’s own life experiences.

“I’m curious about what people actually take away from the game,” she admits. “It’s fascinating to showcase a part of your life and let a stranger interpret it, then watch their reactions.”

Consume Me will be available for PC on September 25th.

Source: www.theguardian.com

In 2024, We Experienced a Record High of Dangerous Hot and Humid Days.

SEI 262102987

Shanghai endured extreme heat and humidity for days in 2024

Reuters/Nicoco Chan

The Earth recorded an unprecedented number of hazardous hot and humid days in 2024, as climate change escalates global humidity to alarming levels.

The worldwide average of humid heat days on land surpassed the 1991-2020 average, reaching 35.6 days last year, an increase of over 9.5 days from the previous record in 2023. Climate Report 2024 Status published by the American Weather Society.

Under hot and humid conditions, it is challenging for individuals to cool down, as moist air diminishes the evaporative cooling impact of sweating. Such weather poses serious risks to human health. Kate Willett from the UK Met Office, who contributed to the report, states, “Your body starts to struggle to shed heat, making it very dangerous.”

Meteorologists measure heat and humidity using “wet bulb temperature.” This is typically done by wrapping a wet cloth around a thermometer bulb, demonstrating the cooling effect of evaporated water. The readings indicate temperatures lower than in a dry bulb, as high humidity limits the cooling effect of evaporation and brings the wet bulb temperature closer to that of dry air.

As the globe heats up, the atmosphere can retain more moisture, leading to increased heavy rainfall, storms, and higher humidity levels. Willett describes 2024 as “exceptionally” humid, second only to 2023 due to higher moisture levels.

Specific regions, including the Middle East, Southeast Asia, and East China, have reached 31°C (88°F) within a short time span, with wet bulb temperatures exceeding 84°F multiple times, according to the report. At these levels, extended exposure outdoors is deemed extremely perilous and potentially lethal.

Traditionally, scientists consider a wet bulb temperature of 35°C the threshold for human survival, as people cannot withstand outdoor conditions for more than a few hours before succumbing. However, research published in 2022 suggests that the actual limit may be lower, around 31°C. “Over 30°C, your body really struggles,” Willett explains.

Topics:

Source: www.newscientist.com

Living at High Altitudes Could Help Combat Obesity

Research reveals obesity rates among children in Colombia’s hilly capital, Bogotá

Guillermo Legaria/Getty Images

A study involving over 4 million children in Colombia suggests that living at high altitudes may help in preventing obesity.

This outcome is consistent with existing research. Higher altitudes are thought to reduce obesity, potentially due to increased energy expenditure at lower oxygen levels. Most studies, however, have focused primarily on adults.

To explore the effect on children, Fernando Lizcano Rosada from Lhasavanna University in Chia, Colombia, along with his team analyzed data concerning 4.16 million children from municipalities up to age 5, sourced from the Colombian Institute of Family Welfare.

The participants were categorized into four groups based on the altitudes where they resided.

In two low-altitude areas, about 80 out of every 10,000 children were classified as obese. In contrast, at altitudes of 3,000 meters or higher since 2001, this rate dropped to 40 per 10,000.

However, at elevations above 3,000 meters, the prevalence rose again, reaching 86 out of 10,000. The researchers caution that this might be a statistical anomaly since it is based on data from only seven municipalities and 11,498 individuals, substantially fewer than the data for the other altitude groups.

“That’s a valid observation,” states David Stencel from Loughborough University, UK. He notes that a dose-response relationship would have strengthened the findings.

Stencel also underscores that the study is observational, meaning it does not definitively prove that high altitudes reduce obesity risk. “The research takes into account various confounding factors,” he explains, including measures of poverty. Yet, he adds, “we cannot account for everything.”

Nevertheless, he sees this research as a promising commencement. “It establishes a relationship that calls for more tailored studies to verify the hypothesis independently.”

Lizcano Rosada posits that metabolism may be heightened at higher altitudes, leading to increased energy expenditure.

This claim is plausible, Stencel agrees. “Some studies indicate that resting metabolic rates may be elevated at high altitudes,” he notes. For instance, a 1984 study found that climbers tended to lose more weight at high altitudes partly because fat from food was burned or expelled before being stored as tissue.

More recent studies suggest that lower oxygen levels may lead to accelerated metabolism and increased levels of leptin, the hormone related to satiety, while levels of ghrelin, often associated with hunger, are reduced.

If it is indeed true that high altitude diminishes obesity risk, Stencel notes that the practical application of this knowledge in combating obesity remains ambiguous. Nonetheless, Lizcano Rosada asserts that personalized advice could be beneficial, acknowledging that diverse environmental factors contribute to obesity across various locales.

topic:

Source: www.newscientist.com

High Temperatures Above 90 Degrees Reported Northeast of New England

LAS VEGAS – The severe heatwave plaguing the Southwest is anticipated to move eastward in the coming days, with temperatures surpassing 90 degrees Fahrenheit expected in the northeast, particularly in parts of New England later this week.

Around 15 million individuals are currently facing extreme heat advisories in eastern California, Nevada, Arizona, and western Texas. Triple-digit temperatures are projected to become widespread throughout the area by Wednesday.

In southern Nevada, temperatures this week may rise up to 12°F above the seasonal average, as reported by the National Weather Service. Nearby Arizona is predicted to see highs reaching 115°F across its southern and central regions.

As the week progresses, this heat will extend into the Midwest and Great Lakes, leading to the region’s first significant heat wave this weekend.

The Weather Service has indicated that record high temperatures are possible from Northern and Eastern Colorado to Nebraska and South Dakota on Thursday, affecting parts of the Great Basin and western South Dakota.

Research indicates that climate change may contribute to more frequent, intense, and prolonged heat waves.

Warm temperatures combined with high humidity will affect the East Coast this weekend and into next week. In major cities like New York City, Boston, and Washington, DC, the thermal index (the combined effect of temperature and humidity on how hot it feels) is expected to reach the mid-90s or higher.

“Starting early next week, we will see a stretch of prolonged hot and humid weather beginning Sunday and lasting through at least mid-week,” according to the New York branch of the Weather Bureau, as noted in a post on X.

Meanwhile, extreme heat is forecasted to persist across southwestern desert regions, including Death Valley, for the remainder of the week, as per the weather services.

Source: www.nbcnews.com

High Court Calls on UK Lawyers to Halt AI Misuse After Noting Fabricated Case Law

The High Court has instructed senior counsels to implement immediate actions to curb the misuse of artificial intelligence, following numerous false cases presented to the court featuring entirely fictitious individuals or constructed references.

While attorneys are leveraging AI systems to formulate legal arguments, two cases this year have been severely affected by citations from fictitious legal precedents, which are believed to have originated from AI.

In a damages lawsuit amounting to £89 million against Qatar National Bank, the claimant referenced 45 legal actions. The claimant acknowledged the use of publicly accessible AI tools, and his legal team admitted to citing non-existent authorities.

When Haringey Law Center filed a challenge against the London Borough of Haringey for allegedly failing to provide temporary accommodation for its clients, the attorney referenced fictitious case law multiple times. Concerns were raised when the counsel representing the council had to repeatedly explain why they could not verify the supposed authorities.

This situation led to legal action over unwarranted legal expenses, with the court ruling that the Law Centre and its attorneys, including the student attorney, were negligent. Although the barrister in that case refused to use AI, she stated that she might have inadvertently done so while preparing for another case where she cited the fictitious authority. She mentioned that she might have assumed the AI summary was accurate without fully understanding it.

In the Regulation Judgment, Dr. Victoria Sharp, President of the King’s Bench Division, warned, “If artificial intelligence is misused, it could severely undermine public trust in the judicial system. Lawyers who misuse AI could face disciplinary actions, including court contempt sanctions and referrals to law enforcement.”

She urged the Council of Lawyers and the Law Society to treat this issue as an immediate priority and instructed the heads of legal chambers and administrative bodies to ensure all lawyers understand their professional and ethical responsibilities regarding the use of AI.

“While tools like these can produce apparently consistent and plausible responses, those responses may be completely incorrect,” she stated. “They might assert confidently false information, reference non-existent sources, or misquote real documents.”

Ian Jeffrey, CEO of the English and Welsh Law Association, remarked that the ruling “highlights the dangers of employing AI in legal matters.”

“AI tools are increasingly utilized to assist in delivering legal services,” he continued. “However, the significant risk of inaccurate outputs produced by generative AI necessitates that lawyers diligently verify and ensure the accuracy of their work.”

Skip past newsletter promotions

These cases are not the first to suffer due to AI-generated inaccuracies. At the UK tax court in 2023, an appellant allegedly assisted by an “acquaintance at a law office” provided nine fictitious historical court decisions as precedents. She acknowledged that she might have used ChatGPT but claimed there were other cases supporting her position.

Earlier this year, in a Danish case valued at 5.8 million euros (£4.9 million), the appellant narrowly avoided dismissal when relying on a fabricated ruling that the judge had identified. A 2023 case in the US District Court for the Southern District of New York faced turmoil when the court was shown seven clearly fictitious cases cited by the attorneys. After querying, ChatGPT summarized the previously invented cases, leading the judge to express concerns and resulted in a $5,000 fine for two lawyers and their firm.

Source: www.theguardian.com

The Massive Boulder at Clifftop, Tonga, Was Pulled by 50-Meter High Waves

Martin Köhler in front of the Maka Rahi Boulder in Tonga

Martin Köhler/University of Queensland

The massive 1,200 tonne boulders of Tonga were carried inland as towering waves of 50 meters crashed against a 30-meter-high cliff.

“This is not just an ordinary boulder. It holds the title of the largest corrugated boulder found on a cliff and ranks as the third largest boulder globally, signifying that an immense force was needed to propel it from such a height,” said Martin Köhler from the University of Queensland, Brisbane, Australia.

Locally known as Maka Rahi, which translates to a large rock, this boulder had yet to be studied by scientists.

During fieldwork in Tonga in July 2024, the villagers pointed the researchers towards some intriguing rocks they might want to examine.

“We never anticipated discovering such a substantial rock at the finale of our field studies. It dawned on us quickly that we had stumbled upon a significant find,” Kohler explained.

Measuring 14 meters in length, 12 meters in width, and approximately 7 meters in height, it was described as a “remarkable” boulder, composed of limestone coral reef horn rec rocks. Previous satellite images missed this potential Monami Boulder, as vegetation had grown atop it, with surrounding forests extending into the woodland.

Upon observing the boulder, researchers identified a massive gouge believed to have been created at the cliff’s top, approximately 200 meters from the ocean.

The team utilized computer models to ascertain how this colossal boulder ended up above sea level.

Shifting it necessitated a wave with a minimum height of 50 meters and a duration of 90 seconds, implying it moved at a velocity exceeding 22 meters per second over a minute and a half, Kohler stated. Such a colossal tsunami is relatively localized and is thought to stem from nearby underwater landslides.

Dating indicated the boulder’s age to be 6,891 years, well before the settlement of humans on the island.

“It’s hard for me to fathom a 50-meter wave since I’ve never witnessed or heard of such massive waves before,” Kohler remarked. “However, the logic follows easily when one considers this enormous boulder positioned 200 meters inland on a 39-meter-high cliff.”

Only two rocks deposited by the tsunami have been found on land: one weighing 3,400 tons and another weighing 1,500 tons.

Topics:

Source: www.newscientist.com

Are Slate Auto Electric Trucks the Solution to High Car Prices?

Social media buzzed with reactions when startup Slate Auto unveiled its electric pickup truck priced at approximately $25,000 last month. The vehicle’s simplistic design features a silent body and nostalgic hand crank windows.

How wild is it? According to Cox Automotive, average monthly payments for new vehicles surged to $739 in March, up from $537 in January 2019. The average cost of a new car is now $47,400, while electric models are around $59,200. The high interest rate, currently about 9.4% on a 72-month loan, has further strained finances for buyers.

“Prices and interest rates are exceptionally high,” stated Mark Schirmer, director of industry insights at Cox Automotive. “For consumers who haven’t been in the market since 2018, the cost of a vehicle might seem shocking.”

President Trump’s 25% tariffs on imported automobiles and parts have prompted consumers to buy now, fearing further price increases. Cars priced below $30,000 are particularly vulnerable, with nearly 80% facing these tariffs. This includes popular models like the American-made Honda Civic and Toyota Corolla. The supply of budget-friendly models is expected to dwindle as automakers may cease the importation of certain vehicles entirely.

Enter Slate, a suburban Detroit startup backed by venture capital and Amazon founder Jeff Bezos.

Former Fiat executive Chris Berman, now CEO of Slate, mentioned that their trucks won’t be available until late 2026 but are intentionally designed to alleviate sticker shock.

True to its name, the truck serves as a blank canvas, enabling buyers to customize with over 100 accessories, such as power windows and heated seats, as their budgets allow or needs evolve. While it lacks built-in stereo or touchscreen display, it features a dock for phones and tablets, which saves costs and helps avoid the digital obsolescence often seen in car entertainment systems.

“I believe hardworking Americans are seeking value for their money,” Berman expressed in a recent interview.

This message resonated with 41-year-old Liv Leigh, who secured a slate truck reservation during its public debut at Long Beach Airport in California last April, paying $50 to do so.

She observed Slate employees convert the two-seat pickup into a five-seater SUV in just about an hour. Lee values the compact size of the truck, which is smaller than a Civic, along with its moderate 150-mile range.

“I love the concept of a utilitarian truck, a basic model that can handle dogs, muddy bikes, and plywood easily,” Lee remarked. “We don’t need a massive vehicle for our needs.”

Berman emphasized that efficient design and manufacturing are critical to maintaining the low price of their trucks. The grey plastic composite body panels eliminate the necessity for costly steel body stamping facilities or paint shops.

Just as the Ford Model T was available only in black, the Slate grants buyers a choice of 13 colors of vinyl body wraps for an additional $500. Customers can also opt for larger factory-installed batteries that extend the range to 240 miles.

“This approach keeps costs down while offering customers the freedom of choice,” said Berman. “They can customize their vehicles as per their preferences rather than adhering to manufacturer standards.”

Slate anticipates that its US-based supply chain, including batteries produced by South Korea’s SK On, will qualify for a $7,500 federal tax credit. However, some Republican lawmakers recently introduced a budget bill that removes this incentive and dismantles other Biden-era climate and energy policies.

Success hinges on Slate’s ability to navigate the treacherous landscape of electric vehicle startups, as several young manufacturers like Fisker, Nikola, and Canoo have sought bankruptcy protection.

Regardless of subsidies, Berman remains optimistic about Slate’s business strategy.

The company aims to price the truck around $20,000 before any government incentives, hoping to become a contender against the Nissan Leaf, which is the most affordable electric vehicle at $29,300 but no longer qualifies for tax credits. Chevrolet is set to release a redesigned Bolt SUV for roughly $30,000 by year-end, which will qualify for a tax credit, reducing its effective price to about $22,500.

Erin Keating, executive analyst at Cox Automotive, has praised the slate truck’s originality. However, she noted that the two-seat pickup’s short range and minimalistic interiors might not attract American buyers accustomed to high-tech features and comforts.

“There’s no harm in attempting to resolve the affordability crisis, but I question whether this will become a high-volume seller,” Keating commented. “Ultimately, this is a compact EV that offers very little. It doesn’t improve the array of affordable options with longer ranges.”

The Ford Maverick poses a potential challenge to the Slate, as its compact pickup is two feet longer, seating five passengers and featuring even more amenities. The hybrid version achieves 40 miles per gallon, with over 500 miles of range on a full tank.

Ford sold 131,000 Mavericks last year, indicating substantial demand for small, fuel-efficient trucks. The company has raised the starting price for hybrid versions to $28,150 as of 2024 due to tariffs on trucks assembled in Mexico. Ford confirmed that it would not pass on the entire tariff burden to consumers, offering vehicles at a price equivalent to employee sales until early July.

As with all vehicle types, American pickups have morphed dramatically over the years, with some extravagant models costing as much as luxury European sedans. Electric trucks from Tesla, Rivian, and Ford range from $70,000 to $100,000 or even higher.

Berman is keeping an eye on market opportunities for personas such as entry-level truck enthusiasts, families seeking a second vehicle, empty-nesters, landscapers, contractors, and delivery personnel. The company anticipates selling more trucks to customers who would typically opt for used cars, with an average price point estimated at $26,000.

A significant hurdle for Slate and other firms aiming to sell more affordable vehicles is that many Americans don’t appear to be purchasing such offerings, despite their stated preferences.

Keating highlighted that around 20 models currently available start below $25,000, predominantly small cars or SUVs, including the $18,300 Nissan Versa, the lowest-priced car on the market.

Almost all mid-sized family sedans start below $30,000, including popular models like the Honda Accord, Toyota Camry, and Hyundai Sonata. Yet, many Americans favor larger vehicles; SUVs, pickups, and minivans now comprise over 80% of the market.

Trump’s trade policies remain unpredictable. Analysts hope tariffs will add thousands to new car prices, subsequently increasing demand and prices for used vehicles.

In April alone, Americans purchased 1.5 million new cars, 400,000 more than in April 2024. However, analysts have noted that buyers are acting now to avoid being caught in a crunch later. Jonathan Smoke, chief economist at Cox Automotive, mentioned that new car inventories have reached their lowest point in two years, indicating potential price increases as dealerships sell out ahead of impending tariffs. Meanwhile, S&P Global Mobility has reduced its forecasts for new car sales, anticipating a 4% decline this year.

For those seeking refuge amidst financial uncertainty, electric vehicles present a sound investment, according to Keating. New electric vehicles received an average discount of 13.3% in March, translating to savings of nearly $8,000.

Lee recently leased a Chevrolet Equinox for two years, paying $5,500 upfront, resulting in a monthly payment of $230. The electric SUV boasts a 319-mile range. “Many people aren’t aware of the extensive incentives available,” she noted.

Source: www.nytimes.com

Early Season Heat Waves Record High Temperatures in the Southwest and Texas

This week, the country is bracing for early seasonal heat waves, with record or near-record high temperatures anticipated across the Northern and Southern Plains, Southwest, and vast regions of central and southern Texas.

On Monday, temperatures climbed into the 90s in North Dakota, South Dakota, and Minnesota, with some areas likely reaching triple-digit highs.

Beginning Tuesday, Texas will experience its hottest conditions, with temperatures exceeding 100 degrees Fahrenheit becoming commonplace throughout the state, according to the agency.

“We are expecting record-breaking heat by mid-week across much of central and southern Texas,” the Weather Service noted. I mentioned this in a short-distance forecast on Monday.

In a series of posts on X, the San Antonio Weather Service office cautioned that many people may struggle to adapt to such extreme temperatures, heightening the risk of heat-related illnesses and fatalities.

“Temperatures are slated to soar above 100 on Tuesday, with some locations potentially hitting 110 mid-week. Ensure you have access to cooling and ample hydration before the heat arrives,” the office advised. I shared this on X.

As the week continues, the heat will intensify in the central and southern plains, eventually spreading to the southeastern U.S. and Florida.

Cities likely to set new daily temperature records this week include Austin, Dallas, San Antonio, and Houston in Texas, as well as Oklahoma City; Shreveport, Louisiana; Charleston, South Carolina; and Tallahassee, Jacksonville, and Orlando in Florida.

The unseasonably high temperatures are attributed to strong high-pressure ridges situated over much of the country, particularly in Texas. These “thermal domes” effectively trap heat in the region, leading to elevated temperatures for several days.

Southern California recorded historic highs over the weekend, peaking at 103 in downtown Los Angeles, surpassing the previous record of 99 set in 1988. According to reports, this significant increase has raised concerns.

During the Los Angeles heat wave, individuals took a moment to hydrate on Sunday.
Carlin Steele/Los Angeles Times Getty Images

Research indicates that climate change is intensifying the frequency, duration, and intensity of heat waves globally. Scientists predict yet another hot summer following two consecutive years of record-breaking global temperatures (2023 and 2024).

These ongoing record temperatures are part of alarming warming trends long anticipated by climate change models. The hottest years on record since 1850 have all occurred within the last decade. According to the National Oceanic and Atmospheric Administration.

Source: www.nbcnews.com

Reducing high blood pressure may decrease the likelihood of developing dementia

Low blood pressure is associated with a lower risk of dementia

Shutterstock/Greeny

According to a large study from Chinese people, lowering hypertension reduces the risk of dementia and cognitive impairment.

Many studies link Hypertension is also known as hypertension, and is at a high risk of developing dementia.. Some studies have also shown that side effects of blood pressure treatment may be at a lower risk of dementia.

now, jiang he At the University of Texas Southwestern Medical Center, Dallas and his colleagues are directly considering the effectiveness of drugs that lower blood pressure for dementia and cognitive impairment.

They studied 33,995 people in rural China. They were all over 40 years old and had high blood pressure. Participants were split into one of two random groups, each with an average age of approximately 63 years.

On average, the first group actively received three antitherapeutic drugs, such as ACE inhibitors, diuretics, or calcium channel blockers, actively ensuring lower blood pressure. They also coached home blood pressure monitoring and lifestyle changes that help to reduce blood pressure, such as weight loss and alcohol and salt intake.

Another set treated as a control group achieved local treatment levels with the same coaching and more general levels of treatment, including on average one drug.

At follow-up appointments 48 months later, participants were tested for blood pressure and measured for signs of cognitive impairment using a standard questionnaire.

Concerns about hypertension begin when a person’s systolic pressure exceeds 130 mm mercury (mmHg) or when diastolic pressure exceeds 80 mmHg. blood pressure It has exceeded 130/80.

On average, many people who received the medication lowered their blood pressure from 157.0/87.9 to 127.6/72.6 mmHg, while the control group was able to take it from 155.4/87.2 to just 147.7/81.0 mmHg.

The researchers also found that 15% fewer people who received multiple drug therapies during the study received dementia diagnosis compared to the control group, and 16% suffered from cognitive impairment.

“The results of this study demonstrated that lowering blood pressure is effective in reducing the risk of dementia in patients with uncontrolled hypertensive conditions,” he says. “This proven and effective intervention should be widely adopted and expanded to alleviate the global burden of dementia.”

“Over the years, many people know that blood pressure is likely a risk factor for dementia. Zachary Malcolm At Washington University in Seattle.

Raj Shah Rush University in Chicago says it’s helpful to add evidence that treating high blood pressure can help stop dementia, but that’s just one of the dementia puzzles, as we affect brain abilities as we age.

“You need to treat hypertension for multiple reasons,” says Shah. “Because of people’s longevity and happiness, they can age healthyly over time.”

Marcum also says people should think more broadly than just blood pressure to avoid dementia. He says there is Other known risk factors This is associated with an increased risk of dementia, including smoking, inactivity, obesity, social isolation, and hearing loss.

And many factors are more influential at different stages of life. To reduce the risk of dementia, “a holistic approach is needed throughout your life,” says Shah.

topic:

Source: www.newscientist.com

The Truth About Endorphins and Runner’s High: How to Increase Your Natural High

For those who are not runners and don’t find the appeal of a two-hour run at 6am, it’s known that running (and other forms of aerobic exercise) can create powerful chemical sensations that are comparable to real drugs.

The body naturally produces two pleasurable substances associated with the runner’s high. Endorphins are well-known neurotransmitters that can be likened to morphine for their pain-relieving properties.

One theory suggests that our ancestors evolved to produce endorphins to help them chase prey or escape predators by numbing foot pain and blisters.

Research indicates that for runners, a long-term, moderately intense run is the ideal scenario for endorphin production. If you’re aiming to experience the runner’s high, try a “tempo” run.

After a good warm-up, aim to run for at least 20 minutes at a pace of about 6 or7 out of 10 (with 10 being an all-out sprint).

Running can produce powerful chemical hits that justify comparisons with real drugs – Illustration Credit: James Clapham

While endorphins have traditionally been credited with causing the euphoria of the runner’s high, recent research suggests that another substance may be the actual source of the uplift felt towards the end of a run.

Endocannabinoids are molecules similar to those in marijuana that enhance the mood, but are naturally produced by the body.

Research shows that when cannabinoid receptors are blocked in mice, they exhibit reduced activity. In a study in 2021, researchers at the University Medical Center Hamburg Eppendorf found that even when the opioid receptor that binds to endorphins is blocked, runners still experience the high.

This suggests that cannabinoids may be more responsible for the runner’s high than previously thought.

In the study, participants ran at a moderate pace for 45 minutes. To achieve a similar high, aim for a consistently challenging pace where holding a conversation becomes difficult.


This article addresses the question (posed by Emily Marine of Colchester) “When does the runner’s high kick in?”

Please email us with your questions at Question @sciencefocus.com or message us on Facebook, Twitter, or on our Instagram page (don’t forget to include your name and location).

Check out our ultimate Fun Fact for more amazing science content.


Read more:

Source: www.sciencefocus.com

Scammers Use Stolen Money to Buy “Sim Farm”, High Heels, and Zombie Knife

oThe NA shelf between the Alexander McQueen shoes, Louis Vuitton handbags and Versace heels in the police evidence room is an 18-inch machete and a serrated zombie knife. According to DCI Paul Curtis, the tools needed to achieve that are in addition to expensive fashion purchased along with serious fraud revenue.

“These are serious criminals, and for some reason they felt the need to have these to protect themselves,” he says. Another tool is “Sim Farms” purchased from Dark Web. This is used by scammers to send many text messages at once. Laptop stack. Mobile phone and payment card reader.

The corruption of tools and crime comes from raids led by dedicated card and payment crime unit (DCPCU). observer I visited this month. Taking responsibility for revealing payment fraud across the country and charging the perpetrators, London and a group of police officers from the metropolitan police are at the forefront of the fight against fraud.

Some of their recent successes are Bank staff and Police officer To fraud the people and the Chinese Running along the west edge of London, sending fake text messages From the machine in his car.




DCI Paul Curtis says fraud is “underreported and causes great harm.”

The National Crime Agency (NCA) estimates that 40% of crimes are fraud-related. It places the amount lost to billions of people each year. Revealing the criminals behind payment fraud is a difficult struggle.

There are many problems. Criminals are often based overseas and take advantage of the constant advancement in technology that dupes victims. But one difficulty is simple and common. This means that people are so embarrassed that they fell for fraud that they can’t mention it to those around them. “It’s underreported and causes great harm,” Curtis says. “And the harm is not just financial.” Victims can lose their sense of self-confidence and create mental health issues, which can even lead to suicide.

“It’s really challenging to overcome people’s own embarrassment and shame,” he says. “It’s about getting comfort and communicating to people from that network around you, like from your family. [and] From your social network. And if people have that support, it can be very empowering to them [they can] They then report and engage with law enforcement. ”

Guardian Recently, we have revealed details of sophisticated mercenary tactics used by scammers in an elaborate business based in Tblisi, Georgia. Many were called again and again by criminals, and more and more persuaded to hand over money.

Curtis says criminals in fraud cases use similar methods to criminals who are sexual predators when grooming victims. “It works exactly the same as a scammer. They have to build confidence with the victim. They have to build that trust. So this may not be a quick process to become a victim of a scam,” he says.




Louis Vuitton’s bag was seized in the assault

Technological advances present a continuing problem with the DCPCU. Money can now travel across borders at a much faster rate than before, leaving it out of reach of law enforcement, and ads for fake investment vehicles are often praised for being supported by celebrities like those that happened in MoneySavingexpert’s Martin Lewis.

According to Bendaldson, managing director of economic crime for the UK Finance Association, a banking organization that funds the DCPCU, artificial intelligence can use patterns to create text, images or videos on existing data – the development of a generation AI that gives criminals new opportunities.

“I think it gives [criminals] There are a variety of features they didn’t have before, and… some of this technology is easy to use and easy to use.

“It’s much easier to do that [fraud] In a very convincing way [as] There are now a variety of capabilities available to criminals, changing the nature of the threat. Admission bar [with] That type of technology is always low,” Donaldson says.

Which of the UK’s finances and consumer groups this month? I wrote a joint letter to the British government They ask technology companies to “robust action” to offset the increased costs of fraud.

Donaldson says the majority of approved push payments (APP) scams involve training someone to voluntarily send money from a bank account. The decision announced in January to remove fact checkers and reduce censorship by Facebook raises concerns that it will make it even easier for criminals to exploit people, he says.

Police are seeking more effective user verification to prevent criminals from operating anonymously, and to share more information that can identify them.

Meta, the parent company of Facebook, Instagram and WhatsApp, says it has begun mutual exchanges of fraudulent information. This allows banks to share information about fraud. This has deleted 20,000 accounts. Tiktok says each request for data from police will be investigated and evaluated before data is disclosed.

So, when scams swell and come to people through emails, texts, WhatsApp messages, and many social media channels, what can people do to stay safe?

Donaldson handles your personal information the same way you handle keys in your home. “Please do not hand over any aspect of your personal information unless you trust someone with the door key.”

Source: www.theguardian.com

Understanding the Implications of Apple’s High Court Challenge on Data Protection

This appeal will be reviewed by the investigative courts to determine if the national intelligence agency acted unlawfully.


What is the UK government requesting from Apple?

The Ministry of Home Affairs has issued a “Technical Capacity Notice” under the Investigation Powers Act, requiring businesses to assist law enforcement in providing evidence. The focus is on Apple’s Advanced Data Protection Service, which encrypts personal data stored on Apple’s cloud servers.

The UK government hopes that Apple will provide access to its services’ content through backdoors.


Why is Apple opposing this?

Apple values privacy as a core principle and has removed its Advanced Data Protection Tool from the UK. The tool offers end-to-end encryption, ensuring only the account owner can decrypt the data. Apple’s iMessage and FaceTime services maintain end-to-end encryption.

Apple faces opposition from human rights groups challenging the Technical Capacity Notice as a broad request that compromises billions of users’ personal data to potential threats.


Can Apple succeed in the challenge?

Legal lecturer Dr. Daniella Lock from King’s College London suggests Apple has a chance due to human rights considerations. The requirement for a backdoor to access encrypted data may be viewed as disproportionate, and questions arise about data security.

However, Lock acknowledges that the UK government’s secrecy surrounding the case could hinder Apple’s defense, as courts tend to support national security interests.


Does the US government support Apple?

The US government has expressed concerns about the UK’s demands on Apple, with President Trump likening it to Chinese surveillance practices.

“We told them you can’t do this,” Trump stated in an interview. “We actually said[Starmer]… can’t believe it. That’s what you know, you hear China.”


Would Apple’s defeat create a precedent?

Regardless of the outcome, future conflicts with tech companies are possible as the IPA requires companies to notify the government of changes affecting data access. Services like WhatsApp, committed to privacy, may also face similar requests.

This case represents a critical battleground between law enforcement and technology, balancing users’ privacy rights and overall security concerns.

Source: www.theguardian.com

There is a high probability that Asteroid 2024 YR4 will not collide with Earth in 2032.

Astronomers raced to observe asteroid 2024 YR4

NASA/Magdalena Ridge 2.4M Telescope/New Mexico Institute of Technology/Ryan

The world’s space agency has reduced the chance that asteroid 2024 YR4 will affect Earth by less than 1%. This strongly suggests that potentially catastrophic conflicts will be avoided. However, the asteroid probably passes very extraordinarily close to our planet, giving astronomers a rare opportunity to observe the asteroid in close proximity.

“We don’t expect the impact probability above 1% will exceed 1% in 2032 due to our close approach with the Earth,” he says. Richard Moisle With the European Space Agency (ESA). “The most likely further development is a further reduction in impact probability, perhaps even dropping to zero.”

The alarm last December regarding the asteroid 2024 YR4 was first raised in December last year, when it discovered it could be on Earth’s collision course in 2032. It looks like it’s 40-90 meters wide and can produce a fatal explosion if attacked by a city. Over the next few weeks, global telescopes and space agencies have closely tracked their orbits, honing their future paths more accurately. On February 17th, we reached our highest shock risk with one in 32nd chance, but in the next few days this reduced to a 67th or 1.5% risk.

On February 20th, new observations led to a sudden downgrade of this risk, with NASA having a 0.27% impact chance of 1-in-360, and ESA having a 0.16%, or 1-in-in-in- in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in- in-ine 625. These ratings placed it at 1 on a 10-point Turin scale used to evaluate the hazards brought about by such objects. That score has decreased from 3. So, 2024 YR4 is now considered one of many low-risk asteroids discovered each year, but ultimately misses Earth.

I say this is good news Gareth Collins At Imperial College in London, asteroids still serve as a dry run for planetary defense systems and scientific purposes. “This still makes for an epic, close approach. If the risk of a hit was so high, it must be close to us,” he says.

Space companies that were sketching possible schemes to deflect NASA, ESA and asteroids, say they will likely continue their plans. Niklas Voight At OHB, a German space company. Voigt and his team were beginning to think about the mission to deflect the 2024 YR4, but the new risks won’t change that, he says. “The risk has decreased, but for the time being, we are still working on the topic.”

A close approach could be a good opportunity to test its ability to deflect asteroids, says Voigt – the only previous attempt to do this was NASA’s DART mission, the 160m in 2022 The asteroid-shaped trajectory of the . Satellites can be constructed to send to the 2024 YR4, he says, as well as the ESA’s Ramses satellite, to travel to observe the asteroid Apophis, passing near Earth in 2029. It is set to do so.

The final decision on what to do about YR4 2024 will likely not be made until the planned observation in March using James Webb Space Telescope. Not only does it collect orbital data, it also helps to better assess the size and composition of the asteroid. That information will be provided to the UN Assisted Space Mission Planning Advisory Group, which will determine the best action around the end of April. “These are very useful exercises to find a pinch point to make a decision, as you have time to do something wise in advance,” Collins says. “Absolutely, these committees are still meeting, but they’re probably less stressful.”

The possibility of an Earth shock has plummeted, but the risk of a YR4 collision with the moon in 2024 rose from 0.3% to 1.2%. “There’s a clear possibility that those numbers will rise even further,” says Moissl. “The exact impact of the effects of the moon from objects of this size is still under evaluation.”

The response to this object is also a useful rehearsal for other asteroids of concern, Collins says. “We want to avoid screams in the future, as the public is used to this threat, thinking, ‘Oh, that’s never going to happen.’ ”

topic:

Source: www.newscientist.com

Researchers find unusually high levels of cosmic formation beryllium in the Pacific Ocean

A team of scientists from Helmholtz Senturm Dresden Rossendorf, Tad Dresden Institute of Technology, and the Australian National University have discovered an “unexpected” accumulation of Beryllium-10 from the bottom of the central and North Pacific Oceans.

Col et al. Report on the discovery of anomalies in the beryllium-10 concentration profiles of several deep-sea ferromanganese crusts (stars) from the late Miocene central and North Pacific Oceans. The main bottom (blue line) and surface (red line) ocean currents of the thermal halin circulation are shown. Image credit: Koll et al., doi: 10.1038/s41467-024-55662-4.

Radionuclides are types of nuclei (isotopes) that decay into other elements over time.

They are used to date archaeological and geological samples, and radiocarbon dating is one of the best-known methods.

“The major ocean floors on Earth show one of the most pristine geological archives documenting environmental conditions and changes over millions of years, the ferromanganese crust,” Zentrum Dresden-Rossendorf and his colleagues.

“Dating these marine archives can be achieved through fossils through changes in biostratigraphy, isotope, or elemental composition. Alternatively, we can analyze the imprinted changes in the Earth's magnetic field due to magnetic stratigraphy. Masu.”

“Another commonly employed technique is dating space-forming nuclides,” they added.

“The radionuclide Beryllium-10 is continuously produced in the upper atmosphere, primarily through cosmic ray spallation for nitrogen and oxygen.”

“The residence time of Beryllium-10 in the atmosphere is about 1-2 years for it to adhere to the aerosol and precipitate.”

“In the ocean, atmospheric beryllium-10 mixes with the stable beryllium-9 of the lithosphere, which is transported to the ocean by river runoff and river dust, primarily after erosion of terrestrial minerals.”

Dr. Koll and co-authors have discovered long-term cosmicogenic beryllium-10 anomalies in central and North Pacific samples.

Such anomalies can be attributed to changes in ocean currents or astrophysical events that occurred during the late Miocene era around 10 million years ago.

The findings have the potential to serve as a global time marker for promising advances in dating geological archives over millions of years.

“For a period of millions of years, such space-forming time markers still do not exist,” Dr. Koll said.

“However, this beryllium abnormality can act as such a marker.”

result It will be displayed in the journal Natural Communication.

____

D. Koll et al. 2025. Cosmic genome 10It becomes abnormal in the late Miocene as an independent time marker for marine archives. Nut commune 16, 866; doi:10.1038/s41467-024-55662-4

Source: www.sci.news

Unveiling the Mysteries of Deepseek Torn AI: The True Source of High Tech Brothers’ Fear

no, it wasn’t the “moment of SPUTNIK”. Last month’s release Deepseek R1 in China generated AI or chatbot has sparked conversations in the high-tech world, speculating on stocks, and suggesting that the United States is losing its edge in AI technology. However, the confusion caused by SPUTNIK doesn’t reveal much about Deepseek beyond American neurosis.

The original SPUTNIK moment occurred when the Soviet Union launched SPUTNIK 1 on October 4, 1957, shocking the world. In the famous words of Neil Armstrong, when he took a “small step” on the moon, it was an anachronistic phrase from a later, even more important milestone, the “Giant Leap for Mankind”.

Deepseek, backed by Chinese hedge funds, is a notable achievement. However, it doesn’t offer any technical advancements in large-scale language models (LLMs) that already exist. It lacks the speed and “wisdom” of Openai’s Chatgpt or Anthropic’s Claude.Vision-All LLMs tend to generate incorrect answers or fabricate facts to fill data gaps. NewsGuard’s evaluation system found that the DeepSeek chatbot made false claims in 30% of cases and failed to provide answers in 53% of cases.

DeepSeek’s high non-response rate may be attributed to censorship, avoiding sensitive issues for China or limiting information on topics like Tiananmen Square and Taiwan.

The true impact of DeepSeek lies in AI economy rather than its technology. It is a chatbot with strengths and weaknesses, like other major models, but it is built with lower costs and inferior technology. The ban on cutting-edge chips and chip manufacturing equipment to China by the Biden administration in 2022 has unintentionally spurred Chinese researchers to be more innovative.

DeepSeek is freely available and open-source, helping to democratize AI technology, especially outside the United States. While US companies create barriers to entry for competitors, it is ironic that China embraces openness.

The impact of DeepSeek goes beyond technology, unveiling the hype and geopolitical tensions surrounding AI. It challenges the notion of building larger AI models with massive computing capabilities and high costs.

The hype around DeepSeek mirrors the hyperbole around AI and reflects geopolitical tensions. If DeepSeek had originated from a US university, it might have gone unnoticed without causing global uproar. Amidst this panic, concerns about DeepSeek’s Chinese origin raise questions about privacy, censorship, and surveillance that affect AI technology as a whole.

Navigating between hype and fear around AI becomes crucial in times of trade wars and threats to democracy. It’s important to recognize the promises and challenges of technology without being swayed by political agendas.

Kenan Malik is an observer columnist

Source: www.theguardian.com

Biologists Use High Resolution Imaging to Study Iberian Rib Neutrons.

Aquariums in the aquatic tail, called Newt, have a large genome with many repetitive factors. It is unknown how these elements form genome and relate to the unique playback ability of Newt. In the new research, scientists Carolinska Research Institute In other places, a chromosomal scale genomic sequence was generated. Newt with Iberia ribs (Pleurodeles Waltl)

brown et al。 We present a chromosome scale assembly of 20.3 GB genome with Iberian ribs (Pleurodeles Waltl), It has an unprecedented continuity and integrity between giant genomes. Image credit: Brown et al。 , Doi: 10.1016/j.xgen.2025.100761.

Iberia's ribs, which are also known as gully parts or Spanish ribs, are a species of Newt. Climate disease To Spain, Portugal, Morocco.

This kind is known for its wide flat head and sharp RIB bone that can make a hole in its side.

Men measure up to 31 cm (12.2 inches) for men and up to 29 cm (11.4 inches) for women. North African specimens are smaller than European population specimens.

“Iberia's Ribbal Newt boasts an impressive playback roster that can reconstruct lost limbs and regenerate damaged organizations of complex organs, brain, heart, and eyes,” he said. A research institute professor Andas Simon and his colleagues said.

“The use of this model type is greatly enhanced by high -quality genomic assembly and annotation.”

“But this was a challenge for a considerable concentration of a large 20 GB genomal size and a repeated element sequence.”

The authors have discovered that repeated elements account for 74 % of Iberia's ribs of Newt genome content.

“This was a technical challenge, but I succeeded in a more detailed mapping that was more comprehensive than other species of similar genome size,” said Simon.

“We have determined the accurate position of both protein cord sequences and non -coded sequences of each chromosome,” said Karolinska Institute Ph.D. Student Ketan Michela.

“In addition, we have identified which protein corded gene is lacking in the genome of the newt, or has more copies than other species.”

“The result is an important resource for researchers in several fields, such as the evolution, regeneration and development of development of genome, and cancer biology.”

“The next step of the research is to focus on functional research. This is to manipulate the molecular process and determine how these will affect the regenerative ability.”

“In addition, we plan to conduct comparative research with other species to further understand these mechanisms.”

Survey results It will be displayed in the journal Cell genomics

______

Thomas Brown et al。 The chromosome genome assembly reveals how the repetitive errors form a non -coding RNA landscape that is active during the rehabilitation of the neut limb. Cell genomicsReleased online on January 27, 2025. Doi: 10.1016/j.xgen.2025.100761

Source: www.sci.news

New Record High of Atmospheric Carbon Dioxide Levels Recorded at Monitoring Station

Hawaii’s Mauna Loa Observatory has been recording atmospheric carbon dioxide concentrations since 1958.

Fred Espenak/Science Photo Library

Atmospheric carbon dioxide levels measured by Hawaii’s Mauna Loa Observatory weather station increased by 3.58 parts per million in 2024, the largest increase since records began in 1958.

‘We’re still going in the wrong direction,’ climate scientists say Richard Betts At the Met Office, the UK’s weather bureau.

Part of this record increase is due to carbon dioxide emissions from human activities such as fossil fuel burning and deforestation, which reached an all-time high in 2024. Added to this were numerous wildfires caused by record global warming driven by climate change. Long-term warming plus El Niño weather patterns.

Betts predicted that atmospheric carbon dioxide concentrations measured at Mauna Loa would rise by 2.26 parts per million (ppm) this year, with a margin of error of 0.56 ppm either way. This is significantly lower than the 2024 record, but it would exceed the last possible pathway to limiting the rise in global surface temperatures to 1.5°C above pre-industrial levels.

“You can think of this as another nail in the 1.5°C coffin,” Betts says. “Now that’s highly unlikely.”

Atmospheric carbon dioxide concentration is the most important indicator when it comes to climate change, as increasing atmospheric carbon dioxide concentration is the main driver of short- and long-term warming. The first continuous measurements of CO2 levels were taken at Mauna Loa.

“Because this station has the longest observation record and is located far from major anthropogenic and natural sources of CO2 emissions and sinks, it is often used to represent changes in global CO2 concentrations. It will be done.” Richard Engelen At the EU’s Copernicus Atmospheric Monitoring Service.

However, observations from satellites have made it possible to directly measure the global average atmospheric carbon dioxide concentration. According to CAMS, it rose by 2.9 ppm in 2024. Although this is not a record, it is one of the largest increases since satellite observations began.

“The reasons for this large increase require further investigation, but are likely a combination of a recovery in emissions in much of the world after the coronavirus pandemic and interannual fluctuations in natural carbon sinks.” says Engelen. Carbon sinks refer to marine and terrestrial ecosystems that absorb about half of the carbon dioxide emitted by humans.

It has long been predicted that as the Earth warms, this excess CO2 will become less absorbed. “The concern is whether this is the beginning of that,” Betts said. “We don’t know.”

At Mauna Loa, carbon dioxide increases will be higher than global average levels in 2024 due to the large number of wildfires in the Northern Hemisphere, Betts said. CO2 plumes from sources such as wildfires take time to mix evenly into the world’s atmosphere. “Fire emissions in the Northern Hemisphere were particularly high last year,” he says.

Although it is now certain that global warming will exceed the 1.5°C threshold, Betts believes it is still the right goal to set that goal. “The Paris Agreement is carefully worded to seek to limit global warming to 1.5%. We recognized from the beginning that this would be difficult,” he says. “The idea was to set this stretch goal to motivate action, and I actually think it was successful. It galvanized action.”

topic:

Source: www.newscientist.com

Research: Elderberry juice high in anthocyanins may be an effective weight management aid

Consuming elderberry juice daily for one week (12 ounces of juice daily) significantly increases the gut microbial community associated with health benefits, according to a new randomized, placebo-controlled study. Compared to placebo, elderberry juice significantly increased Firmicutes and Actinobacteria and decreased Bacteroidetes. At the genus level, elderberry juice increased. FaecalibacteriumRuminococcaceae , and Bifidobacterium Bacteroidetes and lactic acid-producing bacteria decreased.

Elderberry is a small dark purple fruit that grows on the elderberry tree, which is native to Europe. Image credit: TheOtherKev.

More than 70% of adults in the United States are overweight or obese. The latest estimates indicate that 42% of adults suffer from obesity, and this is expected to increase to 48-55% by 2050.

Obesity has myriad and multifaceted causes. Proactive dietary management of obesity-related cardiometabolic complications includes dietary patterns that incorporate food sources rich in bioactive food components, such as the Mediterranean-style diet.

These dietary patterns include 5 to 10 daily servings of fruits and vegetables, which are rich sources of polyphenols that promote human health and longevity.

Anthocyanins are a diverse subclass of flavonoids that have been widely studied for health-promoting properties, including metabolic changes associated with obesity, such as diabetes, dyslipidemia, and cardiovascular disease.

Furthermore, research results ranging from translational studies in rodents to large prospective cohort studies show that anthocyanin-rich berries have a protective effect against obesity-related morbidity and mortality. It's proven.

The mechanisms of action of anthocyanin benefits include preventing the intestinal absorption of monosaccharides, promoting cellular metabolism in adipose and muscle tissue, and modulating the gut microbiome.

“We have previously shown that consuming 600 g of blackberries per day for one week increases insulin sensitivity as evidenced by dietary challenge testing and decreases in respiratory quotient by 24-hour indirect calorimetry. “We demonstrated that fat oxidation also increases.” Dr. Patrick Solverson and colleagues at Washington State University.

“The aim of this human study was to determine whether the metabolic benefits observed with other anthocyanin-rich berries also apply to elderberry.”

Researchers tested elderberry's effects on metabolic health in a randomized, placebo-controlled clinical trial of 18 overweight adults.

While maintaining a standardized diet, participants consumed either elderberry juice or a placebo with a similar color and flavor specifically designed by North Carolina State University's Food Innovation Lab.

Post-intervention clinical trials showed that participants who consumed elderberry juice had significantly increased amounts of beneficial gut bacteria, such as Firmicutes and Actinobacteria, and decreased amounts of harmful bacteria, such as Bacteroidetes. It was shown that

In addition to positive changes in the microbiome, elderberry intervention improved metabolism.

The results showed that elderberry juice lowered participants' blood sugar levels by an average of 24%, significantly improving their ability to process sugar after ingesting carbohydrates. Results also showed that insulin levels were reduced by 9%.

Additionally, the results suggested that elderberry juice may increase the body's ability to burn fat.

Participants who consumed elderberry juice showed a significant increase in fat oxidation, or the breakdown of fatty acids, after a high-carbohydrate meal or during exercise.

“Food is medicine, and science is catching up to that conventional wisdom,” Dr. Solverson said.

“This study contributes to a growing body of evidence that elderberry, which has been used as a folk medicine for centuries, has many benefits for metabolic as well as prebiotic health. .”

“Other berries also contain anthocyanins, but usually in lower concentrations,” he added.

“To get the same amount of anthocyanins found in 6 ounces of elderberry juice, you would need to consume 4 cups of blackberries a day.”

of findings appear in the diary nutrients.

_____

Christy Teets others. 2024. A 1-week elderberry juice intervention enhances fecal microbiota and suggests improved glucose tolerance and fat oxidation in a randomized controlled trial. nutrients 16(20):3555;doi: 10.3390/nu16203555

Source: www.sci.news

Staple plant foods high in starch were a key component in the human diet nearly 800,000 years ago

Archaeologists say they have extracted various starch granules from stone tools found at an early Middle Pleistocene site in Israel. These include acorns, grass grains, water chestnuts, yellow waterlily rhizomes, and legume seeds.

Examples of plant parts recovered from Gesher Benot Yaakov's percussion instruments, including whole plants, edible parts, and characteristic starch granules. From left to right: oak, yellow water lily, oat. Scale bar – 20 μm. Image credit: Hadar Ahituv and Yoel Melamed.

The 780,000-year-old basalt tools were discovered at the early Middle Pleistocene site of Gesher Benot Yaakov, located on the shores of ancient Lake Hula.

They were examined by a team of researchers led by Bar-Ilan University. Dr. Hadar Ahitub.

“Our study contradicts the prevailing theory that ancient humans' diets were primarily based on animal protein, as suggested by the popular 'Paleo' diet,” the scientists said. Ta.

“Many of these diets are based on interpretations of animal bones found at archaeological sites, and very little plant-based food has been preserved.”

“However, the discovery of starch granules in ancient tools provides new insight into the central role of plants, especially the carbohydrate-rich starchy tubers, nuts and roots essential to the energy needs of the human brain. I got it.”

“Our research also focuses on the sophisticated methods that early humans used to process plant materials.”

The authors recorded more than 650 starch granules in basalt maces and anvils, tools used to crack and crush Gesher Benot Yaakov's plant foods.

These tools are the earliest evidence of human processing of plant foods, and were used to cook a variety of plants, including acorns, grains, legumes, and aquatic plants like yellow water lilies and the now-extinct water chestnut. was used to.

They also identified microscopic debris such as pollen grains, rodent hair, and feathers, supporting the reliability of the starch findings.

“This discovery highlights the importance of plant foods in the evolution of our ancestors,” Dr. Ahitub said.

“We now know that early humans collected a wide variety of plants throughout the year and processed them using tools made of basalt.”

“This discovery opens a new chapter in the study of the deep relationship between early human diets and plant-based foods.”

The findings also provide insight into hominin social and cognitive behavior.

“The use of tools to process plants suggests a high degree of cooperation and social structure, as hominins operated as part of a larger social group,” the researchers said.

“Their ability to exploit diverse resources from both aquatic and terrestrial environments demonstrates a deep knowledge of their surrounding environment, similar to that of modern humans today.”

“This discovery is an important milestone in the field of prehistoric research, providing valuable evidence about the diet of our ancient ancestors and providing new perspectives on human evolution and the development of complex societies.”

Regarding this research, paper this week, Proceedings of the National Academy of Sciences.

_____

Hadar Ahitub others. 2025. Starch-rich plant foods 780,000 years ago: Evidence from Acheulean impact stone tools. PNAS 122 (3): e2418661121;doi: 10.1073/pnas.2418661121

Source: www.sci.news

Recent studies uncover the mechanisms by which Deinococcus bacteria can survive high levels of radiation

called radiation-resistant bacteria Deinococcus radiodurans It can withstand radiation doses thousands of times higher than what would kill a human. The secret behind this resistance is the existence of a collection of simple metabolites that combine with manganese to form a powerful antioxidant. Now, Northwestern University professor Brian Hoffman and his colleagues have discovered how this antioxidant works.

Deinococcus radiodurans. Image credit: USU/Michael Daly.

First discovered in 1956, Deinococcus radiodurans It is one of the most radiation-resistant organisms known.

It was isolated in an experiment aimed at determining whether high doses of gamma rays could be used to sterilize canned food.

In a new study, Professor Hoffman and co-authors characterized a synthetic designer antioxidant called MDP. Deinococcus radiodurans'Resilience.

They show that the components of MDP, manganese ions, phosphates, and small peptides, form a ternary complex that is a much more powerful protector from radiation damage than when manganese is combined with other individual components alone. I discovered that.

This discovery could ultimately lead to new synthetic antioxidants specifically tailored to human needs.

Applications include protecting astronauts from intense space radiation during deep space missions, preparing for radiation emergencies, and producing radiation-inactivated vaccines.

“This ternary complex is MDP's excellent shield against the effects of radiation,” said Professor Hoffman.

“It has long been known that manganese ions and phosphates together make a powerful antioxidant, but now we discover and understand the 'magical' potency brought about by the addition of a third ingredient. That's a breakthrough.”

“This study provided the key to understanding why this combination is such a powerful and promising radioprotector.”

In a previous study, researchers found that: Deinococcus radiodurans It can withstand 25,000 Grays (or units of X-rays and gamma rays).

But in a 2022 study, Professor Hoffmann and his team found that this bacterium, when dried and frozen, can withstand 140,000 Gy of radiation, 28,000 times the dose that would kill humans. did.

Therefore, if there are dormant frozen microbes buried on Mars, they may have survived the onslaught of galactic cosmic radiation and solar protons to this day.

In an effort to understand radioresistance in microorganisms, researchers investigated a designer decapeptide called DP1.

When combined with phosphate and manganese, DP1 forms the free radical scavenger MDP, which protects cells and proteins from radiation damage.

Professor Michael Daly, from Uniformed Services University, said: “This new understanding of MDP could lead to the development of even more powerful manganese-based antioxidants with applications in areas such as medicine, industry, defense and space exploration. Yes,” he said.

of result will appear in Proceedings of the National Academy of Sciences.

_____

Hao Yang others. 2024. A ternary complex of Mn2+, synthetic decapeptide DP1 (DEHGTAVMLK), and orthophosphate is an excellent antioxidant. PNAS 121 (51): e2417389121;doi: 10.1073/pnas.2417389121

Source: www.sci.news

Arctic Faces High Temperatures, Melting Ice, and Fires in 2024 According to NOAA Report

overview

  • This year was the second hottest year on record in the Arctic, according to a new report from NOAA.
  • The authors said the tundra has become a carbon source rather than a carbon sink.
  • The North Pole is heating much faster than lower altitude locations because melting ice reflects less radiation back into space.

The Arctic just experienced its second warmest year on record. And worryingly, the region's tundra is transitioning from a carbon sink to a carbon emitter as permafrost thaws and methane is released.

This would only increase the amount of heat-trapping gas entering the atmosphere, paving the way for further global warming.

The findings, shared Tuesday in the National Oceanic and Atmospheric Administration's Arctic Report Card, show how climate change is disrupting ecosystems and altering the landscape in regions where global warming is most intense.

The Arctic, considered a leading region for the effects of climate change, is heating much faster than lower-altitude locations, depending on the baseline scientists use for comparisons and which geographies they include in their assessments. But that speed is 2-4 times faster. Each of the last nine years in the Arctic has been the hottest on record since 1900.

This dynamic is the result of a phenomenon called arctic amplification. As snow cover and sea ice are lost in the Arctic, more dark-colored water and rocks are revealed. Their dark surfaces reflect less radiation back into space, instead absorbing heat. In addition, ocean and atmospheric circulation patterns increasingly transport heat toward the Earth's poles.

Taken together, that means the Arctic is a fundamentally different place than it was just a decade ago. Twila Moon said.

“The Arctic is in a kind of new regime, not a new normal, of course, but it's definitely different than it was just a few decades ago,” she says.

Overall, the Arctic is becoming a greener landscape with more extreme precipitation, less snow and ice, the report said. As fires in the Arctic send smoke into populated areas, ice melts and sea levels rise, the effects of those changes are becoming increasingly apparent closer to American homes, scientists said.

“These problems aren't just limited to the Arctic; they affect all of us,” says Brendan Rogers, an associate scientist at the Woodwell Climate Research Center in Woods Hole, Massachusetts. .

This year's report includes a detailed explanation of how the carbon cycle in the Arctic is changing. Scientists have been closely watching what happens when permafrost thaws, releasing powerful greenhouse gases as it thaws and decomposes.

“Permafrost regions contain about twice as much carbon as is currently present in the atmosphere, and about three times as much carbon as is contained in the above-ground biomass of forests around the world. There's a lot of carbon out there,” Rogers said.

He added that permafrost areas “have been carbon sinks for thousands of years on average, primarily due to low temperatures and frozen soil.” Carbon sinks, by definition, absorb and capture more carbon dioxide than they emit. But now such areas are instead sources of greenhouse gas emissions, as they dissolve carbon and methane and release it into the atmosphere, Rogers said.

Wildfires also contribute to Arctic emissions. Last year's wildfires burned more than twice as much area in the region as the year before, and produced more emissions than Canada's economic activity.

Rogers said Canada's total wildfire emissions are “roughly three times the emissions from all other sectors in Canada.” “This is more than the annual emissions of any other country except China, the United States, India and Russia.”

Last year's wildfires forced the evacuation of Yellowknife, the capital of Canada's Northwest Territories. About 19,000 people had to evacuate the cityin Areas with discontinuous permafrost.

Temperature records are organized by Arctic water year, so the most recent records are from October 2023 to September 2024. Every September, scientists measure the extent of Arctic sea ice at its seasonal minimum.

This year's sea ice was the sixth lowest in the 45 years since satellite measurements began. Sea ice extent has decreased by about 50% since the 1980s. Meanwhile, the Arctic tundra is the second greenest since records began in 2000, indicating more shrubs have taken root and spread into new terrain.

Measurements of Arctic permafrost taken from boreholes drilled beneath the earth's surface show that average temperatures were warmer than in all but one year.

“There are many indicators that consistently show extreme or near-extreme conditions,” Moon said.

Source: www.nbcnews.com

Study finds that decreased cloud cover may lead to heatwaves and high temperatures

overview

  • Global temperatures over the past two years have been even warmer than climate scientists expected.
  • A new study offers a possible reason: reduced cloud cover.
  • The study suggests that this decline may be a result of global warming, which could mean the Earth is heating up even faster than scientists thought.

Over the past two years, temperatures around the world have risen far more than scientists expected. This trend is creating the mystery of whether there are hidden climate change dynamics behind the sudden change.

Last year was the hottest summer on record, 2024 was likely to be even hotter.. Even after accounting for the expected effects of greenhouse gas pollution and El Niño (a natural pattern that generally increases temperatures), the researchers found that the roughly 0.2 degrees Celsius (0.36 degrees Fahrenheit) of warmer temperatures observed in 2023 I couldn’t explain the change.

A new study offers a possible explanation. Cloud cover has decreased over the past two years, meaning more light is now reaching and heating the Earth’s surface, rather than being reflected back into space.

In the research, Published in Science on Thursdaysuggesting that this dynamic, called an overall decrease in the planet’s albedo, is likely the cause of the observed temperature anomaly in 2023.

“This is broadly consistent with the observed recent further increase in solar radiation,” said study author Helge Goessling, a climate physicist at the Alfred Wegener Institute in Germany.

The expected cloud behavior in a warmer world is one of the most difficult aspects of the climate system to study and model. Answering questions about it will help scientists more accurately determine how sensitive the Earth is to greenhouse gas emissions.

If the decrease in low-level cloud cover is not a coincidence, it likely means the Earth is warming even faster than scientists thought.

“It’s not really clear yet how likely it is that some of this variation is variability that disappears again,” Gosling said. “This increases the likelihood of greater-than-expected warming.”

The new study is based on analysis of climate models and NASA satellite data on Earth’s reflectivity. It outlines three possible reasons for the decline in developing low clouds, but provides no conclusions about how much each factor contributes.

One option is that natural processes temporarily deviate from normal, causing a decrease in cloud cover. For example, natural fluctuations may be causing sea surface temperatures to rise more than expected, thereby changing the physics of how clouds form.

The second possibility is a change in maritime transport regulations. In 2020, the International Maritime Organization imposed limits on the sulfur content allowed in marine fuels. Some scientists believe that reducing the number of sulfur particles polluting the atmosphere may have the unintended effect of suppressing ocean cloud formation.

“They act as condensation nuclei for clouds, so they can make clouds brighter and last longer,” Goessling said of the sulfur particles.

A third option is that unidentified feedback loops within the climate system are causing clouds to decrease due to global warming.

If the latter two possibilities turn out to be the main culprits, it would mean that the climate is more sensitive to anthropogenic pollution than many scientists thought, and that humanity is therefore more likely than world leaders to It means we are closer to exceeding the targets set for emissions limits than previously realized. (The term “climate sensitivity” refers to how warm the Earth is. If the concentration of carbon dioxide and other greenhouse gases in the atmosphere doubled.. )

Still, many questions remain, said Zeke Hausfather, director of climate research at financial firm Stripe and a researcher at Berkeley Earth.

“It remains to be seen whether these changes in cloud behavior are due to short-term fluctuations and will return to more normal conditions over time, or whether they represent new and ongoing changes to the climate system. No,” he said. Email.

According to the National Oceanic and Atmospheric Administration, the average surface temperature of land and ocean in 2023 was about 2.12 degrees Fahrenheit above the 20th century average.

Efforts by world leaders to reduce greenhouse gas emissions remain insufficient. Global temperatures are on track to rise by more than 3 degrees Celsius (5.4 degrees Fahrenheit) on average, far exceeding the 1.5 degrees Celsius (2.7 degrees Fahrenheit) goal set by the Paris Agreement.

Source: www.nbcnews.com

Record high CO2 emissions driven by drought, fires, and use of fossil fuels

Wildfires in the tropics caused a slight increase in CO2 emissions, but most of that was due to burning fossil fuels

Carl De Souza/AFP/Getty Images

Carbon dioxide emissions from burning fossil fuels in 2024 are expected to exceed last year's record levels, dashing hopes that global warming emissions will peak this year.

“Reducing emissions is more urgent than ever, and the only way to do that is by significantly reducing fossil fuel emissions,” he said. Pierre Friedlingstein At the University of Exeter, UK.

This is according to the latest global carbon budget. reporta preliminary calculation of CO2 emissions to date, including projections until the end of the year, prepared by Friedlingstein and his colleagues. The announcement was made at the ongoing COP29 summit in Azerbaijan, where countries aim to set new fiscal targets to combat climate change.

Last year, some researchers predicted a peak in emissions in 2024, but the report found that anthropogenic CO2 emissions would reach a record high of 41.6 gigatonnes in 2024, up 2% from the 2023 record. It has been revealed that the number is expected to increase. Almost 90% of that total is made up of emissions from fossil fuel combustion. The remainder is primarily due to land changes caused by deforestation and wildfires.

Fossil fuel emissions will grow at 0.8 percent, half as much as in 2023, but still higher than the average growth rate over the past decade. “[The slower rate] This is a good sign, but we are still miles away from reaching our goal,” says Friedlingstein.

Despite a long-term downward trend, projected emissions from land-use change also increased this year, mainly due to drought-induced wildfires in the tropics. Part of this increase is also due to the collapse of terrestrial sinks of carbon in 2023, which typically removes about a quarter of annual CO2 emissions from the atmosphere. This absorption decreased by more than 40 percent last year and in early 2024 as global temperatures soared due to El Niño.

“2023 is an amazing demonstration of what can happen in a warmer world when El Niño droughts and fires combine to create record global temperatures,” he says. pep canadel He is a researcher at Australia's Commonwealth Scientific and Industrial Research Agency and co-author of this report. “Taken all together, the world's forests contributed almost a third less carbon dioxide from the atmosphere last year than they did in the previous decade.”

This will also increase emissions in 2024, but researchers expect this “land carbon sink” to largely recover as the warming effects of El Niño fade. “This is not a long-term collapse,” Friedlingstein says.

The report reveals that China's CO2 emissions, which emit almost a third of the world's total emissions, are projected to increase by only 0.2% in 2024 compared to 2023. . Canadel said this forecast of China's emissions has a large margin of error, so they may actually be stable or even declining. India's emissions have also slowed down from last year, increasing by just under 5%. In the US and EU, emissions continued to decline, albeit at a much slower pace than last year.

Increased demand for electricity to power air conditioners due to higher temperatures is also a key reason why fossil fuel emissions will continue to rise despite a massive build-up of renewable energy in 2024. He says: Neil Grant At the German think tank Climate Analytics. Whether it's from electric vehicles, data centers or manufacturing, “most people seem to be a little surprised by the level of electricity demand this year,” he says.

If emissions continue at this level, the world will exceed its carbon budget to limit warming to 1.5°C above pre-industrial levels within six years, and exceed its budget to limit warming to 2°C within 27 years, the report says. Pointed out. .

“We have to accelerate, accelerate, accelerate, accelerate the transition to renewable energy,” Candell said. “Climate change is like a slippery slope and we can keep falling. We need to slam on the brakes as hard as we can to avoid falling.”

topic:

Source: www.newscientist.com

Bitcoin reaches new all-time high of over $82,000 thanks to “Trump Pump”

Bitcoin prices surged above $82,000 for the first time as traders speculated on Donald Trump’s potential support for the cryptocurrency upon his return to the White House.

Bitcoin hit a record high of $82,413 before dipping by approximately 2.8% to around $82,000 on Monday. The price has more than doubled from about $37,000 a year ago.

While Trump had previously criticized Bitcoin, he appeared to shift his stance during the US presidential campaign, engaging with the crypto community and attending industry events. This shift raised hopes of relaxed regulations for individual investors looking to enter the cryptocurrency market, although specific policies have not been announced by President Trump.

Following Trump’s anticipated victory, Trump trading has impacted global markets with a strengthening of the dollar as investors await significant government spending in the US.

In China, investors brace for increased tariffs as Hong Kong’s Hang Seng Index dropped 1.5% on Monday in response to what some view as an insufficient reaction to China’s recent stimulus measures.

Despite China’s debt exchange programs worth approximately 10 trillion yuan (£1.1 trillion), Deutsche Bank economists note a lack of direct fiscal stimulus or housing enhancements, leading to market disappointment.

The values of alternative cryptocurrencies like Ethereum and Dogecoin have also risen in the wake of the election. Dogecoin, previously supported by Elon Musk, saw a significant increase in value, fueling further interest in digital assets.

Trump’s open support for his family’s cryptocurrency venture and potential deregulation of digital assets could raise concerns over conflicts of interest. Nevertheless, there is growing interest in cryptocurrencies as an alternative to traditional banking systems.

Efforts to deregulate digital assets may expose Trump to criticism over potential conflicts of interest, particularly concerning his family’s cryptocurrency venture. His son, Donald Trump Jr., has advocated for cryptocurrencies as a conservative-friendly alternative to traditional banking systems.

In anticipation of potential policy changes, publicly traded crypto companies like Coinbase and MicroStrategy have seen significant increases in their stock prices.

Skip past newsletter promotions

Market analysts suggest that Bitcoin price fluctuations are closely linked to the prevailing market sentiment, including investors’ reactions to political developments like the election results and potential policy changes under a new administration.

As Bitcoin’s price surge continues, interest in cryptocurrencies is on the rise. Online searches for “Bitcoin” have reached their highest levels in months, indicating growing curiosity and market activity in the digital asset space.

Source: www.theguardian.com

Most fast radio bursts come from galaxies with high star formation rates

Fast radio bursts (FRBs) are millisecond-long events detected from beyond the Milky Way. The radiative properties of FRBs favor magnetars as their source, as evidenced by FRB-like outbursts from the Milky Way's magnetars and the star-forming nature of FRB host galaxies. However, the process that generates the FRB source remains unknown. FRBs are more likely to occur in massive star-forming galaxies, according to a new study. The study also suggests that magnetars, whose magnetic fields are 100 trillion times stronger than Earth's, are often formed when two stars merge and later explode in a supernova.

This photo montage shows the Deep Synoptic Array-110 antenna used to locate and determine the location of Fast Radio Bursts (FRBs). Above the antenna are several images of the FRB's host galaxy appearing in the sky. These galaxies are very large and challenging models to describe FRB sources. Image credit: Annie Mejia/California Institute of Technology.

“Magnetars' immense power output makes them one of the most fascinating and extreme objects in the universe,” said lead author Kriti Sharma, a graduate student at Caltech.

“Little is known about what causes magnetars to form during the extinction of massive stars. Our work helps answer this question.”

To search for FRBs, Sharma and his colleagues used Deep Synoptic Array-110 (DSA-110) at the Owens Valley Radio Astronomical Observatory near Bishop, California.

To date, this sprawling radio array has detected 70 FRBs and located their specific source galaxies (only 23 other FRBs have been located by other telescopes). is).

In the current study, the researchers analyzed 30 of these local FRBs.

“DSA-110 more than doubles the number of FRBs containing known host galaxies, which is what we built the array for,” said Dr. Vikram Ravi of the California Institute of Technology.

FRBs are known to occur in galaxies that are actively forming stars, but the authors were surprised to find that FRBs are more frequent in massive star-forming galaxies than in low-mass star-forming galaxies. I've found that this tends to happen.

This alone was interesting because astronomers had previously thought that all types of active galaxies generate FRBs.

Armed with this new information, they began pondering what the results revealed about the Fed.

Metals in our universe (elements manufactured by stars) take time to accumulate over the course of the universe's history, so large galaxies tend to be rich in metals.

The fact that FRBs are more common in these metal-rich galaxies means that the magnetars from which they originate are also more common in these types of galaxies.

Stars rich in metals (astronomical terminology for elements heavier than hydrogen or helium) tend to be larger than other stars.

“Over time, as the galaxy grows, successive generations of stars evolve and die, enriching the galaxy with metals,” Dr. Ravi said.

Additionally, massive stars that can go supernova and become magnetars are more commonly found in pairs.

In fact, 84% of massive stars are binaries. So when one massive star in a binary swells with extra metal content, that extra material is pulled into its partner, which facilitates the eventual merger of the two stars.

These merging stars will have a combined magnetic field that is larger than the magnetic field of a single star.

“Stars with more metallic content swell, promoting mass transfer and eventually reaching mergers, resulting in even more massive stars with a total magnetic field greater than what any individual star would have.” is formed,” Sharma said.

In summary, since FRBs are preferentially observed in massive, metal-rich star-forming galaxies, magnetars (which are thought to cause FRBs) are also probably located in metal-rich environments that promote the merger of two stars. It is thought that it is formed by.

Therefore, this result suggests that magnetars in the universe originate from the remains of stellar mergers.

In the future, the team plans to use the DSA-110 and eventually the DSA-2000, an even larger wireless array to be built in the Nevada desert and expected to be completed in 2028, to connect more FRBs and their We would like to track the location of the occurrence.

“This result is a milestone for the entire DSA team. Many of the authors of this paper helped build DSA-110,” said Dr. Ravi.

“And the fact that DSA-110 is so good at localizing FRBs bodes well for the success of DSA-2000.”

of findings Published in today's magazine nature.

_____

K. Sharma others. 2024. Preferential occurrence of fast radio bursts in massive star-forming galaxies. nature 635, 61-66; doi: 10.1038/s41586-024-08074-9

Source: www.sci.news

Bitcoin reaches all-time high of $75,000 as investors speculate on Trump’s win | Bitcoin

Bitcoin has reached record highs amidst speculation on Donald Trump’s victory in the US presidential election, with many viewing him as a candidate supportive of cryptocurrencies.

The digital currency hit $75,005.08 on Wednesday morning, surpassing its previous peak of $73,797.98 achieved in March.

“Bitcoin’s price seems to be closely tied to President Trump’s standing in the polls and betting markets,” commented AJ Bell analyst Russ Mould ahead of the U.S. presidential election.


Investors believe that a Republican win could lead to increased demand for digital currencies,” he added.

Although Trump previously criticized cryptocurrencies as scams during his tenure, he has since shifted his position and even introduced his own platform for the currency.

Nigel Green from DeVere also stated before the election that “President Trump’s victory could propel the world’s first and largest cryptocurrency to new heights.”

Green added, “If re-elected, there would likely be a focus on deregulation, tax breaks, and economic policies favoring investments like Bitcoin.”

President Trump has vowed to make the United States the “Bitcoin and cryptocurrency capital of the world” and appoint Elon Musk to oversee a comprehensive audit of government spending.

Trump’s corporate tax cuts during his previous term boosted market liquidity and encouraged investment in high-growth assets such as cryptocurrencies.

In September, Trump announced the launch of a digital currency platform named World Liberty Financial with his son and other entrepreneurs, although initial sales were sluggish.

World Liberty Financial provides a lending and borrowing service for cryptocurrencies, akin to platforms like Aave.

Since their inception, cryptocurrencies have made headlines for extreme volatility and the collapse of major industry players, notably the FTX exchange platform.

Leading up to the election, Trump made a purchase at a New York restaurant, touting it as a “historic transaction” and possibly becoming the first former president to use Bitcoin for a transaction.

“Who wants a hamburger?” Trump exclaimed to his followers in September, shortly after the platform’s launch.

Read more of the Guardian’s 2024 US election coverage

Source: www.theguardian.com

Apple sees high demand for iPhone 16 despite declining sales in China

Apple’s quarterly earnings report on Thursday revealed strong demand for the iPhone 16, with a slight dip in overall sales in China compared to the previous year. The company recorded revenue of $94.9 billion, up by 6%, and earnings per share (EPS) of $1.64, slightly beating Wall Street’s expectations of $1.60 EPS on revenue of $94.4 billion.

Revenue from iPhone sales reached $46.2 billion, higher than the $43.8 billion reported in the same period last year. Additionally, fourth-quarter revenue for the Services segment, including subscriptions, rose to $24.97 billion from $22.31 billion year-over-year.

The company also received a one-time payment of $10.2 billion following the annulment of the European General Court’s judgment demanding Apple to repay Irish taxes.

This earnings report marked the debut assessment of the iPhone 16’s demand, which was launched shortly before the close of the fourth quarter. The introduction of the latest iPhone was anticipated to boost Apple’s presence in China and help in reclaiming market share from competitors like Huawei and Xiaomi. According to a report by the International Data Corporation, Apple had dropped to the sixth position in smartphone retail rankings due to tough competition.

CEO Tim Cook lauded the release of the company’s “best products yet,” which now include Apple Intelligence in addition to the iPhone 16.

Apple Intelligence, a new feature providing enhanced privacy in AI, was recently launched, further strengthening the product lineup for the holiday season. The company did not specify the anticipated impact of Apple Intelligence on driving product demand during the holiday period.

Luca Maestri, Apple’s chief financial officer, expressed excitement about upcoming product launches and enhancements, emphasizing that the rollout of Apple Intelligence will evolve gradually.

Amidst a challenging year for Apple, marked by weak demand for its other devices, investors sought updates on iPhone 16 demand and the gradual rollout of Apple’s AI features, collectively known as Apple Intelligence.

Cook highlighted the positive consumer response to Apple Intelligence, noting a significant increase in iOS update downloads compared to the previous year.

The company continues to refine Apple Intelligence, with plans for further feature releases over the next months. Cook hinted at more advanced versions in the pipeline as well.

Apple has yet to launch Apple Intelligence in key markets like Europe and China, where competition remains fierce. In Asia, the Indonesian government has imposed a ban on iPhone 16 sales, alleging Apple’s failure to fulfill promises of increased local investments.

Source: www.theguardian.com

Dementia risk factors identified: Poor vision and high cholesterol

Vision loss linked to dementia

Drazen Žigic/Getty Images

A large-scale study has identified poor eyesight and high cholesterol as two new risk factors for dementia. The study claims that eliminating these factors, along with 12 other previously recognized factors, could prevent almost half of all dementia cases worldwide. However, some of these factors are difficult to eliminate, and genetics and advanced age remain the biggest risk factors for developing dementia.

“Dementia may be one of the most significant health threats facing the nation.” Gil Livingston “The possibility of changing this and significantly reducing the number of people suffering from depression is crucial,” said researchers from University College London. [this] disease.”

A 2020 study identified 12 potentially modifiable risk factors for dementia, including hearing loss, depression, smoking, high blood pressure, heavy alcohol consumption, obesity, air pollution, traumatic brain injury, diabetes, social isolation, physical inactivity and lack of education.

Livingstone and 26 other dementia experts from around the world updated the list based on the latest evidence, retaining the 12 risk factors but adding two new ones: high levels of low-density lipoprotein (LDL) “bad” cholesterol before age 65 and untreated vision loss in later life.

The researchers included high LDL cholesterol based on several new findings, including: Analysis of 17 studies The study followed around 1.2 million British participants under the age of 65 for over a year.

The results showed that for every 1 millimole per liter (mmol/L) increase in LDL cholesterol, the incidence of dementia increased by 8 percent. In another study of similar size, High LDL cholesterol (above 3 mmol/L) has been linked to a 33% increased risk of dementia, on average, and this risk is most pronounced in people who had high LDL cholesterol in midlife. “So it really does matter how long you have it,” Livingston says.

The researchers suggest that this association may mean that excess cholesterol in the brain increases the risk of stroke and contributes to dementia. Cholesterol has also been linked to the buildup of beta-amyloid protein plaques in the brain, which is linked to Alzheimer's disease.

Untreated vision loss can: Analysis of 14 studiesThe study, which involved more than 6.2 million older adults who were initially cognitively healthy, showed a 47% increased risk of developing dementia over 14.5 years. In another analysis, the risk The decline in vision was mainly due to cataracts and complications from diabetes. [loss] “There's a risk because you're reducing cognitive stimulation,” Livingston said, even though some research suggests that such stimulation may make the brain more resilient to dementia.

The researchers then used their model to estimate what percentage of dementia cases worldwide could be prevented if each of 14 modifiable risk factors were eliminated. They found that hearing loss and high cholesterol had the greatest impact, each contributing about 7 percent of dementia cases, while obesity and excessive alcohol consumption had the least impact, each contributing 1 percent. If all factors were eliminated, the team estimated that about 45 percent of dementia cases could be prevented.

But just because these factors are associated with dementia doesn't mean they cause it, he said. Dylan Williams“So even if we target interventions at them, they may not prevent as much disease as we would hope,” said researcher David L. Schneider of University College London, who was not involved in the report.

These estimates are only population averages and don't capture individual-level risk, Williams says. So removing all factors from your life wouldn't necessarily halve your risk of dementia, which is heavily influenced by genetics and age. Eliminating many of these risk factors, like air pollution or lack of education, would also require public health interventions rather than individual changes, Williams says.

topic:

Source: www.newscientist.com

Study reveals high prevalence of SARS-CoV-2 virus among wild animals

In a new study, a team of scientists from Virginia Tech investigated the extent to which exposure to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), the causative agent of COVID-19, was widespread in wildlife communities in Virginia and Washington, DC, between May 2022 and September 2023. They documented positive detection of SARS-CoV-2 RNA in six species: deer mice, Virginia opossums, raccoons, groundhogs, cotton-tailed bats, and eastern red bats. They also found no evidence that the SARS-CoV-2 virus was transmitted from animals to humans, and people should not fear general contact with wildlife.

Goldberg othersThis suggests that a wide variety of mammal species were infected with SARS-CoV-2 in the wild. Image credit: Goldberg others., doi: 10.1038/s41467-024-49891-w.

“SARS-CoV-2 could be transmitted from humans to wild animals during contact between humans and wild animals, in the same way that a hitchhiker might jump to a new, more suitable host,” said Carla Finkelstein, a professor at Virginia Tech.

“The goal of a virus is to spread in order to survive. It wants to infect as many humans as possible, but vaccination protects many of us. So the virus turns to animals, where it adapts and mutates to thrive in a new host.”

SARS CoV-2 infections have previously been identified in wild animals, primarily white-tailed deer and wild mink.

This new research significantly expands the number of species investigated and improves our understanding of virus transmission in and between wild animals.

The data suggest that exposure to the virus is widespread among wild animals and that areas with high human activity may be contact points for interspecies transmission.

“This study was prompted by the realization that there were significant and important gaps in our knowledge about the transmission of SARS-CoV-2 in the broader wildlife community,” said Dr. Joseph Hoyt of Virginia Tech.

“Most studies to date have focused on white-tailed deer, but we still don't know what's going on with many of the wildlife species commonly found in our backyards.”

For the study, the researchers collected 798 nasal and oral swabs from animals that had been caught live and released from the wild, or that were being treated at a wildlife rehabilitation center, as well as 126 blood samples from six animal species.

These sites were chosen to compare the presence of viruses in animals across different levels of human activity, from urban areas to remote wilderness.

The scientists also identified two mice with the exact same mutation on the same day and in the same location, indicating that they either both got infected from the same person, or one had transmitted it to the other.

How it spreads from humans to animals is unknown, but wastewater is a possibility, but trash cans and discarded food are more likely sources.

“I think the biggest takeaway from this study is that this virus is everywhere. We're finding it in common backyard animals that are testing positive,” said Dr. Amanda Goldberg of Virginia Tech.

“This study highlights the potentially broad host range of SARS-CoV-2 in nature and how widely it may actually spread,” Dr Hoyt said.

“There is much work to be done to understand which wildlife species, if any, are important in the long-term maintenance of SARS-CoV-2 in humans.”

“But what we've already learned is that SARS CoV-2 is not just a human problem, and we need multidisciplinary teams to effectively address its impacts on different species and ecosystems,” Professor Finkelstein said.

of Investigation result Published in today's journal Nature Communications.

_____

A.R. Goldberg others2024. Widespread exposure to SARS-CoV-2 in wildlife communities. Nat Community 15, 6210; doi: 10.1038/s41467-024-49891-w

This article has been edited based on the original release from Virginia Tech.

Source: www.sci.news

Google fails to meet major climate goal due to high AI power consumption

Three years ago, Google launched an ambitious plan to address climate change, aiming to achieve “net zero” emissions by 2030. This goal entails not emitting more climate-affecting gases into the atmosphere than it removes.

However, a recent report released by Google indicated that it is far from reaching this objective. Emissions are projected to increase by 13% in 2023 compared to the previous year and have surged by 48% since the base year of 2019.

The company attributes last year’s emission growth to electricity-intensive agriculture. The burning of coal or natural gas for electricity production releases greenhouse gases like carbon dioxide and methane, contributing to global warming and more extreme weather events.

Despite being a leader in climate change initiatives, Google faces challenges in meeting its net-zero target. Experts suggest that the rapid expansion of data centers, which require significant energy and resources, could hinder the transition to clean electricity and exacerbate climate change.

To address these issues, Google’s chief sustainability officer, Kate Brandt, emphasized the need for continued evolution and innovation in the company’s approach. She acknowledged the uncertainties surrounding the environmental impact of AI and the importance of partnering with cleaner companies and investing in sustainable practices.

Ultimately, achieving a net-zero goal by 2030 will require concerted efforts and collaboration across industries to accelerate the transition to clean energy sources and mitigate the effects of climate change.

Google’s commitment to using renewable energy and implementing energy-efficient practices in its data centers and offices is a step in the right direction. However, there is a need for ongoing improvement and collaboration to address the challenges posed by climate change.

Source: www.nbcnews.com

Cooling fabric reduces heat transfer from pavements and buildings in urban areas with high temperatures

A scorching hot day in Bucharest, Romania, June 2019

lcv / Alamy

In the future, city dwellers could beat the heat with clothes made from new fabrics that keep them cool.

Made from plastic material and silver nanowires, the fabric is designed to keep you cool in urban environments by using the principle of radiative cooling, a natural process in which objects radiate heat back into space.

The material selectively emits a narrow band of infrared light that allows it to escape the Earth’s atmosphere, while at the same time blocking radiation from the sun and from surrounding structures.

Jo Bo-jun, a researcher from the University of Chicago, Illinois, and his team say the material “is more than half [the radiation]” from buildings and the ground,” he says.

Some cooling fabrics and building materials already use this radiative cooling principle, but most of their designs don’t take into account radiation from the sun or infrared radiation from structures like buildings and pavements, and they assume the materials are oriented horizontally against the sky, like roof panels, rather than vertically like clothing worn by a person.

Such designs “work well when they face something cooler, like the sky or a field,” Su says, “but not when they face an urban heat island.”

Xu and his colleagues designed a three-layered fabric: the inner layer is made from common clothing fabrics like wool or cotton, and the middle layer is made up of silver nanowires that reflect most of the radiation.

The top layer is made of a plastic material called polymethylpentene, which does not absorb or reflect most wavelengths and emits a narrow band of infrared light.

In outdoor tests, the fabric remained 8.9°C (16°F) cooler than regular silk fabric and 2.3°C (4.1°F) cooler than a broad-spectrum radiation-emitting material. When tested against the skin, the fabric was 1.8°C (3.2°F) cooler than cotton fabric.

Su said this slight difference in temperature could theoretically increase the amount of time a person can comfortably be exposed to heat by up to a third, but that this has yet to be tested.

“It’s always been difficult to make this material practical as a fiber.” Aswath Raman, the UCLA researcher added that the study is a good example of applying the physical principles of radiative cooling to a practical material. Other materials with similar properties could also be used on vertical surfaces in buildings, he said.

Science
DOI: 10.1126/science.adl0653

topic:

Source: www.newscientist.com

Webb uncovers high levels of hydrocarbons in protoplanetary disks surrounding ultra-low-mass stars

Very low-mass stars orbit rocky exoplanets more frequently than other types of stars. The composition of these planets is poorly understood, but it is thought to be related to the protoplanetary disk in which they form. In the new study, astronomers used the NASA/ESA/CSA James Webb Space Telescope to investigate the chemical composition of the planet-forming disk around ISO-ChaI 147, a red dwarf star just one-tenth the mass of the Sun. They identified emission from 13 carbon-containing molecules, including ethane and benzene.

This is an artist's impression of a young star surrounded by a disk of gas and dust. Image courtesy of NASA/JPL.

ISO-ChaI 147 It is a red dwarf star with a mass 0.11 times that of the Sun, located about 639 light years away in the constellation Chamaeleon.

The star was observed as part of the MIRI Mid-Infrared Disk Survey (MINDS), which aims to bridge the gap between the chemical composition of the disk and the properties of exoplanets.

These observations provide insight into the environments and fundamental elements for the formation of such planets.

Astronomers discovered that the gas in ISO-ChaI 147's planet-forming region is rich in carbon.

This could be due to carbon being removed from the solid material from which rocky planets form, which could explain why Earth is relatively carbon-poor.

“WEBB has greater sensitivity and spectral resolution than conventional infrared space telescopes,” said Dr Aditya Arabavi, an astronomer at the University of Groningen.

“These observations are not possible from Earth because the radiation is blocked by the atmosphere.”

“So far we have only been able to identify acetylene emissions from this object.”

“But Webb's high sensitivity and spectral resolution allowed us to detect faint emissions from fewer molecules.”

“Thanks to Webb, we now know that these hydrocarbon molecules are not only diverse, but abundant as well.”

The spectrum of ISO-ChaI 147 shows the richest hydrocarbon chemical composition ever observed in a protoplanetary disk, consisting of 13 carbon-containing molecules. Image credit: NASA/ESA/CSA/Ralf Crawford, STScI.

The spectrum of ISO-ChaI 147 is Webb's mid-infrared measuring instrument (MIRI) displays the richest hydrocarbon chemical composition ever observed in a protoplanetary disk, consisting of 13 carbon-containing molecules up to benzene.

This includes the first extrasolar detection of ethane, the largest fully saturated hydrocarbon detected outside the solar system.

Fully saturated hydrocarbons are expected to form from more basic molecules, so detecting them here can give researchers clues about their chemical environment.

Astronomers also detected ethylene, propyne, and methyl radicals in a protoplanetary disk for the first time.

“These molecules have already been detected in our solar system, for example in comets such as 67P/Churyumov-Gerasimenko and C/2014 Q2 (Lovejoy),” Dr. Arababi said.

“It's amazing that we can now see these molecules dancing in the cradle of the planet.”

“This is a completely different environment to how we normally think of planet formation.”

The team note that these results have significant implications for the astrochemistry within 0.1 AU and the planets that form there.

“This is very different to the composition found in disks around solar-type stars, where oxygen-containing molecules (such as carbon dioxide and water) dominate,” said Dr Inga Kamp, also from the University of Groningen.

“This object proves that these are unique classes of objects.”

“It's incredible that we can detect and quantify the amount of a molecule that's well known on Earth, such as benzene, in an object more than 600 light years away,” said Dr Agnes Perrin, an astronomer at the French National Center for Scientific Research.

Team result Published in today's journal Science.

_____

AM Arabavi other2024. Abundant hydrocarbons present in a disk around a very low-mass star. Science 384, 6700: 1086-1090; doi: 10.1126/science.adi8147

Source: www.sci.news