The ozone layer has shown significant improvement, with the Antarctic ozone holes in 2024 being smaller than in prior years. New Report from the World Meteorological Organization (WMO).
This map depicts the size and shape of the Antarctic ozone hole on October 5th, 2022. Image credit: Earth Observatory by Joshua Stevens/NASA.
The depth of the Antarctic ozone hole in 2024 (which typically appears every spring) was below the average levels measured from 1990 to 2020, with the maximum ozone mass deficit recorded on September 29th at 46.1 million tons.
From 2020 to 2023, it remained smaller than a significantly larger hole.
Its development was relatively gradual, with ozone depletion slowing by September, followed by a quicker recovery after reaching the maximum deficit.
“This consistent progression is considered a strong indicator of early recovery in the Antarctic ozone holes,” stated WMO experts.
The alarm was initially sounded by scientists in 1975 when the WMO reported “changes in the ozone layer due to human activities and certain geophysical factors.”
If current policies remain in effect, the latest assessment for 2022 indicates that the ozone layer is projected to return to 1980 levels (prior to the appearance of ozone holes) around 2066, 2045 in the Arctic, and globally by 2045.
“Despite the significant success of the Montreal Protocol over the years, this effort remains ongoing, and continuous monitoring of stratospheric ozone and ozone-depleting substances is essential,” experts noted.
“WMO’s scientific research on the ozone layer spans decades,” remarked Celeste Sauro, WMO executive director.
“It relies on trust, international collaboration, and a commitment to free data exchange—fundamental principles of the world’s most successful environmental agreements.”
“To date, the Montreal Protocol has resulted in over 99% reduction in the production and consumption of controlled ozone-depleting substances used in refrigeration, air conditioning, fire foam, and even hairsprays.”
“Consequently, the ozone layer is on course to recover to 1980 levels by the middle of this century, significantly lowering the risk of ecosystem damage from skin cancer, cataracts, and UV overexposure.”
Egg cells do not dispose of waste like other cells.
Sebastian Kauritzki / Aramie
Human eggs appear to manage waste differently than other cell types.
All women are born with a limited supply of egg cells, or oocytes, expected to last around 50 years. This duration is remarkably extended for a single cell. Certain human cells, including brain and retinal cells, can persist for a lifetime, but the innate processes that facilitate their function often lead to gradual damage over time.
Cells require protein recycling as part of their housekeeping, but this comes with a price. The energy spent during this process can generate reactive oxygen species (ROS), which inflict random damage on the cells. “This background damage is ever-present,” notes Elvan Beke from the Spanish Genome Regulation Center. “An increase in ROS correlates with increased damage.”
However, it appears that healthy eggs circumvent this issue. To investigate this phenomenon, Beke and her team examined human eggs under a microscope. The cells were immersed in a fluid containing a fluorescent dye that binds to acidic cellular components known as lysosomes, which are considered “recycling plants.” Gabriele Zaffagnini from the University of Cologne, Germany, was involved in this study.
The bright dyes indicated that the lysosomes containing waste in human eggs demonstrated less activity compared to similar structures in other human cells or small mammalian egg cells, such as those from mice. Zaffagnini and his colleagues theorize that this may serve a self-preservation purpose.
According to Zaffagnini, reducing the waste recycling process might be one of several strategies employed by human egg cells to maintain their extended lifespan. Beke suggests that human oocytes appear to “put the brakes on everything” in order to minimize damage, as all cellular functions slow down in these eggs, thereby lowering the production of harmful ROS.
Slowing the protein recycling mechanism seems beneficial for egg cell health, and failure to do so could explain the prevalence of unhealthy oocytes. “This insight might help explain why human oocytes become dysfunctional after a certain age,” states M-Re from Yale University School of Medicine. “This could lead to a broader understanding of the challenges faced by human oocytes,” he adds.
Fluorescent dyes highlight human egg cells, showcasing components
such as mitochondria (orange) and DNA (light blue).
Gabriele Zaffagnini/Centro de Regulación Genómica
Evaluating egg cell health in this manner could enhance fertility therapies. “It’s well-known that protein degradation is vital for cell survival, directly affecting fertility,” explains Beke, who is focused on researching healthy egg cells. There are ongoing comparisons between oocytes and cells from individuals encountering fertility issues. “Elevated ROS levels correlate with poor IVF outcomes,” she states.
Research on human egg cells is still in its early stages due to inherent complexities. “They are hard to manipulate due to sample constraints,” comments Beke. Seri mentions that this is one of several “layers” complicating egg cell studies, including regulatory limitations and funding challenges.
Zaffagnini believes that overcoming these obstacles could lead to “truly astonishing” discoveries. “It’s certainly worth pursuing,” he concludes.
John Stewart Bell developed a method to measure the unique correlations permitted in the quantum world
CERN
While some perceive a Poltergeist in the attic and others spot a ghost on dark nights, there’s also the enigmatic figure of John Stewart Bell. His groundbreaking work and enduring legacy have intrigued me for years.
Consider this: how much of our reality can we claim to experience objectively? I ponder this frequently, especially when discussing the intricate nature of space, time, and quantum mechanics. Bell was deeply reflective about such matters, and his contributions have forever altered our comprehension of these concepts.
Born in Belfast in 1928, Bell was, by all accounts, a curious and cheerful child. He gravitated towards physics early and undertook his first role as a lab engineer at just 16. With training in both theoretical and experimental physics, he built a significant part of his career around particle accelerators. Yet, it was the inconsistencies he perceived within quantum theory that occupied his thoughts during late nights.
Today, this area has become a well-established branch of physics, featured prominently in New Scientist. Modern physics does not typically welcome those who question the edges of physics, mathematics, and philosophy. In Bell’s time, scientists were still grappling with the legacies of quantum theory’s pioneers, including heated debates between Niels Bohr and Albert Einstein.
My interest in Bell’s work began as a casual pursuit, though I devoted several hours to it. In 1963, he took a sabbatical with his physicist wife, using the time to craft a pair of original papers. Initially published without much attention, their significance could not be understated.
Bell transformed philosophical inquiries into testable experiments, particularly concentrating on the notion of “hidden variables” in quantum mechanics.
Quantum mechanics inherently resists certainty and determinism, as elucidated by Bohr and his contemporaries in the early 20th century. Notably, definitive statements about quantum entities remain elusive until we engage with them. Predictive ability exists only in probabilistic terms—an electron, for instance, might have a 98% likelihood of exhibiting one energy level while being 2% likely to reveal another, but the actual outcome is intrinsically random.
How does nature make these seemingly random decisions? One theory proposes that certain properties remain hidden from observers. If physicists could identify these hidden variables, they could inject absolute predictability into quantum theory.
Bell crafted a test aimed at marginalizing the myriad hidden variable theories, either altering or challenging quantum theory. This test typically involves two experimenters—Alice and Bob. A pair of entangled particles is produced repeatedly, with one particle sent to Alice and the corresponding one dispatched to Bob in a separate laboratory. Upon receipt, Alice and Bob each independently measure specific properties, for instance, Alice might analyze a particle’s spin.
Simultaneously, Bob conducts his measurements without any communication between the two experimenters. Once all data is collected, it is filtered into equations derived by Bell in 1964. This “inequality” framework evaluates the correlations between Alice and Bob’s observations. Even in scenarios devoid of quantum interactions, some correlations may occur by mere chance. However, Bell established a threshold of correlation indicating that something beyond randomness is happening. The particles demonstrate correlations unique to quantum physics, negating the presence of local hidden variables.
Thus, Bell’s test does more than affirm quantum theory as a superior explanation of our reality; it also underscores the peculiar nature of “non-locality,” revealing strange traits of our existence. This implies that quantum objects can maintain connections, with their behaviors remaining profoundly intertwined despite vast separations. Einstein critiqued this notion vigorously, as it contradicts the principles of his special theory of relativity by insinuating a form of instantaneous communication between entities.
Bell, initially a disciple of Einstein’s theories, found himself ultimately proving his idol wrong. His tests compellingly indicated that our reality is indeed quantum. This debate continues to engage researchers, particularly regarding the persistent discrepancies between quantum theory and our best understanding of gravity, framed by Einstein himself.
There was little acknowledgment of Bell’s experimental designs during his lifetime, despite the technical challenges they presented. The first experiment of this kind was conducted in 1972, and it wasn’t until 2015 that a test with minimal loopholes ultimately refuted the local hidden variable theories conclusively. In 2022, physicists Alain Aspect, John F. Krauss, and Anton Zeilinger received the Nobel Prize in Physics for their extensive work on these experiments.
So why does John Stewart Bell’s legacy resonate so strongly with me? Am I ensnared in some quantum malaise?
The answer lies in the fact that his work and the myriad experiments testing it have spawned as many questions about quantum physics and physical reality as they aim to resolve. For instance, numerous physicists concur that our universe is fundamentally non-local, yet they strive to uncover the underlying physical mechanisms at play. Others are busy formulating new hidden variable theories that evade the constraints set by Bell’s tests. Additionally, researchers are scrupulously reevaluating the mathematical assumptions Bell made in his original work, believing that fresh perspectives on Bell’s findings may be critical for advancing interpretations of quantum theory and developing cohesive theories.
The repercussions of Bell’s findings permeate the realm of quantum physics. We have engaged in Bell tests for nearly five decades, continuously enhancing entangled particles. But this is just the beginning. Recently, I collaborated with physicists to design a method to leverage Bell’s work in exploring whether free will might be partially constrained by cosmic factors. Afterwards, I received a call from another cohort of researchers keen to discuss Bell again, this time in relation to gravity and the foundational nature of space and time. They drew inspiration from his methodologies and sought to create a test that would examine genuine gravitational properties rather than quantum ones.
It’s no wonder I feel inextricably linked to Bell. His capacity to convert philosophical inquiries into tangible tests encapsulates the essence of physics. The essence of physics is to unravel the world’s most baffling mysteries through experimental means. Bell’s test vividly embodies that promise.
If I must ponder a haunting presence, I couldn’t ask for a more remarkable specter.
California implemented stricter building standards in 2008, focusing on design and materials used in new construction in high-risk areas.
Discussing the issue, Crowfoot emphasized the challenge posed by older homes not meeting modern safety standards. A special committee formed by Governor Newsom aims to address this concern.
Since 2020, California has allocated at least $50 million to home hardening projects and initiated a grant program to incentivize homeowners to renovate their properties.
However, according to Kimiko Barrett from Headwater Economics, the current program is inadequate to deal with the scale of the problem, especially evident in Los Angeles County.
Despite meeting high safety standards, some homes, like those affected in the Paradise Camp Fire of 2018, still faced significant damage. Only 43% of homes built after 2008 survived the fire in Paradise.
JP Rose from the Center for Biological Diversity highlights the importance of not building extensively in high-risk fire zones without proper precautions to avoid future disasters.
Firefighters extinguish a house on fire along Pacific Coast Highway in Malibu, California, on January 8. Juan Carlo / USA Today Network File
At a national level, there is a lack of significant investment in making structures more resilient to wildfires. The Federal Bushfire Commission Report acknowledges the immense cost of addressing the bushfire crisis annually.
Efforts to improve fire resilience in homes, like replacing wooden roofs, could require billions of dollars. Without government intervention, insurance companies may play a crucial role in encouraging homeowners to undertake costly renovations to safeguard their properties.
In the midst of the ongoing housing crisis in Southern California, the rebuilding process after recent fires, as supported by Newsom and Los Angeles Mayor Karen Bass, aims to expedite the reconstruction of damaged properties.
I've seen my future and it's full of beans, both literally and figuratively. In addition to increasing the amount of beans, eat a lot of vegetables, no meat, avoid long periods of hunger and almost no alcohol. But in return for this dietary discipline, my future will also be longer and brighter. I'm 52 years old and can expect to live another 29 years with my current diet. But if I change now, I can earn another 10 years and live a healthy life until I'm 90.
This “longevity diet” is not just the latest fad, but the result of scientific research that extends beyond the human lifespan. And it's not just designed to prevent disease, it's actually designed to slow the aging process – that's the claim, anyway.
Of course, it makes sense to say that your diet can change your lifespan. Millions of people around the world still die prematurely each year due to lack of calories and nutrients. On the other hand, the estimate is 11 million people die every year from too many calories And the wrong kind of nutrients. Unnecessary ridicule inevitably leads to obesity and its obesity. cardiovascular diseasediabetes and cancer. The typical Western diet is high in sugar, refined starches, saturated fat, and low in whole foods, which can hinder your metabolism and make injuries even worse. This includes excessive release of insulin, a hormone that controls blood sugar levels and directly affects aging. Suffice it to say that the Western diet is not pushing the longevity levers in the right direction. But is it really possible to dig yourself into a later grave?…
Obesity has undergone a significant shift in how we perceive it in recent years. It was once seen as a personal lifestyle choice, but is now acknowledged as a complex disease affected by genetics, biology, psychosocial factors, and the environment. It is a prevalent condition affecting a large portion of the population.
The World Health Organization (WHO) estimates that around 2 billion adults are overweight or obese and this number is rising rapidly across different income countries.
Obesity poses a major health risk as it increases the likelihood of developing diseases such as diabetes, heart disease, dementia, cancer, nonalcoholic fatty liver disease, and kidney failure.
Recent advancements in understanding obesity have revealed that the regulation of energy balance and eating behavior occurs in the brain, not the stomach. Scientists have identified numerous genes that impact weight regulation and predispose individuals to obesity. Additionally, maintaining weight loss is challenging due to the body’s natural responses, which slow metabolism and increase hunger hormones.
Efforts to find effective treatments for obesity have intensified, considering the stigma and discrimination individuals with obesity often face. Traditional methods such as surgery or lifestyle changes like diet and exercise have been common strategies for weight loss. However, these approaches come with risks, limitations, and costs.
Current weight loss drugs have not been very successful in achieving significant results. New gut hormone treatments known as incretins show promise in managing obesity by regulating appetite through the gut-brain axis. Drugs like semaglutide have demonstrated notable weight loss benefits and improved blood sugar levels, offering hope for effective obesity management in the future.
Despite the potential of new weight loss drugs, challenges like administration methods, cost, and long-term effectiveness need to be addressed. Developments in weight loss medications, including oral drugs like orforglyprone, are still in progress, suggesting a promising future for obesity treatment.
While weight loss pills offer a valuable tool, they should be part of a comprehensive approach that includes lifestyle changes, psychological support, and addressing socio-economic factors influencing health inequalities. The future of obesity treatment looks promising with ongoing research and advancements in medical technology.
It’s crucial to recognize that a holistic approach, which combines different strategies tailored to individual needs, is essential in effectively addressing the complexities of obesity. By destigmatizing obesity and focusing on a well-rounded treatment plan, we can make significant strides in managing this chronic disease.
Astrobotic said the cause of the failure was likely a propulsion failure, adding that a further update would be issued once more data is acquired and analyzed.
Peregrine’s mission represents a new chapter in the commercial space industry, launching private companies into the space race and delivering to NASA and other customers.
The Pittsburgh-based company First private company to succeed in landing This is something only four countries have achieved on the moon. A Houston-based company also has a lander ready to fly and is expected to take a more direct route to the moon.
NASA provided both companies with significant funding to build and fly their own lunar landers. The space agency hopes the privately owned lander will scout the site before astronauts arrive, while also providing technical and scientific experiments for NASA and benefiting other customers. Astrobotic’s Peregrine lander contract: $108 million.
During its first flight, the Peregrine lander carried five NASA instruments. Following the technical anomaly, NASA said it would learn from the situation.
“Each success and setback is an opportunity for us to learn and grow,” Joel Kearns, deputy assistant administrator for exploration at NASA’s Science Mission Directorate, said in a statement Monday. “We will use these lessons to advance our efforts to advance science, exploration, and commercial development of the Moon.”
The last time the United States launched a moon landing mission was in December 1972. Apollo 17’s Gene Cernan was the last human to set foot on the moon as mission commander, and Harrison Schmidt was the 12th astronaut to walk on the moon. , concluded an era that continued to be the pinnacle of NASA.
The space agency’s new Artemis mission, named after Apollo’s twin sister in Greek mythology, aims to return astronauts to the moon’s surface within the next few years. first, Flight around the moon by four astronautsProbably by the end of the year.
Highlighting Monday’s moonshot was the long-delayed initial test flight of the Vulcan rocket from Cape Canaveral Space Force Station. The 202-foot (61-meter) rocket is essentially an upgraded version of ULA’s highly successful flagship Atlas V, which will be phased out along with the company’s Delta IV. Jeff Bezos’ rocket company Blue Origin provided his two main engines for the Vulcan.
The then-Soviet Union and the United States suspended touchdowns after a string of successful moon landings in the 1960s and 1970s. China joined the elite club in 2013, India in 2023.But I also saw it last year Landing craft from Russia and Japanese private companies crash into the moon.In 2019, an Israeli nonprofit organization's lander crashed.
Next month, SpaceX will provide lifts for the lander from Intuitive Machines.
In addition to flight experiments for NASA, Astrobotic has launched its own cargo transportation business, shipping its 6-foot-tall (1.9-meter-tall) Hayabusa lander with rock chips from Everest and toy-sized objects it catapults from Mexico. I packed everything up to my car. It will reach and cruise the moon’s surface, collecting the ashes and DNA of deceased space enthusiasts, including “Star Trek” creator Gene Roddenberry and science fiction writer Arthur C. Clarke.
The Navajo Nation recently asked for the launch to be postponed because of the human remains. He said it would be a “grave desecration” of the celestial bodies worshiped by Native Americans. Thornton said the December challenge was too late, but promised to work to find a “good path forward” with the Navajo Nation for future missions.
Celestis, one of the spaceflight memorial companies that purchased space for the lander, said in a statement that no single culture or religion owns the moon and should not be able to veto the mission. There will be more debris in the rocket’s upper stage, and once released from the lander, it will orbit the sun indefinitely all the way to Mars.
Freight rates for Hayabusa range from a few hundred dollars to $1.2 million per kilogram (2.2 pounds), not enough for Astrobotic to break even. But Astrobotic CEO John Thornton says that’s not the point for this first flight.
“A lot of people’s dreams and hopes rest on this,” he said.
The Asian tiger mosquito, which carries dengue fever, could be endemic in Britain by mid-century, government health experts say.
In recent years, insects have spread to most of Europe due to the warmer climate, tending to live in urban areas and feed during the day, putting people at greater risk.
They have striped bodies and are known for their ability to spread dengue fever, Zika virus, and chikungunya fever (diseases usually associated with tropical regions).
The UK Health and Safety Agency (UKHSA) fears dengue fever could be transmitted in London by 2060, and mosquitoes themselves could be widespread across the UK by the 2040s.
UKHSA entomologist Jolyon Medlock said surveillance at borders could help slow the spread of mosquitoes, and people should cover standing water and empty containers as the insects lay their eggs to keep mosquitoes away.
Professor Dame Jenny Harries, chief executive of the UKHSA, said: “What was called a tropical disease when I was trained many years ago will actually become a national household disease.”
Officials also said other food- and water-borne infections could become more common, increasing the risk of further pandemics.
The dangers of heatwaves will worsen, flooding will become more severe, and people’s mental health may be adversely affected.
Food prices could also become more volatile, as much of the food the UK imports comes from areas prone to climate impacts such as drought.
Wildfires that produce toxic smoke are also expected to occur more frequently during hotter, drier summers.
Young children, the elderly, and people with pre-existing conditions are most vulnerable to these threats.
“In the summer of 2022, temperatures in the UK exceeded 40 degrees for the first time on record,” Dame Jenny said.
“Nearly 3,000 excess deaths were recorded during this prolonged heat wave, while many other countries have experienced extreme heat and prolonged heat waves in recent months.
“Using a high emissions scenario, health-related deaths in the UK are estimated to increase by more than 100% in the 2030s, more than 500% in the 2050s and more than 1,000% by 2070.”
read more: What is dengue fever and how is it spread? Doctors fear an outbreak could occur in the UK too – and here’s why
The Earth is already 1.2 degrees warmer than pre-industrial levels, and given the amount of carbon in the atmosphere, further warming is already locked in, even if emissions start to fall overnight.
This means some adaptation will be needed, alongside emissions reductions, housing improvements, flood protection and extreme heat warnings, UKHSA said.
Mrs Jenny added: ‘Climate change is an important threat to public health around the world, not only through increased mortality from extreme temperatures and weather impacts, but also through increased spread of infectious diseases and worsened earth system vulnerabilities.”
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.