CRISPR: Revolutionizing Genetic Code Editing – The Most Innovative Idea of the Century

New Scientist: Your source for the latest in science news and long-form articles from expert journalists covering advancements in science, technology, health, and environmental issues.

“The pain was like being struck by lightning and being hit by a freight train at the same time,” shared Victoria Gray. New Scientist reflects on her journey: “Everything has changed for me now.”

Gray once endured debilitating symptoms of sickle cell disease, but in 2019, she found hope through CRISPR gene editing, a pioneering technology enabling precise modifications of DNA. By 2023, this groundbreaking treatment was officially recognized as the first approved CRISPR therapy.

Currently, hundreds of clinical trials are exploring CRISPR-based therapies. Discover the ongoing trials that signify just the beginning of CRISPR’s potential. This revolutionary tool is poised to treat a wide range of diseases beyond just genetic disorders. For example, a single CRISPR dose may drastically lower cholesterol levels, significantly reducing heart attack and stroke risk.

While still in its infancy regarding safety, there’s optimism that CRISPR could eventually be routinely employed to modify children’s genomes, potentially reducing their risk of common diseases.

Additionally, CRISPR is set to revolutionize agriculture, facilitating the creation of crops and livestock that resist diseases, thrive in warmer climates, and are optimized for human consumption.

Given its transformative capabilities, CRISPR is arguably one of the most groundbreaking innovations of the 21st century. Its strength lies in correcting genetic “misspellings.” This involves precisely positioning the gene-editing tool within the genome, akin to placing a cursor in a lengthy document, before making modifications.

Microbes utilize this genetic editing mechanism in their defense against other microbes. Before 2012, researchers identified various natural gene-editing proteins, each limited to targeting a single location in the genome. Altering the target sequence required redesigning the protein’s DNA-binding section, a process that was time-consuming.

However, scientists discovered that bacteria have developed a diverse range of gene-editing proteins that bind to RNA—a close relative of DNA—allowing faster sequence matching. Producing RNA takes mere days instead of years.

In 2012, Jennifer Doudna and her team at the University of California, Berkeley, along with Emmanuelle Charpentier from the Max Planck Institute for Infection Biology, revealed the mechanics of one such gene-editing protein, CRISPR Cas9. By simply adding a “guide RNA” in a specific format, they could target any desired sequence.

Today, thousands of variants of CRISPR are in use for diverse applications, all relying on guide RNA targeting. This paradigm-shifting technology earned Doudna and Charpentier the Nobel Prize in 2020.

Topics:

Source: www.newscientist.com

Why Crowdsourcing Wikipedia is the Most Revolutionary Idea of the Century

New Scientist: Your Go-To Source for Science News and Insights

In today’s digital landscape, hostility often overshadows collaboration. Remarkably, Wikipedia—a publicly editable encyclopedia—has emerged as a leading knowledge resource worldwide. “While it may seem improbable in theory, it remarkably works in practice,” states Anusha Alikan from the Wikimedia Foundation, the nonprofit behind Wikipedia.

Founded by Jimmy Wales in 2001, Wikipedia continues to thrive, although co-founder Larry Sanger left the project the following year and has since expressed ongoing criticism, claiming it is “overrun by ideologues.”

Nonetheless, Sanger’s opinions are not widely echoed. Wikipedia boasts over 64 million articles in 300+ languages, generating an astonishing 15 billion hits monthly. Currently, it ranks as the 9th most visited website globally. “No one could have anticipated it would become such a trusted online resource, yet here we are,” Arikan commented.

Building trust on a massive scale is no small achievement. Although the Internet has democratized access to human knowledge, it often presents fragmented and unreliable information. Wikipedia disrupts this trend by allowing anyone to contribute, supported by approximately 260,000 volunteers worldwide, making an impressive 342 edits per minute. A sophisticated system grants broader editing rights to responsible contributors, fostering trust that encourages collaboration even among strangers.

Wikipedia also actively invites special interest groups to create and edit content. For instance, the Women in Red project tackles gender disparities, while other initiatives focus on climate change and the history of Africa. All articles uphold strict accuracy standards, despite critics like Sanger alleging bias.

As an anomaly in the technology sector, Wikipedia operates without advertising, shareholders, or profit motives. It has maintained this unique position for over two decades with great success.

However, the rise of artificial intelligence poses new challenges. AI can generate misleading content, deplete resources in training efforts, and lead to diminished website traffic and decreased donations due to AI-driven search summaries.

Topics:

  • Artificial Intelligence/
  • Internet

Source: www.newscientist.com

Discover the Top 21 Innovative Ideas of the 21st Century: How We Selected Them and Why They Matter

What distinguishes a groundbreaking idea from a mediocre one? This is often a challenging distinction to make. Take the example of vaccination: collecting pus from a cowpox-infected individual and injecting it into an eight-year-old boy may seem utterly reckless. Yet, 18th-century physician Edward Jenner’s daring action ultimately led to the eradication of smallpox, a disease that plagued humanity.

With the benefit of hindsight, we recognize that Jenner’s innovation was monumental. This principle of vaccination continues to save millions of lives today. As we progress through the 21st century, we feel it’s essential to reflect on and celebrate transformative ideas from the past 25 years that are reshaping our perspectives, actions, and understanding of the world around us.

Compiling our list of the 21 most impactful ideas of the 21st century involved rigorous discussions among our editorial team. One of our initial challenges was determining if the first quarter of this century would conclude at the beginning or end of 2025. For clarity, we opted for the latter. We navigated debates on various ideas, dedicating particular attention to concepts like the microbiome—establishing it as a legitimate 21st-century notion—and scrutinizing the role of social media, which after much discussion, we deemed largely negative. Ultimately, we recognize that the quality of ideas is subjective.

We developed a robust set of criteria for our selection. To qualify for this list, a concept must already demonstrate a significant impact on our self-understanding, health, or broader universe. Additionally, it should be grounded in scientific discovery, with a strong idea underpinning it. Lastly, the development must have occurred within the last 25 years.


Rather than trying to predict the future, it’s important to take the time to reflect on the past.

While the last criterion may appear straightforward, we encountered numerous proposals that remain unrealized. The discovery of gravitational waves in the 21st century opened new cosmic vistas, but their prediction dates back a century to Albert Einstein. Similarly, ideas like weight loss medications, personalized medicine, and mRNA vaccines show promise, but their full potential has yet to be achieved—perhaps these will make the list in 2050.

During our selection process, we couldn’t disregard ideas that initially seemed appealing but faltered. Therefore, we also crafted a list of the five most disappointing ideas of the century thus far. The line between success and failure can sometimes blur, leading to controversial choices in our best ideas list. For instance, while many would advocate for the removal of smartphones, we ultimately view them as largely beneficial. Likewise, the ambitious global warming target of 1.5°C can be seen as a failure, especially as new reports indicate that average global temperatures have surpassed this benchmark for the first time. Nonetheless, we argue that striving to reduce the threshold from 2°C remains one of the century’s monumental ideas, setting a standard for global climate ambition.

Advancing away from fossil fuels is undoubtedly crucial, and prominently featured in this effort is Elon Musk. In 2016, before Musk ventured into social media and politics, his company Tesla launched its first Gigafactory in Nevada, marking a pivotal moment in the transition to renewable energy by utilizing economies of scale to transform transportation and energy systems. Conversely, other approaches to fighting climate change, such as alternative fuels and carbon offsets, appear more harmful than beneficial.

One significant takeaway from our selection process is that revolutionary ideas often arise by chance. For many, a working outlet can be the catalyst for a few minutes of smartphone scrolling during a lengthy commute. However, for two physicists in 2005, their discovery altered the global decarbonization strategy. This breakthrough also unveiled the foundations of our complex thought processes, illustrating that brain regions don’t operate in isolation but are interwoven into a robust network. This understanding has revolutionized our approach to diagnosing and treating neurological issues.

Looking back over the past quarter-century, it’s evident that the world has transformed considerably. We successfully dodged the Millennium Bug, the human genome’s first draft was completed, and the International Space Station welcomed its first crew. Concepts like “Denisovans” and “microbiomes” were unknown to us. In our pages, we celebrated innovations like wireless communication and marveled at miniaturized computer chips driving these technologies. “At its core is a device known as a Bluetooth chip,” we stated, positing it as the next big thing—a prediction that, in hindsight, was flawed, since truly transformative technologies extend beyond mere convenience.

This experience highlights the folly of predictions, as they can often be overlooked in the rush for the next trending innovation. Thus, rather than striving to foresee the future, we ought to invest time in contemplating the past. The advancements we’ve witnessed in health, technology, and environmental conservation suggest that this century has made the world a better place. Let’s hope, without necessarily predicting, that this momentum continues into the future.

Source: www.newscientist.com

Unlocking the Best Idea of the Century: Why Smartphones Are Here to Stay

Explore science news and lengthy articles by seasoned journalists on our website and in our magazine, covering breakthroughs in science, technology, health, and environmental studies.

“Every so often, a groundbreaking product emerges that reshapes our reality.” Steve Jobs during the 2007 Apple presentation. Tech executives often hype their innovations, but this proclamation was substantiated. The iPhone not only popularized apps but also introduced compact, powerful computers into our daily lives.

However, this transformation comes with drawbacks. Much like a snail retreating into its shell, we can retreat into our devices at any moment, breeding social anxiety. Coupled with safety issues, numerous countries have restricted mobile phone use in educational settings, and Australia has implemented a total ban on social media for users under 16 as of December 2025. Additionally, reliance on a constantly connected device can diminish our sense of privacy, according to data scientists like Mar Hicks of the University of Virginia. “This technology is acclimating users to significantly less privacy, not only in public spaces but also within the privacy of their own homes.”

Smartphones transcend their basic function, emphasizing their role in our lives, as anthropologist Daniel Miller from University College London notes. “They’ve expanded our personal space,” he articulates. These handheld digital environments allow for seamless access to the virtual worlds of our friends and family, resulting in a continuous navigation between our physical and digital existence.

The global influence of smartphones is undeniable. According to GSMA, the mobile operators’ industry association, over 70% of the global population now owns a smartphone. In many low-income countries, people increasingly bypass traditional desktop computers altogether. Smartphone-driven fintech platforms facilitate transactions for 70 million users across 170 countries, removing the necessity for conventional banks. Furthermore, farmers utilize smartphone applications for crop monitoring, and doctors employ them in hospitals to reduce reliance on costly machinery.

Moreover, the ramifications of smartphones extend far beyond their immediate use. The rapid miniaturization of electrical components like cameras, transistors, and motion sensors has enhanced processing power and introduced new potentials. This technological evolution has spurred numerous 21st-century innovations, including versatile drones, smart wearables, virtual reality headsets, and miniature medical implants.

Topics:

Source: www.newscientist.com

How Fear Influences Ecosystems: The Groundbreaking Insight of the Century

Explore the Science Behind Eco-Systems

After the reintroduction of wolves to Yellowstone National Park in 1995, significant ecological changes were observed, particularly a substantial decrease in moose populations. This decline was largely attributed to the impact of wolves on elk behavior; where wolves were likely present, elk dedicated more time to vigilance and less to foraging. Biologist John Laundre referred to this phenomenon as a “landscape of fear” in a pivotal 2001 study.

This concept builds on earlier research that suggested predator fear could influence prey behavior. Until then, it was widely assumed that predators primarily affected prey populations through physical predation alone. Laundre’s observations challenged this notion, indicating a potentially complex relationship between fear and wildlife dynamics.

Recent studies led by Liana Zanet at Western University in Ontario, Canada, further explore this landscape of fear. Over the past two decades, Zanet and her colleagues conducted experiments in British Columbia, playing predator calls near wild songbirds. Their findings revealed a marked reduction in egg-laying and hatching rates, with survival rates for hatchlings plummeting when predator sounds were used. Less than half of the hatchlings survived compared to when non-predator sounds were played. This indicates that fear alone can significantly outweigh the effects of direct predation on wildlife populations.

According to Zanet, prey animals often prioritize safety over foraging opportunities, avoiding prime feeding areas when they perceive threats. This fear-based behavior has profound ecological implications. On Canada’s west coast, the absence of natural predators like bears, cougars, and wolves has allowed raccoons to flourish, leading them to scavenge food resources along the coastline.

When Zanet’s team introduced dog barking recordings in coastal regions, they observed that raccoons largely avoided the beach, spending their time instead watching for potential threats. This avoidance behavior has contributed to the dramatic rebound of coastal animal populations in areas where predator fear is heightened. However, similar effects were not observed when seal sounds were played.

Understanding landscapes of fear is crucial for comprehending the profound impacts humans have on wildlife. In a specific study, Zanet’s team utilized camera traps to observe how wild animals responded to various sounds in Kruger National Park, South Africa. Surprisingly, they found that the fear generated by human presence surpassed that of lions, highlighting the extensive influence of human activity on wildlife behavior and ecosystems.

Topics:

Source: www.newscientist.com

End-to-End Encryption: The Ultimate Security Solution of the Century

Everyone has secrets to protect. In today’s digital age, whether safeguarding personal messages, business communications, or confidential state information, end-to-end encryption (E2EE) offers essential security and peace of mind.

E2EE ensures that your communications remain private from internet service providers and the operators of messaging or video conferencing applications. Messages are encrypted on the sender’s device and only decrypted by the recipient, making them unreadable to unauthorized parties while in transit. This prevents access by any entity, including law enforcement or corporate insiders.

Digital encryption is rooted in robust mathematics rather than mere assurances. The RSA algorithm, introduced in 1977, pioneered modern encryption by relying on the complexity of factoring large numbers into their prime components. Since then, various algorithms have emerged, utilizing intricate mathematics to enhance cryptographic security.

The true strength of E2EE lies not just in its technical implementation, but in how it upholds democracy and human rights across the globe. As Matthew Feeney from the UK privacy group Big Brother Watch states, “There are individuals in perilous regions depending on encryption to preserve their lives.” Additionally, even in recognized democracies, freedom is vulnerable. Feeney warns that those who claim “I have nothing to hide” should take heed of history’s lessons.

Many governments view E2EE unfavorably because it blocks surveillance, similar to how postal services safeguard letters. Notably, UK governments have attempted to ban E2EE; most recently, Prime Minister Keir Starmer reversed a controversial request for a backdoor into Apple following a public outcry.

Feeney acknowledges the uncertainty surrounding the potential for E2EE to be compromised, as intelligence agencies typically do not disclose their capabilities. Concerns loom regarding the advent of quantum computing, which may soon breach current encryption algorithms. However, cryptography continues to evolve, with emerging mathematical solutions challenging outdated algorithms. “Governments may wield power, but they can’t override the laws of mathematics,” Feeney asserts.

Topics:

This rewrite optimizes the content for SEO, ensuring clarity, keyword inclusion, and readability while preserving the original structure and HTML tags.

Source: www.newscientist.com

Transformer Architecture: The Revolutionary AI Innovation Redefining the 21st Century

Discover Today’s Most Powerful AI Tools

Explore the incredible capabilities of modern AI tools that can summarize documents, generate artwork, write poetry, and even predict protein folding. At the heart of these advancements is the groundbreaking transformer architecture, which revolutionized the field of artificial intelligence.

Unveiled in 2017 at a modest conference center in California, the transformer architecture enables machines to process information in a way that closely resembles human thinking patterns. Historically, AI models relied on recurrent neural networks, which read text sequentially from left to right while retaining only the most recent context. This method sufficed for short phrases, but when dealing with longer and more complex sentences, critical details often slipped through the cracks, leading to confusion and ambiguity.

The introduction of transformers to the AI landscape marked a significant shift, embracing the concept of self-attention. This approach mirrors the way humans naturally read and interpret text. Instead of strictly scanning word by word, we skim, revisit, and draw connections based on context. This cognitive flexibility has long been the goal in natural language processing, aiming to teach machines not just to process language, but to understand it.

Transformers emulate this mental leap effectively; their self-attention mechanism enables them to evaluate every word in a sentence in relation to every other word simultaneously, identifying patterns and constructing meaningful connections. As AI researcher Sasha Ruccioni notes, “You can take all the data you get from the Internet and Wikipedia and use it for your own tasks. And it was very powerful.”

Moreover, this transformative flexibility extends beyond text. Today’s transformers drive tools that can generate music, render images, and even model molecules. A prime example is AlphaFold, which treats proteins—long chains of amino acids—analogously to sentences. The function of a protein hinges on its folding pattern and the spatial relationships among its constituent parts. The attention mechanism allows this model to assess these distant associations with remarkable precision.

In retrospect, the insight behind transformers seems almost intuitive. Both human and artificial intelligence rely on discerning when and what to focus on. Transformers haven’t merely enhanced machines’ language comprehension; they have established a framework for navigating any structured data in the same manner that humans navigate the complexities of their environments.

Source: www.newscientist.com

Understanding Neurodiversity: Why ‘Normal’ Brains Don’t Exist – A Revolutionary Perspective for the Century

Historically, science operated under the notion of a “normal brain,” one that fits standard societal expectations. Those who diverge from this model have often been labeled with a disorder or mental health condition, treated as if they were somehow flawed. For years, researchers have refined the notion that neurodevelopmental conditions, including autism, ADHD, dyslexia, and movement disorders, should be recognized as distinctive variations representing different neurocognitive frameworks.

In the late 1990s, a paradigm shift occurred. What if these “disorders” were simply natural variations in brain wiring? What if human traits existed on a spectrum rather than a stark boundary between normal and abnormal? Those at either end of the spectrum may face challenges, yet their exceptional brains also offer valuable strengths. Viewed through this lens, diverse brains represent assets, contributing positively to society when properly supported.

The concept of neurodiversity gained momentum, sparking lively debates in online autism advocacy groups. By 2013, the Diagnostic and Statistical Manual of Mental Disorders recognized autism as a spectrum condition, abolishing the Asperger’s syndrome diagnosis and classifying it on a scale from Level 1 to Level 3 based on support needs. This shift solidified the understanding of neurodivergent states within medical literature.

Since the early 2000s, research has shown that individuals with autism often excel in mathematical reasoning and attention to detail. Those with ADHD frequently outperform others in creativity, while individuals with dyslexia are adept at pattern recognition and big-picture thinking. Even those with movement disorders have been noted to develop innovative coping strategies.

These discoveries have led many scientists to argue that neurodivergent states are not mere evolutionary happenstance. Instead, our ancestors likely thrived thanks to pioneers, creative thinkers, and detail-oriented individuals in their midst. A group possessing diverse cognitive strengths could more effectively explore, adapt, and survive. Some researchers now propose that the autism spectrum comprises distinct subtypes with varying clusters of abilities and challenges.

While many researchers advocate for framing neurodivergent characteristics as “superpowers,” some caution against overly positive portrayals. “Excessive optimism, especially without supporting evidence, can undermine the seriousness of these conditions,” says Dr. Jessica Eccles, a psychiatrist and neurodiversity researcher at Brighton and Sussex Medical School. Nevertheless, she emphasizes that “with this vocabulary, we can better understand both the strengths and challenges of neurodiversity, enabling individuals to navigate the world more effectively.”

Topics:

Source: www.newscientist.com

The Brain’s Vast Interconnectedness: The Revolutionary Idea of the Century

New Scientist: Explore the latest science news, technology, health advancements, and environmental updates by expert journalists.

You’ve likely encountered the parable of the blind men and the elephant, where each individual’s perspective is limited to one part, leading to a distorted understanding of the whole. This concept resonates deeply in neuroscience, which has historically treated the brain as a collection of specialized regions, each fulfilling unique functions.

For decades, our insights into brain functionality arose from serendipitous events, such as the case of Phineas Gage, a 19th-century railroad worker who dramatically altered personality following a severe brain injury. More recent studies employing brain stimulation have linked the amygdala with emotion and the occipital lobe with visual processing, yet this provides only a fragmented understanding.

Brain regions demonstrate specialization, but this does not encapsulate the entire picture. The advent of imaging technologies, particularly functional MRI and PET scans in the late 1990s and early 2000s, revolutionized our comprehension of the brain’s interconnectedness. Researchers discovered that complex behaviors stem from synchronized activity across overlapping neural networks.

“Mapping brain networks is playing a crucial role in transforming our understanding in neuroscience,” states Luis Pessoa from the University of Maryland.

This transformative journey commenced in 2001 when Marcus Raichle, now at Washington University in St. Louis, characterized the Default Mode Network (DMN). This interconnected network activates during moments of rest, reflecting intrinsic cognitive processes.

In 2003, Kristen McKiernan, then at the Medical College of Wisconsin, and her team identified that the DMN experiences heightened activity during familiar tasks, such as daydreaming and introspection, providing a “resting state” benchmark for evaluating overall brain activity. They began to correlate DMN activity with advanced behaviors, including emotional intelligence and theory of mind.

As discoveries proliferated across other networks—pertaining to attention, language, emotion, memory, and planning—our understanding of mental health and neurodiversity evolved. These neural differences are now thought to be linked with various neurological conditions, including Parkinson’s disease, PTSD, depression, anxiety, and ADHD.

Network science has emerged as a pivotal field, enhancing our comprehension of disorders from autism, characterized by atypical social salience networks—those that detect and prioritize salient social cues—to Alzheimer’s disease, where novel research indicates abnormal protein spread via network pathways. We also acknowledge the inspiration it provides for developing artificial neural networks in AI systems like ChatGPT.

Neural networks have not only reshaped our understanding of brain functionalities but also the methodologies for diagnosing and treating neurological disorders. While we might not yet perceive the entirety of the elephant, our view is undeniably clarifying as science progresses.

Topic:

Source: www.newscientist.com

The Ultimate One-Size-Fits-All Diet: The Best Health Concept of the Century

New Scientist - Your source for science news, technology, health, and environmental insights.

The Mediterranean diet is widely regarded as the ultimate in healthy eating. Rich in fiber, vegetables, legumes, fruits, nuts, and moderate fish consumption, this diet is low in meat and dairy, making it both delicious and beneficial for health and the environment. As Luigi Fontana from the University of Sydney highlights, “Not only is it healthy, but it’s also very tasty.”

Supported by extensive research, unlike transient diet fads, the Mediterranean diet has been celebrated for over 21 years. This longevity stems from a series of randomized controlled trials that established its status as a nutritional gold standard.

In the 1940s, physiologist Ansel Keys advocated that the Mediterranean diet significantly lowers heart disease risk, primarily due to its low levels of saturated fat from meat and dairy, which are known to contribute to cholesterol buildup.

Keys, along with his wife Margaret, a nutritionist, conducted pioneering research comparing diet and heart health across seven countries. Their findings suggest that those following the Mediterranean diet enjoyed a markedly lower risk of heart disease, although external factors like income levels weren’t accounted for.

The most compelling evidence was presented in 1999. In this study, participants with prior heart attacks were assigned to either a Mediterranean diet or a low-fat diet, demonstrating that the former significantly reduced the risk of both stroke and subsequent heart attacks.

This breakthrough set the stage for a transformative shift in our dietary understanding over the next 25 years. Since 2000, multiple randomized controlled trials have confirmed the cardiovascular benefits of the Mediterranean diet. Additionally, it has been shown to reduce the risk of type 2 diabetes. Further research links this eating pattern to diminished risks of infectious diseases, breast cancer, slower cognitive decline, and enhanced IVF success rates, although further investigation remains essential. “Eating a Mediterranean diet reduces your risk of developing multiple chronic diseases,” Fontana emphasizes.

Insights into the diet’s effectiveness point to the importance of fiber and extra virgin olive oil, which are believed to foster beneficial gut bacteria and mitigate harmful inflammation. “Many chronic diseases arise from inflammation, making the Mediterranean diet particularly advantageous,” states Richard Hoffman at the University of Hertfordshire, UK.

Furthermore, adopting the Mediterranean diet benefits the environment. With meat and dairy production accounting for about 15% of global greenhouse gas emissions, transitioning to a diet rich in legumes and vegetables significantly reduces this impact. As global temperatures rise, it is imperative to move away from diet trends and embrace these time-honored culinary practices.

Topics:

Source: www.newscientist.com

Unveiling Quantum Creepiness: The Top Innovative Concept of the Century

In the 1920s, renowned physicist Albert Einstein believed he had identified a fundamental flaw within quantum physics. This led to extensive investigations revealing a pivotal aspect of quantum theory, one of its most perplexing features.

This intriguing property, known as Bell nonlocality, describes how quantum objects exhibit cooperative behavior over vast distances, challenging our intuitions. I’ve accepted this understanding for over 21 years—a remarkable insight for the 21st century.

To illustrate this phenomenon, consider two hypothetical experimenters, Alice and Bob, each possessing a pair of “entangled” particles. Entanglement enables particles to correlate, even when separated by distances that prevent any signal from transmitting between them. Yet, these correlations become apparent only through the interaction of each experimenter with their respective particles. Do these particles “know” about their correlation beforehand, or is some mysterious connection at play?

Einstein, alongside Nathan Rosen and Boris Podolsky, sought to refute this eerie connection. They proposed that certain “local hidden variables” could explain how particles understand their correlated state, making quantum physics more relatable to everyday experiences, where interactions happen at close range.

In the 1960s, physicist John Stewart Bell devised a method to empirically test these concepts. After numerous attempts, groundbreaking experiments in 2015 provided rigorous verification of Bell’s theories, earning three physicists the 2022 Nobel Prize. “This was the final nail in the coffin for these ideas,” says Marek Zhukowski from the University of Gdańsk. Researchers concluded that hidden variables could not maintain the locality of quantum physics. Jacob Valandez at Harvard University adds, “We cannot escape from non-locality.”

Embracing delocality offers substantial advantages, as noted by Ronald Hanson from Delft University of Technology, who led one of the groundbreaking experiments. For him, the focus was never on the oddities of quantum mechanics; rather, he viewed the results as a demonstration of “quantum supremacy” beyond conventional computational capabilities. This intuition proved accurate. The technology developed for the Bell Test has become a foundation for highly secure quantum cryptography.

Currently, Hanson is pioneering quantum communication networks, utilizing entangled particles to forge a near-unhackable internet of the future. Similarly, quantum computing researchers exploit entangled particles to optimize calculations. Although the implications of entanglement remain partially understood, the practical application of entangling quantum objects has transformed into a valuable technological asset, marking a significant evolution for a leading figure in discussions about the quantum nature of reality.

Topics:

Source: www.newscientist.com

19th Century Math Tips for Taming Bad Coffee

Can mathematics enhance these coffee experiences?

Alexander Spatari/Getty Images

Picture having a coffee pot that serves two cups. Poor brewing might result in a stronger brew at the bottom than at the top. When pouring from the pot into two cups, the first cup will taste much weaker than the second.

While this scenario is somewhat contrived, there are other situations where a “first is worse” (or “first is better”) approach can lead to inequity.

Consider a football game where everyone has a good idea of the skills of each player. If one team’s captain selects all players first, it creates a significant imbalance in team strength.

This scenario remains unfair even with a simple pick order. For instance, if players can be ranked from 1 to 10 based on skill, if Captain A chooses player 10 first, then Captain B selects player 9, followed by Captain A picking player 8, and so on, the resultant totals are skewed. Captain A’s team ends up with a score of 30 (10 + 8 + 6 + 4 + 2), while Captain B’s team scores only 25 (9 + 7 + 5 + 3 + 1).

So, how can we ensure a fair player selection? The answer lies in a mathematical method from the 19th century. The Tew-Morse series, initially explored by Eugène Plouet in the 1850s and subsequently detailed by Axel Tew and Marston Morse in the early 20th century, advocates for alternating and rotating choices.

In a scenario with selectors A and B, the selection order follows an ABBA pattern. The first pair is in the same order, while the second flips the order. This pattern can be extended, with a repeat that reverses the As and Bs: ABBA BAAB. Further sequences can be created like “ABBA BAAB BAAB ABBA”.

This rotation helps create equity. Using the team selection example again, the totals would be much more balanced: 10 + 7 + 5 + 4 + 1 for one team versus 9 + 8 + 6 + 3 + 2 for the other, leading to totals of 27 and 28.

An iteration of this sequence is also employed in sporting events. For instance, during a tennis tiebreak, one player serves first, followed by each player taking turns to serve two points in an ABBA sequence. This streamlined version of Tew-Morse is often seen as fairer than simple turn-taking. A similar approach is being tested by FIFA and UEFA during soccer penalty shootouts, applying pressure on the second shooter in each pair.

Returning to the coffee pot scenario, the solution is ideal. If you pour half a cup into cup A, then pour two half cups into cup B, and finally add the last half cup back into A, you will achieve equal strength in both cups. Alternatively, you could stir the coffee with a spoon. However, wouldn’t it be more gratifying to tackle such challenges with the aid of mathematics?

These articles will be published weekly at:
newscientist.com/maker

katie steckles – A mathematician, lecturer, YouTuber, and author based in Manchester, UK. She also contributes to New Scientist‘s puzzle column “BrainTwister”. Follow her @stex

topic:

Source: www.newscientist.com

Soviet-era Spacecraft Poised to Re-enter Earth After Half a Century of Failed Venus Mission

A Soviet-era spaceship aims to land on Venus, with plans for it to return to Earth in the near future.

Currently, it is uncertain where the mass of half-ton metal will descend and how much will survive the journey. Experts are monitoring space debris.

Dutch scientist Marco Langbroek estimates that the spacecraft may re-enter Earth’s atmosphere around May 10th.

“There are risks involved, but there’s no need for excessive concern,” Langbroek stated in an email.

The object is relatively small, and even if it remains intact, the likelihood of it causing damage is similar to that of encountering a random meteorite fall, which occurs annually. “The chance of being struck by lightning in your lifetime is far greater,” he added.

He also mentioned that the spacecraft could potentially impact someone or something; however, this scenario cannot be entirely dismissed.

The Soviet Union sent the spacecraft, known as Cosmos 482, into orbit in 1972 as part of its Venus mission series. It never successfully launched from Earth orbit due to a rocket malfunction.

Most of its counterparts fell back within a decade, yet Langbroek and others believe the landing capsule, a spherical object about three feet (1 meter) in diameter, has been in a highly elliptical orbit for the past 53 years, gradually descending.

There is a substantial possibility that the over 1,000-pound (approximately 500 kilograms) spacecraft could endure re-entry. It was designed to withstand the harsh conditions of Venus’ atmosphere, which is thick with carbon dioxide, according to Langbroek from Delft University of Technology in the Netherlands.

Experts are skeptical about the longevity of its parachute system. Additionally, heat shields might have deteriorated over extended periods in orbit.

Jonathan McDowell of the Harvard Smithsonian Astrophysical Observatory mentioned in an email that while the spacecraft would benefit from an intact heat shield, if it manages to re-enter successfully, “a half-ton metal object will be falling from the sky.”

The spacecraft is projected to re-enter around 51.7°N and 51.7°S, passing near London, Edmonton, Alberta, and Cape Horn, South America. However, given that much of the Earth is covered by water, “the chances are favorable.”

Source: www.nbcnews.com

How Nearly a Century of Happiness Research Unveiled a Key Finding

When Lyubomirsky joined Stanford’s Graduate School of Social Psychology in 1989, the study of happiness was just beginning to earn respectability in academia. Ed Diener, a psychologist at the University of Illinois at Urbana-Champaign, would later gain recognition for his contributions to the field. Despite his long-standing interest in happiness, he chose to wait until he achieved tenure before diving into the subject. Similarly, Lyubomirsky was hesitant to specialize in happiness; as a serious scientist, she felt that topics related to “emotion” were often regarded as less rigorous. However, after an engaging discussion with her advisor on her first day at Stanford, she resolved to make happiness her primary focus.

Lyubomirsky began by exploring the fundamental question of why some individuals experience greater happiness than others. A few years prior, Diener had published a survey that examined existing research, highlighting the types of behaviors often associated with happy individuals. However, the studies often yielded conflicting results, leading to a lack of definitive answers. Lyubomirsky’s own findings indicated that mindset plays a significant role; happy individuals tended to avoid comparing themselves to others, held positive views of those around them, made fulfilling choices, and did not dwell on negativity.

Yet, Lyubomirsky recognized the complexity of cause and effect. Did a happy disposition foster a healthy mindset, or did adopting a positive outlook lead to increased happiness? Were people inherently predisposed to a certain level of happiness, much like mothers clustering together? She pondered whether it was possible to shift one’s mindset, noting that such changes often required extensive time—many people spend years in therapy attempting to achieve this, often without success. This prompted her to investigate whether simpler, quicker actions could enhance well-being.

To this end, Lyubomirsky researched various habits and practices thought to uplift mood, such as random acts of kindness and expressions of gratitude. Over six weeks, she instructed students to perform five acts of kindness each week—like donating blood or assisting peers with assignments. By the end of the study, these students reported higher levels of happiness compared to a control group. Another group reflected weekly on things they were grateful for, such as “My Mother” and “AOL Instant Messenger,” and similarly experienced an increase in happiness. Although the changes were modest, Lyubomirsky found it intriguing that small, low-cost interventions could enhance students’ quality of life. In 2005, she published a paper asserting that individuals possess significant control over their happiness.

Lyubomirsky’s research emerged during a time when psychology was reevaluating its objectives and focus. When Martin Seligman, a psychologist from the University of Pennsylvania, took leadership of the American Psychological Association in 1998, he and his colleagues noted that the field had overly concentrated on dysfunction, neglecting the promotion of life satisfaction. He urged his peers to explore themes such as “optimism, courage, work ethic, resilience, interpersonal skills, pleasure, insight, and social responsibility,” advocating a return to making life more fulfilling and productive for everyone.

Source: www.nytimes.com

AMOC: Crucial ocean currents are unlikely to shut down completely by the end of the century

AMOC brings warm water north from the tropical region near the surface and takes cold water in opposite directions of the deep sea

noaa

Important ocean currents will rarely close by the end of this century, according to new findings that undermine the end of the impending catastrophic collapse.

The Atlantic Meridian Surrounding Circulation (AMOC) transports warm water from the tropical north and helps maintain temperatures in Northern Europe. The temperature and the influx of cold water from the Arctic ice weakens the current temperature, and scientists fear it can stop it completely. This will disrupt marine ecosystems and cool the European climate a few degrees faster.

Some researchers say that the irreversible closure of AMOC could be in the century. But I say this worst-case scenario is unlikely Jonathan Baker At the Met Office in the UK.

To investigate whether a complete AMOC collapse of this century is possible, Baker and his colleagues used 34 climate models to simulate changes in AMOC under extreme climate change, and greenhouse gas levels trained overnight from today's levels. The team also modeled a large amount of freshwater entering the North Atlantic at many times the rate of ice melting now.

They found that although AMOC is significantly weakened in these two scenarios, ocean currents continue in their weakened state, supported by deep-sea upwellings in the North Atlantic, driven by southern sea winds. “The Southern Ocean winds continue to blow, and this brings deep waters up to the surface. This works like a powerful pump,” Baker says. “This keeps AMOC running on models of this century.”

This finding helps explain why climate models generally simulate more stable AMOCs in the warming world compared to studies that rely on statistical methods. This tends to suggest that AMOC is more vulnerable.

Niklas Bore The Potsdam Climate Impact Institute in Germany said the findings are “good news” for those worried about the imminent collapse of the AMOC. “I agree that all cutting-edge climate models will not show a complete AMOC collapse within the 21st century.

However, the model does not predict a complete collapse of AMOC, but shows that quaternary reddish CO2 concentrations lead to a 20-81% reduction in the current intensity.

With AMOC weaker by about 50%, the impact on climate will become important, Baker says it will be important due to marine ecosystem disruption, sea level rise on the North Atlantic coastline, and changes in global rainfall patterns that affect crop harvests around the world. However, this type of weakening does not bring rapid cooling to Europe, he says.

In comparison, Bohr emphasizes that AMOC, which is 80% less than today, will have a devastating effect. “Of course, it's a nearly blocked AMOC,” he says. “It has all the impact on Europe's cooling and changing patterns of tropical monsoon, and all the things we are concerned about.”

Stephen RahmstoefHe is also at the Potsdam Institute for Climate Impact in Germany, and agrees that under the extreme warming of this century, there may be a weak and shallow AMOC trend left in the world. Some studies even define AMOC disintegration as this type of substantial weakening, he says. “A new study is investigating the remaining wind-driven covers [current] In more detail, this is a valuable contribution to the scientific literature,” he says. “However, in response to human-induced global warming, we will not change our assessment of the risks and impacts of future AMOC changes.”

topic:

Source: www.newscientist.com

Breaking News: The Most Monumental Breakthrough of the Century


Inside the Department of Defense UFO file

The US Congress is talking about extraterrestrial life again. But despite some evidence, this question remains frustratingly unanswered. Professor Michael Bolander, an expert on the impact of contact with extraterrestrial intelligence on human law, details newly released documents from the Department of Defense.

fall asleep faster

Heightened thoughts are one of the most common symptoms of sleeplessness. So if you're having trouble getting depressed because of your brain, Just won't shut up – You might want to try cognitive shuffle. This simple guide will show you how to do it (and even better, you can do it from bed with your eyes closed).

Changes in Earth's rotation

Scientists are revealing how human activity and increased demand for water have a bigger impact on the Earth's rotation than the melting of polar ice sheets.

New year, new you?

Can you change your personality? Cognitive neuroscientist Dr. Christian Jarrett has researched techniques and methods that he claims can help you become more confident, outgoing, and fulfilled.

plus

  • Worst ideas of the 21st century: Hindsight is a wonderful thing. Here are some of the most promising innovations of the past 25 years that failed miserably.
  • 21st century image: The world is full of wonders, and high-definition cameras allow you to see them in more detail than ever before. Check out our favorite images from the first 25 years of the 21st century.
  • Q&A: Answers to the best pub quiz trivia. This month: Can I build a death ray in my garden? How far back in time can I go back in time and still be able to breathe? How can I see Saturn in the night sky? How many abs can I get? Or? What is the biggest snowman ever built? How can polar bears smell food from far away?

No. 414 Released on Tuesday, December 27, 2024

don't forget that BBC Science Focus Also available on all major digital platforms. There is a version of android, Kindle Fire and Kindle e-readers,but also, iOS app For iPad and iPhone.

Source: www.sciencefocus.com

Evidence of Indigenous Canines in Jamestown Colony during the 17th Century Unearthed through Ancient DNA Analysis

Multiple studies have demonstrated that European colonization of the Americas caused the extinction of most mitochondrial lineages of North American dogs between 1492 and present, and that they were replaced by European lineages. Historical records indicate that colonists imported dogs from Europe to North America, and that they became objects of interest and exchange as early as the 17th century. However, it is unclear whether the oldest archaeological dogs found from the colonial period were of European, Native American, or mixed ancestry. To determine the ancestry of dogs from the Jamestown Colony in Virginia, scientists sequenced ancient mitochondrial DNA (mtDNA) from six archaeological dogs dating from 1609 to 1617.

Lithograph “Indian Dog with Rabbit” by John Woodhouse Audubon.

Europeans and Native Americans treasured dogs as pets, used them for similar tasks, and as symbols of identity.

As a result, the dogs reflected the tensions between European and Native American cultures: settlers described Native American dogs as mongrels to emphasize their perception that Native Americans would not breed or own dogs.

Indigenous peoples perceived European dogs as a direct threat to their existence and took steps to restrict their use.

“Previous research had suggested that there were many indigenous dog species in the continental United States, but that they had gone extinct,” said Ariane Thomas, an anthropologist at the University of Iowa.

“We wanted to understand what it meant: when did it happen, were the dogs culled, were they in competition with European dogs or were they sick?”

Dr. Thomas and his colleagues focused on the Jamestown Colony in Virginia because of the number of dog remains found at the site and evidence of Native American influence.

They were able to identify and analyze 181 bones representing at least 16 different dogs.

Of these, the researchers selected 22 sites that span multiple points in Jamestown's early settlement, from 1607 to 1619.

To better understand the ancestry of these dogs, they extracted and sequenced ancient mtDNA.

Based on body size estimates alone, the researchers found that most of the Jamestown dogs weighed between 10 and 18 kg (22 and 39 pounds), comparable to modern beagles and schnauzers.

Additionally, many of the dog bones bore signs of human damage, including burn marks and cuts.

“Cut marks and other signs of butchery found on the dogs indicate that some of these dogs were eaten,” Dr Thomas said.

“This suggests that when settlers arrived, they did not have enough food and had to rely on the native dogs of the area.”

“Furthermore, DNA sequencing demonstrated that at least six of the dogs showed evidence of Native American ancestry.”

“Our findings indicate that there were indigenous dogs in the region and that they did not quickly become extinct when Europeans arrived.”

“While it is not surprising that dogs could be identified with Native American ancestry, our results suggest that settlers and Native American tribes may have been exchanging dogs and had little concern about potential interbreeding.”

of Survey results Published in the journal American Antiquity.

_____

Ariane E. Thomas othersTsenacomoco Dogs: Ancient DNA reveals presence of local dogs in Jamestown Colony in the early 17th century. American AntiquityPublished online May 22, 2024, doi: 10.1017/aaq.2024.25

Source: www.sci.news