New Evidence Shows Humans Mastered Fire 400,000 Years Ago, Earlier Than Previously Believed

“This site, dating back 400,000 years, represents the earliest known evidence of fire not just in Britain and Europe but across the globe,” stated Nick Ashton, co-author of the study and curator at the British Museum. He noted that this discovery pushes back the timeline of when our ancestors might have first harnessed fire by approximately 350,000 years.

Researchers are uncertain about the uses of fire by these hominin ancestors. They may have roasted meat, crafted tools, or shared narratives under its glow.

Understanding when our ancestors mastered the use of fire is crucial to unraveling the complexities of human evolution and behavior.

One hypothesis suggests that the ability to start fire contributed to the increase in brain size among early humans, as cooking facilitates easier digestion and boosts caloric intake. Another theory posits that controlling fire may have fostered social gathering spots at night, boosting social behavior and cognitive evolution.

“We know brain size was increasing towards its current capacity during this period,” remarked Chris Stringer, research head in human evolution at London’s Natural History Museum and another author of the Nature study. “The brain is energetically costly, consuming about 20 percent of the body’s energy. Thus, the ability to use fire enhances nutrient absorption from food, provides energy for the brain, and allows for the evolution of larger brains.”

Stringer emphasized that this finding does not signify the beginning of fire usage among humans but is merely the earliest instance researchers can confidently point to. Other early indications of fire use have been found in regions of South Africa, Israel, and Kenya, though these are contentious and open to interpretation.

From an archaeological standpoint, it’s challenging to ascertain the cause of wildfires or whether they were initiated by humans.

“The key question is whether they collected it from a natural source, managed it, or created it themselves. On the surface, this appears to be a robust case suggesting that the group knew how to start fires,” noted Dennis Sandogyas, a senior lecturer in the archaeology department at Simon Fraser University in Canada, who was not part of the study.

In the recent Nature study, researchers highlight the presence of deposits with fire residue, fire-cracked stone tools including a flint hatchet, and two small fragments of pyrite likely brought to the site by humans for fire-making, as indicated by geological analysis.

The prehistoric hatchet stone tool was discovered near a 400,000-year-old fire site that researchers believe was frequently used by Neanderthals.
Road to Ancient Britain Project

Other outside researchers expressed skepticism.

Much of the evidence presented is “circumstantial,” wrote Will Loebloeks, a professor emeritus of paleolithic archaeology at Leiden University in the Netherlands, in an email.

Lowbrokes pointed out that later Neanderthal sites, dating to around 50,000 years ago, showed flint tools with wear signs indicating they had been struck against pyrite to produce sparks, an indication of humans creating fire. This evidence isn’t present in the current study.

“While the authors conducted thorough analysis of the Burnham data, they seem to be overstating claims by suggesting this is the ‘earliest evidence of a fire outbreak,'” Lobruks noted.

For our ancestors, fire was vital for warmth, nutrition, deterring predators, and even melting resins used in adhesives.

However, Sandgate emphasized that the evolution of fire-starting is not a straightforward path; it included sporadic adaptations and innovations. Evidence exists that early groups who learned to create fire sometimes lost that ability or ceased its use for cultural reasons.

“We must be cautious not to generalize any single instance … as proof that from this moment forward everyone will know how to start a fire,” Sandogyas remarked, referencing nearly 100 modern hunter-gatherer groups that have been meticulously observed. Some lacked the ability to generate fire.

“It’s probable that the art of fire-making was discovered, lost, rediscovered, and lost again across various groups over time. Its history is undoubtedly intricate.”

Source: www.nbcnews.com

Incredible Methods to Detect Parkinson’s Disease Years Earlier

Parkinson’s disease is currently the fastest-growing neurological disorder in the United States; currently, 90,000 individuals have been diagnosed—a staggering 50% increase since the mid-1980s. The situation mirrors global trends, with an expected 25 million diagnoses by 2050, effectively doubled compared to today’s figures.

In summary, this is a significant issue. However, these numbers aren’t entirely surprising, considering longer life spans and growing populations. What is truly alarming, and frankly, unsettling, is how unprepared we are for this impending wave.

The available treatments are limited. Diagnostic tools are inadequate. Honestly, we still don’t really understand what causes Parkinson’s disease.

Yet, before you plunge into the depths of neurodegenerative despair, there is hope. Scientists worldwide are actively working to change the narrative surrounding Parkinson’s.

In particular, researchers are revolutionizing how we can detect Parkinson’s disease. Armed with cutting-edge technologies, AI, and a fundamentally evolving understanding of disease manifestation throughout the body, they’re aiming to detect it decades before any symptoms present themselves, rather than years.

Presently, there is no single definitive test for Parkinson’s disease. Instead, doctors diagnose it based on physical symptoms like tremors, slow movement, and muscle stiffness, often requiring assessments of tasks such as writing and speaking.

“Today’s neurodegenerative disease is what cancer used to be 50 years ago,” states Professor Hermona Solek, a leading researcher in next-generation diagnostic tools. “We often finalize a diagnosis only when all involved nerve cells are already dead, leaving us unable to properly treat the patient.”

But what if there were a way to diagnose Parkinson’s disease before it could do any significant harm? What if it could be caught on its way, before brain cells face irreversible damage?

This is no longer just a theory. In fact, there are multiple methods emerging.

AI Desk Accessories

Not all breakthroughs in diagnostics require a blood sample; some new innovations could be found right on your desk.

At the University of California, Los Angeles, Professor Junchen‘s lab claims to have developed a diagnostic pen that detects Parkinson’s disease by analyzing your writing.

This unique pen’s soft tip is crafted from an innovative magnetoelastic material that alters the magnetic field in response to pressure or bending—a phenomenon previously known in rigid metals but now applied to soft polymers, creating a new type of highly sensitive and user-friendly sensor.

“Utilizing magnetoelastic effects with soft materials represents a new operational mechanism,” Chen explains. “It can translate small biomechanical pressures, like arterial vibrations, into high-fidelity electrical signals.”

The pen, filled with magnetized ink, captures movements occurring both on paper and in the air, subsequently sending this data to a computer. Here, AI models analyze specific patterns linked to Parkinson’s motor symptoms.

Smart pens can be especially beneficial in countries where affordable diagnostic tools are needed—UCLA Jun Chen Lab

In a pilot study, the system successfully distinguished individuals with Parkinson’s disease from healthy controls with over 96% accuracy. Even better, Chen believes this pen can be mass-produced for merely $5 (£3.70).

“We have already filed for a patent and aim to commercialize this pen,” Chen states. “Simultaneously, we are working on optimizing it to improve our diagnostics’ accuracy.”

If handwriting isn’t your preferred method, Chen’s team has you covered. They’ve also created a Smart Keyboard utilizing the same principles.

This keyboard tracks subtle changes in pressure and rhythm as users type—often imperceptible to the naked eye—and relays that information to machine learning algorithms.

Initial tests indicate that it can identify characteristic motor abnormalities in Parkinson’s disease, and the team is combining this technology with a mobile app for continuous remote monitoring.

Together, these intelligent desk tools offer a glimpse into what Chen describes as the “personalized, predictive, preventive, participatory” future of Parkinson’s healthcare; a future where diagnosis is as simple as taking notes or sending emails.

This portable, soft keyboard employs magnetic elasticity to detect Parkinson’s disease and sends results to your smartphone—UCLA Jun Chen Lab

Parkinson’s Eye Test Detects Changes Two Decades in Advance

Picture diagnosing Parkinson’s disease during a routine eye exam, potentially decades before symptoms manifest. This is the promise of new non-invasive techniques developed by Victoria Soto Linan and her colleague at Laval University in Canada, using an established eye test known as electroretinography (ERG).

According to Soto Linan, this eye test serves as a “window to the brain,” as it’s part of the central nervous system. Issues like blurred vision and diminished contrast sensitivity manifest long before the well-known symptoms of tremors and stiffness.

The Soto Linan team collected data on how the retina responds to light flashes from both mice engineered to develop Parkinson-like symptoms and newly diagnosed human patients.

They identified unique retinal signals demonstrating “sick signatures,” particularly in women. Crucially, this weakened signal appeared in the mice prior to any behavioral disease signs.

This leads Soto Linan to believe that this eye test could detect Parkinson’s as much as 20 years before symptoms arise.

Read more:

And unlike other early diagnostic methods, this one is already well ahead of the game.

“ERGs are now employed in clinics to diagnose eye diseases,” she explains. “They also have the major advantage of being non-invasive.”

The patient sits before a dome that flashes lights, capturing how the retina responds. This could easily be integrated into a few minutes of your annual vision test.

The team is currently focusing on enhancing the testing process, with hopes of linking it to machine learning algorithms that will accelerate results, perhaps even making them portable to smartphones.

While the research is still in its early stages, its potential ramifications are enormous. As Soto Linan states, “This tool could identify at-risk individuals up to 20 years before symptoms emerge. Imagine how much less damage could be done by then.”

“Even if there is no treatment available, early intervention can often improve the quality of life in the long run.”

Detecting Parkinson’s Through Vocal Patterns

Can your voice indicate Parkinson’s disease before your physical body does? Recently, preprint research has explored whether AI can identify Parkinson’s simply by analyzing a person’s speech.

Around 90% of individuals with Parkinson’s develop motor speech disorders known as dysarthria, which can lead to issues like irregular pitch and breath control.

Globally, over 8.5 million individuals live with Parkinson’s disease—Getty

These vocal changes often arise earlier than more noticeable motor symptoms like tremors, thus serving as promising early indicators.

The research team collected brief audio recordings from 31 to 195 individuals, which included 33 individuals with the disease. Their data served to train four different AI models to recognize disease-related vocal patterns. When tested on new recordings from the same participants, the models identified Parkinson’s with an accuracy exceeding 90%.

These changes are subtle and occur early, and researchers suggest that speech-based assessments could provide low-cost, non-invasive diagnostic options.

Blood Tests for Diagnosing Parkinson’s

In April 2025, SOREQ and her colleagues—including her son—announced a groundbreaking new study.

The findings were surprising; they revealed a simple and inexpensive blood test utilizing PCR technology (remember this from COVID-19?) that can accurately detect Parkinson’s disease a few years prior to symptom onset.

This test functions by measuring the ratio between two markers that SOREQ and her team discovered in human blood.

Specifically, individuals with Parkinson’s exhibit abnormally high levels of certain molecules known as transfer RNA (tRNA) fragments, identifiable by a specific repeating pattern called conserved sequence motifs.

A new blood test can detect early Parkinson’s by analyzing the unique imbalance of small RNA molecules in your blood—Credit: Getty

Simultaneously, the team uncovered reduced levels of tRNA associated with mitochondria (the “powerhouses” of cells, responsible for producing most of your body’s energy) in the blood of Parkinson’s patients.

“We proposed that if there’s an increase in one sequence and a decrease in another, we could calculate the ratio and identify a probable diagnosis,” says Soreq.

If this ratio exceeds a specific threshold, it strongly indicates a diagnosis.

According to SOREQ, a traditional diagnosis of Parkinson’s can cost up to $6,000 (£4,400). The two PCR tests required for their method? Only $80 (£60).

“This is monumental. It makes a substantial difference,” she states. With some luck, the team anticipates this will become widely available within the next decade, potentially providing a crucial lifeline for patients globally.

Read more:

Source: www.sciencefocus.com

Experts Caution Against Earlier and Shorter Seasons

This autumn, New England’s renowned leaf spectacle may not extend as long as Leaf Peepers hope. Following a summer marked by drought and fluctuating rainfall, experts anticipate that colors will emerge early, shine brightly, and fade more quickly than usual.

Timing is not just essential for Instagram-worthy shots. Annually, millions flock to New York, Vermont, Massachusetts, New Hampshire, and Maine to hike, drive, and explore under the vibrant canopy, contributing an estimated $8 billion to the local economy, according to the US Forest Service.

However, this year, scientists say the iconic display is less predictable, with sporadic bursts of color replacing the usual weeks of vibrant waves of red, orange, and gold.

“Bright, Short, Early” season

Jim Salge, an autumn leaves predictor for Yankee magazine, forecasts the transition to be “bright, short, and fast.” Some leaves have already turned brown before showcasing their vibrant hues.

“Traditionally, we observe changes moving northward, inland, and in coastal areas, but as trees become stressed and change rapidly, we expect to see more patchwork patterns this year,” Sarji noted.

When trees do not receive adequate water, they become “stressed,” impairing the process of photosynthesis, which converts sunlight into energy. Conversely, excessive water can suffocate roots.

For optimal viewing, I suggest heading to the western parts of Maine, southern New Hampshire, and northern Massachusetts, as well as the White Mountains in Vermont.

Peak colors are expected to shift to Vermont, New Hampshire, and Western Maine by early October, with higher elevations predicted to peak about a week earlier than usual.

“The silver lining about New England is that if you miss it, you can always head further south,” he said. “If it’s too early, go north or ascend to the mountains.”

Travelers can keep track of leaf changes with tools like the Peak leaf map by Yankee Magazine and I Love New York’s weekly reports.

Why are the leaves changing?

Nonetheless, climate change has generally intensified over recent decades, and this year’s dry summer has accelerated the timeline.

“Ideally, our forests would benefit from a mild rain event evenly spread throughout the year,” explained Mukundrao, assistant professor at Columbia University’s Lamont-Doherty Earth Observatory. “However, a series of extreme storms, followed by dry spells, makes it too rapid for the soil to absorb the water.”

Vibrant leaf colors thrive on warm days and cool nights, but stressful conditions for trees can hasten leaf drop. Stressed or unhealthy trees often exhibit shorter transitions and dull foliage, Rao mentioned. In contrast, urban trees typically retain color longer, as buildings and pavement hold heat while streetlights provide extra illumination.

Additional threats include fungal diseases from heavy spring rains and diseases affecting beech trees.

“We are witnessing invasive insects altering forests and decimating various tree species, alongside invasive plants disrupting native growth and patterns,” Sarji stated.

Tracking changes

To make predictions, Salge depends on weather forecasts and phenotype data, which involves tracking seasonal life cycles.

Notably, Polly’s Pancake Parlor in Sugar Hill, New Hampshire, has been monitoring local foliage since 1975. Records indicate that peak colors appeared for two weeks in late September that year; however, in 2024, it shifted to just two days in early October.

The US National Phenology Network gathers and shares observations from across the country. Its Nature’s Notebook app invites volunteers to document seasonal changes, bolstering over 200 scientific studies, according to director Theresa Crimmins.

“We have a general understanding of nature,” Crimmins remarked. “However, when focusing on specific species in particular locations, there remains much we do not comprehend.”

The revamped version of the app, launching this spring, allows users to upload photos for even one-time observations.

“More people can now become citizen scientists,” Sarji commented. “Their perspectives on the world contribute valuable data.”

Source: www.nbcnews.com

Keylistbones Emerged in Bird Ancestors Millions of Years Earlier Than Previously Believed

A group of paleontologists from Yale University and Stony Brook University made a significant discovery while studying dinosaur fossils, including two bird species found in the Gobi Desert, Mongolia.

This scene illustrates the oviraptorid dinosaur Citipati appearing astonished as it rests on sand dunes. The creature raises its arms in a threat display, exposing its wrists and emphasizing the small, relocated, closed carpal bones (highlighted in blue x-ray). Image credit: Henry S. Sharp.

For years, the identity of a particular carpal bone in the bird’s wrist was a scientific enigma, until researchers determined it functioned as a trap.

This bone, originally resembling a kneecap-like sesame bone, shifted from its original position in the wrist, replacing another carpal bone known as Urna.

Positions in modern birds indicate a link that enables the bird to automatically fold its wings when it bends.

The bone’s large V-shaped notch allows for the alignment of hand bones to prevent dislocation during flight.

Consequently, this bone plays a crucial role in the bird’s forelimb and is integral for flight.

“The carpal bone in modern birds is a rare wrist bone that initially forms within muscle tendons, resembling knee-like bones, but eventually takes the place of the ‘normal’ wrist bones known as Urna,” commented one researcher.

“It is closely associated with the muscle tissue of the arm, linking flying muscle movement to wrist articulation when integrated into the wrist.”

“This integration is particularly vital for wing stabilization during flight.”

In their recent study, Dr. Bhullar and his team analyzed two Late Cretaceous fossils: Troodontid (birds of prey, related to Velociraptor) and citipati cf. osmorusca (an oviraptorid with a long neck and beakless jaw).

“We were fortunate to have two rigorously preserved theropod wrists for this analysis,” said Alex Rubenstal, a paleontologist from Yale University.

“The wrist bones are small and well-preserved, but they tend to shift during decay and preservation, complicating their position for interpretation.”

“Observing this small bone in its correct position enabled me to thoroughly interpret the fossil wrists we had on hand, as well as those from previous studies.”

“James Napoli, a vertebrate paleontologist and evolutionary biologist at Stony Brook University, noted:

“While it’s unclear how many times dinosaurs learned to fly, it’s fascinating that experiments with flight appear only after they adapted to the wrist joint.”

“This adaptation may have established an automated mechanism found in present-day birds, although further research on dinosaur wrist bones is necessary to validate this hypothesis.”

Placing their findings within an evolutionary framework, the authors concluded that it was not merely birds but rather theropod dinosaurs that underwent the confinement of this adaptation by the origin of Penalaptra, a group of theropods that includes Dromaeosaurids and Oviraptorosaurs like Velociraptor.

Overall, this group of dinosaurs exhibited bird-like features, including the emergence of feathered wings, indicating that flight evolved at least twice, if not up to five times.

“The evolutionary replacement of Urna was a gradual process occurring much deeper in history than previously understood,” stated the researchers.

“In recent decades, our understanding of theropod dinosaur anatomy and evolution has expanded significantly, revealing many classical ‘bird-like’ traits such as thin-walled bones, larger brains, and feathers.

“Our findings suggest that avian construction is consistent with a topological pattern traced back to the origin of Penalaptra.”

The team’s paper was published in the journal Nature on July 9, 2025.

____

JG Napoli et al. Theropod wrist reorganization preceded the origins of bird flight. Nature, Published online on July 9, 2025. doi:10.1038/s41586-025-09232-3

Source: www.sci.news

Finds from the Bronze Age indicate that market economics may have originated earlier than previously believed

Bronze Age metal hoard from Weisig, Germany

J. Lipták/Landesamt für Archäologie Sachsen

Bronze Age Europeans earned and spent money in much the same way we do today, indicating that the origins of the “market economy” are much older than expected.

That’s the controversial conclusion of a new study that challenges the view that elites were the dominant force in Bronze Age economies and suggests that human economic behaviour may not have changed much over the past 3,500 years or more.

“We tend to romanticize European prehistory, but the Bronze Age was not just a fantasy world where townsfolk and peasants served their needs as a backdrop for great lords,” he said. Nicola Ialongo “It was a very familiar world, with family, friends, social networks, markets, jobs, and ultimately having to figure out how to make ends meet,” says Professor at Aarhus University in Denmark.

Bronze Age Europeans, from 3300 to 800 BCE, were not meticulous bookkeepers like people in other ancient societies, such as those in Mesopotamia. But Ialongo and Giancarlo Lago Researchers at the University of Bologna in Italy suggest that the treasure trove of metal they left behind may hold important insights into their daily lives and the roots of modern economic behavior.

Lago and Ialongo analyzed more than 20,000 metal objects from Bronze Age burials in Italy, Switzerland, Austria, Slovenia and Germany. These metal objects came in many different forms, but around 1500 B.C. they began to be standardized by weight, which is how they were classified. Many experts These are distinguished as a type of pre-monetary currency.

“The discovery of widespread systems of measurement and weight allows us to model things that have been known for centuries in ways that have never been modeled before,” Ialongo says. “This not only gives us new answers to old questions, but it also gives us new questions that no one has asked before.”

The team found that the weight values ​​in their vast sample followed the same statistical distribution as the daily expenses of a modern Western household: small everyday expenses, represented by lighter pieces, dominated the consumption pattern, while larger expenses, represented by heavier pieces, were relatively rare. This pattern is similar to that found in the average modern wallet, with many small bills and very few large bills.

Lago and Ialongo interpret their find as evidence that the Bronze Age economic system was regulated by market forces of supply and demand, with everyone participating in proportion to how much they earned. This hypothesis contrasts with the influential view put forward by anthropologist Karl Polanyi in the 1940s, who characterized the modern economy, based on monetary gain, as a new phenomenon distinct from ancient economies centered on barter, gift exchange, and social status.

Richard Brunton A researcher from Purdue University in Indiana called the study credible: “I think this argument will stimulate debate among archaeologists and economic anthropologists who have been based for decades on erroneous assumptions about the antiquity of market economies,” he said.

“I think this paper adds useful fuel to that criticism,” Brunton says, “and to me it sheds entirely new light on the function of bronze deposits and the potential use of bronze coins as a unit of exchange.”

but, Erica Schonberger Researchers at Johns Hopkins University in Maryland are skeptical of the team’s conclusions. “It’s dangerous to assume that ordinary people in premodern times used money in normal economic activities,” says Schonberger. “For example, medieval English peasants only got money for selling their produce when lords began to demand money in lieu of rents or taxes in kind. They gave most or all of that money directly to the lords. They sold to get money, but they didn’t use it to buy things they needed. We’re still a long way from modern economic behavior.” [in the Middle Ages].”

Lago and Ialongo hope that their work will inspire other experts to carry out similar studies on artefacts from different regions and cultures. They suggest that market economies are a natural development across time and cultures, and that such systems are not something new or unique that has emerged in Western societies over the past few centuries.

“Technically, we haven’t proven that the Bronze Age economy was a market economy,” Ialongo says, “we simply have no evidence that it wasn’t. And we’re just pointing out a contradiction: why is everyone so convinced that there wasn’t a market economy when everything we see can be explained by a market economy model? In other words, if the simplest explanation works well enough, why should we have to imagine a more complex one?”

topic:

Source: www.newscientist.com

New understanding suggests LUCA, the last common ancestor of all life, emerged earlier than previously believed

Illustration showing LUCA possibly being attacked by a virus

Scientific Graphic Design

The organisms that gave rise to all life on Earth evolved much earlier than previously thought – just a few hundred million years after Earth formed – and may have been more sophisticated than previous assessments had suggested.

The DNA of all living organisms today is E. coli There are many similarities in the evolution leading up to the blue whale, suggesting that we can trace our origins back to a universal common ancestor, LUCA, billions of years ago. While many efforts have been made to understand LUCA, studies taking a broader approach have revealed surprising results.

“What we're trying to do is bring together representatives from different disciplines to develop a comprehensive understanding of when LUCA existed and what its biological characteristics were,” he said. Philip Donahue At the University of Bristol, UK.

Genes that are currently present in all major lineages of life may have been passed down uninterrupted from LUCA, which could help us understand what genes our ancient ancestors had. By studying how these genes changed over time, we should be able to estimate when LUCA lived.

In reality, this is a lot more complicated than it sounds, as genes are lost, gained, and swapped between branches. Donohue says the team created a complex model that took this into account, to work out which genes were present in LUCA. “We've found a much more sophisticated organism than many have previously claimed,” he says.

The researchers estimate that 2,600 protein-coding genes come from LUCA, up from previous estimates of as few as 80. The team also concludes that LUCA lived around 4.2 billion years ago, much older than other estimates and surprisingly close to the formation of Earth 4.5 billion years ago. “This suggests that the evolution of life may have been simpler than previously claimed, because evolution happened so quickly,” Donohue says.

The earlier date is largely due to the team's improved methodology, but also because, unlike others, they don't assume that LUCA could have existed only after the Late Heavy Bombardment, when Earth was hit so hard by space debris that any new life that emerged could have been wiped out. Based on rocks returned from the Moon, the period has been put at 3.8 billion years ago, but there's a lot of uncertainty around that number, Donohue says.

Their reconstruction suggests that LUCA had genes that protected it from ultraviolet damage, which leads them to believe that it likely lived on the ocean's surface. Other genes suggest that LUCA fed on hydrogen, which is consistent with previous findings. The team speculates that LUCA may have been part of an ecosystem with other types of primitive cells that are now extinct. “I think it's extremely naive to think that LUCA existed on its own,” Donohue says.

“I think this is compelling from an evolutionary perspective.” Greg Fournier “LUCA is not the beginning of the story of life, but merely the state of the last common ancestor that we can trace back to using genomic data,” say researchers from the Massachusetts Institute of Technology.

The results also suggest that LUCA had a primitive version of the bacterial defense system known as CRISPR to fight viruses. “Even 4.2 billion years ago, our earliest ancestors were fighting viruses,” the team members say. Edmund Moodyalso at the University of Bristol.

Peering into the distant past is fraught with uncertainty, and Donohue is the first to admit that his team may have missed the mark. “We've almost certainly got it all wrong,” he says. “What we're trying to do is push the envelope and create the first attempt to synthesize all of the relevant evidence.”

“This won't be the last word,” he said, “and it won't be our last word on this subject, but we think it's a good start.”

Patrick Forter Researchers at the Institut Pasteur in Paris, France, who coined the term LUCA, also believe that the organism did not live in isolation. “But the claim that LUCA lived before the Late Heavy Bombardment 3.9 billion years ago seems to me completely unrealistic,” says Forterre. “I'm convinced that their strategy for determining the age and gene content of LUCA has several flaws.”

topic:

Source: www.newscientist.com

Cyprus settled by hunter-gatherers much earlier than previously believed

With persistent stories of isolation, inaccessibility, and unattractiveness, one of the eastern Mediterranean islands was first populated by an influx of agricultural populations from the mainland under demographic pressures. It is generally considered to be a Neolithic phenomenon that began with. New research led by Professor Corey Bradshaw from Flinders University shows Cyprus may have been settled by hunter-gatherers by about 14,000 to 13,000 years ago, earlier than previously recognized. . This process must have involved a small number of large-scale migration events (hundreds to thousands of people), which infers the intentions and organization of these early humans.



Bradshaw uses the latest archaeological data, afterthought climate projections, and demographic models of age structure to other. They demonstrate evidence of an early arrival on Cyprus (14,257-13,182 years ago), within two to three major events occurring within 100 years, to ensure the risk of extinction is low. They expected a large group (1,000 to 1,375 people) to arrive.Image credit: Bradshaw other., doi: 10.1073/pnas.2318293121.

In researching when Cyprus was first occupied by humans, Professor Bradshaw and his colleagues discovered that the large Mediterranean islands were an attractive and preferred destination for Paleolithic people.

Their findings contradict previous research that suggested Mediterranean islands would have been inaccessible and inhospitable for Pleistocene hunter-gatherer societies.

Archaeologists used archaeological data, climate estimates, and demographic modeling to uncover Cyprus's early people.

Analysis of archaeological dating from the 10 oldest sites across Cyprus suggests that first human habitation dates between 14,257 and 13,182 years ago, which is longer than previously thought. It is also much older.

“The islands were then rapidly settled. Climate modeling shows that this early hominin population was able to survive in tandem with increases in temperature, precipitation, and environmental productivity sufficient to sustain large hunter-gatherer populations. “We show that this is the case,” the researchers said.

Based on demographic models, we believe that large groups of hundreds to thousands of people arrived in Cyprus over two or three major migration events within 100 years.

“This settlement pattern suggests systematic planning and the use of advanced vessels,” Professor Bradshaw said.

Within 300 years, or 11 generations, Cyprus' population grew to a median of 4,000 to 5,000 people.

Dr Theodora Muzio, an archaeologist at James Cook University and the University of Cyprus, said: “This result suggests that Cyprus, and perhaps other Mediterranean islands, were more likely to be inhospitable places for Paleolithic hunter-gatherer societies. “This suggests that it would have been an attractive destination.”

“The dispersal and settlement of humans in Cyprus and other eastern Mediterranean islands was due to rapid climate change, with coastal regions inundated by post-ice age sea level rise and farmers forced to move to new locations. , it is argued that this is due to demographic pressures on the mainland, an area of ​​necessity rather than choice.”

“This interpretation has arisen as a result of significant gaps in the archaeological record of Cyprus, resulting from differences in the preservation of archaeological materials, bias in preservation, uncertainties associated with dating, and limited DNA evidence. '' said Australian Museum of Archaeology's archaeologist Dr Christian Liebmeyer. Australian Biodiversity and Heritage Research Council Center of Excellence, German Institute of Archaeology, and James Cook University.

“Our research, based on more archaeological evidence and advanced modeling techniques, changes that.”

“New findings highlight the need to reconsider the question of early human migration in the Mediterranean and test the validity of the perceived early settlement dates in the light of new technologies, field survey methods and data. ' said Professor Bradshaw.

Regarding this research, paper Published in Proceedings of the National Academy of Sciences.

_____

Corey J.A. Bradshaw other. 2024. Demographic models predict the onset of the late Pleistocene and rapid expansion of pre-agro-pastoralism in Cyprus. PNAS 121 (21): e2318293121; doi: 10.1073/pnas.2318293121

Source: www.sci.news

Wildfire season starting earlier and extending further

The fire season in Alberta, Canada typically starts on March 1st. The season was officially declared open on February 20th, more than a week ago.

Over 150 wildfires are currently burning in parts of Western Canada. Meanwhile, firefighters in the Texas Panhandle have been battling the largest wildfire in the state’s history for over a week. This fire is part of a trend of recent wildfires starting earlier than expected.

Although winter fires are not uncommon in these regions, scientists believe that global warming is worsening the conditions that lead to these winter wildfires.

According to wildfire expert Mike Flannigan from Thompson Rivers University in British Columbia, Canada, “As temperatures rise, we are seeing conditions that are more conducive to fires. A longer burn period means more chances for fires to occur.”

The ongoing drought in Western Canada is fueling numerous fires in British Columbia and Alberta. Even in areas where drought is not a major issue, the impacts of climate change are being felt.

In Texas, authorities are investigating whether a utility company was responsible for the recent historic fires, which burned over a million acres. Extreme temperatures, dry grass, and high winds created ideal conditions for the fires to spread rapidly.

While global warming may contribute to the conditions favoring wildfires, it is challenging to directly attribute individual events to climate change. Weather, landscapes, and ecosystems all interact in complex ways to influence fire behavior in different locations.

Climate change is leading to warmer environments that make plants drier, increasing the risk of fires. Scientist Nathan Gill from Texas Tech University explained, “While we can’t point to any specific event as caused by climate change, conditions are changing, making similar events more likely in the future.”

This trend is expected to result in longer fire seasons and more winter fires in the years to come, as we continue to live in a more fire-prone world.

“As we face a more flammable world, we should anticipate more occurrences like this,” Flannigan concluded.

Source: www.nbcnews.com