Massive Dark Matter Cloud Detected Near Our Solar System: What’s at Stake?

Dark Matter Discovery

Possible Large Clump of Dark Matter Near Our Galaxy

Credit: Alamy

A significant discovery indicates the presence of a gigantic dark matter cloud adjacent to our solar system. These clouds, previously unidentified in the Milky Way, have been detected thanks to precise cosmic clocks known as pulsars.

Current cosmological models propose that galaxies are enveloped in diffuse clouds of dark matter called halos, with smaller subhaloes scattered throughout. However, the elusive nature of dark matter, which neither emits, absorbs, nor reflects light, complicates the detection of these halos and subhalos.

To quantify this dark matter phenomenon, Sukanya Chakrabarti and her research team at the University of Alabama in Huntsville leveraged pairs of rapidly spinning neutron stars known as pulsars. These cosmic clocks emit beams of light at consistent intervals, allowing researchers to measure variations in their trajectories when influenced by large nearby mass.

Given that dark matter interacts with ordinary matter solely through gravity, an adjacent dark matter subhalo would alter the orbit of neighboring pulsars. This is precisely what Chakrabarti and her collaborators identified approximately 3,000 light years from our solar system. “Our observations detected a pair of pulsars whose motions indicate an unexpected gravitational pull from an unseen object,” comments Philip Chan from the University of Wisconsin-Milwaukee.

The research revealed that this gravitational influence originated from an object approximately 60 million times more massive than the Sun and spanning hundreds of light years. After mapping the location against stellar data, no correlations with known celestial bodies were found. If validated, this object could be a unique example of dark matter.

This potential dark matter subhalo could be the only instance of such size in our local galactic vicinity. “There may only be one or two of these large features nearby, depending on dark matter models,” suggests Alice Quillen at the University of Rochester in New York. “Different dark matter theories propose varying distributions of these structures.”

This pursuit is what catalyzed Chakrabarti’s interest in subhalo research. “Our objective is to map as many subhaloes as we can throughout the galaxy, and we’re just beginning to achieve that. Ultimately, we aim to elucidate the nature of dark matter,” she asserts.

However, pulsar binaries are scarce; only 27 instances provide sufficient accuracy for measuring gravitational acceleration. This scarcity explains why this subhalo remained undetected until now. “Given the finite number of pulsars, we are exploring alternative methods to monitor them using a broader array of objects,” states Zhang. If successful, this could be a breakthrough in understanding the true nature of dark matter.

Topics:

Source: www.newscientist.com

Webb Uncovers Massive Helium Cloud Emanating from WASP-107b

By utilizing highly precise spectroscopic observations from the Near Infrared Imager and Slitless Spectrometer (NIRISS) on board the NASA/ESA/CSA James Webb Space Telescope, astronomers have identified helium gas escaping from WASP-107b, a super-Neptunian exoplanet located in the Virgo constellation, approximately 212 light-years away.

Artist’s impression of exoplanet WASP-107b. Image credit: University of Geneva / NCCR PlanetS / Thibaut Roger.

WASP-107 is an active K-type main-sequence star situated roughly 212 light-years away in the Virgo constellation.

Discovered in 2017, WASP-107b is among the least dense known exoplanets, categorized by astrophysicists as a “superpuff” or “cotton candy” planet.

This exoplanet has an orbit significantly closer to its star than Earth is to the Sun, completing its orbit every 5.7 days.

While this planet features the coldest atmosphere recorded for an exoplanet, at 500 degrees Celsius (932 degrees Fahrenheit), it remains much hotter than Earth.

This elevated temperature results from tidal heating linked to its slightly elliptical orbit, which may help explain how WASP-107b can expand without invoking extreme formation theories.

“A planet’s atmosphere can sometimes dissipate into space,” explained Yann Carteret, an astronomer at the University of Geneva, alongside colleagues.

“On Earth, we lose just over 3 kg of matter (primarily hydrogen) every second.”

“This phenomenon, known as atmospheric escape, is especially significant for astronomers studying exoplanets in close proximity to their stars. Such planets experience intense heating, making them particularly vulnerable to this effect.”

With data from Webb’s NIRISS instrument, astronomers observed a substantial flow of helium within WASP-107b’s exosphere.

This helium cloud partially obscures the star’s light even before the planet transits in front of the star.

“Our atmospheric escape model indicates a helium flow both in front of and behind the planet, extending nearly 10 times the planet’s radius in the direction of its orbit,” Carteret stated.

Alongside helium, astronomers confirmed the existence of water and various trace chemicals (including carbon monoxide, carbon dioxide, and ammonia) in WASP-107b’s atmosphere.

These findings provide essential insights for piecing together the history of their formation and migration.

The research suggests that the planet initially formed at a greater distance from its current orbit before drifting closer to its star, which may account for the thickening of its atmosphere and gas loss.

“Atmospheric escape on Earth is too weak to have a significant impact on our planet,” noted Vincent Boullier, an astronomer at the University of Geneva.

“However, it could explain the absence of water on Venus, which is nearby.”

“Thus, understanding the mechanisms involved in this process is crucial, as it could erode the atmospheres of certain rocky exoplanets.”

Details of these findings were published in the journal Nature Astronomy.

_____

V. Krishnamurthy et al. Continuum helium absorption from both the leading and trailing atmospheric tails of WASP-107b. Nat Astron, published online on December 1, 2025. doi: 10.1038/s41550-025-02710-8

Source: www.sci.news

What is Cloud Seeding and Could It Solve Iran’s Drought Crisis?

SEI 274681593

Riverbed near Latian Dam, one of Tehran’s main water sources, Iran

BAHRAM/Middle East Images/AFP via Getty Images

Iran faces an extreme drought, prompting the president to suggest a possible evacuation of Tehran. To mitigate the situation, cloud-seeding aircraft will start operations on November 15, set to persist through the rainy season until May. However, experts caution that this method is complex and unlikely to significantly alleviate the water crisis on its own.

How critical is the drought in Iran?

Iran’s rainfall is down by 85 percent, with Tehran registering only 1 mm this year. Local reservoirs are reported to be in a dire condition, with 32 dams barely holding 5 percent of their capacity. Satellite imagery shows some areas completely parched.

Residents of Tehran have cut their water usage by approximately 10 percent, yet this reduction falls short. Water pressure has diminished overnight, and officials are preparing to fine households and businesses that exceed their water limits. The president indicated that if rains do not increase by December, 14 million people in Tehran might need to relocate.

What are the causes of drought?

Experts attribute the drought, which has persisted for five years, to climate change. This autumn is the driest Iran has seen in 50 years, with temperatures reaching above 15°C (59°F) in Tehran, a city usually experiencing snowfall in November.

However, mismanagement is largely to blame. Kave Madani, a former environmental ministry official, labels Iran a “water bankrupt.” The government has expanded agriculture in arid regions excessively, stressing water resources. A staggering number of illegal wells—around half a million—have been drilled by desperate farmers, draining groundwater supplies.

What exactly is cloud seeding?

Cloud seeding originated in the 1940s with scientists like Bernard Vonnegut, brother of author Kurt Vonnegut. The process involves dispersing particles that encourage water droplets in clouds to precipitate as rain. While some studies have experimented with salt, the focus has generally been on spraying silver iodide into upper-level mixed-phase clouds. When supercooled liquid droplets encounter this crystalline material, they crystallize into ice that ultimately falls as rain or snow.

Determining the amount of precipitation caused by cloud seeding can be challenging, especially in the absence of cloud formation.

“The natural variability of clouds is significant, making it tough to quantify their impacts,” says Andrea Frosman from the University of Clermont-Auvergne, France. “Externally, clouds appear, but not all of them bring rain.”

A 2014 experiment conducted in Wyoming indicated that cloud seeding could enhance precipitation by 5 to 15 percent.

Is there hope for resolving the drought?

Previously, Iran blamed Israel and the United Arab Emirates for “stealing” rain via cloud seeding, but it’s now taking matters into its own hands. It plans to deploy cargo planes, drones, and a “ground generator”, which is essentially a smoker’s oven, commonly located in mountainous regions.

On November 15, clouds formed over the watershed surrounding Lake Urmia, which has been barren for over two decades, now reduced to a salt flat littered with decaying boats. Initial reports suggest that areas west of the lake received up to 2.7 centimeters of rain the following morning, as indicated by a precipitation map from the University of California, Irvine.

Success in cloud seeding depends on the presence of moisture-rich clouds, which might be sparse in arid Iran, where large bodies of water are limited.

“Cloud seeding becomes particularly challenging during droughts, owing to the extremely dry atmosphere that may not support moisture-laden clouds,” notes Karen Howard, a scientist with the U.S. General Accounting Office.

Yet, in recent days, numerous rain clouds have poured into Iran from the Black Sea, causing localized flooding in regions such as Iram and Kurdistan on November 16.

Cloud seeding might help “extract a few additional droplets” from these weather systems, according to Armin Sorooshian at the University of Arizona. “While it won’t create flooding or completely solve the drought issue, it may provide a minor benefit.”

topic:

Source: www.newscientist.com

Massive Plasma Cloud Erupted from a Star 130 Light-Years Away

SEI 273957047

Artist’s impression of a coronal mass ejection in a star

Olena Shumahalo/Collingham et al.

Astronomers have successfully identified the first clear evidence of a coronal mass ejection (CME) from a star outside of our solar system. This CME, a plasma cloud from a star located 130 light-years away, was observed using radio telescopes here on Earth.

Coronal mass ejections happen when solar storms propel bubbles of magnetized plasma into space. While such eruptions from our Sun can create auroras on Earth, they can also be powerful enough to disrupt the atmosphere of Venus, which lacks a protective magnetic field.

For decades, scientists have detected signs of CMEs in far-off stars, but until now, they were unable to confirm that this material truly escapes the star’s gravitational and magnetic grip, rather than simply being temporarily displaced and then drawn back in.

Joseph Cullingham and his team at the Netherlands Institute for Radio Astronomy discovered these emissions utilizing the Low Frequency Array (LOFAR) radio telescope. The bursts, or radio waves, emitted by CMEs can only be captured when the ejection travels fully away from its origin, which is StKM 1-1262.

This research group also employed the XMM-Newton space-based X-ray telescope to assess the temperature, rotation, and luminosity of the host star.

Cullingham emphasized that this new evidence conclusively affirms prior speculations that CMEs indeed occur in distant stars. “Some will say we’ve seen indications for the last 30 years, and they’re right, but we’ve never been able to prove it definitively,” he remarked. “We are discussing mass being expelled and lost from the star, which has been a topic of ongoing debate.”

The radiation from these ejecta could pose a significant threat to potential life forms nearby. According to researcher Anthony Yates from Durham University in the UK, it is crucial to integrate insights on the frequency and intensity of CMEs from distant stars into models assessing the habitability of exoplanets. “If exoplanets were to exist, the repercussions for life there could be devastating,” he added.

World capital of astronomy: Chile

Explore the astronomical wonders of Chile. Visit the most advanced observatory globally and enjoy stargazing under the clearest skies available.

Article modified on November 12, 2025

Updated star distance from Earth.

Topic:

Source: www.newscientist.com

Astronomers Uncover New Planetary Nebula in the Large Magellanic Cloud

Astronomers have identified a faint planetary nebula during a spectroscopic examination of stars in NGC 1866, a vast young globular cluster within the Milky Way satellite galaxy, known as the Large Magellanic Cloud. This nebula, designated Ka LMC 1, is situated near the core of NGC 1866.



This image shows NGC 1866 overlaid with a false-color representation from the MUSE data cube, highlighting the ionized shell of planetary nebula Ka LMC 1 as a red ring. The grayscale inset details the sizes of the ionization shells of singly ionized nitrogen. [N II] and doubly ionized oxygen [O III]. A magnified Hubble image reveals a pale blue star at the center, likely the hot central star of Ka LMC 1. Image credit: AIP / MM Roth / NASA / ESA / Hubble.

NGC 1866 is located at the edge of the Large Magellanic Cloud, approximately 160,000 light-years from Earth.

This cluster, also referred to as ESO 85-52 and LW 163, was discovered by Scottish astronomer James Dunlop on August 3, 1826.

Surprisingly, NGC 1866 is a young globular cluster positioned close enough for individual star studies.

In a recent spectroscopic investigation of NGC 1866, astronomers analyzed spectra captured by the MUSE Integral Field Spectrometer on ESO’s Very Large Telescope.

They made an unexpected and intriguing discovery: the ionized shell of a planetary nebula.

A subsequent study utilized images from the NASA/ESA Hubble Space Telescope to explore the nature of the object, which has been named Ka LMC 1.

“Planetary nebulae signify a late phase in a star’s evolution, during which the star consumes hydrogen for nucleosynthesis, expands as a red giant in a shell-burning phase, and eventually sheds most of its mass into a large, expanding shell. The remaining core then contracts and heats up, eventually cooling to become a white dwarf,” explained lead author Dr. Howard Bond, an astronomer at Pennsylvania State University and the Space Telescope Science Institute, along with his colleagues.

“Once the core surpasses 35,000 degrees, the shell ionizes and becomes visible through emission lines at specific wavelengths.”

The research team noted that Hubble images depict the hot central star of the Ka LMC 1 nebula.

“Ka LMC 1 is a genuine enigma. A young star cluster aged 200 million years implies that its progenitor star must be significantly massive,” noted astronomer Professor Martin Roth from the Potsdam Leibniz Institute for Astrophysics, the Institute for Physics and Astronomy at the University of Potsdam, and the German Center for Astrophysics.

“However, such a star would quickly evolve towards a cooling white dwarf stage.”

“Reconciling the age of the planetary nebula’s expanding shell with the theoretical evolutionary trajectory of its central star has been challenging.”

“This object undoubtedly demands further detailed observations to clarify its characteristics.”

“It presents a rare opportunity to observe star evolution over a timeframe that usually spans millions, if not billions, of years.”

“Yet, the evolution of massive central stars occurs in merely a few thousand years, making it possible to align with the timeline of the nebula’s expansion.”

According to a study published on November 7, 2025, in Publications of the Astronomical Society of the Pacific.

_____

Howard E. Bond et al. 2025. A faint planetary nebula was accidentally discovered in the massive young LMC star cluster NGC 1866. pasp 137, 114202; doi: 10.1088/1538-3873/ae1664

Source: www.sci.news

OpenAI Enters $38 Billion Cloud Computing Agreement with Amazon

OpenAI has secured a $38 billion (£29 billion) agreement to leverage Amazon’s infrastructure for its artificial intelligence offerings, part of a broader initiative exceeding $1 trillion in investments in computing resources.

This partnership with Amazon Web Services provides OpenAI with immediate access to AWS data centers and the Nvidia chips utilized within them.

Last week, OpenAI CEO Sam Altman stated that the company is committed to an investment of $1.4 trillion in AI infrastructure, highlighting concerns over the sustainability of the expanding data center ecosystem, which serves as the backbone of AI applications such as ChatGPT.

“To scale frontier AI, we need large-scale, dependable computing,” Altman remarked on Monday. “Our collaboration with AWS enhances the computing ecosystem that fuels this new era and makes sophisticated AI accessible to all.”

OpenAI indicated that this deal will provide access to hundreds of thousands of Nvidia graphics processors for training and deploying its AI models. Amazon plans to incorporate these chips into its data centers to enhance ChatGPT’s performance and develop OpenAI’s upcoming models.

AWS CEO Matt Garman reaffirmed that OpenAI is continuously pushing technological boundaries, with Amazon’s infrastructure forming the foundation of these ambitions.

OpenAI aims to develop 30 gigawatts of computing capacity, enough to supply power to approximately 25 million homes in the U.S.

Recently, OpenAI declared its transformation into a for-profit entity as part of a restructuring effort that values the startup at $500 billion. Microsoft, a long-time supporter, will hold roughly 27% of the new commercial organization.

The race for computing resources among AI firms has sparked worries among market analysts regarding financing methods. The Financial Times reported that OpenAI’s annual revenue is approximately $13 billion, a figure starkly contrasted by its $1.4 trillion infrastructure expenditures. Other data center deals OpenAI has entered include a massive $300 billion agreement with Oracle.

During a podcast with Microsoft CEO Satya Nadella, Altman addressed concerns regarding spending, stating “enough is enough” when prompted by host Brad Gerstner about the disparity between OpenAI’s revenue and its infrastructure costs.

Altman claimed that OpenAI generates revenue “well above” the reported $13 billion but did not disclose specific figures. He added: “Enough is enough…I believe there are many who wish to invest in OpenAI shares.”

Analysts at Morgan Stanley have forecast that global data center investment will approach $3 trillion from now until 2028, with half of this spending expected to come from major U.S. tech firms, while the remainder will be sourced from private credit and other avenues. The private credit market is an expanding segment of the shadow banking industry, raising concerns for regulators such as the Bank of England.

quick guide

Contact us about this story

show

The best public interest journalism depends on firsthand reporting from informed individuals.

If you have insights to share on this subject, please contact us confidentially using the following methods.

Secure messaging in the Guardian app

The Guardian app features a tool for submitting story tips. Messages are encrypted end-to-end and concealed within the routine activities of the Guardian mobile app, preventing observers from knowing that you are communicating with us, let alone the content of the messages.

If you haven’t installed the Guardian app yet, download it (iOS/Android) and access the menu. Select ‘Secure Messaging.’

SecureDrop, instant messaging, email, phone, mail

If you can employ the Tor network safely without surveillance, you can send messages and documents to the Guardian through our SecureDrop platform.

Finally, our guide at theguardian.com/tips outlines several secure contact methods and discusses the pros and cons of each.

Illustration: Guardian Design/Rich Cousins

Thank you for your feedback.


Source: www.theguardian.com

Amazon Sees Biggest Cloud Growth Since 2022 Following Major Outage

For the first time since its cloud computing unit experienced a significant failure that impacted various services from smart beds to banks, Amazon has made its financial data public.

Despite this global outage, Amazon Web Services (AWS) continues to thrive, reporting a 20% year-over-year revenue growth for the quarter. Analysts on Wall Street predict that AWS will generate a net revenue of $32.42 billion in the third quarter, while Amazon’s actual reported revenue stands at $33 billion.

“AWS is growing at a rate not seen since 2022,” CEO Andy Jassy mentioned in a statement during the earnings call.


Following the third-quarter earnings report that exceeded analysts’ forecasts, the company’s stock surged by approximately 9% in after-hours trading.

The earnings announcement underscored Amazon’s ambition to compete more effectively with corporations that have successfully capitalized on the AI boom. Amazon’s stock performance has trailed behind some major tech competitors, and its e-commerce operations are particularly vulnerable to the far-reaching and unpredictable tariff policies of the Trump administration compared to companies driven by software.

Value at roughly $2.4 trillion, Amazon reported that it significantly outperformed Wall Street’s expectations, largely due to the expansion of its cloud computing services. Analysts had anticipated earnings of $1.58 per share with net sales of $177.82 billion, whereas Amazon announced sales of $180.17 billion and earnings per share of $1.95.

AWS is facing mounting rivalry from alternative providers like Google Cloud and Microsoft Azure, the latter of which has established a partnership with OpenAI and reported robust growth in its cloud segment, boosting its stock prices.

Nevertheless, AWS remains a crucial component of the modern Internet, and the extent of its influence was inadvertently highlighted earlier this month when a glitch in its cloud services rendered websites, apps, cutting-edge products, and critical communication systems, including electronic health records, inoperable. The outage affected millions and lasted several hours, revealing how integral Amazon’s services are to everyday life.

During the earnings call, Amazon executives promoted the integration of AI tools like shopping assistant Rufus into its services. They also discussed Zoox’s plans to expand its robotaxi business, with self-driving service trials scheduled to commence in Washington, D.C., later this year.

Earlier this week, Amazon announced plans to cut 14,000 jobs at its headquarters, with more layoffs anticipated across the organization. This decision was publicly communicated through a blog post titled “Staying Agile and Continuing to Strengthen Our Organization,” which cited advancements in AI as a key reason, stating that the company aims to “function like the world’s largest startup.”

“We must remember that the world is rapidly evolving,” the Amazon post noted. “This generation of AI represents the most transformative technology since the Internet, allowing businesses to innovate unprecedentedly faster.”

Skip past newsletter promotions

Jassy indicated in a blog post earlier this year that the company’s investments in AI would lead to a “reduction in personnel for some roles currently held.”

However, during a conference call with investors, Jassy clarified that the significant layoffs were not driven by AI, asserting that they stemmed from “culture” and that the company is focusing on a more flexible, startup-like approach.

“The announcement we made a few days ago wasn’t purely financial and hasn’t been so far—it’s not primarily AI-driven either. It’s fundamentally about our culture,” Jassy stated.

quick guide

Contact us about this story

show

The best public interest journalism relies on first-hand accounts from people in the know.

If you have something to share regarding this matter, please contact us confidentially using the methods below.

Secure messaging in the Guardian app

The Guardian app has a tool to submit story tips. Messages are end-to-end encrypted and hidden within the daily activities performed by all Guardian mobile apps. This prevents observers from knowing that you are communicating with us, much less what you are saying.

If you don’t already have the Guardian app, please download it (iOS/android) and go to the menu. Select “Secure Messaging.”

SecureDrop, instant messenger, email, phone, mail

If you can safely use the Tor network without being monitored, you can send messages and documents to the Guardian through our SecureDrop platform.

Finally, our guide at theguardian.com/tips lists several ways to contact us securely and discusses the pros and cons of each.

Illustration: Guardian Design/Rich Cousins

Thank you for your feedback.


Source: www.theguardian.com

Webb Observes Sagittarius B2 in an Interstellar Molecular Cloud

Positioned at the core of the Milky Way galaxy, Sagittarius B2 is an immense molecular cloud of gas and dust, boasting around 3 million times the mass of the sun.



The Sagittarius B2 molecular cloud, where stars, gases, and cosmic dust shimmer in near-infrared light, as captured by Webb’s Nircam instruments. Image credits: NASA/ESA/CSA/STSCI/A. GINSBURG, University of Florida/N. Budaiyev, University of Florida/T. Yu, University of Florida/A. Pagan, STSCI.

The distance from Earth to Sagittarius B2 is roughly 27,000 light years, while it sits just 390 light years from the center of the Milky Way.

This is the largest and most active star-forming cloud within our galaxy, accounting for half of the stars birthed in the central region, even though it comprises merely 10% of the material required for star formation in that area.

“Sagittarius B2 is situated just hundreds of light years away from the supermassive black hole located at the galaxy’s center, right at the heart of star formation.”

“Webb’s infrared observations can penetrate some of the dense clouds present, uncovering young stars alongside the warm dust enveloping them.”

“Examining Webb’s findings aids in unraveling the long-standing enigma surrounding the star formation process and why Sagittarius B2 generates stars at a rate surpassing other galaxy centers.”

“Interestingly, one of the most striking elements of Webb’s imagery of Sagittarius B2 is the regions that remain dark.”

“These seemingly vacant areas of space are so tightly packed with gas and dust that even Webb cannot detect them.”

“These dense clouds are the progenitors of future stars and are too young to emit light themselves.”



Webb’s Miri (medium-infrared device) displays the Sagittarius B2 region in medium-red light, revealing bright warm dust. Image credits: NASA/ESA/CSA/STSCI/A. GINSBURG, University of Florida/N. Budaiyev, University of Florida/T. Yu, University of Florida/A. Pagan, STSCI.

With the high resolution and sensitivity of Webb’s Miri (mid-infrared device), this area has been uncovered in remarkable detail, showcasing luminous cosmic dust heated by a massive, young star.

The red area labeled Sagittarius B2 North (located to the right in these Webb images) is among the most molecularly abundant regions known, yet astronomers have never observed it with such clarity before.

The differentiation lies in the longer wavelengths produced, even within the infrared spectrum, and the contrast between images from Webb’s Miri and Nircam (near-infrared camera) makes it evident.

“The luminous gas and dust emerge dramatically in mid-red light, though everything except for the brightest stars vanishes from sight,” the astronomer noted.

“In contrast to Miri, vibrant stars take center stage in Webb’s Nircam images.”

“Further investigations into these stars will yield insights into their masses and ages, aiding astronomers in comprehending the intricacies of star formation within this dense, dynamic galactic core.”

“Has this activity persisted for millions of years? Or has an unknown process triggered it recently?”

“We anticipate that Webb will illuminate the reasons behind the disproportionate star formation centered around galaxies.”

“While there are ample gaseous components in this area, overall productivity is not on par with that of Sagittarius B2.”

“Sagittarius B2 contains only 10% of the galaxy-centric gas but is responsible for 50% of the stars.”

Source: www.sci.news

Hubble Discovers Cloudy Star Clusters in the Large Magellanic Cloud

A stunning new image captured by the NASA/ESA Hubble Space Telescope reveals a star cluster known as N11, located within the expansive Magellanic Cloud.



This Hubble image depicts star cluster N11. Image credits: NASA/ESA/Hubble/C. Murray/J. Maíz Apellániz.

“This scene is part of the large Magellanic Cloud, a dwarf galaxy situated approximately 160,000 light years from the constellations Dorado and Mensa,” the Hubble astronomer stated.

“With a mass equivalent to 10-20% of that of the Milky Way, the large Magellanic Cloud is the most substantial of the numerous small galaxies orbiting our galaxy.”

“These large Magellanic Clouds host various significant stellar nursery regions where gas clouds, like those portrayed in this image, converge to form new stars.”

This latest Hubble image illustrates a segment of N11, the second-largest star-forming region within the large Magellanic Cloud.

“The Tarantula Nebula, which ranks as the largest and most active star-forming region in the large Magellanic Clouds, is a frequent target for Hubble,” the astronomer noted.

“We observe bright young stars illuminating gas clouds and sculpting masses of dust using their powerful ultraviolet rays.”

“This image represents observations spaced about 20 years apart, highlighting Hubble’s enduring legacy,” they added.

“The initial observations took place between 2002 and 2003 and provided exceptional sensitivity and resolution with the new technology at the time, the Advanced Camera for Surveys.

“We directed Hubble towards the N11 Star Cluster and accomplished something unprecedented: cataloging all the stars in our young cluster, from those with 10% to 100 times the mass of the Sun.”

“The subsequent observations utilized Hubble’s latest instruments, specifically the Wide Field Camera 3.

“These new images emphasized the cluster-filled dusty clouds, offering a fresh perspective on cosmic dust.”

Source: www.sci.news

Microsoft Probes Israeli Military’s Use of Azure Cloud Storage

Microsoft is examining how Unit 8200, the Israeli military surveillance unit, utilizes the Azure Cloud Storage platform.

Top executives are quickly assessing the data that Unit 8200 stores on Azure following a Guardian investigation that disclosed how intelligence agencies can amass extensive records of intercepted Palestinian mobile communications through cloud services.

Collaborative inquiries with Israeli and Palestinian media outlets, including +972 Magazine and the Hebrew publication Local Call, found that Unit 8200 employs a specialized area within Azure that archives millions of daily calls from Gaza and the West Bank.

Sources linked to the investigation indicate that vast amounts of data from this cloud-based repository are being leveraged to pinpoint and identify bombing targets in Gaza.


The report, released on Wednesday, raised alarms among Microsoft’s leadership regarding whether some employees in Israel fully disclose their knowledge of Unit 8200’s use of Azure, according to insiders.

Insiders reveal that internal efforts are currently ongoing to evaluate the type of data stored in Microsoft’s data centers, alongside reconsiderations of how this information is utilized by Israeli forces in the ongoing conflict with Gaza.

In May, Microsoft acknowledged that a review of its military ties with the U.S. found that Azure “was used to target or harm people” in Gaza, with “no prior evidence” of this use existing. These findings reportedly relied partially on the assurances provided by Israeli staff.

However, some senior executives at the company’s U.S. headquarters have recently started to doubt the accuracy of the information provided by certain Israeli employees managing their military relationships.

A source familiar with internal discussions noted that management struggled to verify some of the claims from Israeli staff, questioning whether employees felt they were beholden to their military rather than their employer.

Using leaked documents, the Guardian discovered multiple employees associated with managing the Unit 8200 project, identifying them as reservists for the elite wiretapping unit, akin to the U.S. National Security Agency.

Despite the concerns raised by recent executives, Microsoft has yet to initiate official reviews of its collaborations with the Israeli military from earlier this year, which were spurred by reports highlighting the military’s reliance on Microsoft technology during attacks in Gaza.

A Microsoft spokesperson stated, “As indicated in a prior independent investigation, the company takes these allegations seriously. Upon receiving new information, we are committed to verifying it and taking appropriate actions.”

According to documents reviewed by the Guardian, Microsoft executives are aware of plans to transition a substantial amount of sensitive, classified intelligence data to Azure as the company collaborates with the unit to implement advanced security measures in its cloud services.

Nevertheless, the company claims that executives are not aware that Azure is utilized by Unit 8200 to retain the information from intercepted Palestinian communications. “There is no information concerning the data stored within a customer’s cloud environment,” the spokesperson remarked earlier this week.

In a statement following the investigation, a spokesperson for the Israeli Defense Force (IDF) remarked, “We value Microsoft’s support in securing cybersecurity. Microsoft ensures that it is not an IDF partner in data storage or processing and does not collaborate with us.”

Several sources within Microsoft have indicated that the IDF’s statement surprised the company’s leadership, as it is not a secret that they provide cloud storage to the military under a contract with the Israeli Ministry of Defense.

After the disclosure regarding its reliance on Microsoft, a worker-led organization, called Apartheid Azure, issued a series of demands, including an appeal for the company to sever all publicly known ties with the Israeli military.

NOAA organizer Abdo Mohamed, who was terminated from the company last year, stated that Microsoft CEO Satya Nadella and other executives have maintained that “when entering this partnership in 2021, the company did not anticipate benefiting from the suffering of the Palestinian people.”

Source: www.theguardian.com

Hubble Observes Reflection Nebula in the Taurus Molecular Cloud

Astronomers employing the NASA/ESA Hubble Space Telescope have captured an impressive image of the reflective nebula GN 04.32.8.



This Hubble image showcases GN 04.32.8, a reflective nebula located in the constellation Taurus, approximately 480 light years distant. The color composite was formed from images captured in visible and near-infrared light, utilizing data obtained through two filters. Different hues are assigned to each monochromatic image associated with an individual filter. Image credits: NASA/ESA/Hubble/G.Duchêne.

GN 04.32.8 is situated around 480 light years from the constellation Taurus.

Also referred to as DG 41, it forms a small segment of the stellar nursery known as the Taurus Molecular Cloud.

“Reflective nebulae are composed of dust clouds in space that do not produce their own light, unlike other types of nebulae,” an expert stated.

“Light from nearby stars illuminates them by scattering off the dust.”

“Due to this scattering effect, many reflective nebulae, including GN 04.32.8, often appear blue.”

GN 04.32.8 is illuminated by a trio of bright stars at the heart of the Hubble image, chiefly the variable star V1025 Tauri.

“One of these stars partially overlaps with the nebula. This is another variable star called HP Tauri, classified among the stars in Tauri, thus resembling other varieties of stars found in the broader molecular complex in Taurus,” the astronomer explained.

“It’s not unusual for T-Tauri stars to inhabit such prolific stellar nursery environments, as they are very dynamic and turbulent in their early evolutionary stages.”

“The three stars are known as HP Tau, HP Tau G2, and HP Tau G3 and are believed to be gravitationally bound, forming a triple system.”

“Observers with keen eyes might spot a small, crushed orange area just outside the center, beneath the nebula, which intersects with dark lines,” the researchers noted.

“This marks a newly formed protostar, concealed within a protoplanetary disk that obscures some of its light.”

“Disks like these are ideal subjects for study because they are oriented edge-on from our viewpoint.”

“We are utilizing Hubble to investigate this phenomenon and to glean insights about the types of exoplanets that could form within such disks.”

Source: www.sci.news

Alma Identifies Molecular Activity in the Largest Known Oort Cloud Comet

Astronomers from the Atacama Large Millimeter/Submillimeter Array (Alma) have made observations of comet C/2014 UN271 (Bernardinelli-Bernstein).



Artist depiction of Comet C/2014 UN271. Image credits: NSF/AUI/NSF/NRAO/M.WEISS.

C/2014 UN271 was identified by astronomers Pedro Berner Dinnelli and Gary Bernstein through images captured in 2014 as part of dark energy research.

The comet spans approximately 140 km (85 miles) in diameter, making it over ten times larger than any known comet.

To date, little has been understood about the behavior of such cold, remote celestial bodies.

Recent findings from Alma revealed the intricate and dynamic jets of carbon monoxide gas erupting from the comet’s nucleus, offering the first direct evidence of what drives its activity in relation to the sun.

“With these measurements, we can understand how this massive, icy world functions,” stated Dr. Nathan Ross, an astronomer affiliated with a U.S. university and NASA’s Goddard Space Flight Center.

“We are observing explosive outgassing patterns that raise new inquiries about the comet’s evolution as it journeys deeper into the inner solar system.”

Alma observed C/2014 UN271 by detecting light from carbon monoxide gas in its atmosphere and thermal radiation when the comet was still distant from the sun.

Thanks to Alma’s exceptional sensitivity and resolution, scientists were able to focus on the very faint signals emitted from such cold and far-off objects.

Building on previous Alma observations that characterized the comet’s substantial nucleus size, the new findings measured the thermal signal to accurately assess the size of the comet and the dust surrounding its nucleus.

Their measurements for the nucleus size and dust mass are in line with earlier Alma observations, affirming it as the largest cloud comet ever identified.

Alma’s precision in measuring these signals enabled this research, providing a clearer understanding of this distant, icy giant.

“This discovery not only represents the first detection of molecular outgassing in a record-setting comet but also offers a rare glimpse into the chemical and dynamic processes of objects from the farthest reaches of the solar system,” the astronomer added.

“As C/2014 UN271 approaches the sun, we anticipate more frozen gases will start to evaporate, revealing further insights into the comet’s primitive composition and the early solar system.”

“Such discoveries help address fundamental questions about the origins of Earth and its waters, as well as the formation of life-supporting environments elsewhere.”

The team’s research paper was published in Astrophysics Journal Letters.

____

Nathan X. Ross et al. 2025. Initial detection of molecular activity in the largest known OORT cloud comet: Alma Imaging of The Sun at C/2014 UN271 (Bernardinelli-Bernstein). apjl 986, L22; doi: 10.3847/2041-8213/add526

Source: www.sci.news

Trekking from Cactus to Cloud Route in Palm Springs, California.

A steep trail near the top Aerial Trams in Palm Springs It was covered with a sponge-like fallen needle and wist-strung pine cone on its ankle. It was also shady, and although it was tough, it felt amazing after the first 7 miles Cloud hiking from cactus Not much was offered bird The value of the remedy of the Leaf.

I was already hiking 7,549 feet vertical feet, and had about 3,000 to get to the top of Mount San Jacinto, a granite rock mountain that towers just west of Palm Springs, California.

For this, I chose the third time from the cactus to the cloud hike, and I chose the day in mid-November, but the conditions were perfect. The 22-mile hike has risen from the desert floor to a 10,834-foot peak, then six miles and 2,400-foot vertical feet, reaching a $14 ride.

I started trekking at an altitude of 482 feet near the Palm Springs Museum Immediately after sunrise At 6:41am, carrying enough water to continue to the first water source, the 8,400-foot ranger station, I packed some jackets to deal with the swing of wild temperatures from bottom to top.

People have many reasons to try cacti on clouds. This is one of the most biodiversity hikes in the country. It is a rare wilderness experience on the edge of an urban area. But more than anything, there is the pure boldness of the hiking. In my previous ascension, each of them was motivated by the idea that, following the diagnosis of multiple sclerosis, divorce and completion of treatment for stage 3 breast cancer, I was not sorry for myself by committing myself to this challenge. And what if I made it? Well, that was proof that I could handle anything.

This time my MS was in remission, my marriage was fulfilling, and I had been cancer-free for almost nine years. More than 15 years after the first cactus to the clouds, it was finally about the landscape, not me. I had hidden pocket guides to plants in the San Jacinto Mountains and had planned to take some time to stop the cedars and smell them.

The cloud route from full cactus is only very suitable, but you can also hike that section. Walk one mile from the trailhead to the picnic area and look back. Or take the tram and take the Top Station, then climb the summit on Mount San Jacinto. You can also trek to the top of the tram and then take it on and save the next day’s summit, as you did this time due to sudden leg pain. Each offers you the chance to find eternal views of the Coachella Valley, as well as the Cactus Lens, the Cooper Hawks and the Golden Eagles.

Mount San Jacinto is not particularly noticeable, but the difference in altitude from the base to the summit is the case. The mountain rises approximately 10,400 feet over approximately six horizontal miles. This sudden trend explains the unusual species of life of plants passing through four life zones, apart from the roughly climbing equivalent of Mount Everest.

The trail begins between desert species such as barrel cactus and creosote, and then Chaparal, Scrub oak and Manzanita. After that, the pine forests mix together. At the summit, the trees are shorter, collected and included by altitude and elements. Pines on the arms and legsthrives in places that are hardly anywhere else.

“The environment and lifestyle look very different from the top of Mount San Jacinto in the desert of Colorado Sonoran,” he said. Conservation Biology Center A few days before hiking at the University of California, Riverside. But the most interesting is the zones between the zones, she said. “That’s where it is the most biodiversity is.”

Davis was right. Between 7,000 and 8,000 feet, the green smelled a variety of things, resulting in an incredible array of shapes, textures, hues and sizes. It was like hiking a bowl of mixed salad greens: cedar, manzanita, oak, agave Jeffrey Pines Add a butterscotch hint into the air.

Conditions that make cactus prominent towards the clouds dangerous. The trail hiker died of dehydration and exposure. “We take one deadly a year,” said volunteer Eric Holden. Riverside Mountain Rescueone of four search and rescue teams to deal with hikers suffering on the route.

To protect both hikers and rescuers, Mount San Jacinto State Park In summer, parts of the route were closed under authority (three digit temperatures at the bottom) and winter (snow and frozen conditions at the top). In 2024, the trail closed in early July and reopened on November 4th.

There is little shade for the first 7,000 feet and no ten miles of water. “One of the biggest killers is a hiker who has come to realize he has no physical fitness and is beginning to turn around,” Holden said. “It might be cool when it started at 3am, but now I’m not feeling well and I’m hiking to temperatures that could go above 100 degrees.”

In winter, snow can make it difficult to track. In particular, at 1,000 feet, 1,000 feet below the top of the tram, hikers are stuck on a cliff. I stayed on the path of this section by following my previous GPS tracks from cactus to cloud hiking.

Backpacker magazine Call the cactus to cloud the hike on the fifth most difficult day in America. Most hikers take 12-16 hours. I tried to entertain myself from the relentless climb by identifying exactly which species of Manzanita was scratching my feet, but it did so much to deflect me. I sat down, sitting on a flat rock 6,000 feet above the trailhead, next to either a pink business or a green leaf manzanita.

At least I was hiking uphill. Most veteran hikers understand that while ascending is a tough job, going downhill is a real punishment. In fact, there are clouds from cactus to clouds. Because much of the descent occurs on the streetcars.

“We were dedicated to hikers and were always looking for challenges,” said Sue Birnbaum, one of the six members of the group. Coachella Valley Hiking Club He was the first person to complete the annual Cactus to Clouds Challenge in 1993.

The challenge linked three existing trails, which are “extraordinary super day hikes.” The Museum, Skyline and Mount San Jacin To Peak Trail go beyond a mixture of local, state and federal land and property owned by the Agua Caliente India reservations, which have been operating since 1963.

One paradox of cactus to clouds: better if there are no clouds. They obscure the scenery and sometimes the trail itself. This rise held a different kind of paradox for me too. It was the first time I’d hiked separately on the summit trail, but as I listened to my body and spared my legs, I felt a surge in pride.

Against the Pacific panorama, sparkling Salton Sea, and the peaks of the terrible sacred Takitz, spectacular cauliflower-shaped clouds floated thousands of feet below. At about 90 minutes of sunset, they caught and reflected the shades you can usually see in a mango ice cream scoop.

Despite the warm orange radiating across the sky, the summit was frozen and gusts of wind had collapsed. I was tied up in three jackets containing a hooded, swollen coat and would have happily accepted another. So I wanted as much as I wanted to last longer, and I wanted even more to avoid frostbite.

I headed back from the rocks back onto the trail and turned myself towards the tram and across the cluster of things I had identified. Bush Chin Capin. My pocket guide said the fruit tastes like chestnuts. The next time you do the cactus on the clouds, always stop and try it.

The trail is located just north of the Palm Springs Museum, approximately 6.5 miles from the foot of the tram car. After the hike, you can use ride services such as Uber or Lyft to return to the trailhead parking lot.

Source: www.nytimes.com

Research on the various cloud layers, temperature hot spots, and shifting chemistry found in the extraterrestrial realm

New observations from the NASA/ESA/CSA James Webb Space Telescope support the presence of three specific functions in the atmosphere (clouds, hot spots, and changes in carbon chemistry) of the rapidly rotating and free floating planetary mass object SIMP J013656.5+093347.

Impressions of the artist of SIMP 0136. Image credits: NASA/ESA/CSA/J. Olmsted, Stsci.

SIMP J013656.5+093347 (SIMP 0136 for short) is a rapidly rotating, free-floating object located just 20 light years from Earth.

It may have a mass of 13 Jupiter masses, does not orbit the star, and instead may be a brown dwarf.

Because it is separated, SIMP 0136 can be directly observed and is not afraid of mild contamination or variability caused by the host star.

The short rotation period, only 2.4 hours, allows for very efficient investigation.

“We've been working hard to get into the world,” said Allison McCarthy, a doctoral student at Boston University.

“We also thought that it might have an effect on temperature fluctuations, chemical reactions, and perhaps the activity of the aurora affecting brightness, but we weren't sure.”

Webb's NirSpec Instruments We captured thousands to 5.3 micron spectra of SIMP 0136. The object completed one full rotation, so I captured it one at a time, one at a time, one at a time, one at a time, over 3 hours.

This led to immediate observation Webb's Milli Musical Instrumentshundreds of measurements of light between 5 and 14 microns were collected. One is one every 19.2 seconds, one in another rotation.

The results were hundreds of detailed rays, each showing a very accurate wavelength (color) brightness change, with different sides of the object rotating into view.

“It was incredible to see the entire range of this object change over a few minutes,” said Dr. Joanna Foss, an astronomer at Trinity College Dublin.

“Until now, we only had a small near-infrared spectrum from Hubble, but we had some brightness measurements from Spitzer.”

Astronomers almost immediately noticed that there were several distinct ray shapes.

At any time, some wavelengths were growing brightly, while others were either dimmed or not changing at all.

Many different factors must affect brightness variation.

“Imagine looking at the Earth from afar,” said Dr. Philip Muirhead, a former member of Boston University.

“Looking each color individually gives you a variety of patterns that tell you something about the surface and the atmosphere, even if you don't understand the individual features.”

“As the ocean rotates towards vision, blue increases. The brown and green changes tell us something about the soil and vegetation.”

To understand what could cause variability in SIMP 0136, the team used an atmospheric model to show where each wavelength of light is occurring in the atmosphere.

“The different wavelengths provide information about the different depths in the atmosphere,” McCarthy said.

“We began to realize that the wavelengths that had the most similar ray shapes also investigated the same depth and reinforced this idea that they must be caused by the same mechanism.”

For example, one group of wavelengths occurs deeply in the atmosphere where there may be patchy clouds made of iron particles.

The second group comes from high clouds, which are thought to be made from small grains of silicate minerals.

Both of these light curve variations are related to the patchiness of the cloud layers.

The third group of wavelengths appears to be occurring at very high altitudes far above the clouds and tracking temperatures.

Bright hotspots may be associated with previously detected auroras at radio wavelengths, or hot gas upwelling from deeper in the atmosphere.

Some light curves cannot be explained by clouds or temperature, but instead show variations related to atmospheric carbon chemistry.

There may be chemical reactions in which carbon monoxide and carbon dioxide pockets rotate within and outside of view, or alter the atmosphere.

“We still don't understand the chemical part of the puzzle yet,” Dr. Vos said.

“But these results are really exciting because they show that the richness of molecules like methane and carbon dioxide can change over time from location.”

“If you're looking at a deplanet and only have one measurement, you should assume that it may not be representative of the entire planet.”

Survey results It will be displayed in Astrophysics Journal Letter.

____

Allison M. McCarthy et al. 2025. JWST weather report from isolated exoplanet analog SIMP 0136+0933: pressure-dependent variability driven by multiple mechanisms. apjl 981, L22; doi: 10.3847/2041-8213/AD9EAF

Source: www.sci.news

Astronomers Detect Compelling Evidence of Supermassive Black Holes in the Large Magellanic Cloud

The mass of the ultra-large black hole in the heart of the large Magellan cloud, a small milky satellite galaxy, is approximately 600,000 solar mass.



Impressions of the Hyper Belt Lattist artist ejected from the large Magellan cloud (shown on the right). If the binary star system gets too close to an ultra-large number of black holes, intense gravity will tear the pair apart. One star is captured in tight orbits around a black hole, while the other is thrown outward at extreme speeds – often exceeding thousands of kilometers per second, making it a high-speed star. The inset diagram illustrates this process. The orbital path of the original binary is displayed as an interwoven line, one star is captured by a black hole (near the center of the inset), and the other is ejected into space (bottom right). Image credit: CFA/Melissa Weiss.

“Our Milky Way galaxy halo includes a few stars running faster than local escape speeds in orbit that carry them into intergalactic space,” said Dr. Jesse Han, Ph.D. of the Harvard & Smithsonian Center for Astrophysics and Colleagues.

“One mechanism for generating such ultrafast stars is the Hills mechanism. When a close binary star wanders near an ultrahigh Massive black hole, one star can be captured, while the other is ejected at a rate that reaches more than a second.”

In their new study, astronomers followed the path with ultrafine accuracy of 21 superfast stars in halos outside the Milky Way.

They confidently categorized these stars, finding that seven of them coincided with those born out of the center of the Milky Way.

However, the other nine stars coincided with those born from the centre of the large Magellan cloud, about 160,000 light years away from us.

“Cosmologically speaking, it's amazing to notice another super-large black hole just below the block,” Dr. Han said.

“Black holes are so stealthy that this has been under our noses this time.”

Researchers discovered a large Magellanic Cloud black hole using data from ESA's Gaia Mission.

They also used improved understanding of the orbital of the d-star galaxies around the Milky Way, which was recently obtained by other astronomers.

“We knew these superfast stars had been around for a while, but Gaia provided us with the data we needed to figure out where they actually came from,” says Dr. Kareem El-Badry, an astronomer at Caltech.

“Combining these data with a new theoretical model of how these stars move, we made this incredible discovery.”

“The only explanation we can come up with for these data is the presence of a monster black hole in the next Galaxy,” said Dr. Scott Lucchini, an astronomer at the Harvard & Smithsonian Center for Astrophysics.

a paper Reporting this finding is published in Astrophysical Journal.

____

Ji Won Jesse Han et al. 2025. Hyper Belt Lattist tracks ultra-high Massive black holes in the large Magellan clouds. APJin press; Arxiv: 2502.00102

Source: www.sci.news

Radcliffe’s Waves: The solar system’s past encounter with a massive cloud of gas and dust

Radcliffe’s wave visualization, a series of dust and gas clouds (marked here) throughout the Milky Way. Approximately 400 light years from the sun, marked yellow

Alyssa A. Goodman/Harvard University

Our solar system passed through vast waves of gas and dust about 14 million years ago, darkening the views of the Earth’s night sky. The waves may have left a trace on our planet’s geological records.

Astronomers previously discovered large ocean-like waves of milky stars, gas and dust that ripple up and down for millions of years. One of these closest and most studied is the Radcliffe waves, about 9,000 light years wide and only 400 light years from the solar system.

Now, Efrem Maconi The University of Vienna and his colleagues discovered that the waves of Radcliffe once were far closer to us, surpassing the solar system 11 to 18 million years ago.

Maconi and his team used data from Gaia Space Telescope, which tracked billions of stars in the Milky Way, to identify recently formed groups of stars within the Radcliffe Wave, and identify the dust and gas clouds that formed from them.

Using these stars, they tracked the cloud orbits in time to reveal historic locations to show how the entire wave was moving. They also calculated the past paths of the solar system, rewind the clock for 30 million years, and discovered that the waves and our sun were approaching intimately about 15-12 million years ago. It is difficult to accurately estimate when the intersection began and ended, but the team believes the solar system is within the wave range around 14 million years ago.

This would have made Earth’s galactic environment as dark as it is today, as we currently live in a relatively empty space realm. “If we are in a dense region of interstellar media, that means that the light coming from the stars will dim,” says Macconi. “It’s like being on a foggy day.”

The encounter may have left evidence in Earth’s geological records and deposited radioactive isotopes on the crust, but considering how long ago it happened, this would be difficult to measure, he says. It says it is useful to find such a galactic encounter, as explaining the geological record of the Earth is a continuous problem. Ralph Schoenrich University College London.

More speculatively, the crossing appears to have occurred during a period of cooling, known as the mid-Miocene. Maconi said the two could be linked, but this would be difficult to prove. Schoenrich thinks that is unlikely. “The rule of thumb is that geology outweighs the influence of the universe,” he says. “When you move around the continent or disrupt ocean currents, you need more because climate change is occurring.”

topic:

Source: www.newscientist.com

Space Flower Spotted by Gemini Telescope in Large Magellan Cloud

Using Gemini South Telescope, NSF's Noirlab astronomers imaged a beautiful nebula around the young open cluster NGC 2040.

The nebula, including the NGC 2040, resembles the vivid roses in this image taken by a Gemini Southeres scope. Image credits: International Gemini Observatory / Neur Love / NSF / Aura / J. Miller & M. Rodriguez, International Gemini Observatory & NSF No Arab/TA Rector, University of Alaska Anchorage & NSF No Arab/NSF No Arab/NSF No Arab, NSF No Arab.

NGC 2040 It is about 160,000 light years away from Earth, the constellation of Dorado.

This cluster, also known as the Cal 60 or ESO 56-164, resides within the large Magellan cloud, a satellite d-star galaxy of the Milky Way.

NGC 2040 is a so-called OB association, usually a stellar group containing stars between 10 and 100 stars O and B.

“O- and B-spectral types of stars live a short life of just millions of years, burning very hot before they explode like supernovae during that time,” Noirlab astronomers said in a statement.

“The energy released by these massive star explosions will be fed into the formation of NGC structures in the 2040s, while the material emitted sowing the growth of the next generation of stars.”

Most of our galaxy stars are thought to have been born in the OB Association, as well as NGC 2040.

“NGC 2040 is part of a large structure of interstellar gas known as LH 88, one of the largest active star-forming regions within the large Magellan cloud,” the astronomer said. Ta.

“Thousands of new stars will be born in the region over the millions of years to come.”

“When stars O and B end their supernovae life, they enrich the clusters with elements like carbon, oxygen and iron,” they said.

“Along with the rich hydrogen in the cluster, these elements provide the components necessary for the formation of new stars, planets, and perhaps life.”

“The bright stars seen in the image are widely separated, but their movements through space are similar, indicating that they have a common origin.”

“The layered ambiguous structure of LH 88 is the remains of an already dead star.”

“The delicate leaves of the rose were formed by both the shock waves of the supernova and the stellar winds of the stars of O and B.”

Source: www.sci.news

Webb detects ethereal Herbig Halo objects in the Taurus Molecular Cloud.

Using Near-infrared camera (nircam) Mid-infrared instrument (Mil)Advanced on a James Webbspace Telescope, astronomers took a stunning image of an edge-on-protoplanetary disc around a Herbig Halo object HH 30The Dark Cloud LDN 1551, is located in Taurus Molecular Cloud.

This Webb/nircam/miri shows the Herbig-Haro object HH ​​30. et al.

The Herbig-Haro object is a small bright patch of nebula associated with protostars in the star-forming region.

These structures were first observed in the 19th century by American astronomer Sherburn Wesley Burnham, but were not recognized as a distinct type of ejection nebula until the 1940s.

The first astronomers to study them in detail were George Harbigue and Guillermo Halo, and they were later named.

Herbig Halo objects are formed in very specific circumstances. Hot gas discharged by the newborn star collides with the gas, hitting it at a speed of up to 250,000 kmh (155,000 mph), creating a bright shock wave.

They come in a wide range of shapes. The basic configuration is usually the same. Twin jets of hot gases are ejected in the opposite direction from the forming stars and flow through interstellar space.

“HH 30 is an example of where this effluent gas takes the form of a narrow jet,” the astronomer said.

“The source star is on one end of the jet and is hidden behind an edge-on-protoplanetary disc illuminated by the star.”

Using Webb Instruments, researchers investigate HH 30 objects in great detail.

They also analyzed data from the NASA/ESA Hubble Space Telescope and Atacama's Large Millimeter/Sub-Millimeter Array (ALMA).

“Long-wavelength data from Alma tracks the location of millimeter-sized dust particles in a narrow area on the middle surface of the disk,” they said.

“Short wavelength infrared data from Webb reveals the distribution of smaller dust grains.”

“These grains are one millionth of a meter in diameter. They are the size of a single bacteria.”

“Large dust grains are concentrated in the most dense portions of the disc, while small grains are much more widely used.”

“Combined with Alma's sharp radio-wavelength eyes, Webb's observations show that large dust particles must migrate within the disk and precipitate into a thin layer,” they added.

“Creating narrow, dense layers of dust is an important step in the formation of the planet.”

“In this densely populated area, dust grains together form pebbles, and ultimately form the planet itself.”

“In addition to the behavior of dust grains, images of Webb, Hubble and Alma reveal several different structures nested with each other.”

“A high-speed jet of gas appears at a 90-degree angle from the narrow central disc.”

“The narrow jet is surrounded by wider, cone-shaped spills.”

“Enclosing the cone runoff is a broad nebula that reflects the light from the young stars embedded in the disc.”

“Together, these data reveal that HH 30 is a dynamic location, where small dust grains and huge jets play a role in the formation of a new planet.”

Survey results It will be published in Astrophysical Journal.

____

Ryozaki et al. 2025. JWST imaging of edge-on protranetary discs. IV. HH 30 disc mid-infrared dust scattering. APJin press; Arxiv: 2412.07523

Source: www.sci.news

Study finds that decreased cloud cover may lead to heatwaves and high temperatures

overview

  • Global temperatures over the past two years have been even warmer than climate scientists expected.
  • A new study offers a possible reason: reduced cloud cover.
  • The study suggests that this decline may be a result of global warming, which could mean the Earth is heating up even faster than scientists thought.

Over the past two years, temperatures around the world have risen far more than scientists expected. This trend is creating the mystery of whether there are hidden climate change dynamics behind the sudden change.

Last year was the hottest summer on record, 2024 was likely to be even hotter.. Even after accounting for the expected effects of greenhouse gas pollution and El Niño (a natural pattern that generally increases temperatures), the researchers found that the roughly 0.2 degrees Celsius (0.36 degrees Fahrenheit) of warmer temperatures observed in 2023 I couldn’t explain the change.

A new study offers a possible explanation. Cloud cover has decreased over the past two years, meaning more light is now reaching and heating the Earth’s surface, rather than being reflected back into space.

In the research, Published in Science on Thursdaysuggesting that this dynamic, called an overall decrease in the planet’s albedo, is likely the cause of the observed temperature anomaly in 2023.

“This is broadly consistent with the observed recent further increase in solar radiation,” said study author Helge Goessling, a climate physicist at the Alfred Wegener Institute in Germany.

The expected cloud behavior in a warmer world is one of the most difficult aspects of the climate system to study and model. Answering questions about it will help scientists more accurately determine how sensitive the Earth is to greenhouse gas emissions.

If the decrease in low-level cloud cover is not a coincidence, it likely means the Earth is warming even faster than scientists thought.

“It’s not really clear yet how likely it is that some of this variation is variability that disappears again,” Gosling said. “This increases the likelihood of greater-than-expected warming.”

The new study is based on analysis of climate models and NASA satellite data on Earth’s reflectivity. It outlines three possible reasons for the decline in developing low clouds, but provides no conclusions about how much each factor contributes.

One option is that natural processes temporarily deviate from normal, causing a decrease in cloud cover. For example, natural fluctuations may be causing sea surface temperatures to rise more than expected, thereby changing the physics of how clouds form.

The second possibility is a change in maritime transport regulations. In 2020, the International Maritime Organization imposed limits on the sulfur content allowed in marine fuels. Some scientists believe that reducing the number of sulfur particles polluting the atmosphere may have the unintended effect of suppressing ocean cloud formation.

“They act as condensation nuclei for clouds, so they can make clouds brighter and last longer,” Goessling said of the sulfur particles.

A third option is that unidentified feedback loops within the climate system are causing clouds to decrease due to global warming.

If the latter two possibilities turn out to be the main culprits, it would mean that the climate is more sensitive to anthropogenic pollution than many scientists thought, and that humanity is therefore more likely than world leaders to It means we are closer to exceeding the targets set for emissions limits than previously realized. (The term “climate sensitivity” refers to how warm the Earth is. If the concentration of carbon dioxide and other greenhouse gases in the atmosphere doubled.. )

Still, many questions remain, said Zeke Hausfather, director of climate research at financial firm Stripe and a researcher at Berkeley Earth.

“It remains to be seen whether these changes in cloud behavior are due to short-term fluctuations and will return to more normal conditions over time, or whether they represent new and ongoing changes to the climate system. No,” he said. Email.

According to the National Oceanic and Atmospheric Administration, the average surface temperature of land and ocean in 2023 was about 2.12 degrees Fahrenheit above the 20th century average.

Efforts by world leaders to reduce greenhouse gas emissions remain insufficient. Global temperatures are on track to rise by more than 3 degrees Celsius (5.4 degrees Fahrenheit) on average, far exceeding the 1.5 degrees Celsius (2.7 degrees Fahrenheit) goal set by the Paris Agreement.

Source: www.nbcnews.com

VLTI captures high-resolution images of red supergiant star in Large Magellanic Cloud

used by astronomers ESO’s Very Large Telescope Interferometer (VLTI) has taken an enlarged image of the dusty red supergiant star WOH G64.

This image, taken by ESO’s Very Large Telescope Interferometer’s GRAVITY instrument, shows the red supergiant star WOH G64. Image credit: ESO / Onaka others., doi: 10.1051/0004-6361/202451820.

WOH G64 is located in the constellation Shira, about 160,000 light years away.

The star, also known as IRAS 04553-6825, 2MASS J04551048-6820298, or TIC 30186593, is part of the Large Magellanic Cloud, one of the smaller galaxies orbiting the Milky Way.

WOH G64 is approximately 2,000 times larger than the Sun and is classified as a red supergiant star.

“We discovered an egg-shaped cocoon that tightly surrounds this star,” said Dr. Keiichi Onaka, an astrophysicist at Andres Bello University.

“We’re excited because this could be related to the rapid ejection of material from a dying star before it explodes into a supernova.”

“Astronomers have taken zoomed-in images of and characterized about two dozen stars in our Milky Way galaxy, but countless other stars exist in other galaxies. and were so far away that it was very difficult to observe one of them in detail.

Artist’s reconstruction of the red supergiant star WOH G64. Image credit: ESO/L. Calçada.

Dr. Onaka and his colleagues have been interested in WOH G64 for a long time.

In 2005 and 2007, they used VLTI to learn more about the star’s properties and continued their research in the years since. However, the actual appearance of this star remained elusive.

To achieve the desired photos, it was necessary to wait for the development of VLTI’s second generation equipment. gravity.

After comparing the new results with other previous observations of WOH G64, they were surprised to find that the star had become fainter over the past decade.

Professor Gerd Weigert, an astronomer at the Max Planck Institute for Radio Astronomy, said: “We found that this star has undergone significant changes over the past 10 years, and this is a rare opportunity to witness the life of a star in real time.” he said. .

During the final stages of their lives, red supergiant stars like WOH G64 shed their outer layers of gas and dust in a process that lasts thousands of years.

Dr Jacco van Loon, director of the Kiel Observatory at Kiel University, said: “This star is one of the most extreme of its kind and any dramatic changes could bring it closer to an explosive demise. ” he said.

“These ejected materials may also be responsible for the dimming and the unexpected shape of the dust cocoon around the star,” the astronomers said.

The new image shows the cocoon elongating, surprising researchers who had expected a different shape based on previous observations and computer models.

They believe that the cocoon’s egg-like shape could be explained by the star’s molting or the influence of an as-yet-undiscovered companion star.

As the star dims, it becomes increasingly difficult to take other close-up photos, even VLTI.

Nevertheless, in the future, an update of the telescope’s instruments is planned. Gravity+I promise to change this soon.

“Similar follow-up observations using ESO’s instruments will be important for understanding what is happening inside this star,” said Dr. Onaka.

of the team paper Published in a magazine astronomy and astrophysics.

_____

Kento Ohnaka others. 2024. Image of the innermost circumstellar environment of the red supergiant star WOH G64 in the Large Magellanic Cloud. A&A 691, L15; doi: 10.1051/0004-6361/202451820

Source: www.sci.news

Microsoft’s Cloud Business Experiences Double-Digit Growth as AI Innovation Propels Company Forward

Microsoft reported better-than-expected profits on Wednesday, driven by growth in its Azure cloud business, as five of the “Magnificent Seven” tech giants reveal their quarterly results this week.

“AI-driven transformation is reshaping jobs, outputs, and workflows across all roles, functions, and business processes,” stated Satya Nadella, the company’s CEO, in a press release. Nadella mentioned on a earnings call that Microsoft’s AI business is set to surpass a $10 billion annual run rate next quarter, making it the fastest-growing business in company history to achieve this milestone.

Microsoft’s focus on artificial intelligence garnered attention, with significant investments in Azure, the company’s rapidly expanding division. According to a press release, the division’s revenue grew by 22%. A day earlier, Google’s parent company Alphabet reported a nearly 35% year-on-year growth in its cloud business, reaching $11.35 billion, surpassing analyst forecasts.

Nadella announced that Azure now boasts 39,000 customers, marking an 80% increase year over year. The company has established AI data centers in over 60 regions globally, and Azure-OpenAI usage has more than doubled in the last six months.

The stock prices surged in after-hours trading. Earnings per share were $3.30, exceeding the anticipated $3.10, with revenue standing at $65.59 billion compared to the expected $64.51 billion.

Microsoft’s financial outlay has risen significantly with its emphasis on AI. On Wednesday, the company’s data center finance leases surpassed $108 billion in pre-commencement lease payments.

With soaring investments, Microsoft’s power requirements have soared in recent years. As part of a project to power its extensive data center fleet, the company is revamping Pennsylvania’s Three Mile Island nuclear power plant, known for a partial reactor meltdown in 1979. Microsoft has struck a deal to acquire all power generation capacity from the plant over the next two decades.

However, investors remain cautious about the significant AI bets made by tech giants and seek greater clarity on when these investments will yield returns. The “Magnificent Seven” companies – Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia, and Tesla – with a combined market capitalization of $12 trillion and representing one-fifth of the S&P 500 index, have underperformed the market over the past quarter, with a cumulative 3.5% decline since July.

In a note to investors, Wedbush analyst Dan Ives characterized this quarter as a pivotal test for Microsoft and Azure amid heightened competition in the AI ecosystem.

“Our assessments of Microsoft this quarter are positive as we believe Redmond is taking the lead and accelerating Azure cloud deals with robust momentum into 2025 and beyond,” Ives remarked, referencing Microsoft’s headquarters location in Washington state. “We maintain an ‘outperform’ rating.”

Source: www.theguardian.com

Observation of the diffuse nebula NGC 261 in the Small Magellanic Cloud by the Hubble Space Telescope

NGC 261 is located within the Small Magellanic Cloud, one of the Milky Way's closest neighbors.



This Hubble Space Telescope image shows the diffuse nebula NGC 261, about 200,000 light-years from Earth in the constellation Sivir. Image courtesy of NASA/ESA/LC Johnson, Northwestern University/Gladys Kober, NASA and The Catholic University of America.

NGC 261 It is a diffuse nebula located about 200,000 light years away in the constellation Tetranychus.

The object, also known as Brook 42, ESO 29-12, and IRAS 00447-7322, Found It was discovered on September 5, 1826 by Scottish astronomer James Dunlop.

“The ionized gas burning up from within this diffuse region characterizes NGC 261 as an emission nebula,” the Hubble astronomers said.

“The stars are so hot that they irradiate the surrounding hydrogen gas, giving the clouds a pinkish-red glow.”

The Hubble Space Telescope has turned its keen eye to NGC 261 to study how efficiently stars form within molecular clouds, extremely dense regions of gas and dust.

“These clouds are often composed of large amounts of molecular hydrogen and are the cold regions where most stars form,” the researchers explained.

“But molecular hydrogen is poorly radiative, making it difficult to measure this fuel for star formation in stellar nurseries.”

“Because they're difficult to detect, scientists instead track other molecules present within the molecular cloud.”

“The Small Magellanic Cloud contains a gas-rich environment of young stars, as well as traces of carbon monoxide, which correlates with hydrogen and is a chemical often used to confirm the presence of such clouds.”

The new composite image is Hubble's Advanced Camera for Surveys (ACS) and Wide-angle camera 3 (WFC3) shows such a star in the southwestern part of the Small Magellanic Cloud, where NGC 261 resides.

“The combined powers of the ACS and WFC3 instruments allowed us to probe the star formation properties of the nebula through its carbon monoxide content at visible and near-infrared wavelengths,” the scientists said.

“This work helps us better understand how stars form in our host galaxy and in our Galactic neighbours.”

Source: www.sci.news

Microsoft surpasses sales expectations, yet stock price dips due to slow growth in cloud services

In its latest quarterly earnings report, Microsoft exceeded analysts’ expectations by reporting a 15% increase in revenue year over year on Tuesday. However, growth in Azure, the company’s flagship cloud-computing service, fell short, leading to a 7% drop in Microsoft shares during after-hours trading.

Expectations for solid growth in the fourth-quarter earnings report were high, especially driven by cloud services with predicted revenue growth of 29%, which was expected to be between 30% and 31%. This led to a decline in stock prices for major technology companies due to recent market challenges.

During the Microsoft Earnings Report, CEO Satya Nadella aimed to instill confidence in the company’s performance.

Nadella stated in the earnings call, “This year’s strong performance demonstrates our innovation and the ongoing trust our customers have in Microsoft. As a platform company, we prioritize meeting our customers’ mission-critical needs at scale while leading in the AI era.”

Microsoft’s significant investments in artificial intelligence in recent years reflect a strategic move to dominate the tech industry with AI-enabled services. Backing ChatGPT developer OpenAI solidifies Microsoft’s position as a key player in commercializing generative AI.

Despite the growing questions surrounding the revenue potential of big tech companies’ pivot to AI, other factors like speculation about a Federal Reserve rate cut have helped calm investors as enthusiasm for big tech fades after a period of rising stock prices driven by AI optimism.

Microsoft faced challenges this month amid a global technology outage caused by a flawed software update from cybersecurity firm CrowdStrike affecting Windows systems. An unrelated outage on Microsoft’s Azure cloud service on Tuesday also caused network connectivity issues in multiple countries.

Source: www.theguardian.com

Webb finds hourglass-shaped molecular cloud surrounding protostar

Astronomers MIRI (mid-infrared measuring instrument) camera The NASA/ESA/CSA James Webb Space Telescope probe has captured striking new photos of molecular cloud L1527.

L1527, shown in this image from Webb's MIRI instrument, is a molecular cloud enveloping the IRAS 04368+2557 protostar. The more diffuse blue light and filamentary structures in the image come from organic compounds called polycyclic aromatic hydrocarbons (PAHs), while the red in the center of the image is a thick layer of energetic gas and dust that surrounds the protostar. The white intermediate regions are a mix of PAHs, ionized gases, and other molecules. Image courtesy of NASA / ESA / CSA / STScI.

L1527, also known as LDN 1527, is located about 447 light-years from Earth in the constellation Taurus.

The young protostar, called IRAS 04368+2557, is embedded in a molecular cloud that is part of a star-forming region in the constellation Taurus.

IRAS 04368+2557 is a relatively young star, only 100,000 years old.

Given its age and brightness in the far-infrared, the star is likely a class 0 protostar, the earliest stage of star formation.

IRAS 04368+2557 has an edge-on disc with two misaligned parts.

The inner and outer parts of the disk have slightly different orbital planes and are connected at 40 to 60 AU (astronomical units) from the protostar, but the disk is point-symmetric with respect to the location of the protostar.

Webb's previous observations of L1527 showed that NIRCam (Near Infrared Camera)Astronomers were able to peer into the region, where the molecular cloud and protostar appeared in opaque, vibrant colors.

Both NIRCam and MIRI show the effects of outflows that shoot out in opposite directions along the protostar's rotation axis as the protostar consumes gas and dust from the surrounding cloud.

These outflows take the form of bow shock waves relative to the surrounding molecular cloud and appear as filament-like structures throughout the molecular cloud.

They also energize, or excite, the material around them, causing the areas above and below them to glow, imprinting bright hourglass structures in the molecular cloud.

“But unlike NIRCam, which primarily images light reflected from dust, MIRI will be able to probe how these outflows affect the thickest dust and gas in the region,” astronomer Webb said in a statement.

“The blue region that takes up most of the hourglass represents carbonaceous molecules called polycyclic aromatic hydrocarbons.”

“The IRAS 04368+2557 protostar itself is shown in red, along with the dense mixture of dust and gas that surrounds it.”

“Meanwhile, MIRI revealed white regions just above and below the protostar, which are not as clearly visible in the NIRCam view.”

“This region is a mixture of hydrocarbons, ionized neon, and thick dust, indicating that the protostar is consuming material from the disk in a promiscuous manner, pushing this material over great distances.”

“As IRAS 04368+2557 continues to age and emits energetic jets, it will consume, destroy or push aside much of this molecular cloud, and much of the structure seen here will begin to disappear.”

“Eventually, when the accumulation of mass stops, this impressive spectacle will come to an end and the star itself will become more clearly visible to optical telescopes.”

“Combining both near-infrared and mid-infrared analyses will shed light on the overall behavior of this system, including how the central protostar is influencing the surrounding region.”

“Other stars in Taurus, the star-forming region in which L1527 resides, may form in exactly this way, which could lead to the disruption of other molecular clouds, either preventing the formation of new stars or promoting their development.”

Source: www.sci.news

Our Solar System passed through a frigid interstellar cloud approximately 2 million years ago, new research reveals.

A cold, dense cloud in the Milky Way’s interstellar medium is about four to five orders of magnitude denser than its diffuse counterparts, and a team of astronomers from Boston University, Harvard University, and Johns Hopkins University has found evidence that two to three million years ago, our solar system encountered one of these dense clouds, which may have been so dense that it disrupted the solar wind.



Offers othersThe interstellar material through which the Sun has traveled over the past few million years indicates the presence of cold, dense clouds that could have had dramatic effects on the heliosphere. Image credit: NASA/JPL-Caltech.

Most stars generate winds that move through the surrounding interstellar medium.

This motion creates a cocoon that protects the planet from interstellar material. The Sun’s cocoon is the heliosphere.

It’s made up of a constant stream of charged particles called the solar wind, which extends far beyond Pluto, enveloping the planet in what astronomers call a “local bubble.”

It protects us from radiation and galactic rays that can alter DNA, and scientists think it’s part of the reason why life on Earth evolved.

A cold interstellar cloud compressed the heliosphere, temporarily placing Earth and other planets in the solar system outside of its influence, according to a new study.

“Our paper is the first to quantitatively show that there was an encounter between the Sun and something outside our solar system that affected Earth’s climate,” said Professor Merab Auffar of Boston University.

“Stars move, and this paper shows that not only do they move, but they undergo dramatic changes.”

To study this phenomenon, Professor Orpher and his colleagues essentially went back in time and used advanced computer models to visualize where the Sun was located two million years ago, along with the heliosphere and the rest of the solar system.

They also mapped the path of a “localized cold cloud ribbon” system, a series of large, dense and very cold clouds made mainly of hydrogen atoms.

Their simulations showed that one of the clouds near the edge of the ribbon, a “local cold cloud,” may have collided with the heliosphere.

If this had happened, Earth would have been fully exposed to interstellar matter, where gases and dust would have mixed with atomic elements left over from the exploded star, such as iron and plutonium.

Normally, the heliosphere filters out most of these radioactive particles, but without protection they could easily reach Earth.

This is consistent with geological evidence showing increased levels of the isotopes iron-60 and plutonium-244 in the oceans, the moon, Antarctic snow and ice cores from the same period, according to the paper.

This timing also coincides with temperature records indicating a cold period.

“It is rare for our cosmic neighbors outside our solar system to have an impact on life on Earth,” said Harvard University professor Avi Loeb.

“It’s exciting to discover that our passage through dense clouds millions of years ago may have exposed the Earth to much greater amounts of cosmic rays and atomic hydrogen.”

“Our findings open a new window into the evolution of life on Earth and its relationship with our cosmic neighbours.”

“External pressure from localized lynxes of cold clouds could have continuously blocked the heliosphere for hundreds to millions of years, depending on the size of the cloud.”

“But as soon as Earth left the cold cloud, the heliosphere engulfed all the planets, including Earth.”

“It’s impossible to know exactly what effect the cold clouds had on the Earth, such as whether they caused ice ages.”

“But there are other cool clouds in the interstellar medium that the Sun likely encountered in its first few billion years.”

“And we’ll probably encounter many more over the next million years or so.”

The authors are currently working to determine where the Sun was 7 million years ago, and beyond.

Pinpointing the position of the Sun and cold cloud systems millions of years ago is made possible by data collected by ESA’s Gaia mission, which has produced the largest 3D map of the galaxy ever, showing in unprecedented detail how fast stars move.

“This cloud is certainly from our past, and if we passed through something this massive, we would have been exposed to interstellar material,” Prof Auffar said.

“This is just the beginning. We hope this paper opens the door to further exploration of how the solar system was influenced by outside forces in the ancient past, and how these forces may have shaped life on Earth.”

of paper Published in today’s journal Natural Astronomy.

_____

M. Offer othersIt is possible that Earth was directly exposed to cold, dense interstellar material 2 to 3 million years ago. Nat AstronPublished online June 10, 2024; doi: 10.1038/s41550-024-02279-8

Source: www.sci.news

Three million years ago, a cosmic cloud left Earth exposed to interstellar space

Illustration of a protection bubble around the sun (yellow dot) and the earth (blue dot)

Harvard Radcliffe Institute

Two to three million years ago, the solar system encountered galactic-scale turbulence and collided with dense interstellar clouds, potentially altering both the Earth's climate and evolution.

Only recently have researchers been able to map the Sun's orbit through the Galaxy, particularly in relation to the relatively dense hydrogen clouds that pass through the interstellar medium, the vast expanse of space between star systems.

the current, Merab Offer A research team from Boston University in Massachusetts has found evidence that one of these clouds, a “local cold cloud ribbon” in Lynx, likely intersects with the Sun's heliosphere.

The heliosphere is a protective cocoon or bubble formed by the solar wind pushing out to the edge of the solar system. Within the heliosphere, the planet is protected from the worst gamma radiation in the galaxy.

The new study proposes that as the solar system passed through the interstellar cloud, the heliosphere retreated from it and moved inward toward the Sun. The researchers think that the heliosphere may have shrunk so much that Earth was outside the protective cocoon provided by the solar wind, perhaps for around 10,000 years.

Merab and his colleagues used the European Space Agency's Gaia satellite to map the location of the dense, cold clouds and the sun's past orbit.

Ofer says the heliosphere's encounter with the cold cloud coincides with deposits of the elements plutonium-244 and radioactive iron-60 in Antarctic ice, deep-sea cores and lunar samples. These elements, which originated from distant supernovae, would have been captured in interstellar clouds and deposited while Earth was outside the heliosphere.

“There are signs of an increase in these elements over the past two years. [million] “The solar cloud record going back 3 million years provides compelling evidence that the Sun did in fact pass through it around 2 million years ago,” Offer says. “The exposure of Earth to a cloud of cold interstellar material and the associated increase in atmospheric hydrogen and radiation almost certainly had a major impact on Earth and its climate.”

Sarah Spitzer The University of Michigan researcher says the paper provides “compelling” evidence that the heliosphere was exposed to a much denser interstellar cloud two to three million years ago. As the solar system passed through that dense, cold cloud, Earth would have been outside the heliosphere and directly exposed to the interstellar environment, she says.

“Understanding this can teach us about the impact interstellar material has had on life on Earth in the past,” Spitzer says, “but it also helps us better understand the impact the heliosphere has on life on Earth today, what would happen if Earth were exposed to interstellar material again in the future, and when that might happen.”

Evan Economo Researchers from Japan's Okinawa Institute of Science and Technology say it's intriguing to consider how encounters in “our nearby space” could have influenced the environment experienced by life on Earth.

“The heliosphere is part of the extended environment experienced by life on the Earth's surface, influencing climate and radiation from space,” he says. “If we had been outside the heliosphere for a period of time, it could have altered the evolutionary trajectory of a wide range of life, including humans. Such connections are highly speculative at this point, but they provide us with new research directions.”

topic:

Source: www.newscientist.com

Sui collaborates with Google Cloud to drive Web3 advancement through improved security, scalability, and AI features

Chainwire, Palo Alto, CA, April 30, 2024

The collaboration is focused on addressing key Web3 challenges through data-driven insights, AI-powered development tools, and zero-knowledge proofs.

SuiThe layer-1 blockchain and smart contract platform created and launched by the core research team responsible for building Facebook’s Libra and Diem projects is working with Google Cloud to support the future of Web3 through a partnership with Mysten Labs. I am. This joint effort will focus on enhancing security, scalability, developer tools, and user experience across a variety of Web3 and AI-powered applications.

Key initiatives of this partnership include:

  • Data-driven dApps
  • By integrating Sui’s blockchain data into the BigQuery public dataset, Google Cloud strengthens its commitment to the Sui development ecosystem, providing developers with powerful analytical tools, new insights and innovative dApps. unlocked the potential of
  • Development using AI
  • Sui used Vertex AI, Google Cloud’s generative AI platform, and trained it in the Move programming language to help Web3 developers debug and extend code generation.
  • Mysten Labs uses Google Cloud’s AI and cloud capabilities to create a new AI-based code audit tool. This tool identifies security vulnerabilities in Rust, Move, Typescript, and Solidity, enhancing security efforts to quickly identify and remediate vulnerable code.
  • Seamless user experience
  • Sui’s zkLogin technology is already demonstrating the potential of this collaboration, as it bridges some of the gap between traditional (Web2) and decentralized (Web3) applications. zkLogin is a basic cryptographic primitive that uses OAuth credentials from Google and other trusted Web2 platforms to enable frictionless authentication of crypto wallets and dApps on Sui.
  • Robust and scalable infrastructure
  • Sui leveraged Google Cloud’s globally scalable infrastructure to enhance network security and high transaction throughput. This integration ensures high performance and seamless scalability.

said Evan Cheng, CEO and co-founder of Mysten Labs. “Working with Google Cloud allows us to accelerate the development of secure, scalable, and user-centric Web3 experiences.” “Google Cloud’s infrastructure and cutting-edge AI capabilities complement Sui’s unique technological strengths. We enable developers of all types to create next-generation decentralized applications on Sui.”

Amit Zaveri, Vice President, General Manager, and Head of Platforms, Google Cloud. “Sui’s advanced blockchain technology and commitment to a user-friendly experience make Sui a valuable collaborator for Web3 and Web2 developers to deliver innovative applications that are easy to adopt. Masu.”

Google Cloud is also committed to supporting the Sui developer community. Web3 startup program. The initiative will provide resources such as Google Cloud credits, access to his Discord channel with Web3 experts, foundation grants, and global events, all of which will accelerate the development of innovative applications at Sui. It is designed to.

About Sui

Sui is a first-of-its-kind Layer 1 blockchain and smart contract platform designed from the bottom up to make digital asset ownership fast, private, secure, and accessible to everyone. An object-centric model based on the Move programming language enables parallel execution, sub-second finality, and rich on-chain assets. With horizontally scalable processing and storage, Sui supports a wide range of applications at low cost and unmatched speed. Sui is a step-function advancement in blockchain and a platform that allows creators and developers to build amazing user-friendly experiences. learn more: https://sui.io

About Misten Lab

Mysten Labs is a team of leading experts in distributed systems, programming languages, and cryptography, and its founders were senior executives and lead architects of pioneering blockchain projects. Mysten Labs’ mission is to create the foundational infrastructure for Web3. learn more: https://mystenlabs.com

About Google Cloud

Google Cloud is a new way to the cloud, offering AI, infrastructure, developer, data, security, and collaboration tools built for today and tomorrow. Google Cloud provides a powerful, fully integrated, and optimized AI stack with unique global infrastructure, custom-built chips, generative AI models and development platforms, and AI-powered applications to transform your organization. We support. Customers in more than 200 countries and territories rely on Google Cloud as their trusted technology partner.

contact

global communication manager
Lexi Wangler
Misten Institute
lexi.wangler@mystenlabs.com

Source: www.the-blockchain.com

UAE government denies cloud seeding occurred prior to Dubai floods

DUBAI, United Arab Emirates – The National Meteorological Center, the government task force responsible for cloud seeding missions in the United Arab Emirates, stated that they did not carry out any weather modification techniques in anticipation of severe storms that led to flooding in places like Dubai.

According to CNBC, the group confirmed that they did not deploy pilots for seeding operations before or during the storm that affected the UAE on that fateful Tuesday.

Omar Al-Yazidi, deputy director of NCM, emphasized that no seeding operations were conducted during the storm, stating, “If you find yourself in severe thunderstorm conditions, you should carry out seeding operations before the rain falls. It’s too late.”

On Tuesday, the UAE experienced heavy rainfall. NCM reported that Al Ain received 10 inches of rainfall, while Dubai saw over 100 mm. This is significantly higher than the average annual rainfall in the UAE, which ranges from 140 to 200mm.

NCM’s statement contradicted previous reports suggesting that some of the rain was caused by cloud seeding. The process of cloud seeding has been a crucial aspect of the UAE’s efforts to combat water scarcity since the 1990s, with over 1,000 hours of cloud seeding conducted annually.

The increase in precipitation is attributed to climate change, with a study projecting a 30% increase in precipitation by 2080. This shift in weather patterns has posed challenges for the UAE’s infrastructure, especially in managing drainage systems during heavy downpours.

The UAE government issued warnings through the National Emergency Crisis and Disaster Management Authority, advising residents to adhere to safety guidelines due to the extreme weather conditions.

Despite the region’s unique weather patterns, the UAE’s drainage systems struggled to handle the unprecedented rainfall, leading to flooding in many areas, including Dubai Airport.

Source: www.nbcnews.com

Cloud Seeding: An Explanation and its Potential Role in the Dubai Floods

Driver abandons car after rainstorm in Dubai, United Arab Emirates, April 17

Christopher Pike/Bloomberg/Getty

Record rainfall has hit the Arabian Peninsula this week, causing flooding in Dubai, Abu Dhabi and other coastal cities in the United Arab Emirates. The extreme weather sparked speculation on social media that the UAE’s long-standing cloud seeding program may have played a role. However, cloud seeding almost certainly does not have a significant impact on flooding.

How unusual was the recent rain in the Arabian Peninsula?

It was the most extreme event in the UAE since record-keeping began in 1949. according to to the state-run Emirates News Agency. From April 15th to 16th, some parts of the country received more than their normal annual rainfall in a 24-hour period. Heavy rains in desert regions are not uncommon, but they are not unheard of – as the UAE sees it. heavy rain and flooding For example, 2016.

A drainage system in a coastal city in the United Arab Emirates (UAE) has been overwhelmed by spills, causing flooding. Dramatic images of a plane driving through stagnant water at Dubai International Airport have been widely shared online.

In neighboring Oman, died in flash flood At least 18 people. Parts of Bahrain, Qatar and Saudi Arabia also experienced unusual rainfall.

What is cloud seeding? Did it affect extreme rain?

Cloud seeding is a way to increase precipitation, From about the 1940s. This involves spraying powders such as silver iodide onto clouds from airplanes or rockets, or burning them from stations on the ground. Droplets of supercooled water form around these particles and fall to the ground as rain or snow.

Since 2002, the UAE has maintained one of the largest cloud seeding programs in the world. Planes regularly fly cloud-seeding missions in an effort to increase freshwater resources in arid regions.

Meteorologists at the UAE’s National Center of Meteorology (NCM) have further fueled speculation that cloud formations may be responsible for the recent rains. Said bloomberg news That in the days before the storm, planes had spread clouds over the country. However, NCM later stated: statement That no seed was sown during the storm.

“We take the safety of our employees, pilots and aircraft very seriously,” the company said. “NCM does not conduct cloud seeding operations during extreme weather conditions.”

Even if cloud formation had occurred during the storm, it would have had at most a small effect on precipitation and would have been localized. The extent of rainfall across several countries and the generally limited influence of cloud species suggest that cloud occurrence almost certainly does not play a significant role. “There is no technology that can create or even significantly alter this type of rainfall event.” Maarten Ambaum at the University of Reading, UK statement.

He noted that cloud seeding would have little impact on clouds that were already predicted to bring rain to the region. And that assumes that cloud seeding is effective at all.

“Many claims of successful cloud seeding are false, scientifically flawed, or actually fraudulent,” he says. Andrew Dessler at Texas A&M University. “This makes most atmospheric scientists very skeptical about cloud formation.”

What weather factors were behind the rain?

The extreme precipitation was caused by large storms called mesoscale convective systems. Suzanne Gray Researchers at the University of Reading say this happens “when many individual thunderstorms coalesce to form a single large high-level cloud shield.”

Forecasters had predicted a high risk of flooding in the area for at least a week before the storm.Writing in progress XJeff Berardelli, a meteorologist at WFLA-TV in Florida, linked the storm to a blocking pattern created by a slow-moving jet stream.

Has climate change made rain worse?

Further analysis is needed to link this particular event to climate change, but climate change likely plays a role.

“These types of heavy rainfall events are likely to become more extreme with climate change, as a warming atmosphere retains more water vapor,” Ambaum said. Changes in temperature can also affect atmospheric circulation patterns in the form of changes in precipitation.

For example, recent study The same type of storm that caused this extreme rainfall has occurred in the region 95 times since 2000, with the most frequent occurrences on the Arabian Peninsula in March and April, researchers found. However, it has also been found that the duration of these storms has increased over the UAE since 2000, which may be linked to rising temperatures.

Alternative climate modeling study They predict that annual rainfall in the UAE will increase by 10 to 25 percent by mid-century, characterized by more intense precipitation events.

topic:

  • climate change/
  • Abnormal weather

Source: www.newscientist.com

Hubble Observes Large Globular Cluster in Large Magellanic Cloud

Astronomers using the NASA/ESA Hubble Space Telescope have captured detailed images of the globular star cluster NGC 1651 in the constellation Mensa.

This Hubble image shows the globular cluster NGC 1651 about 162,000 light-years away in the constellation Mensa. Image credits: NASA / ESA / Hubble / L. Girardi / F. Niederhofer.

Globular clusters are densely packed spherical clusters of hundreds of thousands or even millions of stars.

They are among the oldest known objects in the universe and are preferentially associated with the oldest components of galaxies.

There are at least 150 such objects in our Milky Way, and several more may be hidden behind the galaxy's thick disk.

NGC1651 is part of the Large Magellanic Cloud, the largest and brightest of the Milky Way's satellite galaxies.

beginning discovered Discovered by British astronomer John Herschel on November 3, 1834, this globular cluster is located approximately 162,000 light-years away in the constellation Mensa.

NGC 1651, also known as ESO 55-30 or LW 12, has a diameter of 120 light years.

“A remarkable feature of this image is that NGC 1651 nearly fills the entire image, even though the globular cluster is only about 10 to 300 light-years in diameter,” Hubble astronomers said. Masu.

“In contrast, there are many Hubble images that feature entire galaxies, tens or even hundreds of millions of light-years in diameter, that more or less fill the entire image.”

Color images of NGC 1651 consist of observations from. Hubble's Wide Field Camera 3 Found in the near-infrared and optical portions of the spectrum.

“A common misconception is that Hubble and other large telescopes can observe objects of vastly different sizes by zooming in, much like we would with special cameras on Earth,” the astronomers said. Ta.

“However, while smaller telescopes may have the option to zoom in and out to some extent, larger telescopes do not.”

“Each telescope instrument has a fixed 'field of view' (the size of the area of the sky that can be observed in a single observation).”

“For example, WFC3's ultraviolet/visible light channel, the channel and instrument used to collect the data used in this image, has a field of view that is approximately one-twelfth the diameter of the moon as seen from Earth. Masu.”

“Every time WFC3 makes an observation, it becomes the size of the region of sky it can observe.”

“There are two reasons why Hubble is able to observe objects with such widely different sizes,” the researchers said.

“First, the distance to an object determines how big that object appears from Earth, so an entire galaxy that is relatively far away is compared to a relatively nearby globular cluster like NGC 1651. could take up the same amount of space as the sky.''

“In fact, a distant spiral galaxy lurks just to the left of the cluster in this image. It's undoubtedly much larger than the cluster, but here it appears small enough to blend in with the foreground stars.”

“Second, multiple images across different parts of the sky can be mosaicked to create a single image of an object too large for Hubble's field of view.”

Source: www.sci.news

Webb discovers unique helium cloud surrounding GN-z11 in its Halo

GN-z11 is an extremely bright galaxy that existed just 420 million years ago, making it one of the oldest and most distant galaxies ever observed.

This two-part diagram shows evidence of a gaseous mass of helium in the halo surrounding galaxy GN-z11. The small box at the top right corner shows her GN-z11 in the galaxy. The box in the center shows a magnified image of the galaxy. The left-most box shows a map of helium gas in GN-z11's halo. This also includes clumps that are not visible in the infrared colors shown in the center panel. The spectrum in the bottom half of the graphic shows a distinct “fingerprint” of helium within the halo. The full spectrum shows no evidence of other elements, so the helium blob must be fairly pure, made from leftover hydrogen and helium gas from the Big Bang, with little contamination from heavier elements produced by stars. It suggests that there is no. Theory and simulations near particularly massive galaxies of these epochs predict that pockets of primordial gas must remain within the halo, and that these may collapse to form Population III clusters. doing. Image credit: NASA/ESA/CSA/Ralf Crawford, STScI.

GN-z11 is an early but moderately massive galaxy located in the constellation Ursa Major.

First discovered by the NASA/ESA Hubble Space Telescope in 2016, the galaxy is estimated to be just 420 million years old, or 3% of its current age.

GN-z11 is about 25 times smaller than the Milky Way, with only 1% of the mass of stars in our galaxy.

Remarkably, this galaxy is home to a supermassive black hole of approximately 1.6 million solar masses that is rapidly accreting matter.

using, near infrared spectrometer Astronomer Roberto Maiorino of the University of Cambridge and colleagues detected gaseous clumps of helium in the halo surrounding GN-z11 using the NASA/ESA/CSA James Webb Space Telescope's (NIRSpec) instrument.

“The fact that we don't see anything but helium suggests that this mass must be fairly pure,” Maiorino said.

“This is what was predicted by theory and simulations near particularly massive galaxies of these times. There should be pockets of primordial gas left in the halo, and these collapse into population III. They may form star clusters.”

Finding never-before-seen “Population III stars” (first generation stars formed almost entirely of hydrogen and helium) is one of the most important goals of modern astrophysics.

These stars are expected to be very massive, very bright, and very hot.

Their expected characteristics are the presence of ionized helium and the absence of chemical elements heavier than helium.

The formation of the first stars and galaxies marked a fundamental change in the history of the universe, during which the universe went from a dark and relatively simple state to the highly structured and complex state we see today. It has evolved into an environment.

“In future Webb observations, we hope to probe GN-z11 more deeply and strengthen our case for Population III stars potentially forming within the halo,” the astronomers said.

The survey results are journal astronomy and astrophysics.

Source: www.sci.news

Webb’s Observation of a Massive Star-Forming Complex in the Large Magellanic Cloud

Use of Mid-infrared measuring instrument With (MIRI) aboard the NASA/ESA/CSA James Webb Space Telescope, astronomers have captured stunning images of N79, a region of interstellar ionized hydrogen in the Large Magellanic Cloud.

This Hubble image shows star-forming region N79 located 163,000 light-years away in the constellation Sera. Image credit: NASA / ESA / CSA / Webb / M. Meixner.

N79 is a massive star-forming complex spanning about 1,630 light-years in the generally unexplored southwestern region of the Large Magellanic Cloud, a neighboring dwarf galaxy about 163,000 light-years from us.

This region is usually considered a younger version of the 30 Doradus, also known as the Tarantula Nebula.

N79 has a star formation efficiency more than twice that of Doradas 30 over the past 500,000 years.

This particular image centers on one of three giant molecular cloud complexes called N79 South (S1 for short).

The distinctive “starburst” pattern surrounding this bright object is a series of diffraction spikes.

“All telescopes that use mirrors to collect light, like Webb, have this form of artifact resulting from the design of the telescope,” Webb astronomers said.

“For Webb, the six largest starburst spikes appear due to the hexagonal symmetry of Webb's 18 primary mirror segments.”

“Such patterns are only noticeable around very bright and compact objects, where all the light comes from the same place.”

“Most galaxies appear very small to our eyes, but we don't see this pattern because they are dimmer and more spread out than a single star.”

“At the longer wavelengths of light captured by MIRI, Webb's view of N79 shows glowing gas and dust in the region.”

“This is because mid-infrared light can reveal what's going on deep within the cloud (whereas shorter wavelength light is absorbed or scattered by dust particles within the nebula). Still embedded Some protostars also appear in this region.”

Star-forming regions such as N79 are of interest to astronomers because their chemical composition is similar to that of giant star-forming regions observed in the early universe.

“The star-forming regions of our Milky Way galaxy are not producing stars at the same ferocious rate as N79 and have a different chemical composition,” the astronomers said.

“Webb now offers us the opportunity to compare and contrast observations of star formation in N79 with deep telescopic observations of distant galaxies in the early universe.”

“These observations of N79 are part of the Webb program to study the evolution of circumstellar disks and envelopes of forming stars over a wide range of masses and at different evolutionary stages.”

“Webb's sensitivity allows us to detect for the first time disks of planet-forming dust around stars of the same mass as the Sun at distances in the Large Magellanic Cloud.”

Source: www.sci.news

Utilizing Cloud Technology as the Building Blocks for AI: Driving Business Innovation with Azure

The cloud has transformed the way the world operates, yet it doesn’t always receive the recognition it deserves for enabling valuable digital experiences.

Businesses of all sizes, including companies like ASOS powered by Microsoft Azure cloud platform, leverage cloud technology to power their operations and deliver customized customer experiences.

One of the most impactful aspects of cloud technology is its ability to democratize access to computing power, allowing disruptors and innovators to enter new industries.

Tosca Colangeli, general manager at Microsoft UK, emphasizes that the cloud makes digital transformation accessible, enabling businesses to scale as needed and disrupt traditional industries.

Cloud technology grants businesses access to computing power over the Internet, eliminating the need for private physical servers on their premises. It also provides enhanced security and flexibility, allowing for tailored resource allocation and disaster recovery.

As much as you use
Colangeli highlights the importance of paying for resources as needed, allowing businesses to adapt to peak trading periods and manage disaster recovery efficiently.

Cloud flexibility has revolutionized business operations and innovation, especially in the face of the rapid changes driven by AI.

ChatGPT took 2 months

According to Gartner, global cloud spending is projected to account for over 45% of total enterprise IT spending by 2026.

In light of the increasing demand for cloud and AI services, businesses must embrace the cloud as the cornerstone of their operations and embed it within their organizational culture.

Colangeli emphasizes the need for businesses to focus on data as a top priority, as the success of AI is heavily reliant on quality, accessible data.

Additionally, using AI responsibly and adopting a flexible, adaptable approach will be crucial for businesses to stay competitive in the evolving technological landscape.

read more

Source: www.theguardian.com

Innovate with Azure: 5 ways developers’ lives are made easier with cloud tools

There is an interesting and appropriate “spot the difference” Microsoft Visual Studio An advertisement that used to be published in a computing magazine.

On the left side of the image is the desk of an “unhappy” software developer. It's midnight now. Under the gaze of a single desk lamp, mangled piles of paper litter the scene, and yellow sticky notes dangle from a computer screen.

The right side shows the desk of a “happy” software developer. It's clean and tidy, you have a calendar full of social events, and it's a sunny day outside.

The message was clear. It means that developers will feel less stressed if they use his C++, a popular programming language that allows them to develop software regardless of the platform.

Almost two decades later, technology has advanced, but software developers still face many of the same challenges. How do you handle the entire lifecycle of creating a product for your customers, from designing the software to making sure it works and is secure? , to deployment and maintenance of the finished product?

Scaling up and down is smooth
Cloud platforms greatly simplify developers' lives, from planning, writing, editing, and debugging code to delivering it live to customers, even when they don't know what the final requirements will look like. .

Previously, software companies delivered applications and services to customers and then had to guess how much infrastructure was needed to support product delivery and maintenance.

This problem can be further exacerbated by the fact that customers may be struggling to meet highly fluctuating demand from end users.

Azure's scalable cloud platform solves this problem by hosting your software and automatically scaling it as demand increases or decreases, allowing you to focus on what you do best: software development. .

Consider the following case live tech games (LTG) etc. The company was faced with the challenge of supporting 10,000 users simultaneously playing a live football tournament multiplayer mobile game online, a number he needed to increase to 500,000.

“During the 2022 FIFA World Cup, in partnership with ITV, we ran over 30 live tournament games with thousands of players competing simultaneously, and thanks to Azure, everything went perfectly. ” says the LTG co-founder. Co-CEO Samuel Worsley.

Develop and debug all in one place
Software development is a complex process. Especially when there are many cooks involved and everyone, including the boss, has different ideas about what coding language the software should be written in.

“If you like working in one programming language and your colleague likes working in another, that's not a problem with Visual Studio,” says Microsoft UK Head of Digital and Application Innovation, Data and Artificial Intelligence. Director Denise Dourado said. .

Visual Studio is an integrated development environment that allows developers to write, edit, and debug code in one place instead of moving from tool to tool.

“Without Visual Studio, you end up designing code in one place and having to debug it in another place. It takes time,” says Dourado.

The worst thing for developers, she says, is that when they switch from one tool to another, they worry about what will happen to their code, making the development process take a long time.

Audit software is easy
You've built your software, but now you need to deploy and maintain it, which can be a huge headache.Here is Azure DevOp Services Come in.

Azure DevOps can be used collaboratively by multiple people to track changes made to software source code and release them to customers in a live environment.

In addition, at a time of global economic uncertainty, there is a real need to reduce costs and improve efficiency while providing added value to customers whose businesses are undergoing significant digital transformation.

Mobile phone company Vodafone realized that it had too many computer systems within the company and many different code repositories for all of its software. Some control systems relied on Word documents to manually track source code changes.

“At the time, we could automatically roll back a deployment to see specifically which line of code caused the problem, link to the developer, and see why we did something like that. It took us a long time to be able to do that,” says Ben Connolly. Head of Digital Engineering at Vodafone.

“it has been Revolutionize with the power of Azure DevOps. It's all pipeline now, and it's much more auditable than what we had before. ”

Improved productivity
Artificial intelligence (AI) is also increasingly helping developers solve pain points. GitHub, the world's largest code repository and social software development community, leverages AI in the following ways: GitHub Copilot.

“GitHub Copilot provides enhanced features for developers, including suggesting code improvements, identifying common coding errors, and helping developers understand the codebase faster, especially for developers new to the project. ,” adds Dourado.

According to GitHub, 46% of new code is now written by AI, overall developer productivity has increased by 55%, and 75% of developers on the platform are looking to be more creative with their software development. They report that they are now able to concentrate on specific areas.

Software development has been democratized
Microsoft wants to take things a step further by “democratizing” the IT industry and making it possible for people without a software development background to build software, Dourado said. “We're bringing in users from the business environment and allowing them to build programs without necessarily understanding code,” she added.

for example, Azure cognitive services You can automate document processing, improve customer service, understand the root causes of anomalies, and extract insights from content.

“The world of technology is constantly changing, especially around things like AI, and if you are a developer, it may not be your core skill. We’re trying to make it possible to leverage AI without spending months or years,” says Dourado.

“What's important to me is that I don't have to struggle to learn new technology and have more freedom to be creative and take advantage of the features of the application.”


read more

Source: www.theguardian.com

Unlocking the Power of Your Business Data: 4 Ways Azure Cloud Drives Employee Engagement and Customer Insights

As organizations move their computing infrastructure to the cloud, they are harnessing the power of data like never before. Cloud-based services use artificial intelligence (AI) to make data easier to access, search, and understand. Instead of data being the property of a team of data scientists and analysts, new cloud-based tools and technologies are opening up this area of expertise to a wider range of employees across the organization.

Leighton Searle, director of Azure Solutions UK at Microsoft, believes there are significant benefits for businesses running their IT infrastructure. microsoft azure cloud platform. “The huge potential of new generative AI technologies has placed a renewed emphasis on the quality and availability of data in organizations,” he says. When companies move their data to the Microsoft Azure cloud platform, they can instantly access tools and services to unlock their value for both employees and customers. This will lead to culture change and deeper embedding of data throughout the business.

Searle identifies four areas where the cloud is accelerating the use of data, empowering employees, and increasing productivity.

1 Empower employees to make data-driven decisions
Searle emphasized that the cloud is helping to democratize data, allowing employees to access data directly in their daily workflows, rather than being locked into administrative reports or separate line-of-business applications. Masu. “To unlock the value of data and realize its potential, data must be accessible to the people who need it,” Searle said. “It provides contact center agents with a scannable overview of all previous customer interactions, transactions, and support calls, as well as instant access to a company-wide knowledge base of specialized information for world-class That could mean providing a better customer experience or enabling mobile mechanics to meet customer needs. Identifying parts and inventory status from a photo taken with a mobile phone.”

Almost every role can benefit from timely, secure and relevant data, Searle says. When staff become responsible for managing data related to their role, they need tools to simplify the process. Data visualization tools help employees create simple representations of data to glean insights and improve customer experiences.For example, Heathrow Airport Microsoft Power BI Use data visualization tools through Microsoft Teams to transform data from management systems into visualizations that are easy for staff to read. These allow employees to see at a glance how airport passenger numbers are changing in real time. Prepare your staff for the peaks and troughs rather than just reacting to them.

2 Breaking down data silos
To achieve a higher degree of data democracy, data must be made available throughout the organization, rather than being locked away in a central repository. “With the right guidance, governance, and guardrails in place, you can enable the rest of your business and provide access to the data you need,” Searle says.

For example, a group of five south London boroughs formed the South London Partnership; Linked with Azure Build a universal data platform. This includes sharing data from “Internet of Things” (IoT) sensors monitoring at-risk populations, and he estimates the partnership has already saved four lives. I am. IoT sensors also monitor air quality and flood risk. “The ability of cloud technology to share data while maintaining data rights and privacy has allowed us to break down data silos,” Searle says.

3 Building AI and modern search to accelerate your business
Customer and employee expectations have changed as AI-driven experiences play a larger role in daily life. Along with a good data foundation and a good data culture, these experiences are quickly becoming critical to both employee and customer retention. Employees can use internal apps to search for data and access historical information in “structured” tables and graphs or in “unstructured” formats from documents, images, and other sources. You need to dig deep into your knowledge.

Mr. Searle points out that: Azure cognitive search The platform and Azure Open AI service allow users to type common natural language queries into the search bar. This is processed by an AI-powered system that looks at all data sources and returns a natural language summary from the most relevant sources. Used for validation or further research. He says this type of his AI-driven experience will help companies derive insights and make data-driven decisions intuitively and at unprecedented speed.

For example, Cambridge and Peterborough NHS Foundation Trust moved its computing infrastructure to the cloud and used Azure Cognitive Search to Make patient records easier to search By the clinician. The trust has uploaded all records to Azure. The records included all kinds of unstructured data, including handwritten notes, doctor's notes, scanned images, and photographs.

Clinicians said they were “blown away” to learn that Azure Cognitive Search enabled them to discover these diverse formats, allowing them to quickly find handwritten notes and records from the previous year.

4 Building a data-driven culture
From frontline workers to boardroom executives, all employees should be open to incorporating data into the way they work, Searle says. He believes they can all learn from and contribute to enriching the data that flows through them. Employees who are involved in managing their own data are also better placed to reduce the risk of bias and incorrect assumptions in data-driven decision-making.

Searle believes that democratizing data requires a significant change in corporate culture. Departments across the organization, such as human resources, marketing, operations, sales, and finance, play a critical role in the data they generate and consume. For example, business users of organizational data are in the best position to set data security and access policies and manage the data so that it can be used with confidence by other parts of the business.

The pace of change can be daunting for leaders at all levels. To help organizations upskill, Microsoft has partnered with European business school Insead to AI Business School.

Searle outlines the steps businesses need to take to get the most out of their data. “We securely bring data into the Microsoft cloud. We lead from the top to build a data-driven culture across the organization and move quickly to projects that deliver business value. This positive experience ripples throughout the business, and this It will help us incorporate data-driven approaches to further scale.”

read more

Source: www.theguardian.com

Innovate with Azure: 5 Key Factors to Ensure Your Business’ Cloud Platform is Future-Proof



The world is on the brink of a productivity revolution

The world is on the brink of a productivity revolution, as artificial intelligence (AI) creates a new wave of opportunity for businesses of all sizes. Whether it’s using chatbots, more advanced AI, uncovering deeper insights into customer needs, or speeding up product development, you’re missing out on the improved outcomes that AI can bring. No company wants that. For some organizations, generative AI tools are emerging, such as: Chat GPT and Daruiis increasingly making business cases for adopting AI strategies to generate content and images. But while business leaders want to maximize the benefits of technology, they also need to understand the broader responsibilities that come with it (including considerations around data privacy, unintentional bias, copyright infringement, etc.) and how to do so. You also need to. Most of the opportunities are rapidly evolving. To help board executives and IT leaders drive success with their AI strategies, Michael Wignall, director of infrastructure for Microsoft’s Azure Business Customer Success unit, recommends what leaders need to do before leveraging AI. Here are the first five steps you should take.

1. Make AI part of a broader cloud computing strategy

First and foremost, Wignall says companies should consider working with established technology providers. AI works best when it’s part of a broader cloud computing strategy. This means IT operations are outsourced to an outside company that operates the data center. Microsoft Azure he says. “AI is born in the cloud. To take advantage of this wave of innovation, you need to be in the cloud,” he added. He points to his three main components of AI: computing power, data, and algorithms, all of which are best provided through cloud services. He believes companies should adopt a “cloud-native” approach, where the entire AI infrastructure is built on a cloud platform. Such an approach offers many benefits, including: Reduce costs by paying only for the resources you use, rather than maintaining and updating expensive on-premises equipment. Flexibility and scalability. Customers can easily add or remove resources as needed. Access to enhanced security tools. This allows you to better detect, assess, and alert on threats to your customers’ data. As with cloud data, you can easily back up your data and quickly restore it in the event of a failure or disaster.

2. Find the data

Next, businesses need to have a solid understanding of where their data resides within their organization and move it to cloud platforms. The success of AI depends on analyzing relevant data at scale. To fine-tune AI for best performance, AI should be powered by your company’s own data from customer lists, inventory, sales information, financial data, and other key data. “It’s important to make sure your data platform and data strategy is the best it can be, and that you know where your data is and how to access it,” he said.

3. Protect your data

Once the cloud infrastructure is in place and the associated data has been migrated, the next critical step is to secure that data. With all of a company’s important data in one place (the cloud), it’s important to have peace of mind in the presence of multiple threats, such as hackers. “Make sure you’re protected with best-in-class security features, clearly defined policies and governance around who can access your data, and the ability to audit how your data is handled,” he said.

4. Decide which functions and tasks to use AI for

Once the infrastructure, data, and security are in place, companies can move on to determining the best uses for AI, such as automating office processes, extracting insights from data, and handling copywriting and a variety of other tasks. Masu. For the past five years, general AI has provided so-called “cognitive services” such as data analysis and product recommendations. Generative AI takes technology to a new level. With a few keystrokes, users can create content such as reports, ads, images, copy, automated emails, and personalized user connections. Generative AI can also analyze large volumes of documents, call center logs, and financial results and summarize information with short precision.

5. Implement a responsible AI policy

Once a company takes these steps, it is ready to deploy an AI strategy. However, before launching, companies should ensure they have responsible AI policies in place across the board. Businesses ensure that AI is free from embedded bias, that there is good governance around its use, that AI is used ethically, and that there are no unintended or undesirable consequences is needed. Microsoft provides responsible AI policy guidance and provides tools to check for bias, ensure inappropriate data is filtered out, and perform sentiment checks to scrutinize output. Ultimately, however, it is essential that companies ensure they have responsible AI policies in place. While many organizations are just beginning their AI journey, Wignall summarizes the mindset companies should adopt when considering AI: Partnership is key. Cloud is the key. Prioritize the business benefits that matter to your organization. And start today.


Source: www.theguardian.com

Akron Energy secures $110 million investment to expand Bitcoin mining operations and launch AI cloud services in Norway

Akron Energy data center infrastructure company has closed a $110 million private funding round to expand its business, CEO Josh Payne exclusively tells TechCrunch.

The round was led by Bluesky Capital Management with participation from Kestrel 0x1, Nural Capital, and Florence Capital.

The company was founded in 2021 and started with a 5-megawatt site in Australia. Since then, its output has grown to over 130 MW, and it has expanded to other countries and regions such as the United States and Europe.

“These sites are attractive to both Bitcoin miners and AI.” [or] It’s a machine learning client that requires very high-powered computing,” Payne said. By the way, statistics show that 1 megawatt can power 400 to 900 homes per year. Nuclear Regulatory Commission.

Approximately $80 million will be used to acquire an additional 200 megawatts of capacity across new data centers in Ohio, North Carolina, and Texas as part of the company’s plan to increase its total megawatt capacity by 130% by mid-2024. be exposed. This is in addition to an existing 100-megawatt facility in Ohio that Akron purchased in June, Payne noted.

“The United States is an attractive market for us in many ways, primarily due to huge domestic customer demand, a mature and robust energy industry with multiple flexible deregulated markets, and a strong political and・Regulatory stability and attractiveness to institutional investors,” Payne said. “The United States has a wealth of underutilized and stranded generation assets that are connected to some of the lowest-cost power sources in the world, many of which are renewable.”

Payne said the majority of the company’s U.S. data center portfolio is made up of institutional-grade Bitcoin mining companies. “We are essentially landlords who own the underlying infrastructure assets.”

Akron’s business model is focused on strategically acquiring distressed data center assets around the world. “The current and future demand for data center capacity of all types seen around the world, especially in the United States, is unprecedented and huge. We have energy-intensive platforms that require significant amounts of electrical infrastructure.”

The remaining $30 million will be used to develop an artificial intelligence cloud services project at Akron’s data center in Norway to help serve the generative AI and large-scale language model training markets. “Over the past year, we have seen a significant acceleration in market demand for generative AI and large-scale learning model applications,” he said.

However, there is a lack of specialized physical infrastructure to power computers and support most of these products. Akron aims to fill that gap by providing the underlying infrastructure layer that the AI ​​sector relies on.

Over the past year, with spot ETF approval looming, on top of Bitcoin’s potential growth and adoption in the mainstream institutional market, there has been a “meteorous rise in AI applications,” such as Akron’s Specialized data centers are “poised to continue to grow exponentially,” Payne said.

Source: techcrunch.com

Automating Kubernetes configuration with ScaleOps for Reduced Cloud Costs

One of the benefits of using Kubernetes to handle container orchestration is that containers are ephemeral, lasting as long as needed and then disappearing. This was supposed to help solve the resource allocation problem, since containers only need to run long enough to process jobs. However, as Kubernetes environments become increasingly complex, another problem arises as engineering teams must manually modify Kubernetes configurations to accommodate changing needs.

Additionally, workloads are often over-allocated to ensure they continue to run regardless of usage spikes, which can result in unnecessarily high cloud charges. scale ops, an early-stage startup, wants to solve this problem. Rather than guessing and constantly adjusting static allocations, we built a system that dynamically sets configurations based on your requirements at any time. Today, the company announced a $21.5 million Series A.

Yodar Shafrir, co-founder and CEO of ScaleOps, said he often saw this overallocation problem when working at his previous company. As a result, a lot of engineering time was spent configuring resources, often resulting in high cloud charges.

“The companies we work with today are seeing 70% to 80% wasted on over-provisioned containers,” Shafrir told TechCrunch. “So we realized that the only way to free our engineers from this repetitive configuration and free them to focus on what really matters is to fully automate the resource allocation process.”

The company has created a dashboard that shows businesses what workloads are currently available and how much they can save by letting ScaleOps autoconfigure them. He said customers typically start small with a single workload to see how it works. Then, once you see your results, toggle automation to save even more.

ScaleOps resource allocation dashboard.

Image credits: scale ops

He sees an opportunity for growth as companies look to save on cloud fees. The company was founded in 2022 and has dozens of paying customers since launching its product earlier this year, managing thousands of his Kubernetes clusters using its ScaleOps product. Customers include Wiz, Coralogix, and Outbrain. ScaleOps currently has 30 employees, and he plans to double that number by the end of next year.

The company’s $21.5 million Series A was led by Lightspeed Venture Partners, NFX, and Glilot Capital Partners.

Source: techcrunch.com

Understanding the Law of X: A Guide for Cloud Leaders on Balancing Growth and Profits

As an interest rate Returning to historical norms, the world has returned its focus to cost of capital and free cash flow generation. In order for companies to adhere to traditional heuristics like the Rule of 40 (i.e., the idea that the sum of revenue growth and profit margin must equal 40% or more, a metric that Bessemer helped popularize) We are working hard. Executives at both private and public cloud companies agree that free cash flow (FCF) margins are just as important (if not more important) than growth, and that the trade-off is he says 1:1. I often think about it. Many finance executives love the “Rule of 40” for its clarity, but placing equal emphasis on growth and profitability in late-stage businesses is flawed and leads to bad business decisions. I am.

our view

For companies with adequate FCF margins, growth must remain a top priority. There are good reasons to emphasize efficiency, but Traditional Rule of 40 Mathematics Is Completely Wrong When a company approaches its break-even point and has positive free cash flow,

The world has hyper-rotated to an FCF margin mindset instead of a growth mindset, which is counter to efficient business growth. Long-term models show that growth should be valued at least two to three times more than his FCF margin, even in tight markets.

Equivalent emphasis on growth and profitability in late-stage businesses is flawed and leads to bad business decisions.

why?

An increase in margin has a linear effect on value, but an increase in growth rate can have a compound effect on value. We provide detailed calculations below, but when we backtest the relative importance of growth and FCF margins, the correlation of public market valuations confirms it. Actual ratios vary widely in the short term (ranging from about 2x to about 9x over the past few years), but over the long term they are typically 2x to 3x growth value over profitability. It comes down to proportions.

Even the most conservative financial planner recommends that you can safely use a growth rate of up to 2x for late-stage private company profitability. Publicly traded companies with a low cost of capital can use multiples of up to 2-3x (as long as growth is efficient).

Image credits: Bessemer Venture Partners

Source: techcrunch.com