At its core, a star is formed when gravity gathers matter tightly enough to facilitate nuclear fusion in its center while also ensuring it doesn’t generate enough energy to disintegrate. The equilibrium between the gravitational forces pulling inward and the radiative forces pushing outward is referred to as: hydrostatic equilibrium. This balance constrains the size that stars can attain. This limit is known as the Eddington mass limit, which is believed to range between 150 and 300 solar masses.
When stars rotate, they have an enhanced ability to maintain their structure because a rotating body generates a force directed inward from its outer edges. This force is called centripetal force. As the star spins, it applies a centripetal force that acts alongside gravity, balancing the radiation pressure. Recently, a group of scientists investigated how the rotation of giant stars impacts their lifetimes throughout cosmic history. Massive stars contribute significantly to key cosmic phenomena, and understanding their end stages can shed light on the universe’s formation, including the creation of black holes and supernovae.
The researchers employed grid-based modeling software called the Geneva Stellar Evolution Code, also known as Genec. This tool helped simulate stellar behavior and long-term evolution based on initial characteristics. GENEC treats a star as a multi-layered system and tracks the movement of matter across these layers over time.
Two primary variables in their simulations were the star’s rotation status and its initial mass, which ranged from 9 to 500 solar masses. The researchers indicated that current science portrays very massive stars, those exceeding 100 solar masses, as inherently unstable and unpredictable. To clarify this, the team analyzed results for these colossal stars, utilizing 2other models.
To understand how the fates of giant rotating stars have evolved, the researchers examined the ratio of stars containing elements heavier than hydrogen and helium ( metallic). They argued that since the early universe after the Big Bang had few metals, the modern universe must contain significantly more, allowing metallicity to serve as a proxy for stellar evolution. By analyzing spinning stars with low metallicity, they sought insights into the lifespan of the early universe’s rotating stars.
Following the GENEC simulations, the researchers observed distinct differences in the fates of rotating versus non-rotating stars. Spinning massive stars were more likely to collapse into black holes while being less prone to massive supernova eruptions or transitioning into dense neutron stars. The research indicated that very massive, non-rotating stars with low metallicity tend to explode as supernovae, whereas those with high metallicity collapse into black holes.
The researchers proposed that this intricate relationship arises because rotating stars tend to have more of their material mixed, increasing the fusion potential in their cores. However, this rotational dynamic can also lead to the ejection of more outer material, ultimately reducing the fusion resources available in the core.
An additional complicating factor arises from the frequent occurrence of multiple massive stars in close proximity, forming a binary system. In these scenarios, stars can exchange mass, either gaining or losing material. The researchers suggest that because massive stars in binary systems may shed mass before their lifetimes conclude, their model could underestimate the frequency of massive stars evolving into neutron stars rather than exploding or collapsing into black holes.
In summary, the team concluded that rotation intricately influences star evolution. While rotation increases the likelihood of a massive star undergoing certain outcomes, such as collapsing into a black hole, factors like composition and initial mass significantly affect its destiny. Acknowledging the multitude of variables, the researchers emphasized that the next phase in understanding massive stars’ fates should focus on identifying stars in binary systems.
I cannot recall the exact moment my TikTok feed presented me with a video of a woman cradling her stillborn baby, but I do remember the wave of emotion that hit me. Initially, it resembled the joyous clips of mothers holding their newborns, all wrapped up and snug in blankets, with mothers weeping—just like many in those postnatal clips. However, the true nature of the video became clear when I glanced at the caption: her baby was born at just 23 weeks. I was at 22 weeks pregnant. A mere coincidence.
My social media algorithms seemed to know about my pregnancy even before my family, friends, or doctor did. Within a day, my feed transformed. On both Instagram and TikTok, videos emerged featuring women documenting their journeys as if they were conducting pregnancy tests. I began to “like,” “save,” and “share” these posts, feeding the algorithm and indicating my interest, and it responded with more content. But it didn’t take long for the initial joy to be overtaken by dread.
The algorithm quickly adapted to my deepest fears related to pregnancy, introducing clips about miscarriage stories. In them, women shared their heartbreaking experiences after being told their babies had no heartbeat. Soon, posts detailing complications and horror stories started flooding my feed.
One night, after watching a woman document her painful birthing experience with a stillbirth, I uninstalled the app amidst tears. But I reinstalled it shortly after; work commitments and social habits dictated I should. I attempted to block unwanted content, but my efforts were mostly futile.
On TikTok alone, over 300,000 videos are tagged with “miscarriage,” and another 260,000 are linked under related terms. A specific video titled “Live footage of me finding out I had a miscarriage” has garnered almost 500,000 views, while fewer than 5 million have been dedicated to women giving birth to stillborns.
Had I encountered such content before pregnancy, I might have viewed the widespread sharing of these experiences as essential. I don’t believe individuals sharing these deeply personal moments are in the wrong; for some, these narratives could offer solace. Yet, amid the endless stream of anxiety-inducing content, I couldn’t shake the discomfort of the algorithm prioritizing such overwhelming themes.
“I ‘like,’ ‘save,’ and ‘share’ the content, feeding it into the system and prompting it to keep returning more”…Wheeler while pregnant. Photo by Kathryn Wheeler
When discussing this experience with others who were also pregnant at the same time, I found shared nods of understanding and similar narratives. They too recounted their personalized concoctions of fears, as their algorithms zeroed in on their unique anxieties. Our experiences felt radical as we were bombarded with such harrowing content, expanding the range of what is deemed normal concern. This is what pregnancy and motherhood are like in 2025.
“Some posts are supportive, but others are extreme and troubling. I don’t want to relive that,” remarks 8-month-pregnant Cerel Mukoko. Mukoko primarily engages with this content on Facebook and Instagram but deleted TikTok after becoming overwhelmed. “My eldest son is 4 years old, and during my pregnancy, I stumbled upon upsetting posts. They hit closer to home, and it seems to be spiraling out of control.” She adds that the disturbing graphics in this content are growing increasingly hard to cope with.
As a 35-year-old woman of color, Mukoko noticed specific portrayals of pregnant Black women in this content. A 2024 analysis of NHS data indicated that Black women faced up to six times the rate of severe complications compared to their white counterparts during childbirth. “This wasn’t my direct experience, but it certainly raises questions about my treatment and makes me feel more vigilant during appointments,” she states.
“They truly instill fear in us,” she observes. “You start to wonder: ‘Could this happen to me? Am I part of that unfortunate statistic?’ Given the complications I’ve experienced during this pregnancy, those intrusive thoughts can be quite consuming.”
For Dr. Alice Ashcroft, a 29-year-old researcher and consultant analyzing the impacts of identity, gender, language, and technology, this phenomenon began when she was expecting. “Seeing my pregnancy announcement was difficult.”
This onslaught didn’t cease once she was pregnant. “By the end of my pregnancy, around 36 weeks, I was facing stressful scans. I began noticing links shared by my midwife. I was fully aware that the cookies I’d created (my digital footprint) influenced this feed, which swayed towards apocalyptic themes and severe issues. Now with a 6-month-old, her experience continues to haunt her.
The ability of these algorithms to hone in on our most intimate fears is both unsettling and cruel. “For years, I’ve been convinced that social media reads my mind,” says 36-year-old Jade Asha, who welcomed her second child in January. “For me, it was primarily about body image. I’d see posts of women who were still gym-ready during their 9th month, which made me feel inadequate.”
Navigating motherhood has brought its own set of anxieties for Asha. “My feed is filled with posts stating that breastfeeding is the only valid option, and the comment sections are overloaded with opinions presented as facts.”
Dr. Christina Inge, a Harvard researcher specializing in tech ethics, isn’t surprised by these experiences. “Social media platforms are designed for engagement, and fear is a powerful motivator,” she observes. “Once the algorithm identifies someone who is pregnant or might be, it begins testing content similar to how it handles any user data.”
“For months after my pregnancy ended, my feed morphed into a new set of fears I could potentially face.” Photo: Christian Sinibaldi/Guardian
“This content is not a glitch; it’s about engagement, and engagement equals revenue,” Inge continues. “Fear-based content keeps users hooked, creating a sense of urgency to continue watching, even when it’s distressing. Despite the growing psychological toll, these platforms profit.”
The negative impact of social media on pregnant women has been a subject of extensive research. A systematic review examining social media use during pregnancy highlights both benefits and challenges. While it offers peer guidance and support, it also concludes that “issues such as misinformation, anxiety, and excessive use persist.” Dr. Nida Aftab, an obstetrician and the review’s author, emphasizes the critical role healthcare professionals should play in guiding women towards healthier digital habits.
Pregnant women may not only be uniquely vulnerable social media consumers, but studies show they often spend significantly more time online. A research article published in midwife last year indicated a marked increase in social media use during pregnancy, particularly peaking around week 20. Moreover, 10.5% of participants reported experiencing symptoms of social media addiction, as defined by the Bergen Social Media Addiction Scale.
In the broader context, Inge proposes several improvements. A redesigned approach could push platforms to feature positive, evidence-based content in sensitive areas like pregnancy and health. Increased transparency regarding what users are viewing (with options to adjust their feeds) could help minimize harm while empowering policymakers to establish stronger safeguards around sensitive subjects.
“It’s imperative users understand that feeds are algorithmic constructs rather than accurate portrayals of reality,” Inge asserts. “Pregnancy and early parent-child interactions should enjoy protective digital spaces, but they are frequently monetized and treated as discrete data points.”
For Ashcroft, resolving this dilemma is complex. “A primary challenge is that technological advancements are outpacing legislative measures,” she notes. “We wander into murky waters regarding responsibility. Ultimately, it may fall to governments to accurately regulate social media information, but that could come off as heavy-handed. While some platforms incorporate fact-checking through AI, these measures aren’t foolproof and may carry inherent biases.” She suggests using the “I’m not interested in this” feature may be beneficial, even if imperfect. “My foremost advice is to reduce social media consumption,” she concludes.
My baby arrived at the start of the year, and I finally had a moment to breathe as she emerged healthy. However, that relief was brief. In the months following my transition into motherhood, my feed shifted yet again, introducing new fears. Each time I logged onto Instagram, the suggested reels displayed titles like: Another baby falls victim to danger, accompanied by the text “This is not safe.” Soon after, there was a clip featuring a toddler with a LEGO in their mouth and a caption reading, “This could happen to your child if you don’t know how to respond.”
Will this content ultimately make me a superior, well-informed parent? Some might argue yes. But at what cost? Recent online safety legislation emphasizes the necessity for social responsibility to protect vulnerable populations in their online journeys. Yet, as long as the ceaseless threat of misfortune, despair, and misinformation assails the screens of new and expecting mothers, social media firms will profit from perpetuating fear while we continue to falter.
“Rachel, I have some unfortunate news,” the text read. “They are planning to dismantle the loom tomorrow.”
Rachel Halton still doesn’t know who made the decision in October 2022 to eliminate the $160,000 jacquard loom, which had been the foundation of RMIT’s renowned textile and textile design course for two decades.
Standing at 3 meters tall and weighing over half a tonne, the loom was an intricate machine made of polished wood, steel, compressed air, and mechatronics. It served as both a grand tribute to the textile industry’s golden age and a modern tool for weaving intricate fabrics from strands of thread. Halton couldn’t bear the thought of it ending up in a landfill.
The Jacquard Loom uses punch cards—an early form of coding—to guide the lifting and dropping of threads.
Photo: Stuart Walmsley/Guardian
“It was my day off, and I jumped out of bed and rushed over,” recalls Halton.
The loom was unique in the Southern Hemisphere and one of only a few globally. Halton acquired it for the university’s Brunswick campus in the early 2000s soon after she began teaching there. It “expanded artistic possibilities,” she states. Students enrolled specifically to work with it, and international artists visited to weave on it. It became integral to Halton’s creative process.
Upon her arrival on campus that October morning, she was determined to “rescue it from the brink.”
“He severed it right in front of me,” Halton recounts. “It felt like I was pulling the plug on a family member’s life support.”
Many shared her sentiment, prompting a grassroots effort to save the loom as news spread about its impending removal. A passionate collective of weavers, educators, students, and alumni rallied to find it a more suitable home, all while carefully disassembling it for transport to a compassionate technician’s workshop, eventually settling on a former student’s living space.
Textile artist Daisy Watt, part of that collective, describes the event as a “telling snapshot of the challenges” facing higher education in arts and crafts.
Warp and Weft
The loom’s cumbersome name underscores its significance. Traditional jacquard looms utilize punch cards (rows of holes in cardboard slips, the earliest form of coding) to control the lifting of vertical (warp) threads and weave fabric through thread manipulation. The Arm AG CH-3507 loom can be operated manually or via computer, providing total control over every thread and opening up limitless design avenues.
Watt collaborates with technician Tony De Groot to restore the loom.
Photo: Stuart Walmsley/Guardian
Watt has a “deep connection” to the loom. Not only did she invest countless hours during her time at RMIT, but she also housed it for months post-rescue. Self-taught in coding, she is now updating its electronics. Given its roots in Jacquard punch card technology, it feels as though the loom is intertwined with the **fundamentals of modern computing.**
“We often think of crafting as separate from technology, yet this embodies the beautiful chaos of that intersection,” Watt explains. “Effective crafting technology revolves around creating beauty.”
Instructor Lucy Adam notes that when the loom was acquired, RMIT offered textile design as part of its arts diploma.
In 2008, RMIT shifted from offering a diploma to a Certificate IV training package, part of a wider and controversial national restructuring of vocational education. This approach omitted traditional curricula in favor of job-focused “competency units” directed by industry, all under stringent regulation.
Government officials defended these reforms as necessary for streamlining qualifications and eliminating underperforming training providers. However, educators and union representatives warned that this would dilute educational quality, resulting in a systemic decline in skill development which labor theorist Harry Braverman described as a shift from “conscious skilled labor” to rudimentary tasks.
Testimonies from RMIT’s textile design faculty indicate this was indeed the outcome despite their best efforts.
De Groot inspects educational materials recovered from the loom.
Photo: Stuart Walmsley/Guardian
The program has become “very dry and at the lowest common denominator,” according to Adam. Resources have been cut back significantly, and student interaction time has halved. Despite the loom’s educational potential, there was insufficient time to teach students how to operate it adequately. Halton endeavored to integrate it into student projects as much as possible, personally overseeing its setup, disassembly, and maintenance.
In her Master’s thesis, Adam scrutinized the effects of these changes on vocational education and noted that competency checklists missed the essence of trade disciplines like textile design, ceramics, cooking, metalworking, woodworking, and other fields that marry technical skills with artistic expression.
“Unless you are an exceptionally skilled educator capable of circumventing the banality, you’re relegated to an archaic teaching model,” she argues.
Artist and educator John Brooks echoes the concerns about the restrictive course structure, highlighting that even basic tasks like starting or shutting down a computer are now considered part of the evaluation requirements. “With so much focus on compliance, we compromise the fundamental skills we aim to teach,” he laments.
Adam remembers a student lamenting their training package, saying it felt like “filling out a visa application repeatedly.” “It truly saddened me,” she reflects. “Where does real learning take place? Where can you learn it?”
The loom’s new location in Ballarat.
Photo: Stuart Walmsley/Guardian
This trend isn’t confined to TAFE. Ella*, a third-year student from the University of Tasmania, shares with Guardian Australia that advanced 3D media courses, particularly in her areas of focus—furniture, sculpture, or time-based media—cease after the first year. There are also no offerings in art history.
“It significantly affects students’ understanding of contemporary art,” Ella asserts. Her instructor is striving to “revitalize” the course.
Professor Lisa Fletcher, representing the Faculty of Arts at the University of Tasmania, emphasizes the institution’s commitment to arts education, stating they aim to equip students with “strong and sustainable skills,” while actively seeking feedback as they regularly evaluate their art degrees.
Crafting the Future
The loom is currently housed in an incubator space in Ballarat, where rescue organizations can operate for minimal fees. The city is dedicated to preserving rare and endangered craft techniques. Certain crafts have nearly disappeared; for instance, stained glass work, once close to being extinct in Australia, has seen a revival thanks to a handful of artists who successfully reintroduced it into the TAFE system and launched a course in Melbourne’s polytechnics. However, such revivals are rare.
Watt and fellow weavers aspire for looms to be accessible once more, allowing others to learn, teach, and create. As Brooks puts it, the less prevalent these skills become, the fewer opportunities there will be to acquire them. “We’re in danger of losing them altogether.”
An RMIT spokesperson mentioned that the university had to remove the looms as part of an upgrade to ensure students had access to “reliable and modern equipment” that prepares them for the workforce. Presently, the space previously occupied by the looms is dedicated to military-funded textile initiatives, requiring security clearance for entry. Last year, RMIT stopped accepting enrollments for the Certificate IV in Textile Design after state government funding for the course was withdrawn.
Yet, there is a glimmer of hope. Adam remains determined; she recently proposed a new diploma that has been approved. Despite the growing constraints, she isn’t alone in her endeavors at the university. As of this writing, the institution is set to acquire new equipment—a modest yet promising $100,000 computer-controlled Jacquard loom.
isEvangelion was released 10 years ago, an eternity in the world of video games. It's also one of the most compelling games of the decade, and sometimes it's not. On the surface, it's a gorgeous online progressive rock space shooter made by Bungie, the creators of the Xbox classic Halo. Gather up with some friends, deploy somewhere in the shimmering landscapes of a future solar system, and shoot people, aliens, and robots to earn better loot.
None of this is unprecedented, and maybe that's the point. You could say that Destiny's touchstones are Halo with its gunplay, World of Warcraft with its persistent online space, and (admittedly, this is a bit odd) the immortal British retailer Marks & Spencer. This last point is especially true because Destiny is a game of fluctuating destiny that seems to fascinate everyone involved in video games, whether they actually play Destiny or not. Just as many in the UK secretly know if M&S is currently trending up or down (there is no middle ground), everyone in the games industry knows if Destiny is doing well or not. Is it doing better than it has in many years? Or is it in a state of decline that is not comparable to where it was two, five, or seven years ago? Destiny is always an uneasy conversation topic for us.
Amazingly, this has been the case from the very beginning. in front The beginning. Fate met with great misfortune. Revealed as a company Long before it was announced as a fictional universe, the game was announced as SKUs and Q1 financial forecasts, not as a fun gunfighting world dreamed up by the best combat designers in the industry. When the first game finally arrived, it was seen as a beautiful epicenter of action surrounded by something that felt somewhat hastily produced. It was an early star where dust and gas hadn't yet fully solidified. Sure, if you had the right shotgun, you'd go into battle and the whole world would sing with you, but the story and lore were scattered across the game's surface as a series of trading cards, as if Homer had unleashed the Iliad on a collection of beer mugs and hidden them across various battlefields.
A great action game… Destiny was shown on a curved screen at E3 in Los Angeles in June 2014. Photo: Michael Nelson/EPA
But here's the thing: people just couldn't stop playing Destiny. From the start, nights spent online with friends couldn't have been more fun: join in, blow up stuff, win stuff, and compare your gains. Leveling up felt like something meaningful here. New loot had real personality. Set pieces unfolded beneath skyboxes so vast and intense they reminded us that, spaceships aside, Bungie's soul has always been deeply romantic.
Part of the game's enduring appeal is a series of striking images: the funereal hulk of the Traveler, an artificial moon, floating in the sky above the world's last city. Claw-like eruptions of Martian rock illuminated by sunlight turned into a barium haze through the airborne dust. But from the beginning, Bungie's games also seeped into the real world: players could view their builds outside the game, millions of raid-party WhatsApp groups sprung up overnight, and websites and YouTube channels were devoted to everything from leveling tips to reconstructing the story of a Frankenstein-style soap opera.
So for the last decade, playing Destiny has meant arguing about the game, getting annoyed and uninstalling it, then reinstalling it and spending the night engrossed in the game again. The existence of conspiracy theories means that the game means something to people. Caves with easy loot The in-game economy nearly collapsed within the first few months. Was this a bug or an intentional design flaw? Raid area with cheese spots A place where players can dish out massive amounts of damage without putting themselves in danger. Is this the sign of an unstable map, or a sign of a savvy developer generating a different kind of buzz?
Inevitably, people were nostalgic for even the Grimoire lore cards by the time Destiny 2 came out in 2017. Since then, there have been ups and downs. Death of a major character Everyone was talking about it The price of the expansion is the samePeople get tired of the drudgery, they think the raids are unfair, they understandably complain about the store, but they also understandably buy Destiny: The Official Cookbook. Complicating things is the fact that Destiny has been steeped in nostalgia from the get-go. Another final point of connection to M&S is that Destiny is an institution.
Few would argue that Destiny is a great action game, and always has been. At its heart is a core of charismatic gunplay, and what radiates outwards from there is an evocative and unforgettable twist of sci-fi, combined with Bungie's long-standing talent for sad, flashy naming conventions. This is the studio that brought us Halo levels “Pillar of Autumn” and “Silent Cartographer.” It's no wonder that the game “Destiny Weapon Name or Roxy Music Deep Cut?” remains a reliable drinking game. (It goes both ways; it's easy to imagine Bungie releasing Sentimental Fool and Mother of Pearl SMGs.)
Striking image…Destiny 2. Photo: Activision
Still, there are fluctuations. The latest expansion was hailed as one of the best in a while, but player numbers haven't increased significantly since then. Over time, Bungie has gone from questions about the cost of cosmetics to serious allegations about its internal culture; the studio has changed owners and recently suffered layoffs. Last week, Destiny 2 Steam player numbers hit all-time low.
Still, we talk about the games that are always in the news (Includes bungeeannounced that it would be publishing a developer blog tonight discussing the future of the game. Many of us still feel nostalgia for a game that was born out of nostalgia. And these two things create a powerful allure. I remember when I first played Destiny 2, long after everyone I knew had cooled off from their obsession with the game. I found a game that kept me entertained for a few minutes, but those minutes could easily turn into hours. I also found a world that felt as if it was covered in blue plaques that told of a painter from long ago who once vacationed here.
After all, Destiny as a game benefits greatly from its dialogue fallbacks. For example, when I first met Devrim Kay, Destiny's gentlemanly sniper, in person, I knew so much about him I could have been his biographer. I felt like I was in the presence of a celebrity, even though he was just another quest giver.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.