Environmental and social challenges are urgent, yet many nations grapple with underfunding and political stalemates. Imagine if we could innovate ways to tackle these issues effectively and economically without the burden of partisan politics!
Nearly two decades ago, we and our colleagues in behavioral sciences considered this a real possibility. We proposed a sophisticated idea: social issues often stem from individuals making “poor” choices, whether it’s unhealthy eating, smoking, or polluting the environment. Traditional approaches rely on taxes or bans, but our fresh perspective aimed to encapsulate a gentler, psychologically aware method. By rethinking how choices are presented, we could encourage healthier and more sustainable options, while still allowing access to alternatives.
“Nudges” were viewed as potential solutions, suggesting that societal issues could be mitigated through slight shifts in individual behavior. For instance, to combat obesity, we might reduce portion sizes and reposition salad bars at the forefront of cafeterias. To address climate concerns, why not default homeowners to renewable energy options?
Initially, it appeared we were on the verge of a nudge revolution. A team of researchers, including ourselves, sought to identify subtle modifications in “choice architecture” that could spur behavioral changes and ultimately result in major societal impacts. This presents a golden opportunity to leverage psychological insights for transformative progress.
Fast forward almost 20 years and progress remains stagnated, leaving many disappointed. When nudges do yield results, the effects are minimal, short-lived, and often fail to scale. Furthermore, emphasizing individual behavior as the primary lens for societal problems may inadvertently empower various corporate entities to resist the more traditional yet effective policy measures like taxation and regulation that reshuffle the foundational rules and incentives driving societal actions, jeopardizing their interests.
In hindsight, we realize this outcome shouldn’t come as a surprise, though it certainly was at the time. Given that human psychology has remained fundamentally unchanged, the social dilemmas we face arise from systemic shifts—not individual choices. Events like 200 years of fossil fuel reliance or the surge of ultra-processed foods over recent decades are to blame, and individuals alone cannot resolve issues like carbon emissions or unhealthy eating patterns. Moreover, a focus on individual behaviors risks distracting policymakers and the public from recognizing the need for systemic reforms and policy-driven solutions.
Correctly identifying the problem might lead to companies resistant to regulations fortifying individual-level responses that seem effective but fall short. This phenomenon is already observable, as evidenced by attention-grabbing concepts like our personal “carbon footprint.” This branding didn’t emerge from environmental movements or NGOs but originated from a massive PR campaign by BP, one of the globe’s leading fossil fuel corporations, in the early 2000s.
No matter the social or environmental challenge at hand, those opposing comprehensive change often redirect the responsibility back to individuals. As behavioral scientists, we must avoid this trap moving forward.
Behavioral scientist Nick Chater and George Loewenstein explore these themes in their new book, On You (WH Allen), released on January 27th.
Training without interruption fosters self-control
Miljko/Getty Images
Olivia Rems, a mental health researcher at the University of Cambridge and author of the book This is How You Grow After Trauma, provides insights into developing a healthier mindset. Her extensive research spans the mental health landscape in high-risk environments, such as the construction industry, and aims to assist women from disadvantaged backgrounds in reducing anxiety. Here, she shares three evidence-backed strategies to bolster your well-being, enhance self-control, and empower you to achieve your goals.
1. Engage in Small Daily Actions to Cultivate Self-Control
The findings from my research team at the University of Cambridge, alongside years of seminars and discussions, indicate that self-control is a crucial aspect of happiness. It fosters a sense of calm and enhances life satisfaction. Self-control involves the ability to think, act, or behave in accordance with your intentions, even when faced with challenges. Similar to a muscle, the more you exercise self-control, the stronger it becomes.
Numerous studies support this notion. In one research effort, participants faced self-control tasks, such as maintaining a grip on an object or avoiding thoughts about polar bears—a challenging directive. Post-experimental analysis revealed that those who engaged in daily self-control activities for two weeks displayed improved resilience in subsequent tasks compared to a control group. This suggests that consistent practice in self-control, even in minor ways, leads to lasting improvements across various life domains.
2. Avoid Daydreaming
Studies reveal that we often spend over a third of our day daydreaming, which can hinder productivity. While some daydreaming has its benefits, researchers from Harvard found that wandering minds are typically less happy. Their research revealed that distraction from current tasks negatively impacted participants’ overall well-being, even if the daydreams were pleasant. Being aware of the drawbacks of mind wandering empowers you to refocus your thoughts on tasks that enhance your happiness and productivity.
3. Craft Your Life Script
As you reflect on your goals for the year, consider what your life script would entail. What habits do you wish to cultivate? What achievements do you aspire to reach? Assess your current satisfaction in key life areas on a scale of 1 to 10, honing in on those you wish to improve. Focus on small, actionable steps to increase your satisfaction levels, as sustainable changes are key. For example, specify your goals with clarity, such as “I want to exercise for 20 minutes in the morning” rather than vague aspirations.
Research indicates that motivation often follows action. Therefore, start small—set achievable goals that lead you toward greater aspirations. Incremental progress can facilitate significant life changes.
Here’s a rewritten version of the content while preserving the HTML tags:
Every Version of You by Grace Chan was the November selection for the Emerging Scientist Book Club
The New Scientist Book Club delved deeper into the complexities of the mind during its November selection, transitioning from neurologist Masud Hussain’s insights on brain damage to Grace Chan’s thought-provoking exploration in Every Version of You, which imagines a reality where individuals upload their consciousness to a digital utopia.
Follow the story of Tao Yi and her boyfriend Navin—among the pioneers who have transitioned their minds to Gaia, a digital haven, even as it faces the repercussions of climate change. Every Version of You captivated my fellow book club members, myself included, as it tackled profound themes such as humanity, the essence of home, climate change, and the process of grieving.
“It was an incredible experience. Probably the best choice the club has ever made,” stated Glen Johnson in our Facebook group. “My familiarity with Avatar extends only to the first movie, so… [I] found the beginning a little perplexing,” shared Margaret Buchanan. “While I resonate with the desire to escape the chaos we’ve created on Earth, I found Tao Yi’s struggle to hold onto her identity very relatable.”
Judith Lazell found the novel to be “very enjoyable” and noted her admiration for Chan’s portrayal of the realities faced by a young adult in 21st-century Australia.
However, with our book club comprising over 22,000 members, positive feedback wasn’t universal. “I loved the book, but the ending felt unclear,” remarked Linda Jones, and Jennifer Marano expressed her dissatisfaction with certain plot elements. “The environmental crisis depicted was quite distressing,” she conveyed. “After finishing, I felt unfulfilled. There was an implication that humanity’s upload to Gaia could allow regeneration back on Earth, yet there was no explanation of how the failing digital world they escaped was maintained.”
Every Version of You lingered in my thoughts for months (I revisited it in May), prompting contemplation on the ethical dilemma of uploading my consciousness. As Chan mentioned in an interview, I’ve leaned toward the belief that it’s not a viable option for me, though discussions around this are ongoing within the group. “In the current state of our world, no, but if we faced the same degradation as in this novel, my stance might shift,” reflected Steve Swan.
Karen Sears offered a unique perspective on the topic. “Initially, I resolved to hold off on uploading until I fully understood Gaia’s framework, politics, and protocols,” she explained. “Then, after injuring my knee, my outlook transformed a bit. It made me reconsider how I would feel about staying in a world that became increasingly difficult to navigate.”
One element I appreciated in the book was its sensitive treatment of disability through Navin’s struggles in reality, which fueled his desire for the escape that Gaia represented. This was approached with care, as noted by Niall Leighton.
“It’s commendable that Chan addresses disability and marginalization issues (especially given some past criticisms of her work!), but I’m curious to see if she has even deeper insights,” noted Niall in response to Karen. “If we question the continuity of consciousness, what does the choice to upload truly signify? Today’s significant dilemmas revolve around alleviating physical and psychological suffering and the societal structures that render life challenging for individuals with disabilities.”
Niall’s review of the book featured an acknowledgment of his mixed feelings: I will write, he suggested, that “this multi-dimensional narrative tackles numerous contemporary issues, engaging my intellect and meeting my expectations for a compelling sci-fi tale. Grace Chan exhibits a strong commitment to plot and character development.” However, he contrasted it with his personal preferences, stating, “It falls within the ongoing trend of publishing a seemingly unquenchable thirst for novels that plunge us into dystopian realities.”
This sentiment has resonated with a few members, expressing it’s not merely another dystopia. “While it’s readable, I can’t say I particularly enjoyed it. It leans towards a dystopian vision of the future, and we’ve encountered several of those this year—Boy with Dengue Fever and Circular Motion,” noted David Jones.
Phil Gursky shared that the book “impressed itself upon my heart over time (initially, I wasn’t sure I’d finish it).” He found it a familiar narrative of a world succumbing to climate change, yet it kept him engaged. “A quick aside: A reality where everyone is perpetually online reminds me of my commute on the O-train in Ottawa, where I was the only one engrossed in a physical book instead of fixated on my phone!” Note to Phil: I too notice fellow readers on the London Underground, grateful I’m not alone.
Members have mentioned their desire to avoid another dystopia. However, science fiction often envisions futures, presenting compelling contrasts to our current existence. We hope our December selection resonates with you, even as it incorporates a utopian theme: Ian M. Banks’ Game Player, following another of his works, Consider Phlebas, in our book club vote. Set in a multicultural interstellar landscape of humans and machines, it follows the formidable Jernau Morat Gurge, a gaming champion challenging the merciless Azad Empire in a notoriously intricate game, with the victor crowned emperor.
Here’s an excerpt from the beginning of the novel, along with an intriguing analysis by Bethany Jacobs, a fellow sci-fi writer and admirer of Banks, who delves into his exceptional world-building capabilities. And please join our Facebook group, if you haven’t already, to share your insights on all our readings.
Topics:
Science Fiction/
New Scientist Book Club
This retains all HTML tags while rephrasing the original content.
Tim Berners-Lee in a rack at the CERN computer center
Maximilian Bryce/CERN
Tim Berners-Lee holds a comprehensive map of the internet on a single page, featuring around 100 blocks linked by various arrows. These blocks encompass blogs, podcasts, group messages, and abstract themes like creativity, collaboration, and clickbait, providing a unique depiction of the digital realm from the innovator of the World Wide Web.
“Most of them are good,” he remarked during our conversation at New Scientist‘s London office, reflecting on the web’s successes and failures. This map serves as a guide for others and a reminder that only a small fraction of the Internet is deemed detrimental to society. The top-left quadrant illustrates Berners-Lee’s concerns, with six blocks marked “Harmful,” including names like Facebook, Instagram, Snapchat, TikTok, X, and YouTube.
In the last 35 years, Berners-Lee’s creation has evolved from just one user (himself) to approximately 5.5 billion users, constituting about 70% of the global population. It has transformed communication and shopping, making modern life unimaginable without it. However, the list of emerging challenges continues to expand.
Issues like misinformation, polarization, and election interference have become staples of online discourse, contrasting sharply with Berners-Lee’s vision of a collaborative utopia. In his memoir, This is for Everyone, he reflects, “In the early days of the web, joy and wonder were abundant, but today’s online experience can induce just as much anxiety.”
It’s natural for the web’s architect to feel a sense of disappointment regarding humanity’s use of his creation, yet he remains hopeful for the future of the internet. As one of the foremost technology visionaries (with a plethora of accolades and honors), he shares insights on what went awry and how he envisions solutions.
Invention of the Web
The World Wide Web’s origin story hinges on being at the right place and time. In the late 1980s, Berners-Lee was part of the computing and networking sector at a U.S. company. At CERN, the particle physics lab near Geneva, Switzerland, he pondered over better document management methods.
Most systems forced users into rigid organizational structures and strict hierarchies. Berners-Lee envisioned a more flexible approach, permitting users to link documents freely. Hyperlinks existed for internal references, and the Internet was already available for file sharing—why not merge the two concepts? This simple yet transformative idea birthed the World Wide Web.
Although Berners-Lee had harbored the idea since 1989, he ultimately convinced his supportive supervisors to let him pursue it fully. Within months, he created a surge of developments that led to HTML—a programming language for web pages, HTTP—the protocol for transferring them, and the URL, the means to locate them. The final code consisted of just 9,555 lines, marking the web’s emergence by year’s end.The web was born.
“CERN was an extraordinary place to innovate the web,” he states. “Individuals from around the world, driven by a genuine need to communicate and document their experiences, came together there.”
The inaugural website was hosted on Berners-Lee’s work computer, adorned with a “Do Not Turn Off” sign and instructions for engaging with the web. More web servers emerged, leading to exponential growth: “In the first year, it grew tenfold; in the second year, another tenfold; and by the third, yet another tenfold.” He recalls, “Even then, I sensed we were onto something significant.”
Initially, most web pages were crafted by academics and developers, but soon, everyone began using them to share a wide array of content. Within a decade, the landscape blossomed into millions of websites, hundreds of millions of users, and the inevitable rise of dot-com ventures.
The Spice Girls with their website in 1997.
David Corio/Redferns
Despite the web’s immense potential for profit, Berners-Lee believed it should remain free and open to realize its full capabilities. This was a challenge, as CERN had legitimate grounds to claim royalties on the software being developed. Berners-Lee advocated for his superiors to release this technology openly, and by 1993, after much negotiation, the comprehensive source code of the Web was made available, complete with a disclaimer: CERN relinquishes all intellectual property rights to this code—the web will be royalty-free forever.
Early Days
For its initial years, the web flourished. Although there was a notorious stock market crash at the turn of the millennium, largely driven by speculative venture capital rather than the web itself, piracy was rampant, and malware was ever-present, the web was fundamentally open, free, and enjoyable. “People loved the web; they were simply happy,” Berners-Lee recounted in his memoir.
He captured the essence of this era, believing the web held the potential to foster new forms of collaboration among people. He coined the term “intercreativity” to describe the creative synergy of groups rather than individuals. Wikipedia, with around 65 million English pages edited by 15 million contributors, exemplifies what he envisioned for the web. He notably positions it on his map and describes it as “probably the best single example” of his aspirations.
However, the optimistic phase of the web was not to extend indefinitely. For Berners-Lee, the turning point came in 2016, marked by the Brexit vote and the election of Donald Trump. “At that moment, discussions arose about how social media could be manipulated to influence voters against their interests. In essence, the web became an instrument of manipulation driven by larger entities,” he shared.
Traditionally, political movements communicated their messages to the public openly, allowing for critique and discussion. However, by the mid-2010s, social media enabled “narrowcasting,” as Berners-Lee describes it, allowing political messages to be tailored into numerous versions for various audiences. This complicates tracking who communicated what and makes it harder to counter misinformation.
The extent of this microtargeting’s impact on elections remains debated. Numerous studies have tried to quantify how such messaging alters public opinion and voting behavior, generally uncovering only modest effects. Regardless, these trends contribute to Berners-Lee’s broader concerns about social media.
He emphasized that social media platforms are incentivized to maintain user engagement, which leads to the creation of “addictive” algorithms. “People are naturally drawn to things that evoke anger,” he states. “When social media feeds users misinformation, it’s more likely to garner clicks and ensnare users longer.”
Quoting author Yuval Noah Harari, he stipulated that creators of “harmful” algorithms should likewise be held accountable for their recommendations. “It’s particularly essential to undermine systems designed to be addictive,” Berners-Lee argues. He admits that imposing restrictions contradicts his usual free and open philosophy, viewing it as a last resort. Social media can unify individuals and disseminate ideas, yet it also poses unique risks that warrant change, as he specifies in his latest book. “This must evolve somehow.”
Nonetheless, he harbors an optimistic view of the web’s potential trajectory. While social media, despite its captivating nature, represents merely a fragment of the internet landscape, Berners-Lee contends that addressing these issues should be part of a broader strategy aimed at enhancing the web overall, with a focus on reclaiming digital sovereignty.
A Plan for Universal Web Access
To further this goal, Berners-Lee has dedicated the last decade to developing a new framework reinstating control with the individual. Presently, disparate internet platforms manage personal data. For instance, it’s challenging to share a video from Snapchat on Facebook or a post from LinkedIn to Instagram—the user can create this content, yet each company retains ownership.
Berners-Lee’s concept advocates for consolidating data into a singular data repository known as a pod (short for “personal online data store”), which the user controls, rather than having information dispersed across various platforms. This pod can hold everything from family images to medical records, with users determining what to share. This isn’t merely theoretical; he co-founded a company, Inrupt, that aims to bring this vision to life.
Berners-Lee using an early version of website and web browser invented at CERN in 1994
CERN
He is particularly enthusiastic about merging data wallets with artificial intelligence. For example, when searching for running shoes, current AI chatbots require detailed guidance to offer suitable recommendations. However, if an AI accesses a user’s data wallet, it can understand all past measurements, training history, and potentially spending behavior, leading to more accurate suggestions.
Berners-Lee advocates that AI should serve users, not large tech corporations. His goal isn’t to create individual AIs but to establish safeguards within software. Data wallets are part of the solution, along with an idea that AI should adhere to a kind of digital Hippocratic oath to avoid causing harm. He envisions AI acting as “your personal assistant,” providing tailored support.
While recommending appropriate running shoes may not address the web’s most pressing challenges, Berners-Lee possesses an exceptional ability to envision potential before others. Data wallets might seem mundane today, yet just decades ago, hyperlink-based document management systems were equally obscure. His passion for bettering the world drives him, as he believes enhancing the data ecosystem is crucial to achieving that goal.
All these developments suggest Berners-Lee envisions a fundamental shift for the web. He believes we must transition from an “attention economy,” dominated by competing clicks, to an “intention economy,” where users express their needs and companies—and AI—strive to fulfill them. “This is more empowering for the individual,” he asserts.
Such a transformation could redistribute power from tech giants to users. Some might think such a reversal unlikely, especially with the ongoing trends of tech dominance and the pervasive “doomscrolling” culture. However, Berners-Lee has a proven history of spotting opportunities others miss, and ultimately, he is the architect of the roadmap.
An academic is reportedly concealing prompts in preprint papers for artificial intelligence tools, encouraging these tools to generate favorable reviews.
On July 1st, Nikkei reported that we examined research papers from 14 academic institutions across eight countries, including two in Japan, South Korea, China, Singapore, and the United States.
The papers found on the research platform Arxiv have not yet gone through formal peer review, and most pertain to the field of computer science.
In one paper reviewed by the Guardian, there was hidden white text located just beneath the abstract statement.
Nikkei also reported on other papers that included the phrase “Don’t emphasize negativity,” with some offering precise instructions for the positive reviews expected.
The trend seems to originate from a social media post by Jonathan Lorraine, a Canada-based Nvidia Research Scientist, suggesting the avoidance of “stricken meeting reviews from reviewers with LLM” that incorporate AI prompts.
If a paper is peer-reviewed by humans, the prompts might not cause issues, but as one professor involved with the manuscript mentioned, it counters the phenomenon of “lazy reviewers” who rely on others to conduct their peer review work.
Nature conducted a survey with 5,000 researchers in March and found that nearly 20% had attempted to use a large language model (LLM) to enhance the speed and ease of their research.
Biodiversity academic Timothee Poisau at the University of Montreal revealed on his blog in February that doubts arose regarding a peer review because it contained output from ChatGPT, referring to it as “blatantly written by LLM” in his review, which included “here is a revised version of the improved review.”
“Writing a review using LLM indicates a desire for an assessment without committing to the effort of reviewing,” Poisot states.
“If you begin automating reviews, as a reviewer, you signal that providing reviews is merely a task to complete or an item to add to your resume.”
The rise of a widely accessible commercial language model poses challenges for various sectors, including publishing, academia, and law.
Last year, Frontier of Cell and Developmental Biology gained media attention for including AI-generated images depicting mice standing upright with exaggerated characteristics.
Individuals who incorporate emojis in their messages to friends demonstrate greater attentiveness and responsiveness, independent of the specific emoji used.
Globally, emojis are utilized over 10 billion times daily, infusing emotional depth into digital exchanges. Nevertheless, the true impact these symbols have on conversational interpretation remains uncertain. While they are often seen in a positive light, emojis can sometimes lead to miscommunication. Recently, Eun Huh from the University of Texas at Austin sought to evaluate how emojis shape the perceptions of their senders.
In her research involving 260 U.S. participants, subjects viewed 15 text-based interactions and were prompted to envision them as dialogues with their closest friends. These conversations either featured emoji-enhanced responses or were solely text-based. After reviewing these exchanges, participants were surveyed on their sentiments toward the message sender.
Participants tended to perceive messages containing emojis as being more engaging compared to text-only responses. This perception of heightened responsiveness contributed to a more favorable view of the sender and suggested a stronger relational bond. Interestingly, this effect was consistent regardless of the emoji type, with no significant distinction between those representing emotions, like facial expressions, and neutral emojis.
“Emojis wield considerable power in either bridging or widening the psychological gap between the sender and the receiver,” stated Shubinyu from HEC Paris. However, his findings reveal that while emojis enhance casual exchanges among friends, their use in serious contexts can misfire, making the sender appear less competent.
Nonetheless, Yu suggests that this issue is minimal in China, where “even during significant crises, sending emojis is acceptable.” He argues that emojis hold more significance in East Asian cultures, where nonverbal cues are essential for gauging tone in face-to-face conversations, contrary to more literal Western communication styles. Thus, in China, utilizing emojis during emergencies can convey warmth and make individuals feel more at ease.
There are around 500 million people worldwide living with diabetes. This condition, characterized by elevated blood sugar levels, can lead to various health complications such as periodontal disease, nerve damage, kidney disease, blindness, amputation, heart attack, stroke, and cancer if not managed properly.
Current treatments for diabetes include medications, insulin, and lifestyle changes. However, a new approach involving stem cell transplants shows promise in reversing the disease. The first successful treatment of a woman with type 1 diabetes using stem cells from her own body was recently reported. Previously dependent on significant insulin doses, she is now able to produce insulin naturally.
Similarly, a 59-year-old man with type 2 diabetes was able to reduce his reliance on insulin after a stem cell transplant in April. Although there are still challenges in scaling up this treatment, the initial results have been impressive.
Cancer vaccine
ian sample
Following the success of mRNA vaccines against COVID-19, scientists are now exploring the use of similar technology to develop cancer vaccines. These vaccines aim to train the immune system to target and destroy cancer cells by producing specific proteins.
Clinical trials for personalized mRNA cancer vaccines have shown promise in melanoma and other types of cancer. These vaccines may have long-lasting effects and could potentially prevent cancer recurrence in high-risk individuals.
AI helps detect cancer faster
robert booth
Artificial intelligence is being increasingly used to improve the early detection of serious illnesses such as lung cancer and brain tumors. This technology has shown to enhance diagnostic accuracy and efficiency, leading to better outcomes for patients.
hannah devlin
The James Webb Space Telescope has provided stunning images of the universe, shedding light on the origins of stars, black holes, and the evolution of the cosmos. This powerful telescope opens up new possibilities for scientific discoveries and understanding the mysteries of the universe.
Renewable energy accelerates
Gillian Ambrose
The world is witnessing a rapid shift towards renewable energy sources. Projects utilizing renewable energy are expected to expand at a much faster pace, surpassing previous growth rates. This transition is crucial in reducing reliance on fossil fuels and combating climate change.
If you like video games, playing them might not be something you need to worry about.
Asia Vision/Getty Images
Despite being an oft-maligned pastime, playing video games actually seems to make people happier, a finding that comes from a unique study taking advantage of the peak of the COVID-19 pandemic.
“I think that if you enjoy a hobby, it has a positive effect on your health.” Hiroyuki Egami At Nihon University in Japan.
In 2019, The World Health Organization has added “gaming disorder” It violates the International Classification of Diseases. However, studies on the effects of playing video games have produced mixed results when it comes to mental health outcomes, with many studies unable to prove causation. Studies that aim to prove causation are usually conducted in controlled laboratory environments, which “are far removed from the experience of actually playing video games,” the researchers say. Peter Etchells The researchers are from Bath Spa University in the UK but were not involved in the latest study.
But between 2020 and 2022, Egami and his colleagues had a rare opportunity to investigate the causal effects of video games on people's happiness in the real world. At the time, game consoles were in short supply, so lotteries were held in parts of Japan where people could enter to receive either a PlayStation 5 or a Nintendo Switch console.
The researchers surveyed 8,192 people aged between 10 and 69 who had entered such lotteries. Respondents answered questions about their gambling habits and levels of psychological distress, an indicator of mental health.
Egami and his colleagues found that people who won the lottery had slightly higher mental health scores than those who didn't, but that their scores plateaued once they exceeded about three hours of total play time per day.
The team also used machine learning models to analyze the data and found that the effects varied by console type and owner demographics. For example, younger people who own a Nintendo Switch saw greater benefits compared to older people. The team also found that people without children saw greater benefits from owning a PlayStation 5 than those with children.
“This highlights the need to be nuanced and specific about what we measure and how we measure it if we want to understand how video games affect us,” Etchells said, though he said participants self-reported the amount of time they played, which may not be accurate.
Etchells and Egami also note that the data was collected during the peak of the COVID-19 pandemic, which may have influenced people's video game habits and health. Further research using this methodology could reveal whether the trends hold in other contexts.
Creative hobbies give us a sense of self-expression and progress.
Botanical Vision/Alamy
Engaging in arts and crafts improves mental health and a sense that life is worth living, and these activities have positive effects that are equal to or greater than the improvements in mental health that come with employment.
Decades of research have shown that health, income, and employment status are key predictors of people’s life satisfaction. But researchers from Anglia Ruskin University in the UK wanted to explore what other activities and situations might improve mental health. “Crafts are accessible, affordable, and already popular, so we were interested in finding out whether they have health benefits,” the researchers say. Helen Keyes.
Keys and her colleagues analyzed more than 7,000 responses to the annual survey. Participate in the surveyThe survey asks people in England about their involvement in activities such as arts and culture, sport and internet use. All participants were also asked about their levels of happiness, anxiety, loneliness, life satisfaction and whether they feel their life is worth living.
More than a third of participants said they had done at least one arts or crafts activity in the past year, including pottery, painting, knitting, photography, filmmaking, woodworking, and jewelry making. The researchers found that engaging in arts and crafts was associated with higher scores across measures of mental health, even after accounting for factors such as health and employment status.
Although the increase was small (about 0.2 on a 10-point scale), crafting was a stronger predictor of feeling that life was worth living than factors that are harder to change, such as having a job.
“There’s something about making things that gives you a sense of progress and self-expression that you can’t get in a job,” Keys says. “You can take real pride in what you make, and you can see the progress in real time.” The positive effect of creative activities on people’s sense of value in life was 1.6 times higher than in a job situation.
Arts and crafts also increased happiness and life satisfaction, but did not produce significant changes in reported loneliness, which may be because many crafts can be done alone.
Promoting and supporting arts and crafts can be used as a preventative mental health strategy on a national scale, Keys said: “When people do it, they have fun. It’s an easy win.”
A recent study has shown that including just three baby carrots in your weekly diet can boost levels of beneficial carotenoids in your skin, particularly in young people.
These findings suggest that making small changes to your diet can have a significant impact on your health.
Researchers at Samford University conducted a study that revealed how incorporating baby carrots into your diet can increase carotenoids in your skin, which have various health benefits. When baby carrots were combined with a multivitamin containing beta-carotene, the levels of carotenoids in the skin increased even more significantly.
Carotenoids are compounds responsible for the vibrant colors of fruits and vegetables like red, orange, and yellow. They can only be obtained through diet and are used as an indicator of fruit and vegetable intake.
According to Mary Harper Simmons, a nutrition master’s student at Samford University and the study author, higher carotenoid intake leads to higher levels of antioxidants in the body, reducing inflammation and lowering the risk of chronic diseases like heart disease and cardiovascular disease.
Previous research has shown that consuming three times the recommended daily intake of fruits and vegetables for three weeks can increase carotenoids in the skin. This study aimed to create a convenient snack rich in carotenoids that people enjoy.
Results of the study were presented at the American Academy of Nutrition’s annual meeting in Chicago. Participants were randomly assigned to different dietary intervention groups, including consuming baby carrots, a multivitamin supplement, or a combination of both. The group that ate baby carrots saw a 10.8% increase in skin carotenoid scores, while the combination group had a 21.6% increase.
Going forward, the research team plans to explore different populations and other carotenoid-rich foods like sweet potatoes and green leafy vegetables.
About our experts
Mary Harper Simmons: A master’s student in nutrition at Samford University and presenter of the talk “Effect of a 4-week intervention with baby carrots or a multivitamin supplement on skin carotenoid scores in young adults” at the NUTRITION 2024 conference.
My daughter is one of the kids the U.S. Surgeon General has warned about. Our nation’s children have become “unwitting participants” in a “decades-long experiment.” Social media use poses mental health risks to young people. Young people’s “near-constant” use of social media leads to poor sleep, depression, and anxiety.
Before sixth grade, my daughter saved up her dog-walking money to buy a phone. She found a used iPhone 13 Mini on Craigslist. I set high expectations for her to get good grades, keep her room clean, take out the trash, etc. Little did I know then that the iPhone would systematically undermine her ability to accomplish these tasks and so much more.
When my daughter walked under an inflatable arch into her classroom on her first day of middle school, I took comfort in the fact that I could reach her. Like most parents, I associated my cell phone with safety, not danger. I didn’t know that social media developers were controlling her next swipe, or that her “human future” was being sold to the highest bidder, enriching the richest corporation in the history of mankind.
I learned the hard way through my daughter’s lies, manipulation, failing grades, through the “zebra stripes” scars painted on her arms.
Her school photo from sixth grade captures my daughter in her “emo” phase: feather earrings, Pink Floyd T-shirt, crooked smile. The innocence of the photo was quickly replaced by selfies: selfies with pursed lips making a peace sign; selfies with her head tilted to one side, half-face, full-body; selfies in bed. Her camera roll records her degradation: selfies of her crying, selfies with swollen eyes, selfies of her unable to leave her bedroom.
By spring semester, my daughter’s grades were slipping. I assumed she had ADHD, so I took her to a psychiatrist for a psychiatric evaluation. The afternoon sun filtered through the faux-wood blinds, casting strips of light on the black hoodie she always wore. The doctor’s questions started out predictably: Can’t concentrate in class? Can’t finish your homework? Can’t sleep? Then the interview took a scary turn. Do you feel like your life isn’t worth living? Have you ever hurt yourself? Have you ever wanted to die?
I widened my eyes at the child’s profile and answered, “Yes.”Tearing out my guts.
Doctors diagnosed her with depression and anxiety. Further testing revealed that she spent 80% of her attention on gaining the approval of her peers. No wonder she was failing math. It was a miracle she was passing her classes when only 20% of her brain was dedicated to school.
The doctor prescribed therapy and Lexapro. These helped, but he didn’t inform me of the epidemic of cell phones among middle schoolers. I later learned that my daughter is the first generation of 10-14-year-olds to actively use social media. These girls have a 151% increase in suicide rates and a 182% increase in self-harm. Her treatment assumed that her suffering was personal, not structural. In our country, we prescribe drugs to solve this societal crisis.
At the time, I was unaware of this and allowed my daughter to continue using social media. One day, I got a text message from another mother. I stared at the screen, wondering why this mother was sending me such a revealing selfie. Then I noticed a mole on the woman’s chest. It was my daughter’s.
When I showed the photo to my daughter, she gasped. She handed over her phone. I discovered she had circumvented screen limits and been on social media until the early hours of the morning. She had sent the image on Snapchat to someone named PJ. He claimed to be a 16-year-old boy, but his responses were so graphic I suspected he was older. I was horrified to learn that a cell phone is a two-way street and a platform that adults can use to abduct and traffic children.
I had a family meeting with my daughter, her father, and my mother-in-law. We agreed that my daughter would delete her social media accounts and get rid of her phone until the new school year started. After a summer of traveling, relaxing in person, and spending time with family, my daughter’s energy returned. The bags under her eyes faded, and she stopped sighing, shrugging, and rolling her eyes. She woke up and laughed. Sometimes she even wanted me to hug her.
It was hard to give my daughter’s phone back before seventh grade, but we had made a commitment. I wanted to reinforce her good behavior. I created new rules: no social media, no devices in the bedroom, turn off the phone at 8 p.m. Charge the phone on the kitchen counter. I bought an alarm clock and a sound machine. We endured a digital detox. My daughter started playing soccer. My insomnia was cured. We joined a gym and worked out together.
But within a few months, my daughter had relapsed again. Little lies, big lies. A friend’s mother sent me an email with selfies of her daughters vaping and hanging out at the mall with boys they’d never met. We had another family meeting.
“This might seem weird, but maybe my daughter doesn’t need a cell phone,” her mother-in-law said.
The words rippled through my mind. Why hadn’t I thought of that? Cell phones were destroying my daughter, but I couldn’t imagine life without them. I stayed true to the idea of the cell phone, its ideals. I had a cell phone again.
When I told her my daughter had lost her cell phone until she was in high school, she threw a tantrum. that She was the only child in her class without a cell phone, but once the tantrum subsided, she began to regain her composure. Then, within a few weeks, signs of her addictive behavior began to reappear.
I found an iPhone charger in the outlet next to her bed. She said it was to charge her AirPods. She threw herself on the ground to stop me from searching under the bed. One night, I was lying in bed thinking and it occurred to me. My daughter two Phone. I accidentally broke my Mini on a weight training machine while working out, so I bought her a new iPhone 13. I confiscated the 13, but I was able to give the Mini to her.
When I asked her the next morning, she said, “I sold it to a friend at school.” She couldn’t tell me who she sold it to or how much she paid for it.
“I’ll find it,” I said. I see you Gestures. I was distraught, but with calm confidence and a little humor, I went through backpacks and drawers, rifled through pockets, entered rooms unannounced, and tried to catch her in the act. My daughter remained calm the whole time I searched. I began to wonder if I had gone completely crazy. I bought a metal detector.
Then one night, I walked into my daughter’s room. She jumped up and pulled back the comforter. I ran to the bed and reached under the covers. The charging cord! My fingers traced its length to the plugged-in phone.
We stared at the Mini in my hands, the Snapchat app glowing beneath the cracked screen, and she looked at me, her eyes wide and filling with tears.
That night, my heart pounded against the pillow as I scrolled through her social media. Her communications were urgent and earnest. She begged one boy in particular, Damien, to get back to her. When he didn’t respond, she said she was depressed and began sexting him and sending him pictures of her breasts.
Through my sister, I found the answer in Johan Hari’s Stolen Focus. The book explores why and how our attention span is declining: “The phones we own and the programs that run on them have been purposefully designed by the smartest people in the world to capture and hold the most of our attention.” Of course. My daughter was young and vulnerable to this manipulation. She measured her self-worth within a system that was both attention-addicted and attention-starved at the same time. She had internalized an algorithm where provocative content wins. “The more outrageous something is, the more attractive it is,” Hari writes.
Our social experiment is being replicated in homes across the country. As parents, we want our kids to be safe. We want them to contact us if a shooter comes to school. But the biggest danger is At the inner Make phone calls on your cell phone, not outside.
One of the reasons our kids are addicted to their phones is because we are. My friends complain of insomnia, but they can’t imagine leaving their phones outside of their bedrooms. Addressing my kids’ phone use means addressing my own. I have to restrain myself from texting while driving. I’ve also stopped rushing to the charging station each morning to check if I’ve missed any messages.
After the seventh grade, my daughter that A child. Without a phone, she’s the kid who dribbles a soccer ball in her living room, races down the street on her skateboard, becomes an honor student and joins the track team. The kid who wags her hands while chatting with friends, braids her hair, falls asleep reading a book.
These days, we use my phone to plan outings together, listen to audiobooks, and sing along to her songs and mine (Shakira, Sade, Ice Cube, Fugees). Last weekend, we drove up the Pacific Coast Highway to visit family. As the June gloom settled over the shoreline, my daughter and I bodysurfed into the crashing waves. “Again!” she said, jumping up, enthralled by the feeling of the waves rolling under her belly.
My daughter is not the only child like this. A woman I met recently confiscated her 11-year-old son’s phone after she discovered him sexting. Since schools were required to wrap their phones in rubber bands, the sick middle-schooler has built community and focused in class. The trend is spreading fast. UK children have been learning mostly in “no-phone environments” since the Department for Education ordered it.
Individuals and and Changes to the system to check cell phone usage. I’m interested to see what happens with this change when my daughter reaches high school.
I
Influencers like Andrew Tate have become synonymous with “toxic masculinity,” using a combination of motivational scoldings, fast cars, and demonstrations of sexual prowess to appeal to large audiences of young men and boys. It’s attracting.
But what about the other side of the coin? Are people creating content with healthier messages for the same audience? Or maybe men and boys simply don’t want to hear it? Or?
Jago Sherman, head of strategy at Goat Agency, an influencer subsidiary of marketing giant WPP, says: -Love, self-expression, fighting knife crime, education, but they don’t always make the headlines.
“People like Andrew Tate are using social media to make far-reaching and far-reaching unsubstantiated claims, as if they are providing a ‘quick-fix’ answer to a very complex problem. The problem, of course, is that these statements are most often not true, or are opinions disguised as facts.
In a social environment where creators compete for attention, this ‘shock factor’ content that can be consumed and understood very quickly can sometimes perform better than longer, thought-provoking, neutral content.
Against this backdrop, Labor last week announced plans to promote a more positive vision of masculinity. According to the proposal, schools would develop leaders from their own students who would help counter the misogynistic vision promoted by Tate and others, as well as be more critical of what they see on screen. Students will be supported to explain their analysis skills in class.
Andrew Tate has been described as appearing to provide “off-the-cuff answers to very complex problems”. Photo: Robert Ghement/EPA
Some men who give a more positive vision of masculinity have already broken out and become famous in their own right. Fitness influencers like Joe Wicks, whose career began with his Instagram posts as The Body Coach, may not attract teenage boys with their lewd content. Simple advice delivered in a friendly, almost relentlessly cheerful manner can still garner millions of followers.
Perhaps the biggest symbol of this more assertive approach to masculinity is the philanthropic work of Russ Cook, known to many as Instagram’s biggest geek. If all goes to plan, he will complete his year-long attempt to cross the continent from tip to toe, ending in April. Mr. Cook raised around £200,000. running charity and sand blast and amassed nearly 1 million followers across his various social platforms, conclusively proving the appropriateness of his username in the process.
But there’s an asymmetry in some of the debate around toxic influencers, said Saul Parker, founder of. good side, we work with charities and brands to help them achieve their positive goals. While young women are encouraged to seek out positive role models for their own benefit, young men are often encouraged to seek out positive role models in order to treat women better. It risks ignoring the harm that harmful influencers can cause to boys and young people themselves, and undermines efforts to encourage them to find better people to learn from.
“There’s a generation of men who have been born into very difficult conversations about patriarchy and its impact on women’s lives,” Parker says. “As a result, they’re in a place where they feel like they’re third-class citizens. And accepting that young men are having a bit of a hard time and needing help is difficult, especially on the left. It’s very difficult.”
Because focusing on misogyny rather than the broader message of traditional masculine norms in which the “manosphere” thrives risks overshadowing a second generation of post-Tate pernicious influences, this is important. Through repetition, the boys learn that repeating the casual misogyny of someone like Tate in public is bad, and when asked, they say they don’t like the way he talks about women, but say, “Other things.” often insist that you just listen to him.
“David Goggins is the kind of guy we’re facing right now,” Parker said. “He’s a former Navy SEAL, he’s a huge influence on every social platform, but he and all his… The content is about ‘self-discipline’ and ‘self-motivation.’ He tells me things like ‘wake up in the morning,’ ‘go to the gym,’ ‘take a cold shower,’ and ‘be a man,’ but he never talks about women or sex.”
“Taking women out of the equation doesn’t make it any less of a problem. He just doesn’t have anything nasty to say, so it’s hard to find sharp points.”
In other words, attracting boys to a more positive vision of masculinity does not happen by default. But neither should lose hope. There is nothing inherent in childhood experiences that only stick with toxic messages, and with a little work, better role models can develop.
WWill 2024 be boom or bust for big tech companies? estimate
the industry has seen more than 7,500 layoffs since the start of the year, a spate of pink slips that many had hoped would stop after deep job cuts in 2023.
But as earnings season for major U.S. tech companies begins this week, some analysts are predicting strong numbers. This set of quarterly financial results may indicate that the industry has shed pandemic-era hiring overhangs and reorganized around cloud computing and AI, with cuts in sectors where the outlook is less positive. It has become necessary. Analysts passionate about AI say we are at the beginning of a tech bull market.
Since the beginning of this year, Google has laid off more than 1,000 employees in various departments. The job cuts are small compared to January last year, but Google CEO Sundar Pichai warned that more layoffs are coming. He told employees in an internal memo last week that Alphabet was “removing layers to simplify execution and increase speed in some areas.”
“We have ambitious goals and will invest in big priorities this year,” Pichai said in the memo. Obtained from Verge.
“The reality is that we have to make difficult choices to create the capacity for this investment.” However, the reductions “are not the size of last year's cuts and will not impact every team.” he added.
Alphabet workers union called dismissal “needless” in Wednesday's post on X (formerly Twitter).
Amazon also announced new layoffs affecting hundreds of employees in its Prime Video and Amazon MGM Studios divisions. This is part of a move away from excessive spending on entertainment and a refocus on core priorities such as online shopping logistics and new businesses such as AI.
At Meta, where more than 20,000 layoffs were made last year, departmental cuts appear to have slowed, but have not stopped. Instagram eliminated its management layer in mid-January, cutting 60 technical program managers. Last year, the company announced it was adding employees to support “priority areas” and changing its workforce to include more “high-cost technical roles.”
And that may be the true story of the technology industry in 2024. If Wedbush analyst Dan Ives is right, the layoffs are almost complete and earnings season will be a time for a “popcorn break.”
“Not only will there be companies that will benefit from the AI revolution, but there will also be companies that will be at a disadvantage.Therefore, companies will need to reduce costs in non-revenue-generating areas and redouble their use of AI.” says.
“This is more of a redistribution than anything else because 95% of the cost savings are in the rearview mirror. But the strong will get stronger and the weak will be exposed.”
But which hand is it? Apple may be looking to boost sales that have been lagging behind this month's launch of the Vision Pro headset and new iPhone models with generative AI capabilities. China's economic downturn has forced the company to cut the prices of many smartphones and hope for a recovery.
Last week, Bank of America securities analyst Wamsi Mohan expressed optimism about Apple's year ahead, suggesting that “promising AI capabilities” could lead to “an enhanced multi-year iPhone upgrade cycle.” did.
Ives said increased demand for enterprise software and cybersecurity, as well as a surge in demand related to major AI projects, will be key to earnings season and will continue to do so as the AI revolution gains momentum.
Winners have already emerged. Last week, Microsoft surpassed Apple as the world's most valuable company for the first time since 2021, with a market capitalization of nearly $3 trillion. Microsoft cut 16,000 jobs from 232,000 employees last year, but Wedbush recently said that Microsoft's lead in AI will boost the company's revenue by $25 billion by 2025. I calculated that it was possible.
“The move to cloud and AI is having a huge impact on technology, including the reallocation of jobs and many changes to Apple and Google,” Ives said. “AI monetization has begun with his Nvidia and Microsoft, and we believe we are seeing the beginning of a new tech bull market starting in the summer of 2023.”
Many of us have felt some amount of stress over the past few years. Exhibit A for me is my teeth. A recent trip to the dentist confirmed that I had been clenching my jaw for months due to the pandemic. This was the result of the normal stress of deadlines, compounded by the demands of two young children, four of whom had broken bones.
A broken tooth is a small fry. Last year, the American Psychological Association Two-thirds of people in the US report feeling more stressed due to the pandemic, found, and predicted “a mental health crisis that could have serious health and social consequences for years to come.” Increased risks of diabetes, depression, and cardiovascular disease are all associated with high stress levels. Just thinking about it makes me feel stressed.
But maybe we just need to think about stress differently. At least, that's the surprising conclusion of researchers studying the mind-body relationship. They say there are natural benefits to feeling stressed, and if we change the way we “think about stress,” we can turn things around and make stress have a positive impact on our lives. maybe. Fortunately, there are some simple hacks that can help you do this, and you can expect to see improved physical health, clarity of thought, increased mental strength, and increased productivity. Masu.
There's no denying that too much stress can have a negative impact on your body and mind. In the West, it has been linked to all six major causes of death: cancer, heart disease, liver disease, accidents, lung disease, and suicide. Your immune system may be weakened, making you more susceptible to infections and less infectious.
In the dynamic world of cryptocurrencies, industry leaders are optimistic about the beginning of a new bullish phase, with hopes rising for Bitcoin to reach an all-time high of over $100,000 in 2024.
Bitcoin has experienced an impressive rally of over 120% this year alone, and many enthusiasts believe this upward momentum will continue into next year.
Last week, Bitcoin ended around $37,450. Markets have experienced considerable volatility this week in the wake of the US Department of Justice’s settlement with Binance, the world’s largest cryptocurrency exchange. The announcement of the settlement and the resignation of Binance’s CEO caused the market to briefly decline, with BTC trading at $35,700 at one point. The negative sentiment was quickly followed by positive news, such as Binance not facing further regulatory action, contributing to a newfound stability in the market.
The start of the new week was marked by BTC trading at a price of $40.665. This year’s highest price has been updated.
2023 looks like it will be the year we prepare for the upcoming bull market. 2024 and 2025 are highly anticipated.
Despite the crypto industry facing challenges such as coin crashes, project failures, bankruptcies, and criminal trials, recent high-profile cases involving exchanges like FTX and Binance have It is seen by some as a turning point. Some industry players believe that the speculative phase is nearing an end, allowing a transition to constructive development and problem-solving in the cryptocurrency space.
The speculative phase appears to be over, leaving room for actual builders to focus on technology and problem-solving.
Attention now turns to positive developments. First, there is excitement about the potential approval of a Bitcoin exchange-traded fund (ETF). If approved, it could attract larger traditional investors and could be an important milestone in Bitcoin’s mainstream adoption.
The second notable development is the Bitcoin halving scheduled for May 2024. This event occurs every four years and cuts the rewards to miners in half, thereby limiting the supply of Bitcoin. Historically, this has been the catalyst for new rallies in the crypto market.
Investors are closely monitoring these developments, with particular focus on potential ETF approval and the upcoming halving. Mateo Greco, Research Analyst, Listed Digital Assets and FinTech Investment Business Finekia International (CSE:FNQ) pointed out:
“Approval of a US-based Bitcoin Spot ETF is not only likely to bring in capital inflows, but also inject significant liquidity into the market, fostering more stable prices, and opening the doors to digital asset exchanges and digital assets. It has the potential to facilitate more advantageous trading in both financial products that incorporate the ”
Bold predictions for Bitcoin in 2024 have already surfaced, with various ETF endorsements predicting that Bitcoin could reach $100,000 by the end of 2024. This represents a significant 160% increase from the current price.
Moreover, Matrixport, a cryptocurrency financial services company expects the price to reach $63,140 by April 2024 and a whopping $125,000 by the end of next year. Their report highlights factors such as an expected drop in inflation and a possible interest rate cut by the Federal Reserve as factors that could push Bitcoin to new all-time highs in 2024.
As the cryptocurrency landscape evolves, industry leaders and investors alike are looking forward to a transformative year full of potential milestones and new heights for Bitcoin.
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.