Tim Berners-Lee in a rack at the CERN computer center
Maximilian Bryce/CERN
Tim Berners-Lee holds a comprehensive map of the internet on a single page, featuring around 100 blocks linked by various arrows. These blocks encompass blogs, podcasts, group messages, and abstract themes like creativity, collaboration, and clickbait, providing a unique depiction of the digital realm from the innovator of the World Wide Web.
“Most of them are good,” he remarked during our conversation at New Scientist‘s London office, reflecting on the web’s successes and failures. This map serves as a guide for others and a reminder that only a small fraction of the Internet is deemed detrimental to society. The top-left quadrant illustrates Berners-Lee’s concerns, with six blocks marked “Harmful,” including names like Facebook, Instagram, Snapchat, TikTok, X, and YouTube.
In the last 35 years, Berners-Lee’s creation has evolved from just one user (himself) to approximately 5.5 billion users, constituting about 70% of the global population. It has transformed communication and shopping, making modern life unimaginable without it. However, the list of emerging challenges continues to expand.
Issues like misinformation, polarization, and election interference have become staples of online discourse, contrasting sharply with Berners-Lee’s vision of a collaborative utopia. In his memoir, This is for Everyone, he reflects, “In the early days of the web, joy and wonder were abundant, but today’s online experience can induce just as much anxiety.”
It’s natural for the web’s architect to feel a sense of disappointment regarding humanity’s use of his creation, yet he remains hopeful for the future of the internet. As one of the foremost technology visionaries (with a plethora of accolades and honors), he shares insights on what went awry and how he envisions solutions.
Invention of the Web
The World Wide Web’s origin story hinges on being at the right place and time. In the late 1980s, Berners-Lee was part of the computing and networking sector at a U.S. company. At CERN, the particle physics lab near Geneva, Switzerland, he pondered over better document management methods.
Most systems forced users into rigid organizational structures and strict hierarchies. Berners-Lee envisioned a more flexible approach, permitting users to link documents freely. Hyperlinks existed for internal references, and the Internet was already available for file sharing—why not merge the two concepts? This simple yet transformative idea birthed the World Wide Web.
Although Berners-Lee had harbored the idea since 1989, he ultimately convinced his supportive supervisors to let him pursue it fully. Within months, he created a surge of developments that led to HTML—a programming language for web pages, HTTP—the protocol for transferring them, and the URL, the means to locate them. The final code consisted of just 9,555 lines, marking the web’s emergence by year’s end.The web was born.
“CERN was an extraordinary place to innovate the web,” he states. “Individuals from around the world, driven by a genuine need to communicate and document their experiences, came together there.”
The inaugural website was hosted on Berners-Lee’s work computer, adorned with a “Do Not Turn Off” sign and instructions for engaging with the web. More web servers emerged, leading to exponential growth: “In the first year, it grew tenfold; in the second year, another tenfold; and by the third, yet another tenfold.” He recalls, “Even then, I sensed we were onto something significant.”
Initially, most web pages were crafted by academics and developers, but soon, everyone began using them to share a wide array of content. Within a decade, the landscape blossomed into millions of websites, hundreds of millions of users, and the inevitable rise of dot-com ventures.
The Spice Girls with their website in 1997.
David Corio/Redferns
Despite the web’s immense potential for profit, Berners-Lee believed it should remain free and open to realize its full capabilities. This was a challenge, as CERN had legitimate grounds to claim royalties on the software being developed. Berners-Lee advocated for his superiors to release this technology openly, and by 1993, after much negotiation, the comprehensive source code of the Web was made available, complete with a disclaimer: CERN relinquishes all intellectual property rights to this code—the web will be royalty-free forever.
Early Days
For its initial years, the web flourished. Although there was a notorious stock market crash at the turn of the millennium, largely driven by speculative venture capital rather than the web itself, piracy was rampant, and malware was ever-present, the web was fundamentally open, free, and enjoyable. “People loved the web; they were simply happy,” Berners-Lee recounted in his memoir.
He captured the essence of this era, believing the web held the potential to foster new forms of collaboration among people. He coined the term “intercreativity” to describe the creative synergy of groups rather than individuals. Wikipedia, with around 65 million English pages edited by 15 million contributors, exemplifies what he envisioned for the web. He notably positions it on his map and describes it as “probably the best single example” of his aspirations.
However, the optimistic phase of the web was not to extend indefinitely. For Berners-Lee, the turning point came in 2016, marked by the Brexit vote and the election of Donald Trump. “At that moment, discussions arose about how social media could be manipulated to influence voters against their interests. In essence, the web became an instrument of manipulation driven by larger entities,” he shared.
Traditionally, political movements communicated their messages to the public openly, allowing for critique and discussion. However, by the mid-2010s, social media enabled “narrowcasting,” as Berners-Lee describes it, allowing political messages to be tailored into numerous versions for various audiences. This complicates tracking who communicated what and makes it harder to counter misinformation.
The extent of this microtargeting’s impact on elections remains debated. Numerous studies have tried to quantify how such messaging alters public opinion and voting behavior, generally uncovering only modest effects. Regardless, these trends contribute to Berners-Lee’s broader concerns about social media.
He emphasized that social media platforms are incentivized to maintain user engagement, which leads to the creation of “addictive” algorithms. “People are naturally drawn to things that evoke anger,” he states. “When social media feeds users misinformation, it’s more likely to garner clicks and ensnare users longer.”
Quoting author Yuval Noah Harari, he stipulated that creators of “harmful” algorithms should likewise be held accountable for their recommendations. “It’s particularly essential to undermine systems designed to be addictive,” Berners-Lee argues. He admits that imposing restrictions contradicts his usual free and open philosophy, viewing it as a last resort. Social media can unify individuals and disseminate ideas, yet it also poses unique risks that warrant change, as he specifies in his latest book. “This must evolve somehow.”
Nonetheless, he harbors an optimistic view of the web’s potential trajectory. While social media, despite its captivating nature, represents merely a fragment of the internet landscape, Berners-Lee contends that addressing these issues should be part of a broader strategy aimed at enhancing the web overall, with a focus on reclaiming digital sovereignty.
A Plan for Universal Web Access
To further this goal, Berners-Lee has dedicated the last decade to developing a new framework reinstating control with the individual. Presently, disparate internet platforms manage personal data. For instance, it’s challenging to share a video from Snapchat on Facebook or a post from LinkedIn to Instagram—the user can create this content, yet each company retains ownership.
Berners-Lee’s concept advocates for consolidating data into a singular data repository known as a pod (short for “personal online data store”), which the user controls, rather than having information dispersed across various platforms. This pod can hold everything from family images to medical records, with users determining what to share. This isn’t merely theoretical; he co-founded a company, Inrupt, that aims to bring this vision to life.
Berners-Lee using an early version of website and web browser invented at CERN in 1994
CERN
He is particularly enthusiastic about merging data wallets with artificial intelligence. For example, when searching for running shoes, current AI chatbots require detailed guidance to offer suitable recommendations. However, if an AI accesses a user’s data wallet, it can understand all past measurements, training history, and potentially spending behavior, leading to more accurate suggestions.
Berners-Lee advocates that AI should serve users, not large tech corporations. His goal isn’t to create individual AIs but to establish safeguards within software. Data wallets are part of the solution, along with an idea that AI should adhere to a kind of digital Hippocratic oath to avoid causing harm. He envisions AI acting as “your personal assistant,” providing tailored support.
While recommending appropriate running shoes may not address the web’s most pressing challenges, Berners-Lee possesses an exceptional ability to envision potential before others. Data wallets might seem mundane today, yet just decades ago, hyperlink-based document management systems were equally obscure. His passion for bettering the world drives him, as he believes enhancing the data ecosystem is crucial to achieving that goal.
All these developments suggest Berners-Lee envisions a fundamental shift for the web. He believes we must transition from an “attention economy,” dominated by competing clicks, to an “intention economy,” where users express their needs and companies—and AI—strive to fulfill them. “This is more empowering for the individual,” he asserts.
Such a transformation could redistribute power from tech giants to users. Some might think such a reversal unlikely, especially with the ongoing trends of tech dominance and the pervasive “doomscrolling” culture. However, Berners-Lee has a proven history of spotting opportunities others miss, and ultimately, he is the architect of the roadmap.
Topics:
Source: www.newscientist.com
