The World Wide Web just turned 35 years — and please, stop calling it the Internet
The internet has vastly changed since its inception in 1989, with big data, AI and faster connections transforming how people use it
It's unclear whether Sir Tim Berners-Lee knew the magnitude of his authoring of the 1989 paper titled "Information Management: A Proposal". But it was undoubtedly a transformative moment for humanity and has impacted society and business in profound ways.
35 years on from a mere proposal, we have interconnected systems all around the world that are powering large-scale big data analytics workloads, cloud-enabled quantum computing and artificial intelligence (AI) agents that are integrated into software components — like Microsoft's Copilot module. There may yet be further room for growth, with the metaverse and holographic projection possibly next in line as data transmission capabilities increase over the coming years.
Although the web was first proposed with Berners-Lee's paper, the building blocks were in place for a few years beforehand by the US Department of Defense, when it decided to implement TCP/IP into its network. Thus, Arpanet was born. This eventually evolved into the model that's become the web we use today — but it was a simple idea then, and pales into comparison compared with the intricately connected systems that govern every aspect of our lives.
The post-AI internet
The amount of data, for example, on the internet has exponentially surged — especially in the last few years. In 2018, IDC predicted that data would swell from 33ZB (or one billion terabytes) to 175ZB by 2025. Other estimates suggest it's even greater, with 64ZB in 2020 ballooning to 181ZB by next year, according to Statista.
Considering what form the data takes is also key. With high-speed connections becoming more mainstream, the amount of video content has reportedly increased to the extent that it now represents 53.72% of all content online.
Now, however, there are also rising concerns over how much of the internet is real. A "shocking amount" of the content on the internet is fake, according to scientists with Amazon Web Services (AWS) in a recently published paper. More than half (57.1%) of all the sentences on the web have been translated into two or more other languages, suggesting AI tools like large language models (LLMs) were used to create and translate them. It reinforces findings by 404 media, showing Google News is inadvertently promoting AI-generated content.
If Sir Tim has anything to say about the future of the internet, we might adopt a new protocol known as Solid. Spearheaded by his new company, Inrupt, it leans on Web 3.0 principles and the prioritization of user privacy. Whether or not this idea takes off, however, remains to be seen.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
More from TechRadar Pro
Keumars Afifi-Sabet is the Technology Editor for Live Science. He has written for a variety of publications including ITPro, The Week Digital and ComputerActive. He has worked as a technology journalist for more than five years, having previously held the role of features editor with ITPro. In his previous role, he oversaw the commissioning and publishing of long form in areas including AI, cyber security, cloud computing and digital transformation.
Most Popular
By Simon Lucas
By Philip Berne