17 million websites, 360 million users, and almost no way to interact
By 2000, over 17 million websites existed and 360 million people were browsing them. Almost none of those users could do anything but read.
This was Web 1.0: a global library where you could look but not touch. The infrastructure we take for granted today—URLs, HTTP, HTML—was built during this era. So was the centralization that Web 3 now tries to undo.
The first website went live on August 6, 1991. It looked like a text document with blue hyperlinks—because that's essentially what it was.
Early websites had severe limitations:
The internet was a collection of documents. You could read them. That was it.
Three technologies made it work:
HTML (1991): HyperText Markup Language let browsers combine text, images, and links into pages. Functional for distributing content, but limited in presentation.
CSS (1994): Cascading Style Sheets separated content from presentation. Designers could finally control how pages looked without restructuring the HTML.
JavaScript (mid-1990s): Added interactivity—dropdown menus, form validation, dynamic content. Microsoft's adoption through Internet Explorer (which controlled 90%+ of the browser market) made JS the standard.
These three still dominate web development today.
The early web carried utopian aspirations: free information, borderless connectivity, decentralized access. Tim Berners-Lee and others envisioned a democratized medium where anyone could publish and anyone could read.
That vision ran into economics.
Building and maintaining websites required infrastructure, expertise, and money. Large organizations—CERN, IBM, Microsoft, universities—did the heavy lifting. The physical and digital infrastructure they built created gaps that later companies would fill in monopolistic ways.
The late 1990s saw explosive growth. Investors threw money at anything with ".com" in the name. The Tech Bubble inflated—then popped in 2000.
But bubbles, despite their reputation, drive adoption. The infrastructure built during the frenzy didn't disappear when stock prices crashed. It became the foundation for what came next.
The companies that survived or emerged from the wreckage—Google (1998), Amazon (1994), later Facebook (2004)—learned from Web 1.0's limitations. They built platforms where users didn't just consume content. They created it.
Web 1.0 was necessary infrastructure. It proved the concept: global, instant information access worked. The protocols still run the internet. The idealism—free, decentralized, open—still animates movements like Web 3.
But the era also demonstrated how quickly open systems centralize. The web was supposed to democratize publishing. Instead, a handful of companies became the new gatekeepers. That pattern would intensify dramatically in Web 2.0.
The foundations were laid. What got built on them is the subject of Web 2.0.
17 million websites, 360 million users, and almost no way to interact
By 2000, over 17 million websites existed and 360 million people were browsing them. Almost none of those users could do anything but read.
This was Web 1.0: a global library where you could look but not touch. The infrastructure we take for granted today—URLs, HTTP, HTML—was built during this era. So was the centralization that Web 3 now tries to undo.
The first website went live on August 6, 1991. It looked like a text document with blue hyperlinks—because that's essentially what it was.
Early websites had severe limitations:
The internet was a collection of documents. You could read them. That was it.
Three technologies made it work:
HTML (1991): HyperText Markup Language let browsers combine text, images, and links into pages. Functional for distributing content, but limited in presentation.
CSS (1994): Cascading Style Sheets separated content from presentation. Designers could finally control how pages looked without restructuring the HTML.
JavaScript (mid-1990s): Added interactivity—dropdown menus, form validation, dynamic content. Microsoft's adoption through Internet Explorer (which controlled 90%+ of the browser market) made JS the standard.
These three still dominate web development today.
The early web carried utopian aspirations: free information, borderless connectivity, decentralized access. Tim Berners-Lee and others envisioned a democratized medium where anyone could publish and anyone could read.
That vision ran into economics.
Building and maintaining websites required infrastructure, expertise, and money. Large organizations—CERN, IBM, Microsoft, universities—did the heavy lifting. The physical and digital infrastructure they built created gaps that later companies would fill in monopolistic ways.
The late 1990s saw explosive growth. Investors threw money at anything with ".com" in the name. The Tech Bubble inflated—then popped in 2000.
But bubbles, despite their reputation, drive adoption. The infrastructure built during the frenzy didn't disappear when stock prices crashed. It became the foundation for what came next.
The companies that survived or emerged from the wreckage—Google (1998), Amazon (1994), later Facebook (2004)—learned from Web 1.0's limitations. They built platforms where users didn't just consume content. They created it.
Web 1.0 was necessary infrastructure. It proved the concept: global, instant information access worked. The protocols still run the internet. The idealism—free, decentralized, open—still animates movements like Web 3.
But the era also demonstrated how quickly open systems centralize. The web was supposed to democratize publishing. Instead, a handful of companies became the new gatekeepers. That pattern would intensify dramatically in Web 2.0.
The foundations were laid. What got built on them is the subject of Web 2.0.