The Information Age: From the Ashes of the World War to Leaps Across the Abyss
In February 1948, in a Sydney laboratory, an engineer named Trevor Pearcey wrote a sentence that still resonates: "It is not inconceivable that an automatic encyclopaedic service operated through the existing telephone system will one day exist."
The Internet. Predicted from Australia. Forty years before the World Wide Web.
This period — from 1945 to 2010 — is when dreams became machines. Shannon's circuits took form in microprocessors. Turing's universal machine became the personal computer.
Boolean logic became the Internet. And the Dartmouth dream — simulating human intelligence — passed through summers of euphoria and winters of disillusionment before being reborn, transformed.
But this era was also one of leaps across the abyss.
Africa, disconnected from the global telephone network, jumped directly to mobile payments. M-Pesa preceded Apple Pay.
India, having missed the hardware turn, leaped toward software services — the Y2K bug became its launchpad.
Taiwan invented the "pure-play foundry" and became the world's silicon shield.
Israel, a six-year-old nation surrounded by enemies, built one of the world's first computers and became the "Startup Nation."
You will traverse six continents.
Sixty-five years of history. From the secret laboratories of Bletchley Park to the Tel Aviv apartments where ICQ was born. From the ruins of World War II to the servers of Google Maps, born in Sydney.
Everywhere, the same question: who inherits the digital revolution — and who is excluded from it?
The AI winter is over.
The summer that opens will be the longest in history. But to understand where we are going, we must first understand where we came from.
Welcome to A Brief History of AI, season 5.