How Computers Shrank From Rooms to Pockets

HOOK (0:00-0:30)

*note this is a script that will be published soon.

[IMAGE: Early ENIAC computer filling entire room] This room-sized monster from 1945 had less computing power than the calculator app on your phone. Seriously.

[IMAGE: Modern smartphone] That supercomputer in your pocket is roughly a million times more powerful than all of NASA's combined computing power that sent astronauts to the moon in 1969.

[FOOTAGE: Time-lapse of computer evolution] So how did we go from machines that filled entire buildings to devices that fit in our pockets in just a few generations? That's what we're exploring today.

INTRODUCTION (0:30-1:00)

Hey everyone! Today we're diving into the history of computers. We're going back to a time where 'computer' meant a person doing complex math calculations - and bringing it back to what we have now those pocket supercomputers we carry around.

[TEXT ON SCREEN: The History of Computers]

Understanding how we got here isn't just about appreciating old technology – it's about seeing the incredible human ingenuity that built our digital world. From mechanical calculators to artificial intelligence, we'll cover the most important milestones that brought us to today.

And yes, we'll explain why your grandparents still call tech support "the Geek Squad" even when they're calling Apple. Some things never change.

PRE-ELECTRONIC COMPUTING (1:00-2:30)

[IMAGE: Ancient abacus] Humans have been trying to make calculation easier since, well, forever. The abacus dates back to around 2700 BCE – that's nearly 5,000 years ago. Imagine trying to calculate your taxes on this thing!

[IMAGE: Antikythera mechanism] But the real mind-blower from ancient times is the Antikythera mechanism, discovered in a shipwreck off Greece. Dating to around 100 BCE, this mechanical device could predict astronomical positions and eclipses with remarkable accuracy. It's essentially a 2,000-year-old analog computer.

[IMAGE: Pascal's calculator] Fast forward to 1642, when French mathematician Blaise Pascal invented a mechanical calculator to help his tax collector father. I guess complaining about taxes is truly timeless.

[FOOTAGE: Animation of Babbage's Difference Engine] The real breakthrough in mechanical computing came from Charles Babbage in the 1830s. His Difference Engine and later Analytical Engine designs contained all the essential elements of modern computers: an input device, memory, a processor, and output. The only problem? He couldn't actually build them with 19th-century technology.

[IMAGE: Portrait of Ada Lovelace] Enter Ada Lovelace, daughter of the poet Lord Byron and a brilliant mathematician. She wrote what's considered the first computer algorithm for Babbage's Analytical Engine – before the machine even existed! Lovelace understood that computers could manipulate symbols and not just numbers, essentially predicting modern computing. Not bad for someone born in 1815!

EARLY ELECTRONIC COMPUTERS (2:30-4:30)

[FOOTAGE: ENIAC computer with programmers] The first fully electronic general-purpose computer was ENIAC – the Electronic Numerical Integrator and Computer – completed in 1945. It weighed 30 tons, contained 18,000 vacuum tubes, and consumed enough electricity to dim the lights in Philadelphia when turned on. Talk about an energy hog!

[IMAGE: ENIAC programmers] ENIAC's first programmers were six women: Kathleen McNulty, Jean Jennings, Betty Snyder, Marlyn Wescoff, Frances Bilas, and Ruth Lichterman. They had to physically rewire the machine to program it. Imagine having to rearrange your computer's hardware every time you wanted to run a different program!

[IMAGE: Harvard Mark I] Meanwhile, IBM was working with Harvard on the Mark I, a electromechanical computer that used relays instead of vacuum tubes. It sounded like "a roomful of ladies knitting" according to witnesses. Much more pleasant than today's data centers, which sound like jet engines preparing for takeoff.

[FOOTAGE: Colossus computer] World War II dramatically accelerated computer development. At Bletchley Park in England, Alan Turing and his team created specialized computers like the Colossus to break Nazi encryption, shortening the war by an estimated two years. Next time someone questions the value of computers, remind them of that.

[IMAGE: Von Neumann architecture diagram] John von Neumann's 1945 paper on computer architecture established the stored-program concept – where both data and instructions are stored in the computer's memory. Almost all computers today still use von Neumann architecture. If he had patented it, he'd be making Apple look poor.

MAINFRAMES TO MINICOMPUTERS (4:30-6:00)

[FOOTAGE: IBM mainframe with operators] The 1950s and '60s were the era of mainframes – massive computers that served entire organizations. IBM dominated this market with machines like the IBM 704 and later the System/360 series.

[IMAGE: IBM System/360] The System/360, introduced in 1964, was the first family of computers that could run the same software across different models. Before this, upgrading your computer meant rewriting all your software. Imagine having to buy all new apps every time you got a new phone!

[IMAGE: Transistor close-up] What made smaller computers possible was the invention of the transistor at Bell Labs in 1947. Transistors replaced vacuum tubes, which were bulky, energy-hungry, and prone to burning out – kind of like that one string of Christmas lights where if one bulb goes out, they all do.

[B-ROLL: Close-up of integrated circuit manufacturing] By the 1960s, multiple transistors could be integrated onto single silicon chips, creating integrated circuits. These dramatically reduced the size, cost, and power consumption of computers while increasing their reliability.

[IMAGE: DEC PDP-8] Companies like Digital Equipment Corporation (DEC) began producing minicomputers like the PDP-8, introduced in 1965. At $18,000 – about $160,000 in today's money – it was still expensive, but much cheaper than mainframes that cost millions. The PDP-8 was about the size of a refrigerator rather than a room. Progress!

PERSONAL COMPUTING REVOLUTION (6:00-8:00)

[IMAGE: Altair 8800] The first personal computer that gained widespread popularity was the Altair 8800, featured on the cover of Popular Electronics in 1975. It had no keyboard or monitor – you programmed it using switches and lights. Not exactly user-friendly.

[FOOTAGE: Young Bill Gates and Paul Allen] Two young guys named Bill Gates and Paul Allen saw that Altair cover and immediately recognized an opportunity. They called the manufacturer and claimed they had a BASIC programming language ready for it – which wasn't true. They then frantically created it in a few weeks, and Microsoft was born. Sometimes the best business strategy is "fake it 'til you make it."

[IMAGE: Apple I computer] Meanwhile, in a garage in California, Steve Wozniak and Steve Jobs were creating the Apple I computer. When it debuted in 1976, it was basically just a circuit board – you had to add your own case, keyboard, and monitor. But at least it was more user-friendly than the Altair.

[FOOTAGE: Xerox PARC GUI demonstration] The real revolution in computing usability came from Xerox's Palo Alto Research Center (PARC). They developed the graphical user interface (GUI) with windows, icons, menus, and a mouse. Famously, Steve Jobs visited PARC in 1979 and, well, "borrowed" these ideas for Apple's products.

[IMAGE: First Macintosh computer] The 1984 Macintosh was the first commercially successful computer with a GUI. Its famous Super Bowl ad showed a woman smashing a screen representing "Big Brother" IBM. Talk about dramatic marketing!

[IMAGE: IBM PC] But IBM wasn't going away. The 1981 IBM PC established a standard architecture that other manufacturers could clone, creating a massive ecosystem of compatible hardware and software. IBM designed it in just one year by using off-the-shelf parts and an operating system purchased from a tiny company called Microsoft. That decision would later come back to haunt IBM.

[FOOTAGE: MS-DOS command line] Microsoft's DOS operating system dominated the business world through the 1980s, though it required users to memorize commands like "C:>DIR" instead of just clicking icons. Computer users in the '80s weren't just users – they were part-time programmers whether they wanted to be or not.

INTERNET AGE (8:00-9:00)

[ANIMATION: ARPANET growth map] While personal computers were evolving, another revolution was brewing: networking. The ARPANET, funded by the U.S. Department of Defense, connected the first four computer nodes in 1969. It was designed to be decentralized so it could survive a nuclear attack. Paranoia sometimes leads to innovation!

[IMAGE: TCP/IP diagram] The protocols that power today's internet – TCP/IP – were standardized in 1983. They allow different types of computers to communicate, kind of like a universal language for machines.

[FOOTAGE: Tim Berners-Lee] But the internet didn't become user-friendly until 1989, when Tim Berners-Lee at CERN proposed the World Wide Web. He invented URLs, HTML, and HTTP – basically all the tech that lets you click links and view web pages. And unlike many innovators in this story, he decided to make it all free and open to everyone.

[IMAGE: Mosaic browser] The first popular web browser was NCSA Mosaic, released in 1993. Its creator, Marc Andreessen, went on to develop Netscape Navigator, which dominated the early web. Microsoft initially dismissed the internet, but then frantically played catch-up with Internet Explorer. The resulting "browser wars" were brutal.

[FOOTAGE: Dot-com era offices with extravagant perks] The late 1990s saw the dot-com boom, with companies reaching billion-dollar valuations without making a penny in profit. Pets.com spent millions on Super Bowl ads only to go bankrupt a few months later. The party ended with the dot-com crash in 2000, proving that the laws of economics still apply to the internet, unfortunately.

MODERN ERA (9:00-10:00)

[IMAGE: First iPhone] If the PC was the first computing revolution and the internet was the second, mobile represents the third. When Steve Jobs introduced the iPhone in 2007, he was essentially putting a computer more powerful than those that ran entire businesses in the 1980s into people's pockets.

[FOOTAGE: Amazon Web Services data center] Meanwhile, cloud computing was changing how businesses use technology. Amazon Web Services, launched in 2006, allowed companies to rent computing power instead of buying their own servers. Today, Netflix, Airbnb, and countless other services run on AWS infrastructure.

[B-ROLL: Modern AI applications] Artificial intelligence, particularly machine learning, has exploded in the last decade. Systems like GPT-4 can generate human-like text, while computer vision systems can recognize objects better than humans in some tasks. We've come a long way from calculators!

[FOOTAGE: Quantum computer] And the future? Quantum computing promises to solve problems that are impossible for traditional computers. One quantum computer can explore multiple possible solutions simultaneously through quantum superposition. It's like having infinite parallel universes doing your computing for you. No pressure, future engineers.

CONCLUSION & REFLECTION (10:00-10:45)

[MONTAGE: Computer evolution timeline] From the abacus to quantum computers, our computing journey has been remarkably short – just a few human lifetimes. The pace of change continues to accelerate. Gordon Moore observed in 1965 that computing power doubles approximately every two years – a prediction known as Moore's Law that has held true for decades.

[IMAGE: Comparison of 1970s room-sized computer with modern smartphone] But beyond the technical specifications, computers have fundamentally changed how we live. They've democratized access to information, connected people across continents, and automated tasks that once required hundreds of human hours.

[FOOTAGE: People using computers in daily life] I find it fascinating that at each stage, many experts failed to see the next paradigm shift coming. IBM's Thomas Watson allegedly said in 1943 that there might be a world market for "maybe five computers." Ken Olsen of Digital Equipment claimed in 1977 that "there is no reason for any individual to have a computer in his home." Microsoft's Steve Ballmer laughed at the iPhone in 2007.

The lesson? The future of computing will probably surprise us all. And that's what makes this field so exciting.

VIEWER ENGAGEMENT (10:45-11:00)

[TEXT ON SCREEN: What computing device couldn't you live without?] What piece of technology in your life would be hardest to give up? Your smartphone? Your laptop? Or maybe you're one of those people who still has a working Commodore 64 in your basement?

Let me know in the comments below. And if you've ever tried explaining basic computer concepts to your grandparents only to have them call in "the Geek Squad" anyway, you might enjoy my next video on the generation gap in technology adoption.

Until next time, remember that someone, somewhere is still using Windows 95... and they probably need your help.

[OUTRO ANIMATION: Channel logo]

References:

  1. Bashe, C. J., Johnson, L. R., Palmer, J. H., & Pugh, E. W. (1986). IBM's Early Computers. MIT Press.
  2. Berners-Lee, T. (2000). Weaving the Web. HarperBusiness.
  3. Ceruzzi, P. E. (2003). A History of Modern Computing. MIT Press.
  4. Dyson, G. (2012). Turing's Cathedral. Pantheon Books.
  5. Freeth, T., et al. (2006). Decoding the ancient Greek astronomical calculator known as the Antikythera Mechanism. Nature, 444(7119), 587-591.
  6. Freiberger, P., & Swaine, M. (2000). Fire in the Valley. McGraw-Hill.
  7. Hafner, K., & Lyon, M. (1998). Where Wizards Stay Up Late: The Origins of the Internet. Simon & Schuster.
  8. Haigh, T., Priestley, M., & Rope, C. (2016). ENIAC in Action. MIT Press.
  9. Hiltzik, M. (2000). Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. HarperCollins.
  10. Hollings, C., Martin, U., & Rice, A. (2018). Ada Lovelace: The Making of a Computer Scientist. Bodleian Library.
  11. Isaacson, W. (2014). The Innovators. Simon & Schuster.
  12. McCullough, B. (2018). How the Internet Happened. Liveright.
  13. Merchant, B. (2017). The One Device: The Secret History of the iPhone. Little, Brown and Company.
  14. Riordan, M., & Hoddeson, L. (1997). Crystal Fire: The Invention of the Transistor and the Birth of the Information Age. W.W. Norton & Company.
  15. Russell, S. (2019). Human Compatible: Artificial Intelligence and the Problem of Control. Viking.
  16. Schein, E. H. (2004). DEC Is Dead, Long Live DEC. Berrett-Koehler Publishers.
  17. Swade, D. (2001). The Difference Engine: Charles Babbage and the Quest to Build the First Computer. Viking.
  18. Tedlow, R. S. (2018). IBM: The Rise and Fall and Reinvention of a Global Icon. MIT Press.
  19. Webb, A. (2019). The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity. PublicAffairs.

Read more