Skip to content

From Abacuses to Algorithms: A History of the World‘s First Computers

Computers are so deeply embedded in our daily lives today that it‘s nearly impossible to envision a world without them. Yet the laptops and smartphones we take for granted represent merely the latest incarnation of computing technology – a science hundreds of years in the making. By examining 10 of the earliest instruments that shaped concepts of computation and automated calculation, we can trace the story of how primitive counting devices gave way to the advanced electronics that now dominate the planet.

What is a Computer?

To track the origins of computing, we must first define what exactly constitutes a "computer." At its most essential, a computer is a device that accepts input data, processes it according to predefined instructions or algorithms, stores results as needed, and ultimately produces a desired output. The processing capacity and speed, interface methods, memory systems, and even energy sources powering computers have evolved enormously over centuries. However, that fundamental model of receiving, manipulating, and expressing information remains unchanged. Computational tools allowing humans to solve problems more efficiently have been pursued since ancient civilizations first formalized mathematics. Our modern digital electronics merely represent the latest and most spectacular chapter in this perpetual quest to cultivate ever more advanced information processing machines.

207 BC to 87 BC – The Antikythera Mechanism: Ancient Greek Computer

The Antikythera Mechanism is widely considered the world‘s oldest analog computer as well as the earliest known example of a sophisticated scientific calculator. Estimated to have been constructed between 205 BC and 87 BC, this improbable device consisted of a complex clockwork mechanism embedded in a wooden frame and operated by handcranks. Found aboard an ancient shipwreck off the Greek island of Antikythera, it flummoxed scholars upon its discovery in 1901. But decades of research since have gradually decoded its mysteries.

Through meticulous reconstruction and testing, scientists now understand the Antikythera Mechanism‘s purpose. As the operator rotated its handcranks, the device could model the elliptical motions of heavenly bodies using a network of interlocking gears. By tracking the sun, moon, and planets across the zodiac signs of its built-in astronomy calendar, the mechanism could predict upcoming eclipses, forecast new moons, and even schedule ancient Olympic games according to their traditional four-year cycle. This level of scientific sophistication utterly confounded researchers expecting mere primitive trinkets within the shipwreck. But the device‘s existence clearly proves the ancient Greeks possessed far greater technical resourcefulness and knowledge of astronomy than formerly imagined.

1936 to 1937 – Z1: First Functional, Programmable Digital Computer

While the Antikythera Mechanism represents early computing accomplishment, Germany produced the first known fully-functional programmable computer in the modern sense of the word. German civil engineer Konrad Zuse began work in 1936 on the Z1 computer from his parents‘ living room. He completed the machine entirely through his personal funding and ambition within just a year.

As the earliest electro-mechanical binary digital computer, the Z1 soon demonstrated itself capable of autonomous calculation when completed in 1937. Its computing operations relied on telephone relays and more than 2,000 metal strips to perform automated logical and arithmetic operations. Users could program it without rewiring the machine by means of a punched tape mechanism that directed its floating-point binary arithmetic and basic bitwise Boolean logic capabilities. The Z1 accepted inputs up to 22 bits plus a sign. Regrettably, it was destroyed in 1943 during an air raid amidst World War II. Nevertheless, Konrad Zuse‘s Z1 computer clearly stands out as one of the most groundbreaking milestones towards fully-programmable digital computation.

1937 to 1942 – Atanasoff Berry Computer Introduces Electronic Digital Logic

While still relying on mechanical elements for calculation, the German Z1 represented enormous progress towards modern computing concepts. However, the next breakthrough in computing history would incorporate pioneering electronic digital circuitry as well.

Starting in 1937, Iowa State College physics professor John Vincent Atanasoff partnered with his graduate assistant Clifford Berry. Together they decided to design a specialized computer optimized to solve systems of linear algebraic equations frequently encountered in physics research. This goal inspired them to attempt creating the world‘s first electronic digital computer – one that would calculate via electronic means rather than mechanical gears alone.

Atanasoff formulated the overall logic and architecture for the machine. Berry focused primarily on converting those ideas into electronic hardware implementations. They worked tirelessly over the next five years assembling hundreds of vacuum tubes, condensers, and punch card mechanisms into the Atanasoff Berry Computer (ABC). By 1942, the ABC functioned well enough to successfully solve a set of 29 simultaneous linear equations involving over 30 unknown variables. In 1973, US courts legally recognized the ABC as the first electronic digital computing machine as well as the original inventor of several computer principles still central to modern computers.

1943 to 1945 – Colossus Series Decrypts Secret Communications in World War II

By the early 1940s, programmable digital computers still remained rudimentary experiments relegated to a few laboratories. However, World War II soon demanded computing technology become an urgent wartime priority on both sides of the Atlantic. Coded military communications spurred intense efforts to construct specialized computers dedicated to cryptanalysis – decoding encrypted messages to extract vital intelligence.

Britain‘s war-driven codebreaking project received an exponential boost in computing power starting in 1943. At Bletchley Park‘s famous Station X, engineer Tommy Flowers led construction of Colossus – a new code-breaking supercomputer incorporating over 1,500 vacuum tubes. By January 1944, the completion of Colossus Mark 1 gave British intelligence a crucial edge in deciphering Germany‘s secret communications. Enhancements provided by nine additional Colossus computers over the next year then continued deciphering up to 63 million characters per day. Historians now widely credit this classified computer series with significantly shortening the war for the Allies.

1944 – Harvard Mark I Supports Allied Technology Development

As the Colossus computers tipped the codebreaking scales for the British forces, the Allies were also getting a boost from computing innovations across the Atlantic Ocean. Beginning in 1939, US engineer Howard Aiken collaborated with IBM to design the IBM Automatic Sequence Controlled Calculator. Driven by electromechanical relays rather than electronics, this room-sized complex of gears, shafts, clutches and dials could automatically perform long sequences of arithmetic calculation operations after being programmed via paper tape and switch settings.

Finally completed in 1944, the machine was presented to Harvard University. There it supported the Allied war effort by running ballistics calculations for development ofweapons plus computations to aid radar and anti-submarine tactics. Its perseverance and reliability during intense wartime service cemented the Harvard Mark I‘s legacy as one of the most influential early milestone projects paving the path towards practical computers.

1945 – ENIAC: The First General-Purpose Electronic Digital Computer

While special-purpose computers like Colossus and the Harvard Mark I achieved impressive capabilities, the next breakthrough pursued a digital computer designed for more general usage scenarios. Starting in 1943, physicists John Eckert and John Mauchly started developing ENIAC at the University of Pennsylvania‘s Moore School of Electrical Engineering. Funded by the US Army, ENIAC aimed for unprecedented versatility as a tool to calculate ballistics trajectories.

Finally completed in November 1945, the room-filling ENIAC featured over 17,000 vacuum tubes wired together for digital calculation using 5,000 addition units, 765 multiplication stimuli, fifteen divisors, and several other specialized units. All told, ENIAC contained around 1,500 multiplier tubes, over 70,000 crystal diodes, and hundreds of thousands of resistors, capacitors, inductors and other discrete components costing almost $500,000. Air pressure blew the accumulated three tons of cabling between units off the floor for cooling. Despite its 30 panels weighing 30 tons collectively, ENIAC‘s electronic digital design made it around 1,000 times faster than the Harvard Mark I.

Reliability challenges caused ENIAC to initially operate only 43% of the time on average before improvements increased uptime. Nevertheless, its electronic architecture successfully demonstrated far swifter, more versatile computation than prior electromechanical approaches. ENIAC stands today as the first general-purpose electronic digital computer influential to all that followed.

1949 – Manchester Mark 1 Stores Software as Data

Programmability represented another key pursuit in early computing evolution to allow more flexible usage without hardware changes. By 1947, researchers at the University of Manchester began exploring a stored program concept allowing software instructions to reside in memory for execution. While prior systems like ENIAC could be rewired or reconfigured, they did not yet treat machine instructions as changeable data in memory.

Engineers Freddie Williams and Tom Kilburn led design of the Manchester Mark 1 prototype computer system focused on realizing this goal. Williams greatly expanded on ideas from mathematician Alan Turing regarding instruction storage and program execution. Finally, on June 21st, 1948 their machine became the world‘s first computer to successfully run a stored software program – an enormous milestone. Refinements continued until the Manchester Mark 1 was fully complete in April 1949. Despite its primitive capabilities and small memory, the prototype system nevertheless pioneered foundational software concepts now essential to all modern software and hardware.

1950 – The Automatic Computing Engine Pioneers Microcode

Influenced heavily by collaborations with Alan Turing in the mid-1940s, British mathematician Maurice Wilkes began work in 1945 on another pioneering computer dubbed the Electronic Delay Storage Automatic Calculator (EDSAC). Operational by 1949, the EDSAC relied on vacuum tube circuits to read software instructions from a mercury delay line memory system and perform calculations using those data-driven directives. Its initial success then enabled enhancements leading to Wilkes‘ next milestone machine – the Automatic Computing Engine (ACE) completed in 1950.

The ACE built extensively on EDSAC learnings and also implemented an important new innovation from Turing – microcode. Then referred to as Turing‘s "checking instruction code", this technique involved storing primitive programmable logic gate configurations that could emulate computer instructions using simpler underlying hardware building blocks. By storing instruction sets as changeable microcode rather than permanently wired electronic gates or electromechanical mechanisms, engineers could flexibly reprogram the computer‘s fundamental instruction operations. The ACE became recognized as an early fully-functional stored program computer and the pioneering prototype for microcoded central processors that later dominated computing.

1951 – UNIVAC I Ushers in Commercialization of Digital Computers

As breakthrough computing systems gradually shed their legacy hardware limitations through guidance from mathematicians like Alan Turing, the underlying electronic technologies also rapidly advanced. Vacuum tube circuits enabling ENIAC and its counterparts – while innovative for their era – consumed enormous electricity, demanded extensive cooling, and suffered frequent failures. Newly emerging transistor technology offered an attractive replacement to overcome their sluggish speeds and unreliability.

Capitalizing on these improving semiconductors, the Eckert-Mauchly Computer Corporation designed UNIVAC I under an ambitious grant from the US Census Bureau beginning in 1947. Transistor usage minimized size, energy, and component costs compared to equivalent vacuum tube implementations. Upon completion in 1951, UNIVAC I claimed much faster speeds, better durability, and dramatically lower electricity bills than competing systems while still meeting its designer‘s promises to the Census Bureau. These economies of scale for the first time made cost-effective commercialization of a general-purpose digital computer financially feasible.

Within just its first year, UNIVAC I quickly started proving its business worth and accelerating digital technology‘s movement into the mainstream. Its very first completed calculation helped predict the winner of the 1952 presidential election for CBS News viewers – cementing computer usefulness in the public eye. Burroughs then contracted it for banking purposes. Insurance providers bought systems to automate actuarial statistics. Ford Motor Company purchased a UNIVAC for accounting tasks. Ultimately selling 46 systems priced around $1 million, the UNIVAC I signaled digital computers as a viable sales and profit center rather than mere scientific curiosities or wartime necessities.

1953 – Ferranti Mark I Hits the Commercial Marketplace

As the UNIVAC I demonstrated profitable opportunities for computer technology services even beyond laboratory environments, Britain soon forged its own progress commercializing systems for broader business audiences across the Atlantic Ocean as well. University of Manchester professors Freddie Williams and Tom Kilburn had kickstarted their nation‘s computing breakthroughs with the pioneering 1948 Williams-Kilburn Tube – the basis for main memory units crucial to the Manchester Mark 1 prototype design. However, bringing any computer fully to market demanded enormous additional work solving inevitable reliability issues, cost roadblocks, and productization problems.

Therefore, Williams and Kilburn partnered with British electronics firm Ferranti starting in 1947 to transform their academic project into sales-worthy commercial hardware. Adding key company engineers and investing heavily in further development, Ferranti gradually converted prototypes into sturdy products. After six years spent turning concepts into concrete computing platforms, the company ultimately rolled out the Ferranti Mark I in 1953 as the world‘s first commercially available general-purpose computer for businesses and universities.

Priced between $140,000 to $550,000 depending on features, roughly nine Ferranti Mark I systems sold after the product‘s debut – mostly to major British universities. While hardly a runaway blockbuster by today‘s standards, these pioneering installations nevertheless represented enormous progress morphing computer technology into viable products beyond bespoke one-off projects – not just in Britain but internationally as well. During an era when digital computation remained unproven, unreliable, and understandably scarce in the business landscape, the Ferranti Mark 1 helped solidify computers‘ commercial momentum and seed ongoing developments that continue flourishing today.

Reflecting across the computing breakthroughs spanning from ancient Greek mechanisms to 1950s electronics, we‘ve traced an incremental chain of concepts not only enhancing calculation efficiency itself but also gradually building foundations essential to modern computer theory. From programmable logic to microcode and stored memory programs, each successive wave of innovation leveraged progress already achieved while simultaneously pushing boundaries towards more expansive potential.

Computing history represents iterative refinement above all – a stepwise traversal along the perpetual path chasing tomorrow‘s solutions. Every modern laptop, smartphone, and data center now outpacing the capabilities mentioned here merely embodies the latest chapter in this chronicle. Undoubtedly an even more wondrous sequel awaits just over the horizon! But with pioneers stretching centuries into the past having already overcome countless hardships chasing the horizons visible to them then, imagining the impossible seems wholly reasonable today.