A History of Digital Computers

Ancient History

People have been using mechanical, calculating devices for millennia. The Antikythera mechanism is such a device. Originally, people must have used fingers to count. You say that ten fingers are too few for the job? I tell you, that with those ten fingers you can count to over 1,000 using binary representation. In that case, the fingers (also called digits) served as elements of the original digital computer. We have ten digits, our standard number system is base ten, so we call the ten symbols for our number system (0,1,2,3,4,5,6,7,8,9) digits.


A picture of an abacus that has been released into the public domain by its author, HB. From Wikimedia Commons.

People must soon have realized, that anything could represent a number. It did not have to be a finger. Why not a pebble or a bead? So voila, they invented the abacus.

The Babbage Difference Machine

One invention of our modern world, the electronic digital computer, has profoundly transformed human life on the planet.

File:A la mémoire de J.M. Jacquard.jpg

A portrait of Jacquard woven in silk on a Jacquard loom requiring 24,000 punched cards to create (1839). Public Domain, from Wikimedia. This file has been identified as being free of known restrictions under copyright law, including all related and neighboring rights.

The idea of a computing machine was dreamed up in 1821 by the English mathematician Charles Babbage. He built a mechanical model of such a device, which he called a difference engine. This machine became a prototype for adding machines,  to be developed commercially later.  Babbage also conceived of the idea of programming such a machine by using punched cards—an idea that he got from the programmable Jacquard loom, which was developed by a merchant weaver and inventor.

Here we see the marvelous interplay between ideas of merchants and academics. Good ideas come from many sectors of society. They may come from people that are too busy and preoccupied with business and market competition to fully develop the idea. Later, these ideas may be picked up by people who have the leisure and resources to work on things that may not return immediate financial gains. Yet in these refuges from the commercial traffic and bottom-line mentality, their work may hatch such big returns, that create whole new dimensions in commerce. Such happened in the world of computers.

The Manhattan Project

Richard Feynman, the famous theoretical physicist and Nobel Prize winner, worked on the Manhattan Project at Los Alamos, New Mexico.

"As a junior physicist, he was not central to the project. The greater part of his work was administering the computation group of human computers in the theoretical division (one of his students there, John G. Kemeny, later went on to co-design and co-specify the programming language BASIC). Later, with Nicholas Metropolis, he assisted in establishing the system for using IBM punched cards for computation."—Wikipedia

In those days, when the digital computer was not widely available for doing scientific calculations, the necessary computations having been done by people with calculating machines, which owed their existence to Babbage. The individuals involved in such efforts were called "computers". At Los Alamos, the computers were women, who were  assigned to perform specific arithmetical operations on designated parts of an involved calculation that was contained on a worksheet. When finished, they passed the worksheet to another computer, who took the results for further computation from other parts of the worksheet. Feynman organized the scientific computations required by the project into a "flow chart" of how these computations were to be processed by the room full of women computers. In this respect, Feynman was a scientific programmer of a multi-processor computer, consisting of a room full of women. Again, here the idea of complex processing and organization were derived from the assembly lines of Henry Ford.

Electronic Digital Computers

Computers became more affordable after they began to be developed and used by universities to support government-related research work. John von Neumann, a genious mathematician, is credited with the design of modern digital computers, which is called, the Von Neumann architecture., in which both the programs and the numerical results of calculations (data) are stored in memory. The United States Army, Ordnance Corps, Research and Development Command financed the design and construction of the first of the digital computer, ENIAC, which was built in the Von Neumann architecture at the University of Pennsylvania. for the use of the Ballistic Research Laboratory (BRL) at Aberdeen Proving Ground, Maryland.

"Though ENIAC was designed and primarily used to calculate artilleryfiring tables for the United States Army's Ballistic Research Laboratory,[6][7] its first programs included a study of the feasibility of the hydrogen bomb." —Wikipedia article

I was drafted into the Army after I got a BS degree in physics, and assigned to work at BRL as a scientific programmer. By then, the generations of computer design had advanced to the ORDVAC, which I programmed. The BRLESC was just being tried when I completed my stint in the military. After that, I did not program a mainframe computer; but later, I programmed mini and microcomputers, which became plentiful and ubiquitous.


This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *