© 2017 by Fernando Caracena
Introduction
The history of the development of computers, chiefly in terms of their hardware, has been discussed in a previous post on the History of Digital Computers. However, the development of digital computers owes its existence to the great a body of thought in mathematics that made its design possible. The future of computing will involve much more mathematics at a still higher level, to the extent that future generations will have to know much more mathematics than we can even imagine just to be able to stay engaged with the progress of digital computing.
The Problem of Motivation
In the traditional approach to teaching mathematics has the problem of motivating students. The teaching of mathematics at lower levels has some serious errors, which hobble potential students of higher mathematics. Think about the school teacher trying to teach kids how to do arithmetic, especially now, using the illogical approach of "Common Core". The reality is that we have the in the hand-held calculator a genie who can perform math-magic at our command. Just enter the numbers and the desired operation, press "enter" and voila! The poorest and best math students will sit in the back of the room through dull lectures on a moot subject finding other ways of entertaining themselves.
There is a lot exciting about mathematics, especially now that we have the computer available to do a lot of the grunt work. It was not so easy when I was going to school. Our main tools for doing the number crunching were paper and pencil. The only computing device widely available then was the slide rule. The mark of a real nerd was the the slide rule hanging in its holster strapped to his belt.
Algorithms
You could say that the real computation tools that we had, were not mechanical devices, but algorithms. You still had to do all the detailed calculations ourselves, but following algorithms kept the work well organized so that you did not get lost in the details. Algorithms were just part of the tool kit of mathematics—part of the bag of tricks called concepts, which we learned eagerly. In high school, we learned many mathematical concepts, not in math class but on our own, though our own reading and discussions. My high school friends and I spent many happy hours browsing through the stacks of the local book store, called the "Book Nook". The we might pick up a new book by George Gamow, which was our favorite scientific author. At the Book Nook, we also picked up our favorite works of fiction, which were the science fiction stories of Isaac Asimov.
Digital Computers are Math Machines
Computers are basically idiot savants. They are fast because they operate at the speed of light over short distances. Integrated circuits(IC)s engraved on small chips allow designers to pack many circuits into a very small space. Modern digital computers consist of large numbers of ICs coupled together in a way that their operation parallels arithmetic addition and subtraction in terms of binary numbers. Gottfried Leibniz in 1679, way ahead of the computer age, published a study of binary arithmetic. In this way curious mathematicians before the time of computers developed the essential, conceptual tools for the development of computers. George Boole in 1847 had formulated logic in terms of the algebra of binary numbers, called Boolean Algebra. Using this algebra, computer designers were able to incorporate logic into the operation of the electronic switching circuitry. Binary logic plus binary arithmetic make digital computers powerful math machines, but one additional element makes them the powerful machines that they are now. Programs that direct their operations makes this mathematics very versatile, amenable to higher forms of logic and the application of algorithms.
Separating Computer Programming from the Wiring
Engineers were the early computer builders. The way they saw programming was as part of the wiring. As you can imagine, rewiring as programming was a great drag on writing programs. These engineers invented a shortcut to rewiring. They used switches, through which they could toggle in programs. Next they found a more convenient, automated way to feed instructions into the computers from some form of storage, such as paper tapes or punched cards. At that time, there was a macho saying among such computer engineers, "Real programmers toggle in their code"; but in this case, the wimps won. In order to make the process of programming a computer more convenient, the computer engineers were forced to consult with mathematicians.
We owe the basic logical design of the of modern computer architecture to two mathematical geniuses: Allen Turing and John von Neumann. The abstract concept, called a Turing machine , provided the theory for designing of the modern digital computer. This design was elaborated into what was called the von Neumann Architecture, which had both data and computer code stored in the computer memory.
In those days, computer memory was a precious resource that had to be used efficiently. The engineers developed a method of storing a programs on paper tape or on punched cards. Holes in these media allowed electrical contacts to form in automated card and tape readers, which initiated a series of electrical impulses that were fed into the computer memory.
The ABCs of Computer Programming
The early card and tape based codes were sets of instructions written in terms of hexadecimal numbers. This notation in terms of base-16 numbers is a much more compact than that of binary bits, but is readily convertible into binary notation. The programmers that used hexadecimal instructions were programming in what was called machine language. Those early programmers were not just coding in the abstract, but had their head in the computer hardware itself. For example read about "The Story of Mel, a Real Programmer". They were translating a program into operations involving basic electronic, computer processes. Later this task was automated by a subsequent level of abstraction called assembly code, which was more understandable to humans. Assemblers translated this code into machine instructions.
Levels of Programming
The ability to program computers and to do progressively more has been accomplished by levels of abstraction that remove the program designer and coder farther and farther from the details of computer hardware.
Machine language provides the lowest level of programming, 1GL. The next level up is provided by assembly language, 2GL. The third level, 3GL, is populated with numerous languages, such as : Fortran, C, C++, BASIC, Pascal, etc. These languages are usually compiled into machine code, which involves the programmer in directing other computer-related tasks such as compiling and linking code. Programmers have found various ways of making this extra preparation of programs more automated, such as by writing shell scripts and Make files. The fourth level of abstraction, 4GL, further simplifies the mechanics of running programs, but it still leaves a lot of drudgery in the process of writing and running programs.
Reinventing the Wheel
Programming of the recent past was done using a lot of redundancy. How many do-loops and while-wend-loops does a programmer have to write? It gets old very fast and it is like constantly inventing the wheel. It becomes the digital equivalent of crocheting doilies. We dream of advanced computer systems like the one on-board the fictional Starship Enterpise. How many more levels of programming languages do we need to get there from here? The way there is through higher levels of applied mathematics, as for example in the use of matrices. Subroutines handle all the details of matrix mathematics, in a way which involves all kinds of loop structures in a way that is invisible to the user.