For twenty eight years, during the nineteenth century, an English school teacher,
William Shanks, spent his evenings computing the first 707 digits of the number
π ... and made an error in the 528th place. It took
years - and he made an error. He had used multiplication tables and tables of
logarithms; each table had taken years to generate - and were filled with
errors. The people who generated the tables, and indeed Shanks himself, were
"computers" in the original sense of the word.
Shanks used the series
π/4 = 4 tan-1(1/5) - tan-1(1/239).
Shanks' error was uncovered in 1944 when D.F. Ferguson discovered that Shanks had omitted
two terms in the series!%#$&@* In fact, it was noticed (even in Shanks' time) that there was a
suspicious shortage of 7's in the last 707 digits. After the correction, the sequence of digits
passed all statistical tests for randomness.
|
Speed and reliability were the bane of "computers" ... and would
be for another century.
In the seventeenth century, mechanical calculators for performing addition/subtraction
(and multiplication/division, with some help from the operator) had been devised by the
German astronomer and mathematician Wilhelm Schickard. The French mathematician
Blaise Pascal later built such a device and later still, in 1673, the German mathematician
Wilhelm Leibniz (who, with Isaac Newton, co-invented the Calculus: see
Newton & Leibniz) built a mechanical
multiplier.
Leibniz was to write of Pascal's device:
"... it facilitates only additions
and subtractions, the difficulty of which is not very great ..."
|
In the early nineteenth century the eccentric Cambridge mathematician
Charles Babbage
(1791-1871) employed two human computers to generate mathematical tables for
the Astronomical Society, checking the work of one against the other in order to
identify errors.
There's an old mariner's adage that goes something like:
"Never take 2 chronometers; either take 1 or 3."
|
About 1820, frustrated by the time necessary and the inevitable errors, he conceived of a
"Differential Engine" which would compute the values of functions for equally spaced
values of the independent variable, using the Method of Differences.
|
Charles Babbage
|
Part of the "Engine" |
It was a mechanical device replete with wheels, gears and levers and a crank which would
successively generate the function values automatically - without human intervention
except to turn the crank - once the function definition had been set into the machine. Errors
in navigation tables had been the cause of ships running aground, astronomical observations
required accurate tables, so Babbage's Differential Engine was of great importance
... and the British government funded the project. |
By 1833, ten years after the project had begun, there was little to show for their
investment so the government withdrew funding and all work on the Engine ended.
Undeterred, Babbage designed a more ambitious, more versatile "Analytic Engine".
It would consist of three parts:
- The STORE where numbers were stored, or "remembered".
- The MILL where arithmetical operations on numbers taken from the STORE would
be performed.
- The SEQUENCE MECHANISMS which would select the proper numbers from the STORE and
instruct the MILL to perform the proper operations.
It was, indeed, an automatic, general purpose, programmable computer complete
with CPU and input/output and programs and conditional (IF ... THEN) branching and
micro-programming - and it printed its answers. Further, it took its instructions from
punched cards!
|
Punched Cards
|
Punched cards was an idea Babbage extracted from the textile industry where Joseph M.
Jacquard, in 1805, used punched cards to control which threads of warp were lifted
above and which below the shuttle. In 1886 a U.S. statistician, Herman Hollerith,
would also used punched cards in statistical and accounting machines, to expedite
the taking of the U.S. census. (It had taken seven years to perform the clerical work
for the 1880 census.) For Babbage, "Operation" cards identified the operation
to be performed and "Variable" cards located the operand in the STORE.
|
Charles Babbage was a well known scientific and political figure and held parties ... often!
His guests might include Darwin or Dickens or Longfellow.
Apparently, Babbage would read Tennyson (among other things). When he read the lines
"Every moment dies a man. Every moment one is born." he wrote Tennyson to inform him
that "... this calculation would keep the world's population in perpetual equipoise."
Babbage suggested: "Every moment dies a man. And one and a sixteenth is born."
|
His guests would come, be entertained ... and ignore
his partially completed Engine.
Countess of Lovelace |
In 1833, one guest alone understood: the
Princess of Parallelograms, the seventeen year old daughter of Lord Byron, destined
to become the Countess of Lovelace, Babbage's Enchantress of Numbers, the world's
first computer programmer, the beautiful and brilliant
Augusta Ada. We owe to her a
detailed account of both the hardware and software for the Engine.
|
In the 1970s, the U.S. Defense Department contracted for a programming language to permit
all of its computers to "talk" to each other ... a precursor to the Internet. The language
was called ADA.
|
Alas, lack of funds prevented the completion of Babbage's Analytical Engine.
On October 18, 1871, Charles Babbage died, disappointed, frustrated, embittered.
|
During the years 1935 - 1945, the word "computer" took on its modern meaning of a device
which computes rather than a person which computes.
In the early 1930s, in Berlin, a Civil engineering student named Konrad Zuse, in typical
student fashion (!), was too lazy to perform the endless calculations necessary for his studies.
He designed a machine based upon the binary number system using telephone relay switches as
on-off devices. Earlier machines had been decimal, incorporating the digits 0 through 9
on toothed wheels. For years, computers would be decimal.
(see Zuse)
It took four switches to add 1+1, in binary. Zuse's machine filled a small room - his parents'
living room! By 1939 he was the world's leading computer designer.
Later, in a TV interview, Zuse said, "I was too lazy ... so I invented the computer."
|
Punched Tape
(movie film?) |
Then came World War II and he received unlimited funds from the German military. Yet, to
overcome war-time shortages, his "programs" were on punched tape made from discarded movie
film. By 1941 he had built Babbage's dream machine: an automatic, programmable, general
purpose computer, using binary arithmetic and, unfortunately, electromechanical relays.
They were slow, taking up to five seconds to perform a simple multiplication.
|
A friend visiting his laboratory suggested using a switch from the new electronics industry:
the vacuum tube. As his successors in computer science would do, to the present day,
Zuse made up a proposal for government funding: it would be a two-year project and would
result in a computer 1000 times faster than the earlier version.
Two years? Hitler knew the war would be won in less than two years - so the project wasn't
funded. It would be many more years before the world knew of these developments in Germany.
Zuse managed to sneak his latest model, the Z4, out of Germany, accompanied by rocket scientist
Wernher von Braun. He later started a computer company that was bought out by Siemens.
|
In 1939, the Bell Telephone Laboratories (using a design of Bell mathematician George Stibitz
who invented floating point arithmetic), had built a binary calculator which they
called a "Complex Computer" (to perform the troublesome arithmetic associated with
complex/imaginary numbers). A 1940 demonstration, in New York, had the user sending problems
from New Hampshire via teletype.
When the U.S. entered the war, one of the most distressing problems was the time
necessary to supply the latest weaponry. There was a drastic shortage of human computers
to calculate the trajectory of shells. "Firing Tables", incorporating the effects of wind,
temperature, angle of elevation, etc. came from test firings at the Aberdeen Proving
Grounds in Maryland. Hosts of female computers were employed; it took 30 - 40 minutes
with a desk calculator to compute a single trajectory. It would take four years for a single
female computer to compile one Firing Table, incorporating 1800 trajectories!
|
Shell Trajectories |
Howard H. Aiken of Harvard, inspired by Babbage's ideas, worked with International
Business Machines to build, in 1944, the Harvard-IBM Automatic Sequence-Controlled Calculator
(ASCC, or Harvard Mark I). It was a decimal machine with 73 storage registers
handling 23 decimal digits and performed additions in 1/3 second, multiplications in 6 seconds
and worked 24 hours a day solving urgent military-computation problems ... but it contained
gears and wheels and relay switches and failed to take full advantage of the speed inherent
in vacuum tubes. Four years later, on January 27, 1948, IBM was to demonstrate the
Selective Sequence-Controlled Electronic Calculator (SSEC) with 13,500 tubes ...
and 21,400 electromechanical relays!
In a 1981 book, R. Morceau (IBM-France) claims the SSEC to be the world's first computer.
Earlier machines, including the ENIAC, he calls universal calculators.
|
The problem of speed was taken up by John W. Mauchly, a physicist, and
J. Presper Eckert, a young engineer, during the war years from 1942 to 1946.
Both were at the Moore School of Electrical Engineering, University of Pennsylvania. With
army funding, they designed a fully electronic computer with some 18,000 vacuum tubes.
In spite of the unreliability of vacuum tubes (they burned out - rapidly!) and a prediction
that their computer would break down every five seconds (the old reliability problem),
the army was desperate.
The machine was built, filling a 50 x 30 foot room with tens of thousands of electrical
components in addition to the 18,000 tubes, and some half million soldered joints ...
and they called it an Electronic Numerical Integrator And Computer (ENIAC).
They demonstrated ENIAC's awesome computing power (with its panel of flashing lights
specially wired to impress the cameras of the world) by calculating the trajectory of a shell
that would take 30 seconds to reach its target - and they did the calculation in 20 seconds.
Alas, it wasn't finished until 1946, after the war ended.
"Programming" required setting some 6000 switches and incorporating hundreds of cables,
effectively rebuilding the machine for each problem. Unfortunately, ENIAC was required for the
war effort so building the machine according to this early design had to continue unabated.
Before it was put into service,
Eckert and Mauchly realized the need for stored program computers and they left the
University of Pennsylvania to start the first commercial enterprise to build computers:
The Eckert-Mauchly Computer Corporation. They had in mind a stored program
Electronic Discrete Variable Automatic Computer, the EDVAC.
John von Neumann, a Hungarian who came to Princeton in 1930
(and perhaps the most distinguished mathematician of the day), learned of ENIAC/EDVAC,
visited Eckert and Mauchly and wrote a 101 page First Draft of a Report on EDVAC,
laying the theoretical groundwork for how such a machine should be built.
Eckert-Mauchly were not happy! Who really invented the computer? Who really had patent
rights? E-M applied for a patent in 1947. Battles ensued for years, between E-K, Neumann,
Moore School and, later, patent infringement battles with IBM and CDC and Honeywell and
Sperry Rand and etc. etc. The ENIAC patent was finally awarded in 1964. In 1972, a
Judge named Earl Larson, being of sound mind and in possession of some 40,000 documents (including
a copy of von Neumann's infamous Report), ruled that the computer was invented by
(are you ready for this?)
John Vincent Atanasoff ... but that's another story.
|
However, it would be years before the first stored program Eckert-Mauchly computer would
be available. Desperately short of funds, their corporation was absorbed by Sperry Rand and,
based upon their EDVAC design, the Universal Automatic Computer (EDVAC) was built.
The first EDVAC production model was used by the U.S. government for the 1950 census. Grace
Hopper was the premier programmer for Eckert-Mauchly and, subsequently, the Univac
division of Rand.
|
In July, 1948, the Manchester University Computer (M.U.C.) was unveiled. The innovative
memory, using cathode ray tubes, was designed by a British radar engineer Freddie Williams,
freed from wartime activities. Although small, it was the first fully electronic stored
program computer. It took 25 minutes to decide whether 2127 -1 was a prime
number ... a problem clearly designed to impress the public!?
The newspapers were delighted, yet there was rampant scepticism as to what a no purpose
computer could do. A machine to calculate shell trajectories? Okay. A machine to generate star
maps? Okay. But a machine without a specific purpose?
|
Headlines |
Inspired by the EDVAC design (and having attended lectures at the Moore School, obtaining a
copy of Neumann's Report), a young Cambridge physicist, Maurice Wilkes, directed
the building of a "user friendly" Electronic Delay Storage Automatic Computer, the EDSAC.
The "delay storage" referred to an electromechanical delay line: oscillating
quartz crystals generated pulses in tubes of mercury and the pulses were recycled to
provide memory.
(In place of mercury, Turing suggested gin and tonic because the speed of propagation was
relatively insensitive to temperature changes!)
|
The EDSAC was a fully electronic general purpose, stored program computer, in operation
by 1949 (before EDVAC, upon which the design was based). It contained some 3000 vacuum tubes,
could perform additions in 1.4 milliseconds and emphasized "programming". Indeed, EDSAC
could access a library of programs called (would-you-believe) subroutines,
including what was thought impossible at the time: a subroutine for numerical integration
which (by calling an "auxilary" subroutine) could be written without knowledge of the function
to be integrated!
A problem: whenever a tape was read the subroutine may not go to the same memory locations
so certain memory addresses had to be changed. (Each instruction involved an operation to be
performed and a single memory address identifying the location of the operand. Later, multiple
address instructions would become popular.) This problem was overcome by preceding each piece of
code with a set of "coordinating orders", making it self-relocatable. Later, the Cambridge team
devised symbolic addresses whereby the programmer used some label and the computer assigned the
memory locations (avoiding the need to change any explicit addresses which might appear in
the code).
But Wilkes found few who appreciated the problems in programming ... until he visited the U.S.
in 1950 and met Grace Hopper's group. (On the way he stopped off at the University of Toronto
to give a talk at Kelly Gotlieb's invitation.) During his U.S. trip he also argued with
Howard Aiken (of Harvard-IBM ASCC fame) re the advantages of a binary computer.
(Recall that the ASCC was a decimal machine).
Although EDSAC solved a variety of problems (including a nonlinear differential equation
proposed "in characteristically barbed fashion" by the distunguished statistician
R.A. Fisher ... much to Fisher's surprise), in order to avoid the suspicions associated
with analog machines, EDSAC spent some time computing prime numbers (which no analog
machine could do). It impressed at least one skeptic: the distinguished
number theorist L.J. Mordell (who later obtained an honorary degree at the University
of Waterloo).
In 1951, Wilkes co-authored the first book on computer programming. Feeling that such a book
would be risky, Addison-Wesley offered no royalties until 1000 copies were sold.
They were, by the following year.
By 1950, MIT had nearly finished the Whirwind computer which featured, for the first time,
a graphical output terminal, light pen interaction with the operator, data communications
over telephone lines and, later, a magnetic core memory.
MIT was taken by surprise by U of Pennylvania's ENIAC and switched from building an analog
machine to a digital one. The magnetic core made possible, for the first time, very large
memories ... in spite of von Neumann's claim, in 1955, that a memory over 10,000 words, for
the Univac LARK, was a "waste of the government's money".
|
In 1954, IBM had completed the Naval Ordinance Research Computer (NORC) which performed
additions in 15 microseconds and multiplications in 31 microseconds. It handled numbers in
the form of "words" consisting of 16 decimal digits: 13 for the number, 1 for the sign, 2
for the exponent. It had eight magnetic tape units with 1200 feet of tape at 510 characters
per inch, feeding information to the central computer at 70,000 characters per second. "Words"
were available, from memory, in 8 microseconds.
W.J. Eckert, head of IBM's Watson Scientific Computing Laboratory, had supervised the
construction of NORC. He said, "... we expect a billion operations between errors ..."
Thomas John Watson (1874 - 1956) was a business executive who, in 1914, became
president of the "Computing-Tabulating-Recording Company" (which grew out of Hollerith's
"Tabulating Machine" company). In 1924, the company was renamed: International Business
Machines. surprise!
|
In 1953 the UNIVAC was practically the only computer available to a commercial firm. Within
two years IBM would sell more than half the computers in the U.S. and within another two years
the computer industry would be known as IBM and the Seven Dwarfs.
Sperry-Univac was the first dwarf. Some Sperry-Univac people jumped ship and created the
second dwarf: Control Data Corporation. One who jumped ship was Seymour Cray
who built the CDC 1604 - the world's most advanced large computer. Eventually, Cray left CDC
to build his own computers.
The CDC people who had left Univac had designed the Univac 1103.
CDC's headquarters were at 501 Park Ave.
1103 + 501 = 1604
I hate to show my age, but the 1604 was the computer that churned out
the numbers for my PhD thesis. I spent many a night, just me, alone with the 1604 ...
|
Computers could now perform complex calculations with both speed and reliability.
One of the most forward thinkers of the day thought scientists had missed the point.
To use computers for arithmetic was a waste.
Alan Turing |
A. M. Turing
had published a paper in 1936: "On computible numbers with an
application to the entscheidüngsproblem". In it he had defined a "theoretical machine"
(to become known as a "Turing Machine") which, in principle, could compete with a human in
performing cerebral (logical) tasks. During the war, Turing put these ideas to work; he was
engaged in breaking the German codes at the highly secret Bletchley Park.
|
(Bill Tutte, a distinguished professor at the U of Waterloo,
worked at Bletchley Park during WWII.)
In particular, Turing was put in charge of breaking the ENIGMA which encoded messages concerning
daily German troop movements not by a fixed transformation from message to code, but
by continuously altering the characters using rotated toothed wheels. Although Turing
devised a strategy for deciphering the messages, there was too little time ... so an
electronic, special purpose computer, the
COLOSSUS, was built according to
Turing's design. It began operation in 1943 when work on ENIAC had just begun. It was highly
successful, some believing that "COLOSSUS won the war". Secrecy prevented any knowledge of its
capabilities for some thirty years.
My 1960 Colliers Encyclopedia identifies the 1944 Harvard-IBM Mark I as the first
general purpose automatic digital computer.
|
Turing was convinced that computers could do far more than carry out a sequence of
instructions, far more than perform mere arithmetical operations: computers could learn!
After the war he built the ACE computer to simulate human thought. The ACE "Pilot" model was
completed in 1950 and was regarded by many as the world's most advanced computer.
Alas, the ACE was used solely for scientific number crunching. Turing was disgusted with
this and with the bureaucracy ... and joined the University of Manchester where he designed
a machine to work in conjunction with the Manchester University Computer (M.U.C.) to do
"creative word processing", including love letters!
|
M.U.C. Poem |
In 1950, Turing proposed a test of intelligence: for five minutes you ask questions
of an unknown "device", either machine or human (presumably intelligent). If you are not
70% sure that the device is a machine then you must concede that the device has some
intelligence. Turing predicted that machines would pass such a test of intelligence
by the year 2000..
Mamma mia! That's NOW ... as I type this history! Gotta do some Net surfin'
to see what's happened in this regard!
(See the Loebner Prize)
|
Math & Computer building
U of Waterloo |
By 1957 (when the CS dept. at UW begins) there were dozens of suppliers of computing devices,
all designed to perform rapid and accurate calculations. In addition to "automatic
digital computers" there were, at that time, a variety of other machines, not
digital computers,
which could handle one of bookkeeping, data sampling, radar fire control, file searching, flight
simulation, machine control, telephone switching, navigation, music generation, game playing ...
|
Alas, Turing never lived to see the myriad uses to which today's fully electronic,
automatic, general purpose, stored program computers
* are put ... including all of the above!
*
It's necessary to stress this description of the beast
since arguments concerning "who invented what?" - and they contunue unabated to this day -
might be circumvented by just such elaborate prefixes ... as well as a clear definition of the
word "computer".
|
In 1954, prosecuted for homosexualtiy, Alan Mathison Turing
committed suicide.
|
In this bright, new and enlightened millenium, this wouldn't happen. Would it?
|