Sunday, May 28, 2006

My second computer: The Bendix G-15

After I wrote my first few programs for Intercom 1000, the other students frequenting the DPL made it very clear that Real Programmers wrote machine code [1, 2]. By this, they meant instructions for the hardware that ran the Intercom interpreters, the Bendix G-15D computer. [3, 4]

The G-15's instruction set architecture was almost the antithesis of Intercom 1000's ISA. It was the reduced instruction set approach taken to an extreme [5]. "2.5 address" instructions. ("The modified type of two-address code is unusual and difficult to classify.") Two-bit operation codes. No index registers. No floating point. I/O in hexadecimal. All the reference material a working programmer needed condensed on two sides of a 3" x 6" Reference Card [6].

There were three basic reasons to write machine code:
1) Speed. Peak instruction rate was 1,800 instructions per second (or up to 3,600 fixed point additions), compared to Intercom's peak of 30 IPS. Of course, these peak rates were only achieved under very special circumstances, but a good programmer could obtain a substantial speedup in machine code.
2) Space. An Intercom interpreter occupied about half the G-15's 63,307 bits (just under 8 KB) of program-addressable memory. Programmers wanted all of it.
3) Difficulty. The difference between the results achieved by good programmers and great programmers was dramatic, and mediocre programmers couldn't hack machine code at all. None of this "learn it in four hours" nonsense. Any program you can write, I can improve.

The G-15's primary designer was Prof. Harry Huskey, who had just returned from a sabbatical in England, where he worked on the Pilot Ace with Alan Turing. The G-15 embodies a lot of Turing's ideas. Brian Randell and I found it has some remarkable similarities to the Deuce computer, another Ace successor.

There were lots of interesting facets to G-15 programming that I will comment on in separate posts. Here I will summarize some of the basic properties of the hardware.

The G-15 was a remarkably physically small computer, perhaps the smallest commercially available electronic digital computer of its time. It was usually described as "refrigerator sized," with a basic system comprising a 3' x 3' x 5' cabinet [7], plus a modified IBM electric typewriter for input/output. It weighed 1,500 pounds and drew 3 KW of power. [8] This small size was a consequence of the simplicity of its logic (implemented with 450 vacuum tubes plus 3,000 germanium diodes [9]) and its use of a small magnetic drum [10] (rather than, e.g., mercury delay lines) for its memory.

In addition to its ISA, a significant contributor to the G-15's hardware simplicity was its use of bit-serial two's-complement arithmetic. Instead of adding numbers a word (28 bits plus sign), or a decimal digit, at a time, it operated on them one bit at a time, combining single bits from two sources with a possible carry bit, immediately writing the result bit to the destination and setting the carry bit for the next "bit time" [11]. Computation speed required a high bit rate, and the G-15 operated with a fast-for-the-time 105 KHz clock. The time units programmers thought in were the word time, 275 microseconds, and the drum cycle time, 29 milliseconds.

An unusual feature was that the buffered input/output system operated concurrently with computation. A program could place up to 108 words in the buffer, start I/O, and resume computation, later testing or waiting for ready if it needed to determine completion or to initiate another I/O operation. In addition to typewriter I/O at up to 10 hexadecimal characters [12] per second, the basic system was equipped with a 500 cps photoelectric paper tape reader and a 17 cps paper tape punch.

What we would today call the memory hierarchy consisted of a single one-word accumulator, three double-precision registers (which naturally required two word times to access, and were also used for multiplication and division), a cache of four four-word lines, a main memory of twenty 108-word lines, and paper tape as the backing store.

Optional peripherals included adaptors for magnetic tape drives [13], card readers and punches, faster paper tape readers, graphical plotters, and, perhaps most unusually, a digital differential analyzer, which provided hardware support for numerical solution of differential equations, with 54 integrators and 54 constant multipliers.

About 400 G-15s were eventually manufactured and sold. Significantly fewer than IBM 650s, but enough to make it one of the most widespread computers of the early 1960s. Due to its broader distribution, some people consider it stronger claiment than the IBM 610 for the title of "the first commercially viable personal computer."

[Other Sources]

[1] The term microcode was not yet widely used.
[2] With exceptions for quick-and-dirty programs, and programs that were floating-point intensive.
[3] We always called it the G-15, but there had been an early version that was labeled G-15A.
[4] See Ed Thelen's site for much historical information, and particularly his scanned-in Ballistic Research Laboraties Report No. 1115 on Domestic Electronic Digital Computers, March 1961, for quite a lot of interesting historical information. The Computer History Museum has an earlier BRL report, no. 971, December 1955, with a section on the G-15. The BRL reports are the source for a lot of the information I give about the G-15, but since the information for the later report was collected from users, rather than manufacturers, it is neither authoritative nor entirely consistent. Where there are discrepancies, I have tended to use the figures that correspond most closely with my memories.
[5] Although the terms RISC and CISC had not yet been coined.
[6] But advanced programmers could and did study the logic diagrams of the hardware to take advantage of unintended features.
[7] Home refrigerators have gotten larger as computers have gotten smaller.
[8] It was air-cooled and did not officially require air conditioning. However, the DPL had an exhaust vent (like those above many kitchen ranges) to suck hot air outside, and also had what was probably the only window air conditioner on the PUC campus.
[9] Compare to the contemporary (and comparably powerful) IBM 650 with 2,000 tubes and 5,000 diodes. And to the 230 million transistors in a recent Pentium D chip.
[10] Possibly reflecting its delay-line heritage, data was read and rewritten on each drum revolution, rather than being stored permanently, as in modern magnetic disks.
[11] Efficient carry propagation required that numbers be stored and read least significant bit first.
[12] The six non-decimal digits were u, v, w, x, y, and z. To this day, I find it easier to recall the underlying bit pattern for, say, x, than for D.
[13] A real improvement over paper tape as backing store. Not dramatically faster to read, but much faster to write, and it could be rewritten, and didn't have to be moved by hand from punch to reader.


Thursday, May 25, 2006

The Ithaca Address

On September 30, 1976, [1] I was asked to give the after-dinner speech at a workshop hosted by Cornell University to discuss the WOODENMAN version of the DoD's requirements for the language that eventually became Ada.

Let Prof. John Williams [2] set the stage:
Just after dinner on the first evening of the workshop, a tall gaunt and bearded man [3] rose quietly and moved toward the front of the hall. He looked tired and worn as though exhausted by his long arduous journey from the night before. As he turned to speak, a hush fell upon the room. And with a soft and solemn voice, he began,

"Four score and seven weeks ago,
ARPA brought forth upon this community a new Specification,
conceived in desperation,
and dedicated to the proposition
that all embedded computer applications are equal.
Now we are engaged in a great verbal war,
testing whether that specification, or any specification so conceived and so dedicated,
can long be endured.
We are met on a great Battlefield of that war.
We have come to dedicate a Proceedings of that Battle,
as a final resting place for those Papers
that here gave their Ideas that that Specification might live.
It is altogether fitting and proper that we should do this.
But, in a larger sense,
we cannot dedicate - we cannot authorize -
we cannot enforce - this Specification.
The brave men, military and civilian,
who funded this Specification,
have authorized it far above our poor power to add or subtract.
The world will little note, nor long remember what we say here,
but it can never forget what they did elsewhere.
It is for us the experts, rather,
to be dedicated here to the unfinished work
which they who fought here have thus far so nobly advanced.
It is rather for us to be here dedicated to the great task remaining before us,
- that from these honored Papers, we take increased devotion to that cause for which they gave the last full measure of devotion
- that we here highly resolve that these Papers shall not have been written in vain
- that this Specification, under DoD, shall have a new birth of reason
- and that programming
of common problems,
by common programmers,
in common languages,
shall not perish from the earth."
Updated 5/30/07: Corrected attribution of scene-setting remarks.

[1] Roy Levin commented: "I have trouble believing the date in the opening sentence."
To which I replied: "I had to keep the 'four score and seven' to make the source instantly recognizable. Woodenman was just a couple of years into the project, so I doubt I was wrong by more than a binary order of magnitude."
To which he replied: "I was referring to September 31."
So I fixed it.
[2] As later quoted by William A. Whitaker, Col. USAF, Ret.
[3] At that time I had had a beard for almost a decade, and it was still very dark. I was often told that I "looked like Abraham Lincoln." Now, I'm more often asked if I'm Amish.


Sunday, May 14, 2006

My first computer program

One evening in the summer of 1959 (probably during July), I was hanging around PUC's DPL soaking up the ambiance and the reflected glory of the programmers.

An older student, who was running an Intercom 1000 program, exclaimed, "This is the slowest-converging series I've ever seen!" He was printing every tenth partial sum, and after more than sixty printouts (600 terms), the sums were still getting bigger, although more and more slowly.

I asked what that series was, and learned that he was trying to find the limit of the sum of the reciprocals of 2i + 1, for i from one to infinity. I confidently assured him that this series was never going to converge, because it was essentially equivalent to the harmonic series, which was well-known to be divergent, but very slowly divergent.

"Well, it can't be divergent, because it's the solution to a problem that Martin Gardner gives in SCIENTIFIC AMERICAN's Mathematical Games."

Naturally, I asked to see the problem, which was to determine the resistance of an infinite, but regular, network of one-Ohm resistors. I became convinced that he was using the wrong formula. I believed that the correct formula involved a continued fraction, rather than an infinite sum. He was unwilling to admit his error, and when I urged him to at least program my formula and try it, he sulkily handed me the Intercom 1000 manual and said, "Program it yourself."

This was my gentle introduction into the art of programming. I retreated to a corner of the Lab, read the manual, and quickly learned enough to write out a simple loop to evaluate successive approximations to my continued fraction. Having no idea how rapidly the fraction might converge, I decided to print out every tenth approximation, and run the program until that stopped changing.

With some trepidation, I typed my program into the computer, and started it running. It immediately began printing the same number over and over again:
as fast as the typewriter would go.

I assumed that I had made some horrible blunder to cause it to keep printing the same value, so I stopped it and retreated. However, I could find nothing wrong with my program. So I modified it to print out every iteration, rather than every tenth, and discovered that this continued fraction converged really fast, and had reached the limits of double-precision accuracy well before the tenth iteration. So my first try at my first program was correct, but I wound up debugging it anyhow.

Being correct the first time has been rare enough in my programming experience that I would remember this occasion vividly, even if it hadn't been my first program.

Afterword: The next issue of SCIENTIFIC AMERICAN had Martin Gardner's solution to the problem. By looking at the problem in a slightly different way, he came up with a closed-form solution (one plus the square root of three) that produced the same numerical result as my continued fraction, but which could have been calculated with pencil and paper (even without the aid of a table of square roots) in less time than either of us had spent on our programs.

A little knowledge is a dangerous thing.
Insight often trumps calculation.



In 1959 there cannot have been very many places in the world where a 16-year-old entering freshman was given free access to an electronic digital computer. Among small liberal arts colleges, Pacific Union College may have been unique in providing such access. I had the great good fortune to be there, for no better reason than that it was the alma mater of my parents.

The Data Processing Laboratory (DPL) was established by the Alumni Research Foundation (ARF) under an arrangement where the ARF provided the equipment and PUC provided the space (in the basement of the library) and the operating budget. [1][2] The initial equipment was a Bendix G-15D General Purpose Digital Computer, acquired under favorable terms from the Bendix Computer Division. During my student years, various additional pieces of equipment were added, including a Bendix CA-2 accessory to interface the G-15 with an IBM card reader/punch (545) and tabulator (402) [4], Ampex magnetic tape drives, a Bendix PA-3 pen plotter, and eventually a second G-15.

As so often happens, the younger generation absorbed the new technology much more avidly than the older. Hence, most of what I learned about programming was not from the faculty, but from fellow students, especially Milton Barber, Curtis Lacy, and J. Mailen Kootsey. [3] And hardware was the province of Takashi Yogi, who not only kept everything running (a non-trivial task in the days of vacuum-tube computers), but designed and built new hardware, such as the interface from the G-15 to the mag tapes.

DPL had an informal arrangement with BCD to send a couple of students down to El Segundo each summer to work as Junior Mathematicians, which I did in 1961 and 1962. Naturally, we were convinced that we knew as much as the professional programmers, were smarter, harder-working, and produced better code. But we learned a lot, despite our lofty attitudes.

The last time I was on the PUC campus, in 2003, Takashi and I sought out the site of the DPL. The original sign was still on the door, and it did not look like anything had taken the places of the DPL and RRL [5]. Nor did the college seem to have recognized in any way that anything important had happened there.

[1] Strong encouragement and leadership came from a small group of faculty members who were themselves alumni, most notably Profs. Ivan Neilsen and Ted Benedict.
[2] As best as I can tell, neither the ARF nor the DPL have survived as recognizable entities.
[3] By my senior year I was teaching the computer programming courses.
[4] The DPL was also where I first learned to wire plugboards for IBM tabulating equipment--a skill I have not needed for decades.
[5] During this same period, Prof. Neilsen also directed the construction, installation, and operation of a 6 MeV linear accelerator, a "little brother" to the Stanford Linear Accelerator (SLAC). Same technology, but just a yard long, rather than two miles. The accelerator itself was in a "cave" under the library parking lot. The entrance to Radiation Research Laboratory was off the same basement corridor as the DPL.


Friday, May 12, 2006

My first computer: Intercom 1000

The first electronic digital computer that I was ever close to was also the first that I programmed and the first that I operated (summer 1959): Intercom 1000. Intercom 1000 was a virtual machine [1], like today's Java Virtual Machine, implemented by an interpreter running on the hardware of the Bendix G-15 General Purpose Digital Computer. [2]

Intercom 1000 was intended as the source language in which the programmer worked, not as the output of a compiler. Its manual ran to 30 uncrowded pages--including instructions for operating various peripherals--and it "could be learned in just four hours."

Intercom 1000's instruction set was similar to many that were popular in the 1960s. It had single accumulator and 43 single-address instructions. It allowed two levels of subroutine call. More novel for the time, it also had nine index registers. But its Big Feature was floating point arithmetic. Single-precision had five digits of mantissa, with exponents from 10^-38 to 10^38, and was represented internally in a 29-bit word. For double precision, the same instruction set ran on a different interpreter that provided a twelve-digit mantissa. Although internal computation was in binary, I/O was automatically in decimal, and could be concurrent with computation.

Instructions could be typed on the console or stored in memory. Instructions and single-precision numbers occupied one word, double-precision numbers used two. The single-precision interpreter allowed 1,200 words for programs and data, the double-precision version, 1,000. That corresponds to not quite 5 kilobytes, and 4 KB for double-precision.

Intercom 1000 could execute 10 floating point operations per second, and up to 30 of some of the simpler instructions. That's 10 FLOPS, not 10 KiloFLOPS, MegaFLOPS, GigaFLOPS, TeraFLOPS, or PetaFLOPS. Bendix literature bragged that this was "faster than the floating point system of any computer in the G-15 price range" ($50,000 then, equivalent to about $280,000 today).

It might seem that not much interesting could be done with such limited resources, but compared with slide rules and mechanical calculators, it was an enormous advance in speed, precision, capacity, and reliability. PCs (pocket calculators) were still a decade in the future.

[Other Sources]


[1] The term "virtual machine" was not used back then. "Intercom" was a contraction of "interpretive compiler." Terminology was not yet standardized.

[2] Intercom 1000 was not intentionally designed to be machine-independent (another term not used then), but it basically was.

[3] The Computer History Museum website dates Intercom 1000 to 1955 or 1956, but other evidence suggests that it was released in 1958, and was fairly new when I first encountered it.

[4] There were several later Bendix (and CDC) Intercoms: 500, 510, 512, and 550, all hewing to the same basic design, but with better speed and/or more instructions (e.g., to support alphanumeric I/O).


Wednesday, May 10, 2006

Nine miles in the snow!

As technology and society advance, it seems to be the perennial habit of old-timers to tell young whipper-snappers that they just don't know how tough it was in the old days.

I will not be able to completely avoid such remarks in this blog. In many important respects, computing technology has advanced by at least nine (decimal) orders of magnitude during the period I have been involved with it. This phenomenon is often lumped under the heading of "Moore's Law," but is actually more widespread, including, for example, advances in magnetic disk capacity that have outpaced Moore's Law for the past decade (even though the imminent demise of that technology was widely predicted in the 1970s).

Large quantitative changes in almost anything are accompanied by significant qualitative changes in what is conceivable, possible, reasonable, and usual. Think of all the differences in transportation that resulted from the change from traveling on foot to riding a horse to driving a car to flying in a jet plane. And remember that an airplane outperforms a bicycle or a covered wagon by somewhat less than three orders of magnitude (i.e., less than 15 years of Moore's Law). I will be discussing changes that have been dramatically greater than that.

Truth be told: In many ways it was fun, rather than a hardship, to work with systems small enough that one person could thoroughly understand both the hardware and all the software running on it. One of my recurrent themes will be that we have gone from a situation where we mastered the machines to one where they too often master us.

(For the record, I started school in Honolulu, and never trudged through any snow to school, unless you count my days as a professor at the University of Toronto, where the bulk of my commuting involved the estimable Toronto Transit Commission.)


Tuesday, May 09, 2006


Those who cannot forget the past are condemned to remember it.
--Jane Ace
I have encountered many interesting people, machines, and software systems since I was first introduced to a computer in 1959, and have been involved in some interesting situations. I've been encouraged by friends to write some of these down.

At present I have neither the time nor the ambition to organize these memories into a book, as Severo Ornstein did in Computing in the Middle Ages: A View from the Trenches 1955-1983, or even to write structured memoirs, like Per Brinch Hansen's A Programmer's Story and Lynn Conway's Retrospective--although I admire them all and commend them to your attention. This collection is closer in spirit to Multicians, but I am using Blogger to handle the housekeeping.

I plan to post memories incrementally, in the form of a blog that contains anecdotes and historical context from my past, rather than the kind of day-to-day concerns contained in my contemporary blog, Nothing is as simple as we hope it will be.

There will not necessarily be any chronological order to my posts.

I will try to explain in modern terms any period jargon that I use. Several of my stories will involve the Bendix G-15D computer, and will probably make more technical sense if you understand the rudiments of its architecture.

My intent is to be as accurate as memory allows, but not to attempt to compete with scholarly journals such as IEEE Annals of the History of Computing, either in fact-checking or in documentation ("just the good stuff").

I welcome corrections or amplifications from readers whose knowledge or documentation is more complete than my own--or that supplements it. If you have comments, please email me and/or post comments on particular blog entries. I may go back and update previous entries in the light of new information or the remembrance of something else relevant.

Enjoy! (If you please.)