Adam Kalsey is asking people to write about their early computing experiences, so of course I’ll have to goof off and write about mine instead of packing for our Europe trip icon_wink.gif. I was thinking of starting a series of notes for future biographers, anyway…

Sometime in 1967, while browsing at a local library I stumbled upon Elliott Organick‘s “A FORTRAN Primer”, and immediately realized this was hot. I promptly bought Organick’s more up-to-date “FORTRAN IV” and proceeded to learn it forward, backward and sideways. As I had no computer available, I typed my programs on long rolls of paper on an old Olivetti Linea typewriter and tried to single-step and debug them by hand. I remember doing factorials with many digits and other number puzzles from Martin Gardner‘s column in Scientific American.

The next year I casually mentioned the matter to my math teacher, who immediately sent me to the local university’s Engineering School, where they had a IBM 1130 mainframe. This was housed in a large air-conditioned room. The IBM 1131 CPU used magnetic core memory: 8K words of 16 bits each (later expanded to 16KW). The clock frequency was 280 KHz. The CPU also housed a 500KW magnetic cartridge drive and a keyboard with a Selectric-type “golf ball” printer. Other peripherals were the IBM 1442 card read-punch, the IBM 1132 line printer, a pair of paper tape read/punch units, and my personal favorite, the IBM 1627 plotter.

I immediately enrolled in keypunch and FORTRAN classes (with a special dispensation as I wasn’t a student), and began to pester the local staff to cadge computing time. After first getting the factorial calculator to run, I started to write a program for the plotter, inspired by yet another Scientific American article; over several months it evolved into a complex kludge, drawing an arbitrary number of (possibly intersecting) ellipsoids in 3D space from any vantage point, with hidden-line removal. Being unaware of existing hidden-line removal algorithms I tried to solve it by trigonometry, which worked but became extremely slow for the more interesting cases.

The next year I entered the school officially as an Electrical Engineering student, and promptly became attracted by a free systems analysis course to be offered by IBM. This was a 2-hours per day, every weekday, 9-month course sponsored by the university; 20 students were selected from over 200 applicants, and I placed second. The course was excellent, and the two best students were offered an internship at the university’s main computing center, so I made sure to place first…

CECOM, the computing center, at the time had an even older mainframe: the IBM 1401. The CPU had 4000 bytes of core memory; each byte had 6 BCD data bits, a parity bit, and a “word mark” bit to flag the end of a variable-length field; clock frequency was about 83 KHz. The only peripherals were a card read-punch and a line printer, and programming was in Autocoder (assembly) or machine language. It was already obsolete and soon was replaced by a IBM/360-40, itself replaced a few years later by a Burroughs B6700, which remained in use for 13 years. Amazingly, I can’t locate any photo or reference manual of this machine.

The B6700 was huge. The CPU had 800K of semiconductor (static) memory, which was state-of-the-art at the time and had a 800ns access time, if I recall correctly. It also had a 10MB fixed disk drive for virtual memory and operating system bootstrap; this had one magnetic head per track with several huge platters revolving on a horizontal axis. There were half a dozen magnetic tape units, removable disk packs (100MB each), and several fast line printers and card readers; later on about a dozen video terminals were installed. The B6700 had a very interesting architecture, with 51-bit words: 48 data bits which could be interpreted as 6 characters, as well as 3 tag bits which defined the word format. There were different formats for instruction words, address pointers for integers and floats, strings, and stack pointers. The machine was stack-oriented and and had no assembly language; the MCP operating system was written in an Algol dialect called ESPOL. As we had full source code for the MCP and for the compilers, I had a merry time – for several years, it turned out – hacking around and learning about operating system and compiler design.

In 1977 I acquired an Apple II and left the mainframe world. More in the next chapter…