Harvey Cohn

Born December 27, 1923, New York City; mathematical researcher with innovative uses of computers in number theory, particularly for algebraic number theory and modular junctions, specific computations involve class numbers, genus, and systems of modular equations (using computer algebra).

Education: BS, mathematics and physics, City College of New York, 1942; MS, applied mathematics, New York University, 1943; PhD, mathematics, Harvard University, 1948.

Professional Experience: Wayne State University (Detroit) 1948-1956; Washington University (St. Louis), 1956-1958, director of Computer Center, 1957-1958; University of Arizona, 1958-1971, head of Mathematics Department, 1958-1967; distinguished professor chair at City College of New York and the Graduate Center (CUNY), 1971; Stanford University, 1953-1954; member, Institute for Advanced Study, Princeton, 1970-1971; lektor, University of Copenhagen, 1976-1977.

Awards: Putnam Prize Fellowship at Harvard, 1946; Townsend Harris Alumni Award, 1972.

In my Townsend Harris high school years in New York, 1936-1939, I found myself honored as a mathematics prodigy but troubled by the idea that the honor might be of dubious value in earning a living. At the College of the City of New York, 1939-1942, my feelings were accentuated. The gospel of the purity of mathematics was echoed dogmatically but yet defensively by teachers whom I otherwise admired greatly. Their attitude seemed encapsulated in the famous "Mathematician's Apology" of G.H. Hardy, which they recommended to me to read. [Hardy, G.H., A Mathematician's Apology, Cambridge Univ. Press, Cambridge, UK, 1940, pp. 139-143.]

This book appalled me as the ultimate in snobbery for its theme that "if real mathematicians do no good they also do no harm." Indeed Hardy used "real" provocatively to mean "pure" as compared with "trivial" which meant "applied." He seemed to say that a gentleman never works with his hands, and felt this arrogance expressed the academic workings of the British class system, which Americans were sanctimoniously taught to disdain, particularly since they had not yet formalized their own academic class privileges.

At that time some of the best mathematical brains, nationally, were in the academic engineering departments, and they were effective rivals of the pure mathematicians. At City College, it was necessary to go to the Electrical Engineering Department to find a course in matrices or in applications of complex analysis, and to the Physics Department to learn of the applications of differential equations. This was not unusual; it was their purity, their avoidance of real applications, that gave mathematics departments their sense of class.

I never accepted the folklore of the mainstream mathematician "purists," who viewed non-appreciation of their calling as a form of ennoblement, as though it were some idealistic political cause. On the other hand, I did not believe the mathematical "crowd-pleasers," who said that everything abstract shall eventually be applied. (They proved to be largely correct, but I understood then they were making very uninformed guesses couched in generalities which can only be proved right, but never proved wrong). I ended up as an applied mathematician in spirit, specializing in number theory, which looked very "applied" since the interest created by a theorem concerning numbers at that time lay in its numerical examples.

The advent of computing still took time. First I had an exciting two-year adventure, 1942-1944, with Richard Courant, which included an MS in applied mathematics at New York University in 1943. Courant had only an embryonic "Applied Mathematics Group" (quickly renamed "Graduate Center of Mathematics"), which was unpretentiously housed in a corner of the Judson Women's Dormitory at Washington Square. It was the forerunner of the present lavish Courant Institute in Weaver Hall. The classrooms were in the main buildings, but the library was housed at the dormitory and furnished mainly with Courant's own books and reprints, one of many aspects of a "zero-budget" operation. He, with K.O. Friedrichs and J. J. Stoker, formed a triumvirate, but, also because of budget, they had to be supported in part, not as mathematicians but as engineering teachers. Also present was Donald A. Flanders, whom I was to later meet at the Argonne National Laboratory. Courant obviously was thinking ahead in terms of his world-renowned Institute at Göttingen, which he himself had largely created but had to abandon when the Nazis took power. Not surprisingly, he did have a computing center, consisting of a bank of desk calculators in search of users. To look busy, he employed his wife Nina and for director gave a start to the career of Eugene Isaacson, who stayed on to become a more meaningful director of the computing center for the Courant Institute.

The environment seemed mathematically felicitous, particularly to see a mathematician of Courant's stature so lacking in pretension of "purity." Courant reminded us continually that Gauss had always tested results numerically with extravagant precision (like 40 decimals), and indeed that Gauss gained fame by tracking asteroids numerically as they disappeared behind the sun and reemerged. This numerical work inspired some of Gauss's deepest research in number theory and function theory. Clearly there had to be a moral here for aspiring mathematicians. Numerical work was also an important aspect of Courant's public relations campaign to obtain government support from the Office of Scientific Research and Development (OSRD). He was not at all modest about the eternal relevance of his famous work [Courant, R., K.O. Friedrichs, and H. Lewy, "Über die partiellen Differenzengleichungen der mathematischen Physik," Math. Annalen, Vol. 100, 1928, pp. 32-74.] With K.O. Friedrichs and H. Lewy on numerical solutions of partial differential equations. He also made the ethnically generous point that the British mathematicians should not be put down as mere "problem solvers," compared with French and German "theorists," since the British had a special instinct for numerical answers, as he saw in such work as that of Rayleigh and Southwell. One of his favorite associates was a very cheerful self-styled "Scotsman" J.K.L. MacDonald, from Cooper Union, who carried asymptotics to a higher state of the art. He developed techniques to estimate Bessel functions for "medium-sized" values, not being content with the ordinary infinite behavior. MacDonald's proclivity for numerical tricks seemed to go well with his love of gadgets, which he collected and even built. When he died in a private airplane crash during World War II, I felt he had tried one gadget too many. Looking back, I must acknowledge Courant's prescience of the early hold the British would have on computation within the next five years because of their skill with numerical computations. I thought of Richard Courant's endeavor as a "shoe-string" operation and such efforts seemed to occur very often in academic computing. What was lacking by most entrepreneurs, including me, was an "irresistible force" to overcome the "immovable administration."

After service in the Navy (1944-1946) as an electronic (radar) technician's mate, I felt I had acquired an understanding of electronics of the future. Unquestionably the "Captain Eddy" (Navy) training program was the start for many of my generation who became attracted to computing. (The total effect of this program on American computer skills might prove comparable with the Fulbright Program if an accounting were made.)

The automation used even in the earliest tracking radar was like science fiction. Some of my immediate Navy classmates whose names I noticed later in programs of computing meetings include James Butler, Fernando Corbató, and Saul Padwo (and there must have been many others).

Even after I returned to "pure" number theory at Harvard for my PhD (in 1948 as a Putnam Fellow), I must have been influenced greatly by Norbert Wiener's doctrines of cybernetics and the Second Industrial Revolution. [Wiener, N., Cybernetics, John Wiley and Son, New York, 1948.] In fact his work was talked about more by laymen than mathematicians, who tended to look instead to John von Neumann, who was writing in more technical terms on automata. [von Neumann, J., Collected Works, Vol. 5, General and Logical Theory of Automata (1948), Macmillan, New York, 1963, pp.288-328.] But for whatever motivation, I felt machines had to "think" at least in routine fashion. (Maybe it was negative of me to want machines to think; maybe I did not consider mathematicians to be doing as much creative thinking as they pretended.) At any rate, I was beginning to become obsessive on this matter of "thinking."

From the wonders of naval radar, I looked to fancy electronics. The only machines I knew of in any functional detail beyond desk calculators were "analogue computers," and some of Derrick H. Lehmer's "garage-made" models for number theory, which were used in prime factorization. [Lehmer, D.H., "A Photo-Electric Number Sieve," Amer. Math. Mo., Vol. 40, 1933, pp. 401-406.] All I saw, however, were mathematicians "riding hobby horses." I had heard enough about the electronic ENIAC to be unimpressed by the Harvard Mark 1, which gave itself away with its noise background as a mere relay machine.

Far from "thinking," Mark I was routinely calculating tables of Bessel functions, or something similar, when I saw it in operation.

Wayne State University, Detroit (1948-1956)

My first academic position was in the Mathematics Department at Wayne (now Wayne State) University in Detroit (1948-1956), which was upgrading the curriculum to a PhD program by getting research oriented faculty even from as far away as Harvard. In view of the many recalcitrant non-research faculty members in all departments, I soon realized that Wayne was an unfortunate choice for me, except that computing unexpectedly happened to me in the form of two unlikely persons for Wayne. One was Arvid Jacobson, who pursued applied mathematics with literally religious zeal for what he called "industrial community service." His career was extraordinary. He spoke with an accent acquired from a Finnish-speaking community in northern Michigan. He became involved with, but later renounced, Communism, after a disastrous imprisonment in Finland. He then returned to Christianity with messianic zeal, always speaking of "spiritual leadership and community service" with a combination of religious and secular meanings. Since he was known to (and cited as a Communist by) Whittaker Chambers, his past came back to haunt him in the McCarthy era, but his many friends (including me) were adamant in vouching for him. His outlook was scarred by his past suffering. He saw his enemies in Detroit as an "Anglo-Saxon clique," and carried this paranoia to the point of explaining his vote for Eisenhower over Stevenson in 1952.

Jacobson worked in the automobile industry before obtaining his PhD at the University of Michigan in his forties. He knew most of the industrial research people in Detroit. Then came his calling. I must say I had to believe in him after I saw his miracles. He got a completely indifferent university whose administration feared progress to accept the largess of Detroit industries which feared mathematics (two characteristic fears). He formed his Industrial Mathematics Society (IMS) somewhat before the (now) better-known SIAM. Arvid soon became ready for computing. He first got Vannevar Bush's original Differential Analyzer, scrapped by MIT, to Wayne as a preliminary but ineffective start. Ultimately, however, he brought the Burroughs UDEC (Universal Digital Electronic Computer) to Wayne University. This was the first stored program machine I saw (1953). It had 1,000 ten-digit words of drum storage, did about 30 multiplications per second, and had all of 10 instructions (one per digit). The UDEC was loaded by paper tape with punched holes, sensed mechanically not optically. The machine was so slow that even the small problems I had in mind were not practical for it because of normal down times of vacuum tube computers, possibly every half hour.

Also, coincidentally with UDEC, Wayne University acquired Joe Weizenbaum. His friendship was very valuable to me. From him I learned what a stored program was, and I soon learned what a Turing Machine was. Now I felt that I knew the meaning of a "giant brain." Joe Weizenbaum was a psychologist with an amazing fondness for mathematics. He went on to fame at MIT (with ELIZA).

Arvid Jacobson stayed on at Wayne to perform the further miracle of attracting rather prominent persons to serve as director of the computing program (through industrial donations). They served in succession, all leaving dissatisfied.

Among the names I can recall are Harry Huskey (who had been associated with Derrick H. Lehmer at Berkeley), Elbert Little (who was an industrial consultant), and, after I left (1956), Wallace Givens, who had been at the Oak Ridge National Laboratory, but who did not want to continue living in the South as a matter of political choice. In the long run, sadly, Arvid Jacobson quit in despair, and went into private consulting in 1959. He summed up his career as "pushing a great big sponge." The last straw, I am told, was a donnybrook involving what should have been a routine but predictably unpleasant problem, choosing a new machine to replace the IBM-650, which followed the UDEC. (The IBM-650 remains the great performer in this memoir.) At the same time as the UDEC was set up at Wayne University, in 1953, John Carr set up the MIDAC at the University of Michigan, and got it operating very quickly. (It was a faster machine by a factor of ten; it was a miniaturized version of the National Bureau of Standards SEAC, and was also better supported, by the Air Force). The MIDAC even had an optical tape reader. I had the distinction in 1953 of being the first one to run a "real" program (not an exercise) successfully on that machine. The program was in pure mathematics, on cubic units, and I was ecstatic to find that Derrick H. Lehmer (whom I had not yet met) encouraged me to publish it in 1954 in Mathematical Tables and Aids to Computation (MTAC) (Cohn, 1954), the forerunner of today's Mathematics of Computation.

Thus in 1953 1 began my career as a professional parasite, using anybody else's machine, which was always better and bigger. This was easier then than now, since most managers of machine installations were easily flattered and very obliging, and bureaucracy then was not as intimidating as today (with batches, accounts, and priorities). I used many machines unknown today, such as MIDAC, UDEC, ORDVAC, EDVAC, SEAC, Univac, CDC-3600, and George (the Argonne Laboratory machine). Ironically, for my first 18 years of research using computers, the "home" facilities were not adequate until in 1971, when I achieved my current position.

I also learned of the potentially seductive appeal of computing. In 1955 I was not alone in wanting to leave Wayne University, nor was I averse to taking a nonacademic job (although my wife, Bernice, who claimed telepathic foresight, insisted I never could!). The three papers I wrote about using computers seemed to have had more weight than my ten papers in pure mathematics. I was interviewed by several computation laboratories, including the Ballistic Research Laboratories (Aberdeen, Md.), Argonne National Laboratory, Burroughs Research Laboratory (Philadelphia), IBM (Columbia), and Oak Ridge National Laboratory, but the MIT interview stands out in my mind. That was the job I really wanted. Philip Morse was a man whose work (including textbooks) at MIT in mathematical physics and operations research (in World War II) was legendary, and I was overwhelmed that he and Jay Forrester wanted to arrange a joint appointment for me with his laboratory and the MIT Mathematics Department. In the interview, he made me aware of the current interest in speeding up Monte Carlo and simulation computations; he felt number theory was important for randomness research.

When the Mathematics Department would not come up with its share, Morse discussed a job, using "soft" money, which was outside grants and would not lead to tenure. This still interested me, but clearly he would not offer it because he knew that I had achieved tenure at Wayne University and was becoming "established" in mathematics (despite my obvious dissatisfaction).

The most active period of my involvement with computers was about to begin. In 1953 1 had a small US Army Ordnance grant to do algebraic number theory. It was not large enough to cover computing, so I obtained access to the computing facility at the US Army's Ballistic Research Laboratories at Aberdeen, Maryland. In 1949 I had corresponded with John Giese, one of the mathematicians at Aberdeen, on a problem in aerodynamics using numerical methods I had learned while taking my master's degree with Courant much earlier. Giese's connection with the laboratories seemed a logical lead to me, and it led me to his colleague Saul Gorn. Saul had a PhD from Columbia in algebra, and quickly took an interest in the problem which led me there. It was the discovery of integral solutions in a, b, c to the equation

n(a3+b3+c3) + m(a2b+b2c+c2a) + k(ab2+bc2+ca2) + labc = 1

with specially selected integral parameters n, m, k, l chosen so as to make the cubic polynomial factorable in the real domain. If the problem looked messy and limited in appeal, the method of solution involved an even messier three-dimensional version of continued fractions. The details later appeared in a paper jointly with Saul Gorn (Cohn and Gorn, 1957).

It was this unlikely problem which made me appreciate Saul Gorn's vision of computing beyond number crunching.

He was looking to create a machine-independent programming system. His logistics were very simple. Since our problem in number theory was approved by the Army, he could obtain the time to work on his "universal code" using our problem as the pretext. It became his cause célèbre as well as mine. In the summer of 1954, he programmed ORDVAC at Aberdeen's Ballistic Research Laboratories to translate this program from universal code to internal code and to run it successfully. These were heady days of self-realization. We were not upset to find that Peter Swinnerton-Dyer had done a similar calculation simultaneously at Cambridge University, England, by a different method, because he wrote his program only in more direct code and he had a less automated program. His work was never published. Saul had now shown new potentialities for mathematics to evolve its own language.

During that same summer (1954) I worked at the AEC computing facility of the Courant Institute (NYU). This was sponsored by Remington-Rand with personal support of its honorary president, former General Douglas MacArthur, for whom Courant had great praise and with whom he claimed to have had very profitable discussions. (Since Courant was famous for speaking tentatively and mumbling subliminally, we in the laboratory amused ourselves trying to reconstruct the dialogue and imagining the strain on the dignified and redoubtable general.) The installation was built around the Univac and very much influenced by Grace M. Hopper. I now became aware of a larger movement to regard computers as thinking machines and accordingly to regard "thinking" as a concept with different degrees of depth.

Grace Hopper set the moral tone with such maxims as: If you do it once, it is permissible to do it by hand; if you do it twice, it is questionable as to whether or not to do it by hand; but if you do it more than twice, it had best be done by machine. Her Univac B-2 Compiler, indeed, meant that translation to machine (assembly) code and the assignment of memory would never again have to be done by hand. She even refined the compiler to sense systematic typographical errors and correct them (with the author's permission each time). She was also one of Saul Gorn's spiritual leaders. Some of the speakers at the Courant Institute that summer told of efforts at MIT to produce automated algebra (not yet called MACSYMA). I was delighted to find computing beginning to mean something "cybernetic." This computing group at the Courant Institute was independent of Courant (I saw him only once) but it clearly was part of his master plan. Yet I knew the computer would not be permitted to be the tail that wags the dog. Courant was determined to keep numerical analysis dominant over machine usage. Combinatoric Computing was represented by George Dantzig and David Fulkerson, who were summer visitors. The only other guest I recall from the summer of 1954 was Wallace Givens. I had first met him at the Oak Ridge National Laboratory in 1953 and I was destined to keep running into him. He had just finished his famous work on the computation of eigenvalues. [Givens, W., "Numerical Computation of the Characteristic Values of a Real Symmetric Matrix," Oak Ridge Nat. Lab. Report 1574, 1974.]

He had a sobering thought on it later on. He said that aside from its mathematical value, his method has demonstrably saved so much in machine time that if he does nothing else, he will still have deserved whatever salary he is paid. This idea was vicariously comforting to me too, and it was also symbolic of the new age, representing the capacity of the machine to amplify man.

In the summer of 1956, preparatory to leaving Wayne University, I worked at the National Bureau of Standards in Washington, D.C. It was my great opportunity to learn what computing people thought, particularly about one another. Having been influenced by both Saul Gorn and Grace Hopper, I felt my interest would have a cutting edge. I was even more fortunate in having the sponsorship of John Todd and Olga Taussky (who shared my interests in algebraic number theory). John Todd was the guru of traditional numerical analysts, who were looking only to make their well-established skills more effective by enlisting the computer. More than that, he was aware of the special groups of computer abstractionists. Although he was distant from them, he did not disparage ideas which he did not share. That was not true of everyone on the NBS staff, however.

The bureau had inherited an active group of famous Works Project Administration (WPA) table-makers from the 1930s who seemed to set the tone. I remember Irene Stegun, Ida Rhodes, and Henry Antosiewicz particularly. Phrases were bandied about which were largely derogatory to computing machines and were too numerous to recount. A small sample would include: "programming is garbage," "a machine is just a big slide rule," "we don't teach flowcharts," "programmers just don't understand error estimates," or best of all, "don't trust a subroutine you didn't write." Obviously "computer science" was oxymoronic there. I was disappointed but not surprised to also find that Saul Gorn and Grace Hopper were not considered relevant either. The members of John Todd's and Olga Taussky's crew included mathematicians who functioned independently of computing, such as Philip Davis, Everett Dade, and Morris Newman. They tried to be "political centrists" on computing but they must have felt themselves caught in the middle when I argued and sometimes harangued about computer progress.

I was still gratified enormously that John and Olga appreciated me sufficiently to have me represented in both the theorist's and user's side of computing in one of John's handbook-type books (Cohn 1962, 1962a). John also wanted me to join his group permanently later on despite my offbeat attitudes toward the role of computers. I could not accept his offer because of my growing instinctive fear of the nonacademic world. Although computing was no more secure in the academic world, I felt more comfortable with its irrationalities, and the university did have an irrational love-hate relationship with the computer!

The pattern seemed to be nationwide, maybe worldwide. Computer science did not exist the way mathematics did (with a 2,500 year history) and had no obvious home academically. Even more so, the computer scientist, whether in mathematics or engineering, was regarded as a nouveau riche technician whose pay unfortunately came out to be insufferably high compared with the prevailing salary scales.

Washington University, St. Louis (1956-1958)

My next permanent job began September 1956 at Washington University in St. Louis in the Mathematics Department.

The chairman, Holbrook MacNeille, was a capable administrator and a former executive director of the American Mathematical Society. He was intent on helping to prepare the university for computing. The then current Computer Center consisted of a leased IBM-650 housed in an old shack (formerly a student theater) and run by the Industrial Engineering (IE) Department. This arrangement was based on a promise to run it "efficiently" in terms of producing some university income (which was just about forbidden under the "noncommercial" terms of the IBM academic discount). The IE Department members were known, however, to be using the machine only to produce unrecorded consulting income for themselves. This was doubly bad news because they usually took program decks and manuals away with them on house calls, leaving the laboratory denuded. The provost had known of this, too, and had made those IE Department members promise to resign from the center as soon as a better arrangement could be made. He then asked me if I would take over the Computer Center to make it academically responsible. Although I felt Washington University was a respectable institution, the Computer Center was not then an attractive proposition. (Among other things, the building was like a barn with an unfinished interior and wooden partitions which made the occupants feel like cows, and there was no washroom.)

Before I could give an answer, a climactic incident in my life occurred in February 1957, the last one linking me to my mentor Saul Gorn. He was now at the Moore School of the University of Pennsylvania, trying to form a Computer and Information Sciences Department, the first of its kind. He recommended me for the position of head of the department; this time the appointment was to be supported by Mathematics. I went to Philadelphia for an interview, which seemed like a triumphal entry as I knew many members of the Mathematics Department, chiefly Hans Rademacher's number theory group at the University of Pennsylvania, who were involved in related research interests and would welcome my presence. Clearly I was to be made the offer, but a lingering doubt arose somehow in my mind. Likewise Saul also had doubts about the adequacy of the support. In fact, he pointed out to me that the only support available was a one year grant from NSF, with no supporting funds promised by the university. I verified that the administration would promise nothing more to me, even as an outside candidate, than to Saul. We were both used to computing centers not having the respect of colleagues in more established disciplines. We naturally wondered if nonacademics such as the administration would respect us as little.

The answer to our fears came a day after I returned to St. Louis, when I received a call from the telephone operator in Philadelphia asking me to accept a collect call from the vice president of the University of Pennsylvania! I sensed the usual disrespect for computing personnel and declined to accept charges until the VP offered not one but three improbable reasons why he had no phone credit. The phone call continued with a discussion of salary, starting lower than my salary at Washington University and working its way up to parity with very little effort on my part. I suppose I disappointed Saul Gorn by declining the offer. He took it himself, mainly to get the program started, and tried to get me to come as a visitor, but I had changed jobs too recently to accept something temporary. He stepped down as head after a year, and I saw him only a few times afterwards for pleasant reminiscences.

The result was that I could now accept the position of director of the Computer Center at Washington University rather honorably, since Brook MacNeille would support me as much as possible and accept me back in Mathematics if I later wanted to quit. (The administration still would not install a minimal $300 washroom!) I served as director from May 1957 to July 1958 but with no previous illusions. I accepted the fact that the position was for image rather than development. Ominously, the center remained administratively in Engineering.

Things started badly for my directorship. Right off, IBM (which was silent when the Computer Center was mismanaged) now started to enforce its rule against nonacademic income for the university. This led to an ugly incident.

The Bemis Bag Company of St. Louis was starting to work with IBM on computing, and one of their staff formally invited our laboratory to set up a demonstration with a practical problem in linear programming. It was a classic allocation of differently priced paper stocks to various end products subject to supply restrictions. The problem was so classic that we even happened to have the right program deck for the IBM-650. We were already rejoicing over the prospect of outside money for the laboratory when IBM intervened with Bemis, telling them we were not allowed to do it because the problem was "commercial" not "academic." The IBM staff high-handedly brought the Bemis group to their headquarters to do the demonstration without telling us.

Thus ended our first and only outside job.

The lack of income was not alleviated even by meager "legitimate" income from grants; most grants that used the laboratory were tiny and seldom in sciences, so big money was clearly absent.

The provost and his advisers soon reacted in shock at the failure of income. Everyone was very "forgiving" of me, although they announced that the Computer Center had incurred a loss of $80,000 and would have to be discontinued. (This figure involved "creative bookkeeping" since that was more than twice the budget!) Moreover, they came up with the desperate idea of replacing the IBM-650 Computer Center with some cheaper (short-lived) IBM desk machine with an external paper tape for programs but no internal memory. The machine and center would then go back into obscurity in the Industrial Engineering Department. It seemed I had no say.

At this point, however, the Russians came charging to the rescue like the Red Army Cavalry. They sent up not one but two Sputniks, the second one with a dog. I quickly tried (with no success) to find a relevant scientific program to run, but had to resort to an exercise program to keep the lights flashing for the many visitors.

Nevertheless, our IBM-650 became a prime newspaper, television, radio, and newsreel photo opportunity of St. Louis. One of our programmers, Pat Zwillinger, even brought along Athena, her racing greyhound, to pose standing with her paws on the console of the machine, looking at the flashing lights and eagerly panting for a message from the Russian canine in space. The St. Louis Post Dispatch failed to use that picture, preferring some serious ones with us humans. There were also laudatory newspaper editorials, which now served to make the center impregnable. Some of my more political friends solicited statements of support from Sen. Stuart Symington of Missouri and Rep. Melvin Price of Illinois, both with influential congressional committee memberships. We also received unsolicited offers from some of the financial angels of Washington University to put up a building to replace our shack (I assume with a washroom). This was dangerous in itself since the financial angels are not supposed to talk directly with faculty. Edward U. Condon, of Bureau of Standards fame, was chairman of Physics, and he insisted on lending his professional weight to the cause of having a scientifically competent laboratory. (His influence was considerably augmented by his public stand against McCarthyism.)

I was not surprised to be asked finally to continue as center director but was more than disappointed at the reduced budget. I kept thinking that my idol, Courant, could sell ice to Eskimos, while I could not sell computing, with two Sputniks raging overhead. I now knew that I was not enough of a promoter to deal with the situation. MacNeille did attempt to rescue the situation by trying to get the Computer Center moved administratively into Mathematics, where it would have a larger budgetary base, but this idea was premature, and the Mathematics Department was not impressed. Also, the National Science Foundation could not help much because the Sputniks had created demands far beyond their budget for the present year. I must have appeared unsportsmanlike, but I asked to be relieved of the position of director in April 1958.

I also resigned from Washington University at that time, taking solace in the fact that no one wished me to leave, but that a better position was awaiting me at the University of Arizona in the fall of 1958.

1 left for my next job with very fond memories of my colleagues and my staff at Washington University, which also included programmers Alan V. Lemmon, Joe Paull, David Tinney (all part-time students), and one full-time technical supervisor, Robert Carty. It is to my eternal regret that I did not keep up with my enthusiastic staff, whom I also regarded as cherished friends. Those on my staff in a sense were typical of the coming generation of computing. At the age of 35, I knew only the mathematical stereotypes for the younger generation, so I was still in for an education on what species of man was evolving under the influence of computer science. Fortunately the more distinctive species of hackers and nerds were not fully evolved at the time. Alan V. Lemmon was an undergraduate mathematical prodigy who came to the Computer Center to inquire about computers reluctantly, and only because one of his professors sent him. He timidly confessed ignorance about machines and programs. I gave him an IBM-650 Programming Manual to look at. The manuals in those days were written in real English prose, so I felt it was not unfair to ask him to read it through (it was only about 50 pages) and to come back when he understood the program on the last page.

That program was depicted on an IBM card which clears the memory except for the zero-cell and the accumulators. He came back the next day with several cards, one of which cleared all memory and accumulators and others which did special tricks like labeling the memory so that in a memory dump, the unused cells could be identified. He now works for the GTE (General Telephone and Electronics) Laboratories in Waltham, Mass. Not all neophytes were that startling, but the cases of "genetic computer readiness" in the younger generation are so common today that I can believe in Lamarck's adaptive theory of evolution. Bob Carty was a much different stereotype. He was a totally rigid personality, a strict observing Catholic, who saw in machines a kind of model of law and order. He had been a Marine and an FBI agent, and even then was an auxiliary sheriff; also, he believed in the rights of gun owners and had his own arsenal, including many types of handcuffs. He was a very serious and effective worker with no special compulsion for polemics or evangelism. Yet he was very social and made a large number of political contacts in our behalf. Bob left when I did, in 1958, to work in a supervisory capacity at the Wright Air Force Computing Center in Dayton, Ohio, but I lost track of him subsequently. I often think of him today when I observe that programming and software systems are very attractive as exercises in logic to a variety of rigid religious groups, who would feel considerably less comfortable with the humanistic scientific culture which both motivated computing and made it technically possible.

Joe Paull was originally a chemical engineer, and I must remark that there were a remarkable number of chemical engineers who switched to computing in the early stages (maybe second only to mathematics). The flood tide of physicists as machine users with money was yet to come. Patricia Zwillinger had been an honors student of mathematics at Wellesley, and her motivation for continuing in mathematics was largely due to computing. She and her husband had an overriding interest in animals, however, which caused them to move west. David Tinney had little motivation for any career. He was raised in comparative luxury (his father was Calvin Tinney, the humorist). His motivation for that brief period of the Computer Center also came from computing.

It was hard to resist the attraction of consulting. I worked for the IBM SBC (Service Bureau Corporation) in St. Louis for about six months from October 1956 to April 1957 as part of my enthusiastic search for what "real people" do in everyday computing. Surprisingly, it had some very interesting moments. It gave me the opportunity to visit large commercial and industrial computing installations, which only whetted my appetite for "more and more." I learned that most businesses claimed to have an "inventory routine," which did not function, owing to the noncooperation of their accounting departments. (This was not unlike the behavior of pure mathematicians, who feared a similar degradation of their role.) The most delightful moment was an official house call that a group of us from SBC made on Oscar Johnson, a shoe tycoon in St. Louis. He greeted us at the door of an impressive mansion in a private enclave with two sports cars and two Afghan hounds the size of ponies sitting on his driveway. He invited us in for drinks in a dining room with Queen Anne chairs, which he boasted were still safe to sit on.

He had been using the SBC to gather data for his "programmed trading" in the stock market. His method was (typically) the following: He had the SBC gather data daily on each of hundreds of favorite issues and print out to four-decimal accuracy results of computations in four-column format.

NAME OF ISSUE a b a b
b a b x a

Here a and b were data taken from the financial pages. If I recall, a was balance shares (shares bought minus shares sold) and b was total shares traded, all arranged in order of current yield. He noticed, of course, that for some mysterious reason the last column was close to 1.0000, undoubtedly proving the stability of the market, but seldom precisely 1.0000. For high figures like 1.0002 he would buy and for low figures like 0.9997 he would sell, but (I give him credit) he did use some additional judgment. I suppose it was my duty to tell him politely that he was starting with a random criterion, but I held back because of the rejoinder he would surely be too polite to make: "How many sports cars, Afghan hounds, and Queen Anne chairs do you have?" The problem for the men from SBC was far from mathematical, although I believe they knew enough algebra. The problem was to keep him from getting his own IBM-650 and giving up their service. I believe they succeeded by stressing that his basement, however attractive, would require remodeling with a possible change of decor for the heavier air conditioning.

One other SBC problem, which stands out in my mind for its pioneering value, was an on-line use of the IBM-650 for scheduling the ecological use of water power for generating electricity. The idea was to use power during the dead hours of the night to pump water uphill into reservoirs so as to ensure availability during the heavy usage of the day. This very successful program was written first in flowchart form by an enthusiast in the power company named Estil Mabuse. The SBC easily wrote a program for him facilitated by his clear thinking. He apparently just learned about programming ad hoc. He told me later that he put in so much overtime playing with the program that it threatened his marriage.

I also acquired a more professional consulting arrangement with the AEC Laboratory at Argonne, Illinois. It lasted from 1956 to 1969 and it permitted me to see the evolution of computing at a very dedicated and active organization. The original director of the mathematics division who initiated the arrangement was Donald A. Flanders. Later directors were William Miller and Wallace Givens (again), who symbolized the evolving discipline. "Moll" Flanders was a member of Courant's original crew before going to the Oak Ridge National Laboratory. He had an old-fashioned gentle character, which seemed to make him find automation menacing.

He lived in the world of hand computation. We used George, which happened to be Argonne's proprietary (binary) machine. The normal usage was to enter data in decimal form (called "binary coded decimal"), which the machine converted to binary for internal purposes. But Moll converted input and output from decimal to binary and back by hand, using tables, obviously suspicious of an untraditional operation. Nevertheless, he gave the laboratory the high-level mathematical character which served as an asset later on. He had his own problems of depression and erratic behavior, which endangered his clearance from time to time. The fact that his brother, Sen. Ralph Flanders of Vermont, was a leading and outspoken opponent of McCarthyism made the concerns of security look somewhat political. Yet his personal problems were real, culminating in suicide in 1956.

Bill Miller, his successor, was a computational nuclear physicist from Purdue. He was a supreme administrator, with a healthy agenda for all kinds of growth, service, and expansion. His professional input was important, with the emerging role of physics in computing, but there were many users in physics who were lacking in "cybernetic" instincts and overly endowed with money for machine time. Several times the machines were tied up overnight by physicists who looked for errors by running their program in "single-cycles," one instruction at a time, or otherwise expressed, at less than one-millionth its running speed. Bill ultimately left in 1962 to become a vice president of Stanford University. I felt he should have eventually become president of Stanford University, but he became the target of student anger over the Vietnam War. This was an especially cruel blow for him because he was honestly sympathetic and liberal-minded. He did become president of the Stanford Research Institute.

Wallace Givens took over as next director of the mathematics division. He had a full-time appointment now at Northwestern University in Evanston, Illinois, but he had a flexible leave arrangement which lasted indefinitely. His work in numerical analysis and his knowledge of hardware made him the director the most in tune with the functioning of the mathematics division of the laboratory.

The bread-and-butter work was with other science divisions at the Argonne National Laboratory but the mathematical program was not restricted to servicing. There was a large number of inspiring visitors in pure and applied mathematics sometimes only distantly related to computing, but all computer-progressive. My work in number theory led to at least one research paper a year, always using the computers. All in all, I was able to enjoy an environment with a broad, healthy vision of computing to offset the usual narrow academic vision. Some of my closer associates were James Butler, Joe Cook, Bert Garbow, and Bob Buchal. I knew them less well than my crew at Washington University, but I recall things that again are somewhat characteristic of the personnel at computing laboratories. Jim Butler and Burt Garbow had not gotten their BS degrees, but were "redeemed" through their ability at computing. Joe Cook, on the other hand, was a very sophisticated mathematical physicist with a PhD at the University of Chicago under Irving Segal. He had an aversion toward academic stereotypes, which prevented him from obtaining a university position for which he would have been uniquely qualified and, I believe, in great demand.

All of these associates (and others at the Argonne National Laboratory) were involved in imaginative research projects going beyond what usually arises in purely academic context. The most imaginative work I did there, however, would be trivial for a PC today, namely the representation of the three-dimensional boundaries of a four-dimensional object in two-dimensional cross section, all printed by character pixels (like '*') on an unsophisticated line-printer (Cohn 1965). I only wish that the university computing centers I knew personally were that well supported and could deal in ideas as visionary as the Argonne National Laboratory. My impatience was undoubtedly reflected in a talk I gave at a meeting of the American Mathematical Society in June 1963 on computing. The invitation came from John Todd, who chaired the meeting. From sentiments expressed at the NBS in 1956, he knew I looked upon computing as more than the sum of individual tricks.

Instead of simply boasting of the wonders, I asked for a more sober distribution of effort toward new areas which could stimulate development. The title, "Purposeful and Unpurposeful Computing," conveyed the message (Cohn 1963). This was my last public venture at influencing computing. From that point on, it would all be private, through my own research, or through my influence as a mathematician.

Later Positions (after 1958)

By the time I took my next job as head of mathematics at the University of Arizona, Tucson, in August 1958, 1 knew I was not a good promoter. My moment was past. I could serve the cause of computing only as an outsider doing mathematics and building up a compatible PhD department. Happily, I did not appreciate the main difficulty immediately, namely the low salaries. I tended to discount this because the Southwest was so beautiful (and cheap) as a place to live. I did not count on the major difficulties in recruiting created by the ego of most professional mathematicians expressed in the appeal of the "established" departments with which I had to compete.

Nevertheless, the creation of the new Mathematics Department in Tucson, Arizona, was my only great administrative glory. Of course, one factor was working in my favor, which would not have been present in computing. This was the major university in Arizona and, whether the administration liked it or not, they needed a Mathematics Department. It soon became the largest department in the university.

The location was no hindrance in getting money. Our senator Carl Hayden was chairman of the Senate Finance Committee. With or without his influence, one never knows, we obtained many departmental grants from the government.

Also, Arizona was fortunately notoriously behind in its social thinking; a department head was a boss, not a chairman of the board, so he could act decisively when necessary. The problems with the administration arose only when computing entered, because it was an innovative budget concept and involved equipment.

I did like the idea that the West was known for healthy, independent thinking, but unfortunately this meant I had to contend with skeptical administrators who did not think anything called "Mathematics" could be trusted to do anything useful; in fact to them "Applied Mathematics" was a type of engineering (if it existed at all).

Although I had the largest departmental budget in the university, I could not get permission from the administration for even the smallest computer. In fact, we did get a free Teletype connection to the GE BASIC Mainframe in Phoenix in 1965.

It was installed without university funds; only the phone bill was charged. The administration would not let the Mathematics Department pay the phone bill, but Southwest Bell assumed the money had to be there, so they let us run up some $3,000 over two years before the service was terminated. Computing was officially a small activity of the Engineering College. The engineering dean was very mathematically oriented and had made no objections to the participation of the Mathematics Department in computing. The reluctance came from my own dean of Liberal Arts, who possibly also thought Mathematics was too big a department for his comfort. My attempts to play up the future destiny of computers were stagnant for a while, but suddenly they became counterproductive. The engineering dean was replaced in 1965 by a fortunately short-tenured one who had experience only in physics laboratory supervision. He got the novel idea to counter the engineers' (and his own) fear of mathematics by removing the source of fear.

This involved a catalogue proposal to reduce the Engineering mathematics requirements to advanced placement calculus (in high school) supplemented in college by a short calculus course taught not in the Mathematics Department but in the Computing Center (taught by engineers). My skepticism, of course, was predictable.

To show mathematics courses were superfluous, the engineering dean gave a demonstration to an "impartial committee" appointed by the Faculty Senate, purporting to show how calculus problems were "solved" by computer without mathematics courses. I had the privilege of observing with the committee. This demonstration was a procedure more reminiscent of convention centers than universities.

It included signs in large letters on tripods explaining even to the most ignorant exactly what problem was being solved at each time, and even signs announcing when each problem had been solved. The only things missing were the models in miniskirts carrying the signs. The main operation was a charade of students looking on a shelf for the appropriate deck of IBM cards for each problem and loading them into, you guessed it, the IBM-650.

In today's world of terminals, interactive demonstrations are quite legitimate and very much in vogue, but they are scarcely advertised as freeing engineers from the study of college mathematics.

I politely sat through this sideshow, trying to look dignified, and interrupting only to remark that obviously somebody has to do mathematics for the programs. This was so silly to say that I could scarcely refrain from laughing, but this select committee voted that a "new method had just been unveiled which will make the University of Arizona famous." The few votes favorable to mathematics came from such unlikely sources as deans of agricultural engineering and architecture, not from the heavy mathematics users and surely not from the non-science departments. Of course I had to react. I quickly gathered data from colleagues who were on national engineering evaluation committees and I sent word to the university president privately that we might become famous by having the Engineering College disaccredited. The president (who was experienced enough to consider the source of his advice as well as the rhetoric) assured me it was never necessary for me to have worried. He soon made a statement to the faculty senate that "an engineer has to have a lot of mathematics to do his arithmetic," and he summarily canceled out the calculus-without mathematics notion before its popularity could swell to even more gigantic proportions. This did not improve my political popularity with my bewildered colleagues. After I stepped down as head of the department (in 1967) and was no longer a public figure, the Mathematics Department was quietly permitted to start what later became a decent computation laboratory in mathematics. In fact there was a simultaneously created university computer laboratory in its own building.

Over and over again, I saw that the fruits of the innovators were enjoyed, unappreciatively, by the succeeding generations. The dream is not for the dreamer.

I spent a delightful summer vacation in 1963 teaching at John Green's Institute for Numerical Analysts at UCLA. Considering that the University of California at Los Angeles was central to the development of computers (SWAC) as well as curricula in numerical analysis, I expected computing to be treated better than at the National Bureau of Standards. Again, however, the emphasis was on using computers to make numerical analysts look better, not to inspire any new attitudes in mathematics. My only cybernetic soulmate turned out to be Charles Coleman, whom I had briefly met at the National Bureau of Standards in 1956. He had also become so engrossed in computing that he did not complete his bachelor's degree at the University of Virginia. He worked later for IBM at Yorktown Heights. I did not engage in polemics at UCLA, as I had experience with the intransigence of numerical analysts, but I thought to myself that computing might have been better respected by American academics if it were not so "American" looking. To be a great scientist even in 1963, it was not necessary to have an "accent," but it helped.

I came to my present chair appointment at my alma mater City College, now part of CUNY, in 1971. My main purpose is to supervise doctoral students and to teach computing as numerical methods in analysis for undergraduates and as number theory or cryptography for graduates, including computer scientists. In any case, I am now on the sidelines enjoying the achievements of others, with more time to write papers using computers.

I am involved with many computing laboratories, owing to the multiple bureaucratic structure enjoyed by the City University, but when "bad things" happen to the laboratories, they are not my problem. I have no longer any input into sources of computer power and destiny. My encounters with computing still continue but in increasingly satisfactory form from the scientific viewpoint. Machines are faster and programming aids (and even programming assistants) are available. The NSF (National Science Foundation) also helps to make computing relatively easy for me now.

Epilog

Looking back, probably I am pretending to have had a mutual relationship to computing like that of James Boswell to Samuel Johnson. In fact it may have been more like the rooster to the sunrise. I am not modest, however, about having pioneered in the intensive use of computers in an innovative way in a large number of classical mathematical problems. Therefore my presentation is that of a mathematician who uses computing rather than who serves as a creator of hardware, software, or systems. There are many such, but I also claim to have been an early "true believer" who felt that computers are more than devices to aid mathematicians, but rather devices which must change the nature of mathematics. To use the classical analogy, when rational mechanics was introduced in the seventeenth century to an environment of scientists using geometry, this was not just another application of geometry, but the entrance cue for calculus. In retrospect it may be said that the introduction of computers has been an entrance cue for many fields which seem to vary with fashion, like information theory, complexity theory, knowledge theory, and so on, and others which surely shall follow. I am not advocating any one of these viewpoints. To the contrary, I make no choice because my involvement was a search for the future which I did not fully understand, nor do I now. I must count on others to find it. [An enlarged version of Cohn 1994.]

BIBLIOGRAPHY

Biographical

Cohn, H., "Reminiscences of a True Believer," Ann. Hist. Comp., Vol. 16, No. 1, 1994, pp. 71-76.

Significant Publications

Cohn, H., "Numerical Study of Signature Rank of Cubic Cyclotomic Units," Math. Tables Aids to Comput. , Vol. 8, 1954, pp. 186-188.

Cohn, H., "Use and Limitations of Computers," in John Todd, ed., Surveys in Numerical Analysis , McGraw-Hill, New York, 1962, pp. 208-22 1.

Cohn, H., "Some Illustrative Computations in Algebraic Number Theory," in John Todd, ed., Surveys in Numerical Analysis , McGraw-Hill, New York, 1962, pp.543-549.

Cohn, H., "Numerical Survey of the Floors of Various Hilbert Fundamental Domains," Math. of Comp. , Vol. 33, 1965, pp. 594-605.

Cohn, H., and Gorn, S., "Computation of Cyclic Cubic Units," NBSJ. Res. , Vol. 59, 957, pp.155-168.

UPDATES

Harvey Cohn died on May 16, 2014. (THVV, 2017)

PDF version

Original content Copyright © 1995 by the Institute of Electrical and Electronics Engineers Inc.
New content Copyright © 2013-2023 by the IEEE Computer Society and the Institute of Electrical and Electronics Engineers Inc.
All rights reserved. This material may not be reproduced or redistributed without the express written permission of the copyright holder.