Computational Chemistry

Jump To Main Content

Jump Over Banner

Home

Jump Over Left Menu

The Development of Computational Chemistry in the United Kingdom

Stephen J Smith and Brian T Sutcliffe

1997

Reviews in Computational Chemistry Vol 10

This is an excerpt from a much longer paper tracing the early history of Computational Chemistry in the UK through the Atlas years to the period when support had moved to Daresbury. Of the original plan to move the Atlas Laboratory to Daresbury, four Atlas people actually moved, Vic Saunders, Martyn Guest, Pella Machin and Mike Elder.

Emerging from the 1950s

So, at Cambridge, as at Manchester, there was a great deal of computational quantum chemistry underway by the late 1950s and indeed there was a great deal of computational everything by 1960. However, the UK university system in general and all computer-based study was on the verge of a great transformation.

The change in the UK universities was (at least in part) in response to a report by a committee chaired by Lord Lionel Robbins, the Director of the London School of Economics. The committee reported late in 1963 and recommended that the number of UK universities be increased from the existing 25 or so (depending on how they are counted) dealing with some 118,000 students (about 4% of the age group) to about 50 to deal with about twice as many students. There were to be some consequent changes in other institutions of higher education, and because all of these are considered in future tabulations, the comparable figure for total number of students in 1962/63 is 216,000 (about 8.8% of the age group). By 1971 there were about 457,000 students in higher education (about 11 % of the age group).

As far as computers were concerned, in the early and mid-1950s, the politically powerful scientific figures in the UK, such as Hartree, Blackett, and Darwin (Superintendent of the National Physical Laboratory), firmly believed that three or four digital computers would be quite enough for the nation's needs. Thus there had been no real thought in the UK about the possibility of the mass production of machines. But by 1960 it was clear that the demand for computing resources just in Universities and research establishments was soon going to outstrip the supply. It was clear, too, that it was rapidly going to become impossible for every user to run his own jobs on the machine and that thought had to be given to setting up a proper service to run the machine effectively, without, it was hoped, alienating the user. It was also realised that the development of a system (or monitor), together with assembler and higher level languages, could ease both the task of the programmer and those responsible for running the machine.

There were also contemporary technological developments and perhaps the most important of these was of the transistor. This United States discovery meant that it was no longer necessary to use valves (tubes) in anything like the numbers that had been used heretofore and their replacement by transistors meant a consequent increase in reliability and a diminution in power requirements. Thus, although the design of machines was still discrete-component and so each board had to be individually assembled, machines became smaller and easier to keep cool. Also, more automatic methods of producing ferrite core store were developing so that such store, though still expensive, became the fast store of choice. Just as wealth in primitive societies was measured in the size of the herds, so in computer owning society, wealth came to be measured in the size of the core store.

The story in the United States is quite similar to that in the UK up to this point, but from this time on it can be said, admittedly in a rough-and-ready way, that the United States won the battle for the expansion of higher education and for computer mass production. So the research groups involved in the UK in the 1960s were to be much smaller than their United States counterparts, and the computers that they had at their disposal, on the whole, were smaller and less reliable than those in comparable United States undertakings.

The 1960s

It is quite difficult to pick a path through this period, say, 1962-1972, in the UK. Up to 1962, computational chemistry is easy enough to characterise. It consisted simply of X-ray crystallography and of molecular quantum chemistry, which in turn consisted chiefly of atomic and molecular electronic structure calculations. But after 1962 it became much more various. Simulation work of various kinds began to be computerised. This work ranged from the simulation of reaction kinetics through the beginnings of reaction dynamics of various kinds and on to statistical mechanics. There were also developments in the computational simulation of spectra of all kinds. Programs for ESR and NMR and molecular vibration-rotation spectra and the like all began to be developed. The development of direct methods in X-ray crystallography also began to lead to computational simulation approaches to the decoding of X-ray data. In such work, the computer model replaced the ball-and-stick models that had begun to be used in the 1950s to elucidate protein crystal structures. This approach reached its full fruition, however, only in the late 1970s and 1980s, with the development of powerful graphics work stations. There began also the first attempts at computer-aided instrumentation, so that the output from a given experimental set up could be processed directly on a computer, without the need for recording and separate transcription of the results. This was a factor that contributed to the development of networks toward the end of this period. Not only were these developments being made, but also chemical information was being computerised and many database type developments were begun. There was also a move to develop computer-aided synthesis, particularly organic synthesis, though this was chiefly pursued in the United States and only to a limited extent in the UK. The first early steps in what would now be recognized as chemometrics were also being taken.

It is not possible to do justice to all these developments in a brief review like this and what follows will undoubtedly be slanted toward the development of computational quantum chemistry. But the experience of that field is not incongruous with that of some of the other fields mentioned above, though it is rather different from the experience of the information retrieval and processing side of things. Given this, it is probably prudent to start from what would be widely agreed to be a really seminal occasion, for all computational quantum chemistry, not just that in the UK, namely the Boulder Conference of 1959. The proceedings of this meeting are recorded in Reviews of Modern Physics, 32 which appeared in 1960.

It was at that meeting that Charles Coulson offered the view that quantum chemists came in two types, "group I (electronic computers) and group II (non-electronic computers), though a friend has suggested as alternatives the ab initio-ists and the a posteriori-ists." The present concern is not so much with whether he turned out to be right or not, but that he (with his usual perceptiveness) recognised that a sea change was occurring in the subject and that the computer had become an absolutely essential tool in the development of one kind of quantum chemistry. In fact, many of those in group II used semi-empirical methods to perform simple calculations by hand, and since that time the computer has actually come to dominate in calculations of this kind too.

What was clear to any reader of that issue of the journal was the enormous extent of computer use in the United States and of the variety of machines available to the chemical community often, though not always, by means of collaboration with military agencies and the defence industry. Thus much of the Chicago work was done on a Univac 1103 at Wright-Patterson Air Force Base near Dayton, Ohio. Especially noteworthy of such work was the systematic LCAO-MO SCF study undertaken by Ransil on all the first-row diatomics. Some work by Nesbet was done on the IBM 704 at the Martin Aircraft Company in Baltimore. The work of Goodfriend, Birss, and Duncan on formaldehyde was performed on an IBM 650 (a great workhorse in many United States efforts about this time) at the US Army Ordnance Ballistic Research Laboratories. The work from MIT used the Whirlwind, which was an MIT engineered and built machine. The UK computational contribution was made chiefly by Boys both alone and with his student, Julian Foster. The calculations reported were all performed on EDSAC at Cambridge.

Young people in the UK who were interested in developing computational quantum chemistry were therefore, not unreasonably, strongly attracted to a period of work in the United States where there were many more computing facilities than there were in the UK and, so it seemed, much more interest in such work. It should perhaps be recorded that quantum chemistry in the UK was at this time dominated by the groups at Oxford and at Cambridge, the first under Charles Coulson and the second under Christopher Longuet-Higgins (who had succeeded John Lennard-Jones in 1954). Both these powerful figures remained cool about computational work in quantum chemistry and though neither was Luddite or anti-computer, both regarded it as appropriate to make anyone who wished to use a computer, explain precisely why, and why it was advantageous. They were both anxious (and not foolishly anxious, either) that calculation not be used as a substitute for thought. They were not alone in the UK in their anxiety, and in the United States a similar anxiety was often expressed, perhaps most forcefully and effectively by Peter Debye.

However, it was clear that in the United States Robert S. Mulliken at Chicago and John C. Slater at MIT (among others) presided over groups in which computational electronic structure work of all kinds was encouraged. Thus for a period in the early 1960s, quite a lot of UK quantum chemistry was actually done in the United States as young people came from the UK on postdoctoral research fellowships to work in United States labs. The kind of thing that happened can be exemplified by considering developments at the Solid State and Molecular Theory Group at MIT between 1962 and 1964. (This is, of course, to recount only a small portion of the history of a distinguished group of much longer duration and wider interests than might appear from the present account.)

In the SSMTG Quarterly Progress report for January 1963, J. C. Slater notes in his introduction that since October the group has been joined by two postdoctoral workers from England and one from Canada "who have extensive experience in molecular calculations." All three had been attracted to MIT not only for its computational facilities (it had an IBM 709 as a group machine in the Collaborative Computational Laboratory run by a very competent, experienced, and helpful staff) but also by the presence there of M. P. (Michael) Barnett, an expatriate from the UK, who was at that time the guru of many-centre integrals over Slater orbitals. It was generally agreed that, in implementing molecular calculations of any kind, it was the evaluation of the three- and four-centre electron repulsion integrals over Slater orbitals that caused the real bottleneck. The only plausible method of tackling them was by means of the Barnett-Coulson expansion, an expansion that was known, however, to have notoriously erratic convergence properties. Although Barnett and his co-workers (among whom were Russ Pitzer, Don Ellis, and Joe Wright) had made some progress in doing three-centre integrals generally and four-centre ones in special cases, it rapidly became clear to the new arrivals that any progress was going to be very slow indeed.

In these circumstances M. C. (Malcolm) Harrison, one of the Englishmen, was able to convince the molecular-structure people that the use of gaussian, rather than Slater, orbitals was the way forward. He also convinced them of the utility of the LCAO-MO-SCF approach and of the subsequent Configuration Interaction (CI) calculation to allow for electron correlation. He claimed no originality for these ideas, attributing them to his teacher at the University of Leeds, C. M. (Colin) Reeves who, he said, had begun to develop them as a student of Boys. The group therefore divided up the work of programming and brought in graduate students to help. The whole thing was organised and held together by Malcolm Harrison. He wrote software for file-handling on the tape units, which forced the collaborators to use a common interface, and he was ruthless in castigating bad and sloppy habits of programming (although he had a soft spot for computed "GOTO"s). Thus developed, in just under a year, the initial phase of POLYATOM, the first automatic program system to perform LCAO-MO-SCF calculations quite generally. Although molecules were not uppermost in J. C. Slater's mind at this time (he was developing Xα theory and beginning its computational implementation), he was enormously supportive of this work. This was particularly so when things went wrong in the computing, as they sometimes did, and so much very costly computer time was expended to no avail. He took such setbacks calmly and paid the computing bills without demur and without visiting retribution on the heads of the perpetrators, remarking that this was a price that had to be paid in developing new things. It must be admitted that his calmness and reasonableness were quite unexpected by those involved, for he occasionally had a fearsome temper and quite a short fuse.

The first phase of POLYATOM was reported on by Barnett at the Sanibel meeting held in January 1963, and though the report was more about what was planned to happen rather than what by that time had happened, it is nevertheless an accurate account of what eventually was to happen. By the end of 1963 the program had become widely distributed, without guarantee, and though it was not quite bug-free initially, it rapidly became so on use, and the first papers using it were published two years after the completion and checking of the system1 The system was described in an MIT Collaborative Computational Laboratory report, and it went on to further development in the United States under the direction of Jules Moskowitz (who had been one of the original MIT SSMTG molecular calculations group) at New York University (NYU) and at Bell Telephones. It is in this more developed form ("Version 2") that the program was mostly widely distributed in the late 1960s and up to the mid-1970s. But by that time the UK participants in the development had moved on to other things, and Slater's Canadian post-doctorial worker mentioned earlier had become a large-scale user of the system and no longer a developer.

In 1963 the Chicago group and Enrico Clementi were taking the first steps toward what was to become IBMOL, and for a time there was a friendly rivalry between them and the developers of POLYATOM, but as IBMOL began its IBM-based development it drew steadily ahead of POLYATOM in the level of support and the facilities provided. But even today it is possible to find programs employing POLYATOM ideas and, occasionally, the vestiges of POLYATOM code can be recognised. But POLYATOM and IBMOL began a line of program systems for molecular electronic structure calculations whose current representatives are GAUSSIAN, GAMESS, and so on.

The CI program continued its development in Jules Moskowitz's group at NYU from late 1963 until late 1964. Although a users' manual was issued then, it did not attract much use or attention for some years to come. This was essentially because of the difficulty of transforming the two-electron integrals from a basis of AOs to a basis of MOs on machines that had only tapes as backing-store and rather small fast-access store. Only with the development of large disc storage and the development of chaining algorithms did the approach really become a starter. It was first effective some 10 years later, as part of the ATMOL package and also as part of the MUNICH package. These developments will be considered in context.

It is appropriate to note here too, that at about the same time as the developments recorded above were taking place, some senior UK people who made great contributions to computational chemistry also went to work in the United States. In the field of quantum chemistry perhaps the most notable figures were M. J. S. (Michael) Dewar and J. A. (John) Pople. Some too went to Canada and among these were P. R. (Phil) Bunker, who was to contribute to computational spectroscopy at the National Research Council of Canada. It would be possible to expand this list of persons who, to use a term fashionable at the time in the UK, went down the "brain drain" from the UK. But since they all made their contributions from their adoptive countries it would not be appropriate to claim them for computational chemistry in the UK.

However, it would be wrong to give the impression that all UK computational chemistry done between 1962 and 1965 was done abroad. There was a continuing effort at Cambridge as described by Handy. There was also continued work at Manchester, especially in X-ray crystallography, but the quantum chemistry side of things was diminished by the departure of two of the senior figures, Huw Pritchard and Bill Byers Brown, both eventually going to North America. But development was difficult, for there was a chronic shortage of computer power in universities and civilian research institutes in the UK, and this shortage was not mitigated by the kind of collaborations with the military and defence sectors that were so usual in the United States. Such collaborations were generally forbidden on security grounds, and although this is understandable in the political context of the times, all involved knew that, in practice, it was a quite daft prohibition. Things would thus have been pretty grim for computational chemistry but for two developments.

The first was that of the Atlas Computer Laboratory, which opened in summer 1964. The second was the publication of "A Report of a Joint Working Group on Computers for Research" in January 1966. This last is always called the Flowers Report, after its chairman B. H. (Brian) Flowers, who was then Langworthy Professor of Physics (in succession to Blackett) at Manchester. He was later to become a government science adviser, chairman of both the Science Research Council (SRC), one of the successors to Department of Scientific and Industrial Research, chairman of the Computer Board, Rector of Imperial College, London, and eventually Lord Flowers.

The Atlas Computer Laboratory

Atlas Lab had its origins in the Atomic Energy Research Establishment at Harwell and indeed the lab was, and is still, sited next door to Harwell. This section owes much to a talk that J. (Jack) Howlett, the first director of the lab gave at the Warwick meeting of the lEE in July 1993. The Theoretical Physics Division there was heavily involved in computing for reactor design. The head of this division in the immediately postwar period was Dr. Klaus Fuchs, who was later found to have been treacherous for his dealings with the USSR during the war. This was no doubt one of the reasons for the political reluctance to permit academic computational collaboration with the military and defence agencies. By the middle-1950s, the division was using the computers then available including by 1958 a Mercury. But, in practice, for serious reactor design calculations, they needed to use the IBM 704 which was situated at the very high-security Atomic Weapons Research Establishment at Aldermaston, and for advanced reactor design it was clear that they were going to have to use a machine of power comparable to IBM's proposed Stretch machine.

In these circumstances it was natural for the computational group in the Division to press for a large machine. It was not unreasonable of them to suggest that it should be designed and built in the UK. The ambitions of the group were supported by the Director of Harwell, Sir John Cockcroft, and negotiations led to a proposal that Ferranti should build, to a Manchester University design, a super-computer, eventually to be called Atlas. The proposed machine would satisfy all the computing requirements of the Atomic Energy Authority, both at Harwell and elsewhere, and still have spare capacity. It was proposed that the spare capacity be made available to UK universities generally for work needing large-scale computational support. This provision was to be without charge and as of right. The proposal was so expensive that it required the specific approval of the Minister of Science, and it was submitted to him. The response was favourable and in one respect, surprising: it was decreed that the machine be not run by Harwell but run under the auspices of the then newly created National Institute for Research in Nuclear Science (NIRNS). (NIRNS was created in an attempt to encourage civilian research in nuclear and high energy physics, areas that were proving just too expensive for research on an individual University basis.) A special laboratory for the machine should be built outside the wire at Harwell, adjacent to the site of the proposed Rutherford High Energy Laboratory (RHEL), which was also part of the NIRNS remit. It should provide services to the Atomic Energy Authority and to the Universities as proposed, put the Atomic Energy Authority should pay for the services it received. It turned out that Atomic Energy Authority made no use at all of the services of the Atlas lab, and so the lab quickly became a completely civilian facility, eventually coming under the wing of the SRC after the demise of NIRNS (1965).

The decision to go ahead was made late in 1961, and the building of the lab was completed by spring 1964, and the Atlas machine was delivered in April 1964. By then, Ferranti had sold its computer business to International Computers and Tabulators, a firm that was later (1968) to become International Computers Limited (ICL) on acquiring the computer interests of all other UK manufacturers.

It may be of interest to record that the part of the lab built to hold the machine was on two floors. All the equipment that only the engineers needed to touch was on the ground floor and the operations section with the tape drives, the printers, and so on, was on the first floor. The space needed reflected the enormous size of the machine, for it took 14 truckloads to deliver it all, and 3 months were needed to install and test it. The total power consumption of the machine and its attendant cooling facilities was about 150 KVA. None of these figures is out of line with contemporary machines of comparable power. (The machine delivered roughly 350 KFLOPS.) The machines of the time were power-hungry monsters that took a lot of maintenance and needed highly skilled operators.

The machine came on-line for users in October 1964 running a single shift, but the uptake by users was so quick and great that by early 1965, the machine was being run around the clock. The machine was run in batch mode. Users submitted jobs, usually on punched cards, to the operators, of whom there were six to eight on a shift, the jobs were run, and finally the output, together with the input cards, were returned. Atlas has a pretty good claim to be the first machine with an operating system in the Atlas Supervisor. This program controlled the I/O, scheduled the jobs, did automatic transfers between main and backing store (paging), and kept job statistics. Atlas also ran a number of compilers, among them FORTRAN, and the symbolic assembler language was very similar to FAP.

Because of the exigencies of a batch mode of operation, it was necessary for a user to be present at the lab at least during the early stages of program development and debugging. Because the lab was in a fairly remote rural setting (Oxford was the nearest large town, about 15 miles away), workers from different computational disciplines were at close quarters with each other and rather isolated from mundane concerns. There was thus much discussion of computing techniques in a cross-disciplinary environment while waiting for a run-slot. Once programs were ready for production running, card-decks could be sent by post to the lab, where they were run and returned with the output. The lab had a small staff of experienced and capable support programmers who would look at any job that failed and assess if the error involved was trivial and could be corrected on site and the job rerun. One person, Mike Claringbold, seemed to many users of the time to have almost supernatural powers of error perception and correction, and his skills certainly added immeasurably to the effectiveness of remote operation.

The lab also began to develop support groups of staff for particular computational enterprises. The ones relevant to computational chemistry began to develop in the early 1970s and will be considered later, but to understand the relationship between the lab and the computational developments more generally in the UK, it is necessary to consider the Flowers Report and that to which it led.

The Flowers Report

Although the UGC had a subcommittee on computing, it was not fully seized with the urgency of the need for computing capabilities in the universities. However, the Committee for Scientific Policy, which was a body comprising the great and the good which advised government on national scientific policy, told the government of the day (1964) that there was a real need for computers in universities and a need that should be met as quickly as possible. The government was sympathetic to the Committee for Scientific Policy's message for it sought to stimulate the UK computer industry on the advice of the NRDC. Thus when Brian Flowers proposed that a joint committee of the UGC and the Committee for Scientific Policy be set up "To assess the probable computer needs during the next 5 years of users in Universities and civil research establishments receiving support from Government funds," the proposal was acted upon by the government by setting up such a committee and appointing Brian Flowers chairman. The committee (which among other distinguished members contained Dr R. F. (Bob) Churchhouse, Head of the Programming Group at Atlas lab) set about its work with great dispatch, travelling around the country taking evidence from interested parties. A report was presented in June 1965, and the government gave general approval to the proposals in December 1965. This approval included the setting up of the Computer Board, of which Brian Flowers became the first chairman in 1967.

The committee recommended that the UK be treated as a collection of regions with a hierarchical system of computer provision within a region. There were to be regional computer centres, and each university in the region was to have a similar smaller computer, the size being determined essentially by the size of the physical science and engineering departments in the university. Any university user had access as of right to the machine at the regional centre. The government accepted the recommendations and very quickly implemented them so that, under the direction of the Computer Board, by 1967 almost all universities had a computer and a computer centre, and the regional centres had been established, usually at the largest university in the region. Computational chemistry in the UK was thus in with a chance to achieve state-of-the-art research.

Emerging from the 1960s

Initially the computers provided by the Computer Board were for research work only and not for teaching. However, that distinction did not persist into the late 1960s, and the computers became a general educational resource. It was not permitted that the computers so provided be used for university administration and, since they were required to provide an effective service, there were very strict limits on the extent to which the then burgeoning tribe of computer scientists were allowed to monkey about with the machines.

As a general rule the initial provision was of UK-manufactured machines. The smaller ones were comparable to the IBM 709 and the larger ones were about of IBM 7090 or 7094 power. It would probably be agreed now that most of the machines provided were not wonderfully satisfactory, either in terms of hardware reliability or in software provision. But by the late 1960s most of them had been got to work in a satisfactory manner. But, alas, this was just as their manufacturers were going out of business. After 1968, whatever UK computer a university had, they had to deal with ICL who had taken over all other UK computer manufacturers. Thus to add to the anguish of the users over the machines themselves was the difficulty of dealing with an essentially monopoly supplier.

The period from 1967 up to 1972 or so was a period in which computational chemists felt themselves to be struggling against machine limitations, but at least they were able to do some computational work in their home institutions. They could also use the regional centres where, in some cases United States-made machines were available, and, if they had SRC grants, they could use Atlas. So though it was a period of frustration, it was also a time of progress. That progress may be typified by developments in computational quantum chemistry and related enterprises, and no attempt will be made to cover crystallographic computing or the burgeoning interest in databases and bibliographic developments. What follows is a perhaps rather impressionistic attempt to convey the nature of the computational chemistry enterprise in the UK at this time.

At Manchester computational quantum chemistry began to develop strongly again with the building of a group under Dr (now Prof.) Ian Hillier. Among much of interest that originated in that group at about this time, there was probably the first computational realisation of a "direct method" in solving the LCAO-MO-SCF equations, which appeared in the paper by Hillier and Saunders.35 The work was performed on Manchester's own Atlas, which was the prototype for the machine at Atlas lab, and somewhat smaller than it. This work is also interesting because the (gaussian) integral evaluations were carried out by using a program written in FORTRAN IV by Vic Saunders, which was based on the integral programs in IBMOL released as QCPE 92 (see below for QCPE). This work can be regarded as the start of the ATMOL system. It was developed further at Atlas lab, which Dr Saunders was to join in 1970.

The paper of the Manchester lab appeared almost simultaneously with one by Roger Fletcher36 on the use of direct minimisation techniques in MO calculations. Fletcher had been a student of Reeves at Leeds at the same time as was Harrison. With Reeves he developed a widely used conjugate gradient method of minimisation, which is often called the Fletcher-Reeves method. Fletcher's interest in computational chemistry proved to be only a passing one, and he went on to have a distinguished career in optimisation theory itself. However, the Fletcher-Reeves method was used in LCAO-MO work originating from both the quantum chemistry group in York and the group in Leicester.

It is perhaps not fair to count these as the very first use of gradient methods, for nowadays that term is more associated with geometry optimisation in electronic structure calculation. The first in the latter category must surely go to the paper of Gerratt and Mills, work originating from the group at the University of Reading. The chief computational chemistry interest of this paper was molecular spectroscopy. It was while working at Reading that J. Watson developed the modern form of the Eckart Hamiltonian, which was to provide the basis for most subsequent computational work on the interpretation of molecular rotation-vibration spectra.

In Cambridge, EDSAC2 was still in use but in January 1965 a TITAN computer was installed. This was essentially a kit-built Atlas, and the story of its accession to Cambridge is told in an entertaining manner by Wilkes. Boys and Handy had begun work on the transcorrelated method, a method, albeit nonvariational, for incorporating the interelectronic distances into the wave function. A series of papers resulted culminating in 1969 with the use of the technique on the LiH molecule. In 1967 Christopher Longuet-Higgins resigned as Professor of Theoretical Chemistry and took up a Royal Society professorship in Edinburgh to study artificial intelligence. In 1969 A. D. (David) Buckingham became Professor, coming from Bristol where he had also been Professor of Theoretical Chemistry. With his advent the study of intermolecular forces began to develop strongly, its computational aspects owing much to the ideas and energy of Anthony Stone.

Both at Manchester and at Cambridge there were the beginnings of developments in computational molecular dynamics and scattering theory and these disciplines were developing in a lively way in London University too. Molecular dynamics was developed by Konrad Singer and Ian McDonald at Royal Holloway College, particularly.

Computational work had also begun to flourish in the group of Charles Coulson in the Mathematical Institute at Oxford. Not only was molecular electronic structure work done there but also heavy-particle scattering, and quite a lot of that aspect of the work can be discovered by reading the book by Levine, who was in the Mathematical Institute at the time. Computational work had also begun in the Physical Chemistry Laboratory chiefly with Peter Atkins and his students with their interests in NMR and ESR simulation, and Mark Child and his students, with their interests in semiclassical scattering theory.

Oxford was not alone in having for its centre of computational quantum chemistry, a department of mathematics: it was the case too at the University of Nottingham where the Applied Mathematics department under Prof. George Hall made important contributions to many aspects of computational chemistry. A typical computational paper from that group at about that time is the one by David Brailsford and Brian Ford on the ionization potentials of the linear alkanes. The paper is interesting not only for its content, but also for the future careers of the authors. David Brailsford went on to be professor and head of the department of Computer Science at Nottingham, and Brian Ford became director of the Numerical Algorithms Group (NAG), a group that made such enormous contributions to software developments in the 1970s and 1980s and indeed, still do make. There was also scattering theory at Nottingham but not strictly of chemical relevance.

During the late 1960s Roy McWeeny left the University of Keele for the University of Sheffield and that rapidly became a centre of computational quantum chemistry as well as a centre for spectral simulation of all kinds. McWeeny's group was remarkable in having its own computing facilities, an uncommon occurrence at the time.

It was much more usual at the time for researchers to use their local computing facilities, supplemented by the regional centre facility and sometimes Atlas. An example of such use is in a paper originating from the University of Birmingham by Deryk Davies, which is additionally of interest because all available levels of computing resource were involved in its execution and because it is one of the earliest UK uses of programs provided by the Quantum Chemistry Program Exchange (QCPE).

That which has been written above perforce ignores the work of many in the field in the UK, for at least 10 other institutions could have been mentioned at which work in computational chemistry of this kind was being carried out by individuals or very small groups. Failure to mention them explicitly is not intended in any way as a slight on their work; it is simply hoped that what has been written fairly typifies the sort of work that was going on in the UK, without traducing anyone.

The point of the story has been to show that the numbers of persons involved in the endeavour in the UK was rather small (perhaps a score or so of quantum chemists, with perhaps a comparable number in dynamics and simulation, although there were probably about five or six times as many crystallographers), and that their efforts were fragmented. It was natural therefore that attempts would be made to encourage collaborations so that the UK might more effectively contribute to computational chemistry at the international level. The development of such collaborations is the theme for the 1970s. It is also the case that from about 1973 onwards the story of computational chemistry in the UK becomes, to a large extent, the story of computational chemistry on United States designed and built machines, and this change and its difficulties is another theme for the 1970s.

The 1970s

In 1970 Vic Saunders was appointed to the staff at Atlas lab with the remit of supporting computational chemistry generally and in particular computational quantum chemistry. He was joined in this enterprise in 1972 by M. F. (Martyn) Guest. In a similar manner, first M. (Mike) Elder and then P. A. (Pella) Machin were appointed for the support of computational crystallography, from which developed the support for chemical databases and database manipulation. There were, of course, many other appointments in support of other computational enterprises, but to follow the development of these two through the 1970s typifies the developments in computational chemistry more generally. In saying this, however, a caveat must be entered for quantum chemistry because two outstanding and extremely influential figures in the subject died early in the decade. Frank Boys died at the age of 61 in October 1972, and Charles Coulson died, after a long period of debilitating illness, at the age of 63 in January 1974. There can be no doubt that their intellectual presence in the subject was much missed as was the consummate political skill of Charles Coulson.

There are also some institutional changes at Atlas lab and at RHEL that affect the story too. Although the computing facilities for the high energy physics community at RHEL had been provided on a UK machine (Orion), by 1966 that machine was proving inadequate for its needs, and RHEL won the argument that any future machine should be from IBM on the grounds that this was necessary to collaborate effectively with CERN in Geneva. In 1967 Orion was replaced by an IBM 360/75, and this was replaced in 1971 by a 360/195. The Atlas machine was closed down in 1973 and though every effort was made to replace it by a UK machine of suitable power, this proved impossible. So in 1975, on the retirement of its first (and only) director, Jack Howlett, Atlas lab was merged for organisational purposes with RHEL, to form the Atlas computing division of the Rutherford Appleton Laboratory (RAL), and computing work was transferred to the 360/195. There was a large upheaval in 1977 when five of the Atlas lab staff who had been particularly concerned with computational chemistry support, were transferred to what had been a nuclear physics laboratory, running an electron synchrotron called NINA at Daresbury near Warrington in northwest England. The developments at Daresbury are important for the story after 1977, but need not be told now. There are similar changes in computing equipment both at Cambridge and at Manchester but consideration of these too can be delayed.

The Meeting House Developments

During 1971 and 1972 there was much discussion in the Science Board of the SRC on the computational needs of theoretical physics and chemistry. It was the view of some that the way forward was to set up institutes researching these subjects and to concentrate the personnel and computing power at these institutes. Such an idea for theoretical chemistry institutes was floated by Brian Flowers (who was then Chairman of SRC) but it rather soon sank. An idea that seemed more buoyant was that of "Meeting Houses" whose origin lay probably with Prof. R. (Ron) Mason. He was until 1971 Professor at the University of Sheffield, when he went to the University of Sussex. From 1969 he was Chairman of the Chemistry Committee of the Science Board, and from 1972 to 1975 he was Chairman of the Science Board itself. He went on to be Chief Scientist at the Ministry of Defence, Sir Ronald Mason KCB, and to hold some important positions in public life. The fruition of his idea as far as computational chemistry is concerned begins with a memorandum by Jack Howlett to the Science Board in October 1972, which is worth quoting from rather extensively. Howlett wrote:

It seems to us that the way one would go about actually setting up and conducting a "Meeting House" on any particular topic is likely to be largely independent of the topic. We suggest the following: an underlying assumption is that every project which is undertaken is expected to have some visible product often no doubt a computer program but in other cases (or in the same case) possibly a new method of attack on some problem or an understanding of some phenomenon.

  1. We invite a small number - say 4 - of very distinguished scientists whose views would command respect to meet in the Laboratory for a day, to discuss the subject amongst themselves and to suggest areas for study, and names of people knowledgeable in those areas.
  2. We get as many as possible of the people suggested in (i) to come to the Laboratory for more detailed discussions amongst themselves and also with members of our own staff and possible a Science Board nominee. These may go on for a few days, and should lead to the specification, in broad terms, of one or more problems that are to be tackled and to the setting up of a small group to be responsible for each. These groups mayor may not include Laboratory staff and are not permanent, but disband when the project is completed.

    There would be an interval between (i) and (ii) whilst either a member of staff worked full-time on getting familiar with the subject or someone who was already knowledgeable was found who was willing to join the project for its duration. . . .

  3. The groups then settle down with members of the Laboratory to plan out the work in as much detail as seems sensible, and to decide on the resources required-e.g., manpower, machine time, special storage need, special software support, special hardware.
  4. ....
  5. Having made. . . as good arrangements as we can, we and our collaborators start work. We should hope to be free of too much administrative control and reporting, but progress would be reported to the appropriate bodies- . . . -and to the 4 Wise Men (if I may refer to them thus) if, as one would hope, they were willing to keep a general eye on the work. The working groups would of course be free - be encouraged in fact - to arrange seminars at the Laboratory and to contribute to such meetings arranged in universities.

In all this we have placed the "Meeting House" concept in the domain of the Science Board, but the idea has aroused interest among the other SRC Boards, notably the Engineering Board.

In fact, the idea was taken up by the Engineering Board too, in the formation of what were eventually called Special Interest Groups, but it is the positive response of the Science Board that is of interest here and that led through the first meeting house project to the development of the Collaborative Computational Projects (CCPs).

Jack Howlett proposed, and Science Board accepted, the proposal for the first meeting house to be one which would study "molecular correlation errors in theories which surpass the Hartree-Fock theory in accuracy" and nominated as the four wise men, Professors Bransden (Durham), Burke (Queens' University, Belfast), Coulson (Oxford), and McWeeny (Sheffield). Coulson was too ill to serve, and in his place J. N. (John) Murrell of Sussex was chosen. The wise men met first at the lab on 6 February 1974 with Roy McWeeny in the Chair. They first agreed to recommend that the title for the project overall should be "The Atlas Computer Lab Meeting House in Theoretical and Computational Physics and Chemistry" and further recommended that a specific project should be started, to be known as Project I: "Correlation in Molecular Wave Functions." It was recognised that if this project took off then there would be a good case for other projects to start quickly thereafter. The other topics suggested were: continuum wave functions, atomic/molecular collisions, and chemical reactivity. For the moment, it was agreed just to start Project I by setting up a small working group to guide its progress and to report back to the four wise men (who, doubtless uncomfortable with their title, named themselves the "Steering Panel").

The first Project I working group consisted of two of the wise men (McWeeny and Murrell) and J. (Joe) Gerratt (Bristol), N. C. (Nicholas) Handy (Cambridge), M. A. (Mike) Robb (Queen Elizabeth, London), and B. T. (Brian) Sutcliffe (York). The working group was to meet in March and to prepare a report for the Steering Panel giving a scientific program and also indicating the scale of expenditure (manpower, computer resources, and travel grant funding) that would be required of SRC. But before considering in more detail the scientific consequences, we give an aside on the cultural history that may perhaps be illuminating.

It is not only the historical content that is interesting in what is recorded above. The manner of the development typifies a mode of thought, reflected in a proposal for the structure of an organisation, that was perhaps peculiarly English and of its time. At least it was certainly characteristic of UK intellectual and organisational life (including scientific endeavours) before the 1980s. The organisation is top-down and the thinking essentially elitist. The nature of the structures proposed makes it overwhelmingly likely that those lower down (admittedly only those chosen by the top) will rapidly become involved and actually set the intellectual and organisational programme of the enterprise, but the programme will be strongly guided by the elite. Such organisations, like benevolent despotisms, are not without their good features, but like benevolent despotisms, they must be tempered by the "occasional assassination or well judged suicide" to remain responsive. They are also quite intolerable to those not chosen by the elite and who are thus excluded from participating in the programme setting. Such modes of thought and forms of organisation vanished from the UK in the 1980s. They were replaced by structures justified by populist rhetoric and operating a peculiar version of Lenin's "democratic" centralism, in which the idea of top-down organisation meant that the central committee set the targets for the workers to achieve. It follows then that extreme caution should be exercised in attempting to learn any contemporary lessons from the history recorded here.

The first meeting of the Working Group took place at Atlas lab on 28 March 1974. At this stage the Atlas had been decommissioned, and it was pretty clear to the participants that any future computing would be on the IBM 360/195 at RHEL. This is implicit in all that is said from now on. The agenda for that meeting began as follows:

I THE SCIENTIFIC SCOPE OF THE PROJECT

Note: The steering panel felt that its general intentions for Project 1 could best be conveyed to the working group in the form of a list of topics worthy of investigation:

  1. Valence Bond Theory and Its Variants
  2. Multiconfigurational Self-Consistent Field Theory
  3. Geminal and Group Function Methods
  4. Large-scale Configuration Interaction
  5. Transcorrelated Wave Functions
  6. Many Body Perturbation Theory and Green's Function Methods
  7. Time-Dependent Hartree-Fock Theory, Response Functions, and Related Methods

The meeting was a lively one as can be gauged from the minutes ("Dr Handy expressed shock. . .", "Prof. Murrell was generally sceptical. . .", "Dr Robb pointed out. . ."), and it appears that, at some stage, Prof. Murrell emerged as the Chairman. But what is really interesting is that there was a serious and extended discussion of whether a package of programs for quantum chemical purposes was the way forward and, if it was, how such a package should be implemented so as to be most useful to the community. In the end it was agreed to build on what was already available at Atlas lab and to link the existing SCF and MC-SCF packages via a stand-alone four-index transformer to a CI package to be developed from the POLYATOM code. It was agreed too, that a PostDoctoral Research Associate (PDRA) would be of great benefit to the project.

The discussion begun at this first meeting was continued at a second one on 8 May. A little before this second meeting took place, from 8-11 April 1974, Atlas lab sponsored a symposium, held at St. Catherine's College, Oxford, entitled "Quantum Chemistry: The State of the Art." In fact, it covered rather more than just quantum chemistry; it included some scattering theory too. The proceedings make interesting reading as an account of many of the contemporary concerns of computational chemistry in the UK. The symposium was also seminal, in that it exemplified a pattern for useful and effective small meetings that was later much used to good effect by the CCPs.

From the minutes of the second meeting, can be seen the sort of tensions developing that are perhaps inseparable from any attempt at a collaborative venture aimed at increasing the public good. The meeting was devoted to discussing what work the PDRA, to be appointed, should undertake. It was recognised that any worthwhile PDRA would wish to be independent but anxiety was expressed about the possibility of appointing someone who turned out to be a maverick. In the end it was agreed to present a selection of topics, each the particular interest of a member or members of the working group and to appoint an individual with the possibility of any of these in mind. W. R. (William) Rodwell was appointed as the PDRA during the summer of 1974 and was in fact appointed to develop the variational CI approach, both in the configuration-driven form code and also in the (then) recently invented integral driven approach originating with Bjorn Roos and Per Siegbahn in Sweden. With this appointment, naturally enough, some members of the working group became more involved in the actual mechanics of implementing the code, while others became less so. At one level it could be looked on as a matter of winners and losers in the working group, but the stresses did not break up the group, for all seemed to hope for the production of code that would be of widespread use to the community.

The group did not meet again until October 1975 and by that time Atlas lab had become a division of RAL with all Project 1 work being done on the IBM 360/195, although a UK machine (an ICL 1906) was in use for other Atlas division projects. At that meeting a progress report of work done by Project 1 was presented. The configuration-driven CI had been developed from the MUNICH CI package developed by G. H. F. Diercksen. Brian Sutcliffe was involved in both MUNICH and the developments at Atlas, as was M. F. (Michael) Chiu. The integral-driven CI had been developed from the MOLECULE program of Roos and Siegbahn. A stand-alone four-index transformer had been developed by Vic Saunders, and the integral and MO-SCF packages which now constituted ATMOL had been interfaced with the CI and transformation packages. In many ways these were extremely satisfactory developments because anyone who had a suitable IBM machine (and Cambridge had acquired a 360/165 in 1974) could simply port the code and run it. Furthermore, any UK worker could apply to SRC for a grant of time on the RAL machine and, if successful, could run the program suite on the chosen problem, with the support of the Atlas lab group.

However, the codes were pretty machine-dependent and certainly could not be used as "black boxes." In these respects the codes were no better (and no worse) than any others available in the United States and elsewhere. Computational chemistry codes became portable in a routine way only in the 1980s, when also their operation became transparent to users, with the widespread use of free-format input, made interactive on an interface with suitable graphics. In fact, free-format input was actually a feature of the ATMOL suite, and so it was rather ahead of its time.

Given the era, however, it is not surprising that quite a lot of the discussion at the 1975 meeting was devoted to porting the codes to other machines. Of course, the need for porting was not a new need. Martyn Guest had actually ported the ATMOL code to the CDC 6600 at the Manchester Regional Centre (UMRCC) even before he joined Atlas lab. But porting was thought particularly urgent as the Regional Computer Centres, envisaged by Flowers, were now up and running with a new generation of machines, and it was agreed that the porting should be done by the permanent staff at Atlas who were involved in the project. There was also much discussion on what should be tackled next. The Chairman put forward the view that the Project should be closed down early in the New Year. But other members of the working group had their own agendas for continuation. No agreement was reached on these agendas and a whole-day meeting, a month or so hence, was proposed, at which advocates of particular developments could concentrate on persuading the working group.

Two other aspects of the meeting bear noting. The first was the realisation that a powerful interest had not been given a voice on the working group and that it was therefore important to incorporate it before too long, so Ian Hillier of Manchester was invited to join. The next was a discussion of relations with, and possible support for, the Centre Europeen Calcul Atomique et Moleculaire (CECAM), an institute in Paris run by Carl Moser. In fact the UK had joined what was then called the Common Market (its successor is the European Union (EU)) on New Year's day 1973, but there had also been a referendum in 1974 on whether the UK should stay in on the existing terms. Since the referendum had gone in favour of staying in under the existing terms, the idea of Europewide science was being seriously considered even though historically SRC had not favoured such an idea. This was chiefly because joint ventures meant budget commitments related to a basket of foreign currencies, whose exchange rates were beyond SRC's control. (SRC had been very badly burned over the UK contributions to CERN in the early 1970s because of currency fluctuations.) Subsequently Dr Hillier agreed to join the working group (and indeed was to succeed John Murrell as Chairman in 1980), but SRC declined to support CECAM, a position not changed until 1979.

The whole-day meeting took place on 4 December 1975, at which Joe Gerratt spoke for Valence Bond Techniques, Dr B. T. (Barry) Pickup of the University of Sheffield spoke for Green's Function Methods, Mike Robb and Vic Saunders spoke for Coupled Pair Many Electron Theories (CPMET), and Nicholas Handy spoke for Direct (that is, integral-driven) CI methods. The upshot was that Joe Gerratt was invited to inaugurate a pilot study on the viability of the VB method and that, in collaboration with Dr Robb, a largescale CPMET project should be undertaken. These proposals were considered by the Steering Panel, who recommended them to Science Board, where they were accepted at the spring of 1976 meeting. Thus Project 1 was continued for another 3 years.

But large changes were in the offing. Proposals had been made to reorganise the SRC permanent laboratories, and these were about to be acted on. Their consequence for this story is the shifting of the computational chemistry focus from Atlas to Daresbury Laboratory, as mentioned above. But before going on to consider the developments after the change of location, it is useful to back-track a little and consider the chemical database work done at Atlas lab.

The Chemical Database Developments

The efficient and effective inversion and manipulation of chemical databases formed one of the most difficult and intellectually challenging problems that faced computational chemists in the 1970s, not least because of the essentially pictorial nature of chemical structure. information and the need to specify connectivity of significant fragments. The first work in this area in the UK was undertaken at Atlas lab by Mike Elder and Pella Machin, in collaboration with O. S. Mills of University of Manchester, using the Cambridge Structural Database (CSD) developed at the Cambridge Crystallographic Data Centre (CCDC) under the direction of Olga Kennard. An account of the state of crystallographic databases generally in the mid-1980s can be found in the IUC monograph. The work that begins here develops in the 1980s into a Chemical Database Service provided through the SRC and its successor, the SERC, for the UK chemical community and making available databases of NMR, IR, and mass spectra as well as reaction-oriented databases.

In 1974 when this work was begun, the CSD consisted of about 25,000 entries, each entry having a bibliographic element, a connectivity element, and a data element. The data element, giving atom positions and so on, was (and is) by far the largest element in an entry. The UK workers used the CSD files as inverted by a program developed at the National Institutes of Health (NIH) in Maryland and thus began using, as the information retrieval system, the Crystal Structure Search and Retrieval (CSSR) program originating there too. Both these programs owed their origins to the work of Richard Feldman at NIH. Essentially, the NIH file inversion program was incorporated into the UK suite and then updated independently of any NIH developments, as and when the CCDC conventions for file specification were changed and updated. The CSSR program was also taken over, and it developed into a program to handle not only CSD data but also that from other chemical databases. Thus Feldman's ideas and philosophy much influenced the development of the UK codes.

By 1977 when Drs. Elder and Machin had also moved to Daresbury, a set of extremely effective programs had been developed, whose utility to the community was increased by the early developments in networking that were taking place, albeit limited by modern standards. It was also clear that such work was of great commercial interest and that there was going to be commercial competition. Thus database developments were perhaps 6 or 7 years in advance of the rest of computational chemistry in having to face up to the dilemma of operating an enterprise, set up initially only with the public good in mind but readily developed as a source of revenue and perhaps in competition with a commercially provided alternative. This theme is one to be returned to later.