Contact us Heritage collections Image license terms
HOME ACL Associates Technology Literature Applications Society Software revisited
Further reading □ Overview24 August 1961196119621963196419651966196719681969197019711972197319741975QuestQuest Computing NewsMighty Giant
ACD C&A INF CCD CISD Archives Contact us Heritage archives Image license terms

Search

   
ACLLiteratureNews Items :: Literature: News Items
ACLLiteratureNews Items :: Literature: News Items
ACL ACD C&A INF CCD CISD Archives
Further reading

Overview
24 August 1961
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
Quest
Quest Computing News
Mighty Giant

1969 Press Releases

1969

Man who Counted Taxis

Electronics Weekly: 08.01.69

The award of a CBE to Jack Howlett, director of the Atlas Computing Laboratory at Chilton, is a splendid opportunity to examine the phenomenon of personal spin-off.

More of this characteristic has been generated by Jack Howlett than by most of the grizzled eminences in the computer field, and in himself he represents yet another example of the influence of Prof Hartree on the UK computer scene.

Jack is also a reminder of the strange things that happened to numerical men in the pre-computer area.

In the days when Howlett worked in the research department of the LMS railway (Hartree was a railway enthusiast) it is said that he once carried out an operations research assignment on the facilities at Euston Station. This involved him in sitting outside the station for long stretches and counting the taxis as they arrived and left.

He also uncovered the dark secret that the travelling public's immediate destination on leaving a train was in the majority of instances, the publicly provided plumbing nodes.

In 1948, Howlett and Harwell started an association which has continued on and off to this day, although the current connection between Chilton and atomic physics is mostly one of geographical disposition.

But the Mercury computer brought him into contact with the most lively set of European computer users at the time and in 1959 it was fitting that the first destination chosen for the ambitious Atlas computer project was the Chilton site, following the Manchester University prototype machine.

The services rendered by this centre to university research workers have led to Chilton being associated with many advanced ideas in applications programming, and the running in of the ambitious software for the large-scale computers in the UK that aim to combine batch processing with remote terminal facilities.

ICL struggle to smooth out 1900 flaws

Electronics Weekly: 21.05.69

ICL's race against time to erase the impression made by the Bryant- Baylis report for the Science Research Council on comparative computer performances in Fortran applications is rousing many sectors of the UK's scientific community into a collaborative response.

The issue has suddenly become a live one because of the Continental backwash felt by ICL in its endeavour to present the modified large 1900 series machines, the 1906A and 1908A, as powerful number crunchers.

The European view in some influential quarters is that the Bryant-Baylis report damns the architecture of the 1900 series sufficiently to rule out the investment of money needed to improve the performance of the machine in the scientific computing field.

In the ten months since the report came out covering the Atlas, CDC 6600, PDP-10, Univac 1108, IBM 360/75, System 4-50 and the 1907 in typical Fortran program standards, a number of pressures have built up to remedy the poor showing of the 1900 in Fortran work.

Little option

The UK's government funded computer installations have been given little option on judging the computers they use in the light of the total cost/performance ratios exposed in the Bryant-Baylis report. The report was equally critical of the IBM 360 architecture for Fortran work but judged the H02 compiler issued for the 360 as being a very considerable improvement over the H00 version.

The System 4-50 compiler was rated as producing better code than the H00 360 compiler, but was not thought to be as good as the H02 version.

The future of the 1900 series cannot be guaranteed entirely by the design developments envisaged for the 1908A, as this is too far in the future to encourage firm plans for scientific establishments. ICL have introduced a new Fortran compiler since the report was issued which improves on a number of characteristics.

The use of subscripts within the code has, for instance, been amended. Some users regard these changes as important, as some establishments talk of them as trivial.

The vital section of the report, written by P. Bryant and M. H. J. Baylis, said: The current compiler, even recognising the machine's limitations, is not a good piece of work, but it would not be easy to improve it to any great extent without considerable effort.

The effort being made to change the whole premise on which the Bryant-Baylis report is being used to counter ICL's attack on the European scientific market, is coming from a number of sources. The use of the expanded facilities embodied in the 1906A is being closely studied in a number of centres, including the Atlas Computing Laboratory for which the report was originally written.

The work of Professor Barron at Southampton University in fundamentally restructuring the basis of current compiling techniques on the 1900 is attracting support from other government 1900 installations and in one instance personnel may be seconded to work with Prof Barron's team.

The Atlas Lab itself is producing special purpose software to handle peripherals of advanced design, and is also concerned with improving the practical compatibility for Fortran across a variety of machines.

In the recent past the tendency to insist on UK computing hardware for UK government-dependent users has hardened. It is noticeable however, that the Concorde project for airframe testing by RAE is using a PDP-10 computer.

The 1908A design has the advantage that a combination of new compiler techniques and the extra registers it possesses may make the 360 style products seem the stumbling block to progress by the time it finally emerges.

The future of social research

Computer Weekly: 26.06.69

Social scientists and statisticians who do not fully understand computer disciplines may hold back the future development of computer applications in the social sciences.

This warning comes from Mr J. E. Hailstone who is head of the operations group, Science research Council's Atlas Computer Laboratory. in a recent Social Science Research Council newsletter.

Mr Hailstone said that there were two schools of thought on the future of social research. One believed that present methods and facilities, dictated as they were by the needs of the natural sciences, might impose too rigid an approach to social research.

The other, held that there would soon be an explosive demand for computing in the social sciences which would revolutionise those areas of study in the same way that a computerised mathematical model of the atmosphere has revolutionised meteorology.

"Only ten per cent of current projects in the Atlas Laboratory involve social science research, and the demand for computer time showed an even smaller share of the total at seven per cent," he said.

Mr Hailstone added that computing time at university computing centres was normally free, as it was at the Atlas Laboratory, so the low usage was not a function of the amount of money available.

He said that even if computer time was readily available those engaged on research might be ill-equipped to use it.

There were three main dangers, he added: First, the relatively low general level of mathematical and statistical appreciation of a problem; second , the common difficulty of all computer users in formulating a good method of solution; third, an unwillingness to understand the rigours of the scientific method.

The computer is frequently called upon to do massive arithmetic, he said, but the user who has little or no idea of what is to be done with the results is merely transforming his data from one form to another, equally, if not more complex than the original.

He said that many courses in statistics did not prepare students to think of how to design and analyse their experiments, but presented them with a cook-book of techniques and only a vague idea of how to use them.

"Social scientists and even physical scientists are not yet adequately versed in numerical methods," Mr Hailstone claimed, and he said that the computer was not a substitute for thinking.

He said that a solution could be found in better education and awareness of the many pitfalls in handling numerical information.

A footnote to Mr Hailstone's argument came in a letter to the Times from Dr Imre Lakatos, reader in logic at LSE, who linked the gap. between the promise of the social sciences and the actual achievements with student unrest.

Dr Lakatos said that students had sensed the predominance of empiricism, and added that the prevailing gap between the messianic (relevant) promise and the modest (irrelevant) achievement in the social science's provoked rather than calmed student revolutionaries.

Super-computer rivals are given a battering

Electronics Weekly: 27.08.69

THE SUPER-COMPUTER ambitions of the European-controlled computer makers received a battering from last week's announcement by IBM of the 360/195 to be delivered in the first quarter of 1971. The delivery date is perilously near to hand for even the established house in the field - Control Data Corporation with its 7600 and 6700.

Only about a dozen of the 7600 machines can be handed over before the date quoted by IBM, and CDC will have to make up its mind very smartly as to whether the 195 is a paper tiger or not.

The ICL largest machine on offer currently, the 1908A, is not due to be delivered before 1972/3, although a number of rival approaches to amassing computer power through networks of machines are brewing around Europe.

PRICES INDICATES POWER

The power of the 195 can be indicated by quoting its price, in the region of £3 million to £5 million for typical "buys," and the IBM opinion that it is roughly twice the power of a model 85.

The 195 is not an upgraded version of model 85 technology, however. The 195 represents a major commitment by IBM to the monolithic integrated circuit. The buffer memory of the model 195, which can run up to 32,000 bytes, is an IC memory and may not suffer from the development problems which have been rumoured to have stuck to the 85.

The cycle time of the 195 processor units and the buffer memory have been geared to 54 nanoseconds. The raw speed of the CDC 7600 is higher than this and the 7600 still uses discrete components.

The applications listed as being target areas for marketing model 195 are global weather forecasting, big airlines reservations projects and time-sharing networks of an outsize sort.

The faintly curious aspects of the IBM announcement are the coincidental facts that the model is currently reserved to the US and Canadian markets, yet the tenders for the UK Meteorological Office project have just surfaced.

The inference is that IBM is prepared to give up the Met job, but has declared war on CDC for all subsequent bids.

For European hopes in the super-computer market, however, the prospects are dimmed considerably by the move from IBM. The role of CDC in Europe has been ambiguous, and indeed they supply so many of the peripherals to this market that they are looked on more as a friend than a foe.

But the role of IBM is thoroughly unambiguous and many 360/65 installations, such as in the Rutherford Laboratory, will almost certainly move to the 85 and now - to the 195. The pricing of the model 195 is interesting in that it exactly parallels the 7600 price brackets. The strength of CDC in the market is based on the wide range of models now available from them with particular attributes in special sectors of the large machine market.

Separate pricing from IBM will be a real market advantage in dealing with the special characteristics of super-computer buyers, and the three firms with a stake in this market are all closely aligned in this respect.

The last of the three is Burroughs, who have still to pull the fat of the 8500 series out of the fire. There are a number of indications that the 8500 is beginning to turn the corner and to emerge as a real alternative.

On the other hand, if Burroughs don't make it quick, they will lose the chance altogether, and as for ICL . . .

The main store of the 195 has been meshed into the buffer store concept by memory accesses that grab 64 bytes at a time and stream them into the buffer store. The cycle time of the main store is lazily ticking over at 0.756 microseconds.

The weakness of the structure rests with the program changes which will be needed from users if they are to take advantage of the parallelism of the 195. The mystic phrase used by IBM is that most 360 programs will run on the 195. The reluctance of IBM in the large machine market appears to have been finally overcome.

Number theory: Now computers find new results

Computer Weekly: 18.09.69

R. F. Churchhouse

Bob Churchhouse (Centre) at the Conference

Bob Churchhouse (Centre) at the Conference
Full image ⇗
© UKRI Science and Technology Facilities Council

OVER a hundred leading mathematicians gathered in Oxford to hear some of the more striking developments in the application of computers to outstanding problems in number theory. With group theory (the topic of the first Atlas Symposium in 1967), number theory is one of two areas of pure mathematics in which computers are helping to uncover new results.

Both this year's and the 1967 symposia were sponsored by the Science Research Council and ICL, and organised by the Atlas Computer laboratory.

Of the 12 countries represented, the largest contingents were from the US (44), Britain (33) and France (12).

The interest aroused among number theorists by computers is strikingly illustrated by the illustrious names among the participants: Mordell, Cassels, and Swinnerton-Dyer from this country, Lehmer, Marshall Hall, John Todd and Olga Taussky from the USA, Erdos of Hungary, Serre from France and many more.

The symposium programme was divided into 17 sessions of approximately two hours each. In each session three or four papers were presented. Most of the sessions were devoted to papers relating to some particular class of problems, although there were a few sessions miscellaneous topics. Thus there were sessions on algebraic number theory, diophantine equations, partitions, the Zeta function, modular forms, primality and others.

Impact

From the computational point of view one of the most interesting developments brought out at the symposium has been the realisation that a combination of Bakers Theorem and multi-length arithmetic can be used to settle questions of solvability of some types of diophantine equations.

Bakers Theorem, which was first proved about five years ago, is of fundamental importance because it provides for the first time an upper bound for the magnitude of the solutions of some diophantine equations. The bound is, in general,very large: for example if x, y, N are integers satisfying.

X2 - 2 y2 - N

then Bakers Theorem tells us that x, y cannot exceed (300000N)23.

Large though this number is, it is quite possible to use it to find all possible solutions of the equation by a combination of number theoretic methods, which cut down the number of possibilities enormously, and some multi-length arithmetic. The full impact of this combination is yet to be seen.

Swinnerton-Dyer gave an account of some very interesting work he has been doing on calculating the successive minima of the product of three linear forms. This is a classical problem in the geometry of numbers. The first two minima were obtained by Davenport about 30 years ago. Swinnerton-Dyer has now found the first 16 minima and it seems as if they are not approaching a finite limit (unlike the case of the product of two forms). This is an area where computers could be of considerable value but the problems are very difficult, calling for a deep understanding of the theory and very skillful programming.

Several papers were devoted to the Riemann, Zeta function. Perhaps surprisingly, no one reported any more Zeros on the critical line and the record still stands at 3,500,000. Bateman talked about his search for linear relations between the zeros - none were found.

The proceedings, edited by Atkin and Birch, are being published by Academic Press.

The next Atlas Laboratory Symposium will be held in Oxford in August, 1970 on Numerical problems in radiative transport theory.

⇑ Top of page
© Chilton Computing and UKRI Science and Technology Facilities Council webmaster@chilton-computing.org.uk
Our thanks to UKRI Science and Technology Facilities Council for hosting this site