Contact us Heritage collections Image license terms
HOME ACL ACD C&A INF SE ENG Alvey Transputers Literature
Further reading □ Overview □ 1987 □ 12345 □ 1988 □ 6789 □ 1989 □ 101111a121314151617 □ 1990 □ 181920212223242526272829 □ 1991 □ 303132333435 □ 1992 □ 363738394041 □ 1993 □ 424344454647 □ 1994 □ 484950515253 □ 1995 □ 545556575859 □ 1996 □ 60
CCD CISD Harwell Archives Contact us Heritage archives Image license terms

Search

   
InformaticsLiteratureNewslettersECN
InformaticsLiteratureNewslettersECN
ACL ACD C&A INF CCD CISD Archives
Further reading

Overview
1987
12345
1988
6789
1989
101111a121314151617
1990
181920212223242526272829
1991
303132333435
1992
363738394041
1993
424344454647
1994
484950515253
1995
545556575859
1996
60

Engineering Computing Newsletter: Issue 24,

July 1990

Editorial

This month there is a wide range of articles, providing a balance between items of general information to more detailed technical items. I welcome news from Community Club activities, now gaining momentum under the EASE Programme; we plan to have regular coverage of such activities. Community activities benefit everyone one way or another, and for this reason contributions from anyone on any subject are encouraged.

If you have not seen this Newsletter before and wish to get on the mailing list, please send your name and address, preferably by e-mail, to me.

Sheila Davidson

MMED Community Club

Community Club for Modelling and Management of Engineering Data

The first meeting of the MMED Community Club was held at UMIST on Monday 4 June and was attended by about 25 people. The first part of the meeting consisted of a number of presentations, the second part was a discussion on the work programme of the club. Below is a short summary of the meeting.

Tony McClelland (University of Strathclyde) presented the work on the Neutrabas project. The aim of this project is to build a data model for information relating to ship building, in particular steel structures. This data model is then implemented in a system for storing and manipulating this data, and the architecture of this system was explained.

Simon Corne (CADDETC, Leeds) talked about the exchange of CAD data, in particular about the IGES standard. IGES is probably the standard most widely used, but nevertheless has a number of shortcomings. Some of these are the inefficiency of the IGES exchange file, the ambiguous definitions of some entities, and the lack of explicit conformance requirements. It is hoped that the emerging STEP standard will overcome most of these problems.

Jan Maciejowski (University of Cambridge) explained his ideas for a project which would be suitable for the club to undertake. The project would be the development of an information model in the area of control engineering. Reasons for undertaking this activity are that the project would act as a pilot project from which experience could be gained which would be of wide interest, there is a clear and well defined need for such a model, and control engineering has strong interrelationships with other engineering disciplines.

Chris Osland (Rutherford Appleton Laboratory) outlined the techniques used in the standard describing the CGM (Computer Graphics Metafile). The standard consists of one part which gives the basic definitions, and three other parts giving three different encodings. The different encodings are developed with a view to making the file either easy to read for humans, easy to read for computers, or to produce a file which is as short as possible.

The second part of the meeting was devoted to a discussion of the work programme for the Club. Interest was expressed in:

Finally, at the end of the meeting of the Club the important decision was taken to set up a core group of members which will expand the ideas listed above into a work programme which can be formally submitted.

Jan Van Maanen, RAL

The Catalogue of Artificial Intelligence Techniques

The third edition of the Catalogue of Artificial Intelligence Techniques is now available. The catalogue provides a quick guide to the most important computing techniques developed in the field of Artificial Intelligence (AI). It consists of a brief description of each technique together with a reference to enable interested readers to get more information. It is intended both for AI researchers exploring beyond their area of immediate interest and for researchers from other fields who suspect that AI might have some solutions to a problem in their own field.

The original version of the catalogue was built in 1983 as part of the UK SERC-DTI, IKBS Architecture Study. It was adopted by the UK Alvey Programme and, during the life of the programme, was both circulated to Alvey grant holders in hard copy form and maintained as an on-line document. A version designed for the international community was published as a paperback by Springer-Verlag. Now that the Alvey Programme has finished, it is being made available to engineering and IT researchers through the EASE programme. During this time the catalogue has undergone constant revision and refinement. We have recently completed the third edition, and hope that it has now matured into a comprehensive reference book of techniques in AI.

By AI techniques we mean algorithms, data (knowledge) formalisms, architectures and methodological techniques, which can be described in a precise, clean way. The catalogue entries are intended to be non-technical and brief, but with a literature reference. The reference may not be the 'classic' one. It will often be to a textbook or survey article. The border between AI and non-AI techniques is fuzzy. Since the catalogue is to promote interaction, some techniques are included because they are vital parts of many AI programs, even though they did not originate in AI.

We have not included in the catalogue separate entries for each slight variation of a technique, nor have we included descriptions of AI programs tied to a particular application, nor of descriptions of work in progress. The catalogue is not intended to be a dictionary of AI terminology, nor to include definitions of AI problems, nor to include descriptions of paradigm examples. We have deliberately limited it to major AI techniques in order to keep it short and coherent.

By circulating the catalogue among a new audience we hope to make AI techniques more widely known and to promote their use. A new field, like AI, can often hold out a tantalising promise to outsiders, but simultaneously baffle them with new jargon, especially when the jargon is not even used consistently! By boiling the bewildering array of new AI techniques down into a slim, concise volume, we hope to assist new users overcome the jargon barrier and make use of the new tools.

Alan Bundy, University of Edinburgh

EMR Contract Award

Invitations to bid for EMR contracts have appeared in the Newsletter over recent months.

The Computing Facilities Committee has now decided that the following contracts should be awarded:

Geoff Lambert, RAL

The Mathematica Computer Algebra Package Pages

Introduction

Mathematica, described by its authors as A System for Doing Mathematics by Computer, is essentially a computer algebra package with graphics capabilities. It is a C language program of approximately 150,000 lines which, at the Rutherford Appleton Laboratory, has been installed on artois, the STELLAR GS2000 Graphics Supercomputer.

Anyone wishing to use Mathematica should refer to the Mathematica book(Wolfram, S. Mathematica. A System for Doing Mathematics by Computer, Addison-Wesley, 1988), and be familiar with Chapter 0, and at least the first few sections of Chapter 1. In this article I shall attempt to give a very brief overview of the three types of calculation; numerical, symbolic, and graphical; performed by Mathematica, and an idea of the facilities available.

Numerical Calculations

Mathematica performs numerical calculations, and returns results accurate, by default, to six significant figures. The user may, however, specify any degree of precision. Mathematica will return, where possible, an exact result to a numerical calculation; for example, 2100 is given exactly to 31 decimal digits. Mathematica can also return a rational result to a calculation; typing

1/3 - 1/4

returns the value 112.

In addition to the elementary transcendental functions; exponentials, logarithms, square roots, (inverse) trigonometric and (inverse) hyperbolic functions; Mathematica contains many special functions and orthogonal polynomials. Examples are the Euler gamma function and the Legendre polynomials. Mathematica also performs numerical integration; typing

Nlntegrate[ Sin[Sin[x]], {x, 0, Pi} ] 

gives 1.78649, the numerical value of the integral from 0 to π of sin(sin x) dx.

As well as real arithmetic, Mathematica can deal with complex numbers and complex arithmetic. The built-in constant I stands for the imaginary number, square root of -1.

Symbolic Calculations

Algebraic Formulae

One of the most important things about Mathematica is that it can do symbolic as well as numerical calculations, and can therefore work with algebraic formulae as well as with numbers. Some of its features include:

Mathematica allows the user to define variables and functions, eg, x = 5, or g(x) = sin 2π. It also allows the recursive definition of functions. Variable and function definitions can be saved in files for later use.

Solving Algebraic Equations

Mathematica solves algebraic equations in one variable, any set of simultaneous linear equations, and also a large class of simultaneous polynomial equations. Mathematica always tries to give explicit closed-form solutions to equations. For example, it will give the solutions

x = (-16 ± 4 * sqrt(15))/2

to the quadratic equation

x2 + 16x + 4 = 0

Mathematica also gives the numerical approximation to the solution:

x = -0.254033, -15.746

to the above equation.

Mathematical Operations

There are a number of mathematical operations performed by Mathematica. Examples are:

Lists

Another important feature of Mathematica is the idea of a list, eg

{1, 2, 3} or {a, b, C} 

One common use of a list is to represent a vector. A matrix is represented as a list of lists, eg

{{1,2,3}, {4,S,6}} 

Many matrix and vector operations can then be performed, eg inversion, multiplication, and the calculation of determinants and eigenvalues.

Another use of a list is to represent a set. Using lists as sets, Mathematica can perform the standard set theoretical functions; union, intersection, and complement.

Mathematica Packages

Mathematica contains a number of files or packages with definitions of Fourier transforms, Laplace transforms, etc. Typing

<<Laplace.m 

reads the package Laplace.m on Laplace transforms. On artois the packages are in the directory /usr/math/ Packages; this directory is automatically added to the user's search path when Mathematica is run.

Another nice feature of Mathematica is that it can write out expressions in C or FORTRAN form for direct inclusion in a program, or in TEX form for inclusion in a document.

Within Mathematica, there is an on-line help facility. Typing ?Name shows information on Name, whilst typing ??Name shows more detailed information.

Graphics

Mathematica has facilities for producing both two- and three-dimensional graphics. Two Mathematica graphics functions are now described, together with examples of the options available to them. The function Plot generates a two-dimensional plot of a function of one variable, eg

f(x) = sin x, 0≤x≤2π

The curve is generated by sampling the function at a limited number of points. Mathematica always tries to plot functions as smooth curves, and consequently samples more points in regions where functions vary more rapidly. The user may specify the minimum number of points, and the maximum factor by which to subdivide the domain in sampling the function, using the respective options PlotPoints and PlotDivision. The function Plot3D generates a three-dimensional plot of a function of two variables, eg

f(x. y) = sin x sin y, 0≤x≤2π, 0≤y≤2π 

The surface is generated by sampling the function at a specified number of points in each direction; the default number is 15, but the user may specify the number using the option PlotPoints. The pictures generated by Plot3D can be thought of as simulated photographs, with the camera positioned, by default, slightly above and away from the surface. The user may, however, specify the position of the camera using the option ViewPoint.

The surfaces generated by Plot3D are shaded, by default, using a scale GrayLevel, according to height: the higher the point on the surface, the lighter the shading. Many other options are available however. Reading in the Mathematica package Colors.m allows the user to shade the surface using the function HSBColor (hue, saturation, brightness), with the hue or colour varying according to height. The value of the hue varies along the visible spectrum from 0 (red) to 1 (violet).

All Mathematica graphics is generated in POSTSCRIPT, which can be displayed in a window on the screen, saved in a file for later use, or printed out on a laser printer.

This article is a shortened version of Rutherford Appleton Laboratory Mathematical Software Group Note MSGN/10/90.

George Goodsell, RAL

TEX Archive

Introduction

UMIST provide a small archive of software associated with the TEX text formatting system for SERC grant holders. The software allows the user to format a document, preview it on a workstation using SunView or X-windows and print it on a PostScript printer. Additional facilities for extraction of references from bibliography databases, spelling checking and inclusion of pretty printed source listings are also provided.

Archive Contents

With the advent of TEX version 3.0 the opportunity has been taken to reorganise the archive. The current contents are as follows:

Web

This is a complete copy of the Web system conceived by Donald Knuth. This is used for building:

Fonts

Four sets of fonts are provided in this section:

Macros

These include the style files for LATEX, bibliography style files for BIBTEX and input files required for building formats for TEX, LATEX and SLITEX.

Dvips

This is used to convert from device independent format to PostScript. Numerous options are offered, including the ability to include encapsulated PostScript within the converted output file. The program requires the laserwriter fonts.

SeeTEX

This consists of a pair of previewers, one for Sun View and one for X-windows. The latter is the more developed, allowing printing of all or selected pages within a previewed document and much customisation using the Xwindows resource manager. The SunView previewer requires access to the laserwriter fonts, the X-window pre viewer access to the X-window fonts.

Additional Tools

Three additional tools are provided.

Tgrind

This is equivalent to the UNIX command vgrind. It is used to pretty print shell scripts and program source code. It can cope with both Bourne and C-shell scripts and a variety of languages including C, C++, Fortran and Lisp.

Detex

This is equivalent to the UNIX command deroff. It removes TEX commands from a file.

Ispell

An interactive speller. It not only detects bad spelling but, where possible, offers replacements for the offending word.

The latter two tools can be used standalone, however they work best in conjunction with the Gnu Emacs editor.

How to Get the Sources

There are two mechanisms for obtaining the sources, mail and FTP. The former is used for small items and the latter for large. The table shows the approximate sizes and whether the item can be fetched by mail. All the sizes are for the sources in compressed tar format.

Using Mail

There are a number of functions that can be carried out using mail, besides fetching sources. You should mail to:

info-server@uk.ac.umist.cns 

The information server will send out files contained in mail messages, in response to a request contained in a mail message, that you have sent it. Requests are of the form:

request: subject 
topic: topic within that subject 
request: end 

As an example suppose you want to be mailed information about TEX in the subject catalogue. You would send a message of the form:

request: catalogue 
topic: tex 
request: end 

and the TEX information would be mailed back to you. The key words supported by the information server are:

request, topic and line-limit

These can be upper or lower case or a mixture. They are separated from the remainder of the line by tabs, spaces or : this is optional.

Line-limit is for use by people who have mail systems that can only deal with small messages. Consider the following request:

line-limit 1000 
request: catalogue 
topic: dvips 
request: end 

This would mail out dvips information in 1000 line chunks (not including message header information). The line limit must lie between 1000 and 200000000. The default is send the file in 1 message. Everything after the request end is ignored.

Using FTP

In order to protect the security of the machine an account has been set up which allows anonymous access using Blue Book FfP. You should use a username of guest with the password set to your Electronic Mail address, for example colin@umist. If you are transferring compressed tar archives (the usual case) then these will need to be fetched in binary format. You should use a binary word size of 8 bits. The following command would be used to fetch Web, assuming you were using Sun coloured book software:

hhcp -b uk.ac.umist.cns:"<TEX>web.TAR.Z" web.TAR.Z

Alternatively, if you are using the Camnet software you should use the command:

cpt -b "<TEX>web.TAR.Z"@uk.ac.umist.cns web.TAR.Z

The names of the items in the archive are given in the table.

Item Size Mail FTP Name to Fetch
Web 1880 No <TEX>web.TAR.Z
Fonts 3020 No <TEX>fonts.TAR.Z
Macros 390 Yes <TEX>macros.TAR.Z
Dvips 120 Yes <TEX>dvips.TAR.Z
SeeTex 420 Yes <TEX>SeeTex.TAR.Z
Tgrind 28 Yes <TEX>tgrind.TAR.Z
Detex 28 Yes <TEX>detex.TAR.Z
Ispell 232 Yes <TEX>ispell.TAR.Z
Colin Walls, UMIST

Had Trouble Getting Through to JANET?

Hints for Getting Mail Through Various Gateways to and from JANET

A number of gateways on the JANET network make it possible to send electronic mail to a vast number of places around the world. In some cases this is simple, but in others it can be very messy. Each individual gateway produces information on how to use it; though what the email user really wants is the information collected together in one place, with details of how to get further, gateway specific information. The Hints for Getting Mail Through Various Gateways to and from JANET is an attempt to meet this need.

The document was originally edited by Tim Clark of Warwick University Computing Services; initially as a list of hints for Computing Services staff at the Midlands Universities. As subsequent issues reached wider audiences, many more contributions of hints were received. The latest version, issue 7, released on 23 March 1990, is the result of receiving hundreds of contributions.

The document is particularly aimed at staff responsible for providing electronic mail facilities, to enable them to configure their mail systems to automatically route a substantial amount of international mail, and to advise their mail users. Nevertheless it contains a substantial amount of information which will be of interest and use to the determined electronic mail user who is prepared to experiment in order to find a reliable route.

Readers are warned that the editor has not personally tested all of the suggestions, nor should the existence of a suggestion be construed as implying that it is correct, or even legal, to route mail as described.

The Joint Network Team has now taken over the editing of the hints and will be producing the next issue. Those wishing to be notified about new issues should send mail to mail-hints-request@uk.ac.rutherford in order to get added to the notification list.

Details of how to obtain the current issue are contained in Chapter 2 of the document which is reproduced below.

Getting a Copy of this Document

The document exists in a number of different formats:

mail-gateways.7.txt
A readable text file which can be displayed on dumb printers and VDUs.
mail-gateways.7.ms
The format the document is maintained in, suitable for use on systems with tbl and ditroff (or troft) with the ms macros.
mail-gateway.7.latex
LaTeX input. Suitable for use on systems with LaTeX and TeX.
mail-gateway.7.ps
PostScript. Suitable for sending direct to a PostScript printer.

Fetching Files Direct from Warwick University

To do this you need access to NIFTP and JANET. Do an NIFTP setting the FILENAME to the appropriate file chosen from the list above. Specify it in upper-case or lower-case, but do not used mixed-case. Set the USERNAME to anon (or ANON), and do not specify a USERNAME PASSWORD. The site is UK.AC.W ARWICK.CU. If this name is not known to your NIFfP (it was registered in NIFfP context on 22 March 1990), UK.AC.W ARWICK.SOL can be used until its expected departure in June 1990.

Getting the Files via Electronic Mail

Nottingham University Computer Science Department have kindly agreed to hold the document and make it available with their 'info-server'. Send electronic mail to info-server@uk.ac.nott.cs with the body of the message:

Request: sources 
Topic: filename chosen from list above 
Request: end 

The topic line can be repeated for each file you wish to collect eg:

Request: sources 
Topic: mail-gateways.7.txt 
Topic: mail-gateways.7.ps 
Request: end 

It will also supply a list of the files available in this category if sent the mail:

Request: catalogue 
Topic: mail-gateways 
Request: end 

For additional information, eg how to get the files in several chunks (if your mailer does not like receiving huge files), just send an empty message to it (or, if your mailer refuses to send an empty message, send something the infoserver will not recognize, eg 'boo!'). If all else fails, mail me (see my 'signature' at the end), telling me what you want, what has gone wrong, and I will mail you a copy.

Obtaining a Paper Copy

If you really do want a paper copy, then write to me at the address at the end of this document.

Tim Clark, University of Warwick

AI Support for Engineers

At the beginning of 1990 AIAI won the DTI Award for collaboration with industry. The award was for our collaboration over a 3-year period with ICL and mostly related to our Study Programme training initiative. Due to the prize money from the DTI A ward we will be reequipping our practical training facility. Tenders have gone out to various hardware suppliers and over the summer months new X-based systems will be installed.

We have taken this opportunity to rationalise our courses and set in place our training strategy for the 1990s. The main difference you will see is the disappearance of product-specific courses and the introduction of more techniques and methods based courses.

We have developed a suite of Knowledge Based Systems courses that takes the student through the complete life cycle of knowledge engineering. The courses start at fundamental knowledge representation level, go through knowledge elicitation to the final knowledge engineering stage. The KBS suite of courses comprises:

Knowledge Based Systems Skills

Fundamentals of knowledge representation are explained and the programming paradigms practised. This is the entry level course in the KBS suite.

Knowledge Elicitation

Teaches the practical techniques of getting information from experts, whether this is by using the traditional interviewing techniques or the psychology-based techniques used in multi-dimensional scaling.

Knowledge Engineering

Students are taken through the KBS life cycle and taught the analytical skills necessary to design and develop a KBS. (This was our most popular course last session!)

Additionally, we are concentrating our efforts on enhancing the language based courses. Over the summer months effort will be put into developing a suite of AI Language courses. This suite already contains:

Common Lisp

During 1989/90 the 5-day Common Lisp course has been updated and its new style presentation has been well received by all participants. This course will continue in its current format for the next academic year.

Introduction to Prolog

This course is a popular course for beginners. It was specifically developed as an introductory course to Prolog to encourage non-programmers to make a start in this area.

Parlog

The 3-day course is for those system builders who have a requirement to either assess the applicability of a parallel language or to make use of parallelism in their applications. Not only does the course develop an understanding of logic programming in Parlog, but also demonstrates the use of Parlog as a rapid prototyping tool.

Fuller descriptions of the new courses are given below.

Knowledge Based Systems Skills

This 3-day course teaches the student how to represent knowledge in a KBS. The student is taken through the major formalisms of knowledge representation and inference. Practical sessions teach the student:

Exemplars of each formalism are discussed. At the end of the course the student will be able to:

Other features are covered in the course, including reasoning with uncertainty and providing explanations.

This course is designed as a practical introduction to KBS and as such is the entry level course for our suite of KBS courses.

Knowledge Elicitation

Knowledge elicitation is a difficult task made more difficult by a lack of understanding by the knowledge engineer of the range of elicitation techniques available, and the poor understanding of the psychology of expert cognition. This 2-day course aims to overcome the first of these two major difficulties.

The participants are introduced to the practical techniques of eliciting knowledge from experts. The teaching material covers the different techniques that have been developed and surveys the currently available knowledge elicitation tools. The student is taught:

This course was developed by Nigel Shadbolt and Mike Burton from the AI Group in the Department of Psychology, Nottingham University. Both Nigel and Mike are internationally renowned experts in this area.

Revised Course Schedule

Due to support from SERC via their EASE Programme some training places are available free of charge to academic engineers.

Ann Macintosh, AIAI

Forthcoming Events

Oxford/Berkeley Engineering Programme State of the Art Topics 2-13 July 1990, University of Oxford

Turbomachinery Aerodynamics An Advanced Short Course 18-14 July 1990, University of Cambridge

ICALP - 17th International Colloquium on Automata, Languages and Programming 16-20 July 1990, University of Warwick

Workshop on Visualisation in Computational Fluid Dynamics (CFD) , Thursday, 19 July 1990, Atlas Colloquium, RAL

Molecular Dynamics on Parallel Computers One-Day Seminar, 19 September 1990 at RAL

A key area in the application of parallel computers to molecular modelling is in the field of molecular dynamics simulation. Several groups have been working on the problem of bringing user-friendly molecular dynamics packages to the molecular modeller so as to make possible semi-interactive simulations.

The Molecular Modelling Transputer Applications Community Club (MMT ACC) is holding a one-day seminar at which speakers from leading laboratories in the parallel computing will update the community on the latest developments in this important area. Speakers will include: S L Fornili (Palermo), G S Pawley (Edinburgh) U C Klomp (Shell Thornton), ARC Raine (Cambridge) P D Adams (Edinburgh)

The Molecular Dynamics seminar will be held at the Rutherford Appleton Laboratory on 19 September and will be preceded by a short open meeting of the MMTACC.

TCP/IP Protocols: A Practical Introduction 2-3 October, University of Manchester

The course is suitable for technical staff who intend to become involved in setting up TCP/IP-based computer networks such as Sun or HP workstation configurations, in maintaining and trouble-shooting these networks, or in writing protocol code. The course will consist of lecture sessions, interspersed with practical exercises which illustrate details of the different concepts.

Engineeering Interactive Computing Facility, Part 2

Multi-User Mini Assessment 1976-78: Enter Prime and GEC

Two multi-user minis (MUMs), a Prime 400 and a GEC 4070, were purchased in late 1976 for the planned assessment following a rigorous tender exercise, which included an interactive benchmark involving teams of 6 real people running a script. This turned out to be a very revealing way of assessing the systems. An attempt was also made to make a subjective view of the machines to judge the user interfaces. In fact a P300 was originally ordered from Prime but this was changed to a P400 - the first sign that Prime regarded SERC as a very important new customer. The Prime 400 was installed on 3 December 1976 and an internal user service started at the beginning of January 1977.

The configuration of the Prime 400, which cost £81,302 including VAT, was as follows:

Prime had already decided that their machines would be limited to 8 memory slots, arguing that chip capacity would increase fast enough for this not to be a problem. In SERC's case this proved to be a very perceptive view of the future and has never caused any problems.

The Operating system at the time is best described by the following paragraphs taken from a progress report dated 1 September 1977 on the assessment activity:

The version of the operating system (PRIMOS) that was delivered was known as REV (revision) 11. REV 13 is now available and, although functionally a considerable improvement, it gives a greater store and processor overhead. For this reason we have delayed using it in service until Prime lend us an extra 32K words store. Prime claim that REV 14 and REV 15 will restore the lost performance.

Because of the very rapid rate of software development within Prime, great efforts are being taken to ensure that system modification at RAL will not lead to an unacceptable maintenance load in future revisions of the system.

Implementation of the Technical Group Report 1976-1978: Phase 1 of the ICF

Phase 1 of the ICF

The main departure from the original plan was the abandonment of the plan for a large central computer at RAL, an increase to 7 in the number of existing minicomputers enhanced and some acceleration of the MUM programme.

The Prime 400 was substantially upgraded when the assessment was complete and used to run a central service at RAL, supporting a user population of 100, to partially substitute for the large central machine.

The assessment of the Prime and GEC machines took place during 1977. A benchmark was produced which included a number of existing Fortran programs from users (ie Engineers). The real engineering programs were provided by the Engineering Departments of Cambridge, Glasgow and Leeds Universities and were run on the two machines using Tektronix terminals (4006,4010 or 4014) via 12oon5 bps modems on the public telephone network.

The recommendations from the assessment were as follows:

In a non-demanding environment, the GEC 4070 is more cost-effective than the P400. Therefore, GEC 4070s should be purchased to go to sites with this characteristic workload.

However, one in six of the observed programs appear to fall into the category where large arrays are being continually accessed. These programs will perform much better on the P400. (Note the P400 processor speed was twice that of the GEC 4070.) The type of problem area where such programs have been observed include:

In environments where such programs are found, the GEC 4070 is not recommended. In this case, the P400 is much more cost-effective.

The P400 was expected to be capable of supporting up to 8-10 simultaneous users. Some specific comments from the assessment of the Prime 400 are given overleaf. The close involvement of the two manufacturers in the assessment was seen as being of considerable benefit to the whole exercise and assured the successful outcome in both cases.

A specific policy decision was made following the assessment to both distribute and support the software (operating system and applications) centrally from RAL. This was a very important and central aspect of the ICF, ensuring that engineering researchers were able to concentrate on their research. A machine from each supplier was to be provided at RAL in support of the systems development activities.

The acceptance of Prime as one of the recommended suppliers resulted in one of the existing mini-computers to be upgraded being a Prime 300 at Nottingham University. This was replaced by a Prime 400 with the Prime 300 processor being transferred to RAL as a possible development machine. In fact it was put in a cupboard and never used!

All the other hardware recommendations were implemented according to the Technical Group Report, with one additional item, namely the purchase of a Floating Point Systems AP-120B array processor for assessment. This was attached to the P400 at RAL.

On the applications side, Special Interest Groups (SIGs) were established covering the following areas:

The Groups were charged with identifying essential applications to be purchased or developed during Phase 2 of the ICF.

Changes to PRIMOS 1976-1978

During the assessment stage a number of 'essential' changes and additions were made to PRIMOS. These included:

A list of future developments was also identified which would be implemented in the second phase of the Interactive Computing Facility (1979 to 1984). These were:

Some of the remaining deficiencies, eg batch, X25, incremental dumping and filestore budgeting, were being addressed by Prime in 1978.

Some Specific Comments from the Assessment of the Prime 400

Reliability

The only significant problem with the initial configuration was the 80 Mbyte discs which gave problems at a rate of about one a week. Eventually Prime replaced them both and the new ones have given no trouble. The only other problems have been with upgrades whether major, eg new memory, or minor, eg CPU modification. Invariably there is a period of unreliability (one or two incidents a week) with such upgrades. After this there is no further trouble, and it is believed that a stable system would give no more than one incident a month. However, there have been a number of lineprinter paper jams. Software breaks on the Prime are virtually unknown.

Failures

These figures represent the number of failures which have caused the systems to halt during service periods over the 12 months of the assessment.

Prime GEC
Hardware: memory 19 12
Hardware: disc 26 3
Hardware: other 25 0
Hardware: software 10 5

Most of these incidents were clustered in time, and it is common to have several weeks with no incidents on either machine.

Generally these machines are very reliable compared with any mainframes previously experienced.

Documentation and Support

The standard of documentation on the Prime used to be very poor: only in the last few months has good quality documentation, orientated towards the user, begun to emerge. It was necessary to produce a substantial user manual for the machine.

Prime automatically supply the source of all software: this makes it easy to enhance or to cure locally trivial bugs in utilities.

Initially Prime software support in the UK suffered because of the remoteness from the parent company. This appears to be no longer the case: in fact a UK development organisation is being set up. Relations with Prime have been excellent, although RAL appears to be treated as something of a special customer.

Mike Jane, RAL

The next article in this series will cover Phase 2 of the ICF (1979-1984).

⇑ Top of page
© Chilton Computing and UKRI Science and Technology Facilities Council webmaster@chilton-computing.org.uk
Our thanks to UKRI Science and Technology Facilities Council for hosting this site