Contact us Heritage collections Image license terms
HOME ACL ACD C&A INF CCD Mainframes Super-computers Graphics Networking Bryant Archive Data Literature
Further reading □ OverviewIssue 1: December 1986Issue 2: January 1987Issue 3: April 1987Issue 4: June 1987Issue 5: August 1987Issue 6: October 1987Issue 7: December 1987Issue 8: February 1988Issue 9: April 1988Issue 10: June 1988Issue 11: August 1988Issue 12: October 1988Issue 13: December 1988Index of issues
CISD Archives Contact us Heritage archives Image license terms

Search

   
CCDLiteratureNewslettersARCLIGHT
CCDLiteratureNewslettersARCLIGHT
ACL ACD C&A INF CCD CISD Archives
Further reading

Overview
Issue 1: December 1986
Issue 2: January 1987
Issue 3: April 1987
Issue 4: June 1987
Issue 5: August 1987
Issue 6: October 1987
Issue 7: December 1987
Issue 8: February 1988
Issue 9: April 1988
Issue 10: June 1988
Issue 11: August 1988
Issue 12: October 1988
Issue 13: December 1988
Index of issues

Issue 7: December 1987

Cray Grant Holders, December 1987

Approved Grant holders and their allocations

Grant Holder Title Hours
Dr G G Balint-Kurti, Thtl Chemistry, Bristol Theory of molecular photofragmentation 60
Dr I M Barbour, Physics, Glasgow QCD at finite density 200
Dr M Beale, Electrical Eng, Manchester Multiple access protocols for optical fibre local area networks 25
Prof T J M Boyd et al. Physics, UC, North Wales Modelling parametric instabilities in laser produced plasmas 200
Prof PG Burke et al, DAMTP, Queen's, Belfast Atomic and molecular physics calculations 468
Prof PG Burke et al, DAMTP, Queen's, Belfast Electron and positron molecule collision calculations 104
Prof C R A Catlow, Chemistry, Keele Supercomputer simulations in solid state chemistry 450
Prof C R A Catlow, Chemistry, Keele Computer modelling studies of diffusion processes in silicate minerals 75
Dr S J Cowley, Mathematics, Imperial Numerical solution of unsteady 3D laminar boundary-layer flow over bodies at incidence 30
Dr R G Evans, CCD, RAL Cray Scientific Support Group 40
Dr R G Evans, CCD, RAL Cray Scientific Support Group 20
Dr R G Evans, CCD, RAL Cray Scientific Support Group 25
Dr P S Excell, Electrical Eng, Bradford Probabilistic factors in radiative electromagnetic compatibility analyses 4
Dr R C Fawcett, ASR division, RAL Hartree-Fock and Dirac-Fock atomic structure programs 120
Dr D R Flower et al, Physics, Durham Studies of atomic collision processes 35
Dr P W Fowler, Chemistry, Exeter Ab initio studies of ions in solids 45
Dr J Gerratt et al. Theoretical Chemistry, Bristol Study of molecular states and molecular processes using spin-coupled VB theory 18
Dr A D Gosman, Mechanical eng, Imperial Prediction of turbulent heat transfer in complex geometries (CEGB/SERC co-funding) 60
Dr D G Gregory-Smith, Engineering, Durham Validation of viscous turbomachinery calculation methods 100
Dr D Gubbins, Earth Sciences, Cambridge Numerical models of the geodynamo 200
Dr B L Gyorffy, Physics, Bristol Magnetic scattering of X-rays 50
Dr B L Gyorffy, Physics, Bristol Superconductivity above 77 degrees Kelvin 100
Prof M G Haines, Physics, Imperial PIC simulations of beat wave excitation of relativistic plasma waves 100
Dr N C Handy, Chemistry, Cambridge Quantum chemistry, molecular vibrations and spectroscopy 400
Prof I H Hillier, Chemistry, Manchester Quantum chemistry calculations 400
Dr D M Hirst, Chemistry, Warwick Calculation of potential energy surfaces for small molecules and ions 100
Dr J Hock, HEP Division, RAL Monte Carlo renormalistion group study of SU(3) Gauge Theory on lattices 160
Dr R Holdaway, S & A division, RAL Orbit determination and prediction for satellites 100
Dr B W Holland, Physics, Warwick Theoretical studies in heterogeneous catalysis 50
Dr J M Hutson, Chemistry, Durham Intermolecular potentials from the infrared spectra of Van der Vaals complexes 100
Dr J E Inglesfield, TCS Division, Daresbury Protein crystallography - Chmn: Prof ACT North 15
Dr J E Inglesfield, TCS Division, Daresbury Simulation of bulk systems - Chmn: Dr J Clarke, UMIST 20
Dr J E Inglesfield, TCS Division, Daresbury Heavy particle dynamics - Chmn: Dr JNL Connor, Manchester 15
Dr J E Inglesfield, TCS Division, Daresbury Correlated wave functions - Chmn: Prof IH Hillier, Manchester 15
Dr J E Inglesfield, TCS Division, Daresbury Astronomical spectra - Chmn: Prof MJ Seaton, UCL 15
Dr J E Inglesfield, TCS Division, Daresbury Electronic structure of solids - Chmn: Dr BL Gyorffy 15
Dr J E Inglesfield, TCS Division, Daresbury Computational study of surfaces - Chmn: Prof JB Pendry, Imperial 15
Dr J E Inglesfield, TCS Division, Daresbury TCS support of SRS experiments and CCP's 150
Prof J C Inkson, Physics, Exeter Real space calculations of response functions in insulators 25
Prof J C Inkson, Physics, Exeter Numerical studies of heterostructure based systems 20
Dr R A James, Astronomy, Manchester Numerical simulations of disc galaxies and of interacting galaxies 100
Dr M R Jane, Informatics Div, RAL ECF Applications support 150
Dr R Jones, Physics, Exeter Structure and motion of defects in semiconductors 50
Dr R G Jordan et al, Physics, Birmingham Surface and bulk electronic structure in rare earth metals 150
Dr P Killworth, Hooke Institute, Oxford FRAM : Fine Resolution Antarctic Model 8400
Prof A E Kingston, DAMTP, Queen's, Belfast Continuum states of atoms and molecules - Chmn: Prof AK Kingston 15
Prof B E Launder, Mechanical Engineering, UMIST Momentum and heat transport in idealised stator and rotor passages 278
Prof D C Leslie, Nuclear Engineering, QMC, London Large eddy simulation of thermal stripping (CEGB/SERC co-funding) 21)
Dr A Mackinnon, Physics, Imperial Computer simulation of transport in disordered solids 30
Dr P A Madden, Physical Chemistry Laboratory, Oxford Computer simulation studies of material properties 60
Prof J D McConnell, Earth Sciences, Oxford The micro-hydrodynamics and thermodynamics of water in clays 80
Dr J A Mclnnes, Computer Sci, Strathclyde Numerical calculations of Ohmic and Non-Ohmic Hopping Conductivities of 1D, 2D and 3D systems 250
Dr M E Mclntyre, DAMTP, Cambridge UK Universities Global atmospheric modelling project 9000
Prof C Michael, DAMTP, Liverpool Glueball spectra 300
Dr H Nikjoo, Radiobiology Unit, MRC, Chilton Monte Carlo track structure calculations 200
Dr D Nunn, Electronics, Southampton Numerical simulations of wave particle interactions in space plasmas 70
Dr S C Parker et al, Chemistry, Bath Modelling of stability and sorption in alumino-silicate catalysts 700
Dr M C Payne, Physics, Cambridge Ab-initio investigation of the SIGMA = 5 (001) twist boundary in germanium 700
Prof J B Pendry, Physics, Imperial Determining the structure of disordered surfaces 60
Dr J Peraire, Civil Engineering, UC, Swansea Development of an unstructured grid based code for 3D aerodynamic applications 180
Prof G J Pert, Physics, York CCP in plasma physics 240
Prof G J Pert, Physics, York Rayleigh-Taylor instability with self-generated magnetic field 750
Dr D Prandle, IOS, Bidston Obs North Sea Programme - proposed by the Marine Sciences Committee: Chmn Prof JH Simpson 7000
Dr D Rees, Phys & Astronmy, UCL Global thermospheric and ionospheric modelling 60
Dr W G Richards, Physical Chemistry, Oxford Study of interaction energy between aromatic molecules 20
Dr S J Rose, Laser Division, RAL Central Laser Facility scientific computation on the Cray XMP 20
Dr P J Rous, c/o TCM Group, Cambridge Dynamical theories of electron spectroscopy 20
Prof J S Rowlinson, Physical Chemical Laboratory, Oxford Computer simulation of fluids in pores and cavities 80
Dr C T C Sachrajda, Physics, Southampton Operator matrix elements from lattice calculations 400
Dr N S Scott, Computer Sci, Queen's, Belfast Parallel algorithms for the calculation of recoupling coefficients 6
Prof M J Seaton et al, Physics & Astronomy, UCL Atomic data for opacity calculations 510
Dr G P Srivastava, Physics, Ulster Pseudopotential total energy and force calculations 60
Dr J P W Stark, Aeronautics & Astronautics, Southampton Space vehicle flow field and glow simulation 200
Dr J B Staunton, Physics, Warwick Magnetic and compositional ordering and dynamical susceptibility in metals and alloys 200
Dr J B Staunton, Physics, Warwick Study of magnetic anisotropy and dynamical susceptibility based on relativistic s.p.b. structure 30
Dr A P Sutton et al, Metallurgy & Material Science, Oxf Computer simulation of defects and interfaces in semiconductors 700
Dr F W Taylor, Atmospheric Phy, Oxford Atmospheric and Oceanographic research at the Robert Hooke Institute 80
Prof C Taylor et al, Civil Eng, UC, Swansea Numerical modelling of 3D turbulent flow with swirl in 180 deg. bends - A Finite Element Model 70
Dr J Tennyson, Physics & Astronomy, UCL Calculated vibration-rotation states of small molecules 220
Dr M Teper, Thtl Physics, Oxford Physics of SU(3) lattice gauge theory 850
Dr I J Thompson, Eng Mathematics, Bristol Coupled reaction channel methods in nuclear scattering problems 9
Dr O R Tutty, Aeronautics & Astronautics, Southampton Boundary layer stability 750
Dr D D Vvedensky. Physics, Imperial Monte Carlo simulations of growth by molecular beam epitaxy 40
Dr J F Wheater, Thtl Physics, Oxford Properties of field theories on random lattices 290
Dr D H Williams, Chemical Lab, Cambridge Advanced models for computer assisted drug design 35
Prof O C Zienkiewicz, Civil Eng, UC, Swansea Error estimates and adaptive finite element analysis for engineering design and analysis 30

The Cray Filestore

The Cray permanent filestore is a major aspect of the Cray service.

It is managed by the COS archiving system, which is used to maintain copies of all datasets in the Masstor M860 automatic cartridge store and to delete the on-disk versions after a period of inactivity. Such "migrated" datasets are reloaded transparently and automatically to disk when required (ACCESSed). Currently, only about 20% of the filestore resides on disk at any one time, in order to leave adequate disk space for work datasets and short term files.

A filestore should be reliable and able to withstand hardware and software failures with minimal disruption to users. To this end, considerable care and attention has been given to ensuring the integrity of the Cray filestore.

Datasets can be reloaded from the M860 if the on-disk copy is lost due to a disk hardware failure. The master dataset catalogue can also be recovered from the M860 if it is lost from disk.

A copying program has been developed by the Atlas Centre which copies Cray data onto IBM 3480 cartridge tapes. This is used to maintain backup copies of the entire Cray filestore, from which datasets can be recovered in case of M860 cartridge failure.

The maximum impact of failure of any one piece of the hardware would be to lose changes to datasets and datasets created since the previous backup run. A full backup is run each evening.

The Cray system is currently allocated about one third of the Masstor M860 device, allowing a maximum total Cray permanent filestore of about 26 Gigabytes. The whole of the remainder of the Masstor is divided between the CMS and MVS systems, each of whose filestores is about 12 Gbytes.

The main constraint on the Atlas Centre mainframe filestores is the size of the M860, which can be partitioned flexibly between the different operating system filestores. So users using more than one of the operating systems should store data in whichever system is most appropriate for the application. User support group (US @ UK.AC.RL.IB) are able and willing to move your space allocation between systems, to avoid you having to artificially move data between the systems - after all, the space all comes out of the same M860 device.

The M860 is almost full, so Atlas Centre mainframe users, especially those projects owning more than 100 megabytes of filestore (combined between CMS, MVS and COS, excluding users own tapes), are asked to review their use of filestore and reduce it where you can without restricting your work.

Some projects require so much data that use of magnetic tapes directly by users is an appropriate, rather than filestore files. As a guide, users should use tapes for storing data at the Atlas Centre only if the amount of data is large (hundreds of megabytes, more than one tape full). For smaller amounts, it is generally more convenient for users themselves and for the system as a whole if users use the ordinary filestores of the operating systems.

We wish to divide the available filestore capacity of the Masstor M860 flexibly and usefully between the users. Although the standard filespace allocation for new Cray users and those with little filespace at the time the allocations were introduced is 50 megabytes, it is clear that the filestore requirement for different projects vary widely. We also recognise that a user's filestore requirements can vary considerably with time.

User support group (US @ UK.AC.RL.IB) will respond positively to requests for increased filestore allocations, even of large quantities (Gbytes), especially if it is only required for a period.

Steps are being taken in order to increase the M860 space available to all systems, such as improving the M860 utilisation efficiency of MVS.

Enhancements are being considered to the filestore allocation system. Possibilities include a facility (ACCT subfunction) to transfer some of your own allocation to another user. The systems of space allocations in CMS and COS are currently different, possibly they should be made the same, and even combined with just a single space allocation per user.

If you have comments of suggestions about the Cray filestore and space allocations, please pass them on to user support group (US@UK.AC.RL.IB) so they can be considered.

David Rigby - Systems Group, RAL.

First Cray User Meeting October 1987

The first Cray User Meeting took place on Thursday 15 October 1987 at the Rutherford Appleton Laboratory. Over 60 people attended to hear a number of presentations on a variety of topics in the morning. The afternoon was devoted to a general discussion period during which a number of interesting issues were raised.

Chairman's Introduction - Professor G J Pert

Professor Geoff Pert welcomed everyone to the meeting and highlighted the three main reasons for holding the meeting:

Introduction to JRCSU - Dr B W Davies

Dr Brian Davies started his talk with a look back at the history of the Atlas Centre, whose first computer was the original Ferranti Atlas 1 (the supercomputer of the 1960's). This machine cost £3M (£20M at today's price) and arrived in 20 lorries. In comparison, the Cray cost only £13M and took only one lorry to deliver it, but is about 1000 times more powerful.

In describing where this meeting fitted into the infrastructure of other meetings, he pointed out that the Chairman of the Cray User Meeting attends the Atlas Centre Supercomputing Committee (ACSC), chaired by Professor Forty, along with representation from the two other National Centres. The ACSC reports to the joint Policy Committee set up by the Advisory Board for Research Councils (ABRC), the University Grants Committee (UGC) and the Computer Board (CB).

On the future of the Cray, Dr Davies commented that although there was sufficient funding to cover the cost of running the Cray, no money had yet been allocated for upgrading it. As was highlighted in the discussion in the afternoon, the lack of disk space was already causing concern to users. The usefulness of the Cray will be judged on the scientific results produced from the Cray and it will be very important for users to publish their successes wherever possible. Publicity material is being produced at the Atlas Centre, and photographs, diagrams, reports and any other material is urgently required. Any user who can help out should contact User Support and Marketing Group at RAL.

User Support and Training -Mr P C Thompson

Providing training and support to a whole new community on a new computer has taken a great deal of effort this year. Several courses have been organised, a User Guide written, and both the Service Line, and the Program Advisory Office have seen a steady increase in queries as users have become established on the Cray. Mr Paul Thompson gave a note of warning to prospective Cray users, asking them to ensure that they add a request for funds (£100 should be sufficient) to cover computer manuals along with their request for computer time, as manufacturer's manuals cannot be supplied free of charge to users.

A list of all the software available on the Cray was highlighted and a request that users who wished to use any piece of software not currently available on the Cray should contact User Support and Marketing Group.

Cray Operating Software - Dr T G Pett

Dr Tim Pett talked about the future migration from COS to UNICOS, a move which may be forced on us within two or three years if Cray reduce support for COS. This was one topic raised in the afternoon discussion. Any changes or additions to COS will be advertised in Arclight. Tim Pett ended with a request for volunteers! A pilot scheme has been run providing a VAX front-end to the Cray. This is now to be extended and Tim would like to hear from anyone who would like to take part in this extended service.

Graphics Services - Mr C D Osland

Mr Chris Osland described some of the services available and some of those planned.

Discussions are taking place with ULCC aimed at providing film output. Two routes will be used; output up to a size yet to be determined will be sent online while larger jobs will go to tape and be transferred by courier.

Plans for a video output facility at RAL are complete and are awaiting funds.

A Silicon Graphics IRIS 3130 workstation will provide very powerful graphics facilities to users able to visit the Atlas Centre. Following an investigation by a Working Party under the IUSC, a bulk deal has been made between The Computer Board and UNIRAS for their presentation graphics system; Chris Osland asked users whether they required UNIRAS on the Cray.

Cray Scientific Support - Dr R G Evans

The Advanced Research Computing Unit will comprise a small number of scientists from different areas of science who will be available to help Cray users with problems particularly related to their science. They will also be carrying out a certain amount of research themselves. The aim is to get each Funding Board in SERC represented along with representation from the other Research Councils. The process of getting this unit up to strength is proceeding.

Dr Evans is very keen to make contact with users in new subject areas not covered by the current Computational Collaborative Projects (CCPs), to see if new CCPs should be set up.

As was mentioned earlier in the graphics talk, there are facilities at the Atlas Centre which are available for users during visits. Cray users wishing to visit the Centre for a number of days will be very welcome and anyone wishing to do so should contact Dr Evans. One of the facilities users may wish to make use of is the FORGE optimising tool which runs on a SUN and the Cray. A SUN is on order and we will be getting a month's free trial of this expensive (£20,000 per single user license) system.

Discussion

Several points were raised during this lively session, for which users were obviously happy to miss afternoon tea when the time over-ran! The major points are highlighted below.

  1. Publicity of the work on the Cray is going to be vitally important long term, both to make cases for extra facilities on the existing Cray and to pave the way for the next academic supercomputer. Could users please send a copy of any report published which contains reference to work performed on the JRCSU to Cray User Support and Marketing Group. Any other photogenic material relating to the science will also be very useful. Something along the lines of an Annual Report will need to be produced.
  2. Documentation written for Cray users needs to include information on the performance monitoring systems available. Dr John Gordon informed users that a short introduction on micro tasking is nearing completion; it will be advertised in Arclight when complete.
  3. There was "strong encouragement" to the three National Centres to collaborate and standardise wherever possible. For example, graphics meta files should be transferable between the machines. (Chris Osland, in answer to this question, mentioned the International Standard for metafiles which has now been defined.)
  4. The prospective move to UNICOS caused some discussion. First, the reasons to move were listed:
    • Cray have decided that for their "state-of-the-art" computers UNIX is the operating system they have chosen.
    • It is the only operating system which offers the opportunity to run the same interface on different front-ends.
    • COS is apparently locked into 24-bit addressing (16 Mwords).
    • UNICOS offers the possibility of remote procedure calls from a front-end graphics workstation to the Cray.
    There are several disadvantages though.
    • UNICOS requires more disk space,
    • more memory,
    • it generates more I/O and
    • is more expensive.

    ULCC are interested in investigating UNICOS but can not run COS and UNICOS on their machine. On the X-MP/48 there are also problems. COS and UNICOS filestores are incompatible, and the two operating systems cannot share CPUs, so during development and testing, 1 CPU will have to be set aside for UNICOS. Users were generally cautious about UNICOS (reports from the States have not been favourable) and the Atlas Centre have stated that they are happy to wait until UNICOS has become more mature before moving.

  5. Machine usage caused a great deal of discussion. The limited amount of disk space available has already caused problems - this is before the machine is fully loaded. Users with special disk space problems are urged to talk to User Support and Marketing Group. Large amounts of space can be allocated for a short period of time if necessary. Some users are experiencing problems with their programs being held up due to contention in the Solid State Storage Device (SSD). Again, users experiencing problems are asked to contact User Support who are investigating these contentions, this problem should ease with the introduction of COS 1.16. Priority 0 time was introduced to enable users who had already been approved to make full use of the Cray while use of it builds up. This will automatically be reduced as the numbers of approved users increases.

    The intention of User Support & Marketing Group is to talk to the users with some of the larger programs to see if they can be made to run faster and more efficiently.

    Remote access was mentioned, and the advice to anyone wishing to ship large amounts of data is to talk to their local network representatives to ensure that optimum "window" size is being achieved over JANET.

Future Meetings

It was decided these meetings should be held about twice a year mainly at RAL, but with occasional visits to other sites. They should follow a similar format, possibly with presentations from one or two users. The date of the next meeting will be advertised in Arclight and users will be invited through this medium.

Jacky Hutchinson, Central Computing Department, Rutherford Appleton Laboratory

Numerical Modelling of the Middle Atmosphere

At present, there is considerable interest in the middle atmosphere, the region of our atmosphere extending from approximately 10 km above the surface to over 100 km. Ozone, a minor constituent of the atmosphere, forms a layer which peaks at about 25 km. Despite its small abundance, it is essential in absorbing solar ultraviolet radiation in the region of 280-300 nm. DNA strongly absorbs in this region and any change to the ozone layer could have important biological consequences. With the discovery in recent years of significant ozone depletions in south polar regions, the observation and theoretical interpretation of middle atmospheric structure has taken on renewed importance.

Activities at Cambridge in the field of middle atmospheric science include the use of a two-dimensional numerical model of the dynamics and chemistry of this region. The model calculates the winds, thermal structure, and chemistry of constituents affecting the balance of ozone as a function of altitude and latitude. The numerical complexity of the problem of solving many differential equations as a function of time makes the Cray X-MP the ideal vehicle for carrying out these calculations. Despite the use of vectorisation, a calculation carried out for one model year requires about 10 minutes of CPU time.

Calculations of possible future perturbations of the ozone layer due to increases in trace gases (e.g. chlorofluorocarbons, methane, and nitrous oxide) are currently being made. Recent international treaties have regulated the growth of certain compounds in an effort to limit their impact to the ozone layer. Models, such as ours, are investigating how successful such controls are likely to be. These computations require examining the state of the atmosphere up to 100 years in the future which would be a mammoth task on a scalar machine!

In regard to the observed depletions of Antarctic ozone and other reported possible global depletions, there has been strong interest in comparing past observed trends with model calculations. Such work is also being done with our numerical model in collaboration with other groups worldwide. We are also actively involved in the Universities Global Atmospheric Modelling Project (described briefly in Arc-light No. 6) which is developing a state-of-the-art three-dimensional atmospheric model to investigate many of the outstanding problems dynamical and chemical problems both in the troposphere and stratosphere.

In summary, the RAL Cray X-MP is being exploited to examine many problems at the forefront of middle atmospheric research. We are currently investigating methods to improve the efficiency of the model code using devices such as the SSD and the transport of large quantities of model results back to Cambridge for graphical examination and interpretation.

Richard Eckman, Department of Physical Chemistry, Cambridge University

Using the Cray X-MP/48 to study the structure of nucleons and pions

In this note I briefly describe a project currently being undertaken in collaboration with G. Martinelli of CERN. The purpose of this project is to compute a number of fundamental quantities necessary for the understanding of the structure of hadrons (strongly interacting particles), in particular of the pi mesons and of the nucleons. Among these quantities are the momentum distribution of quarks and gluons inside these particles as well as their electromagnetic form-factors (e.g. the magnetic moment of the proton). In addition we shall compute the strong corrections to a number of processes which would signal the presence of new physics (e.g. proton decay in grand unified theories). It is the uncertainty in the strong interaction corrections which is generally the largest contribution to the theoretical error for these processes. We stress that these computations of fundamental quantities in strong interaction physics, which are now made possible with the advent of modern computers, are performed from first principles with no model assumptions or free parameters. They will also be the first computations of these quantities so that in parallel with the computation, the corresponding theoretical structure must be developed.

The calculations are performed by putting QCD (quantum chromodynamics, the theory of the strong nuclear force) on a lattice of space-time points. We choose to work on a lattice which has 20 points in the x direction, 10 points in the y and z directions and 40 points in the time direction (such an asymmetric choice is dictated to us by physics considerations). Most of the CPU time is used in determining how a quark propagates in a given gluon background field, (with or without a pion or proton source). This involves the inversion of a 11,520,000 dimensional sparse matrix, which is done by iteration, using the Gauss-Siedel method. The corresponding program is fully vectorised, using both the standard and privately tailored scatter/gather routines. Indeed, since the project requires us to run 180 of these jobs, it would not be feasible on a scalar machine. The SSD allows us to use I/O's heavily gaining a large reduction in the required memory. A typical job uses about 3 hours of CPU time before the required precision is reached, needs 1 Mwords of memory and moves 100 million sectors to and from the SSD. Each job produces a quark 'propagator' which is the basic ingredient in the computations of all the quantities we are interested in. The propagator is contained in a dataset of 23 Mwords which is stored on tape.

To date we have generated the propagators required for the structure of the pion. With these we have studied the momentum distribution of the quark and antiquark in the pion and its electromagnetic form factor. The results are extremely encouraging, being in agreement with experimental data and having small statistical errors. At present, in addition to continuing the studies of the pion's structure, we are generating the propagators required for the calculations involving the nucleons.

In summary we repeat that we are exploiting the Cray X-MP/48, and in particular its vectorial nature and the fast I/O's possible with the SSD, to study the structure of pi mesons, protons and neutrons from first principles!

Chris Sachrajda, University of Southampton
⇑ Top of page
© Chilton Computing and UKRI Science and Technology Facilities Council webmaster@chilton-computing.org.uk
Our thanks to UKRI Science and Technology Facilities Council for hosting this site