Jump Over Left Menu
Issue 10: February 1998
- Launch of W3C-LA
- Optimise your Results with LANCELOT
- Human Computer Interfaces for the year 2015
- The TAP-EXTRA Project
- Atlas Users Meeting
NOTE: many of the urls referenced in this newsletter are no longer active.
Launch of W3C-LA
The UK launch of the Word Wide Web Consortium's Leveraging Action initiative (W3C-LA) took place in the Great Room of the Royal Society for the Arts, London, on 3 December. This symposium was organised jointly by Computer Weekly, DISC and CLRC, Rutherford Appleton Laboratory with Bob Hopgood as Chairman and many other DCI staff heavily involved in preparation, registration and demonstrations. Some 150 senior executives from leading UK companies heard Jean-Francois Abramatic, W3C Chairman, open the proceedings by describing the Web as a people-to-people communication medium, a machine-to-people framework for computing knowledge and a machine-to-machine information infrastructure. He said the challenges were to prevent market fragmentation, develop a powerful architecture, guarantee accessibility and develop best practices. He then presented an overview of the W3C, covering its history, achievements and ongoing work and emphasising the many benefits for member organisations.
He was followed by star attraction Tim Berners-Lee, inventor of the Web, and now W3C Director, who described his personal dream of a hypertext system which would be "personally cool", enable social efficiency and understanding and allow the exploitation of computing power in real life. His vision now was for the Web to become a single universal space encompassing PCs to TVs, hypertext, video on demand and spanning the personal to the global, the scribbled to the polished. Getting somewhat carried away he even coined a new word "intercreativity" meaning building things together in the Web. Not forgetting the industrial audience he went on to talk of the importance for trade of the "Web of Trust", the whole security infrastructure from keys and signed documents to personal information, which will allow electronic commerce to take place safely and securely.
The afternoon was devoted to the explanation of the work of the three W3C areas covering the User Interface, Architecture, and Technology and Society Domains, presented by Bert Bos, Philipp Hoschka and Josef Dietl respectively. Topics covered included HTML, XML, Style Sheets, Graphics, Fonts, Mathematical Markup, HTTP, Multimedia, PICS (Platform for Internet Content Selection), RDF (Metadata) and Digital Signatures. These talks were complemented throughout the day by demonstrations which illustrated many of these new developments and gave a flavour of how they would impact the Web in the future.
If you would like more information on W3C please read the article on W3C in this issue.
Tim Pett, DCI, CLRC
Optimise your Results with LANCELOT
Legend has it that, way back in the dark ages, Sir Lancelot and his companion Knights of the Round Table were charged by God to return the Holy Grail to Camelot. In these days of more plentiful light, we are pleased to announce that LANCELOT is still with us and more than willing to help you in locating your own grail. That is, if you are interested in solving large-scale minimisation problems. So, save yourself sleepless nights and let our sleepless knight do the job for you.
One of the main research interests of DCI's Numerical Analysis Group has been the solution of large-scale linear and non-linear programming problems, that is, in finding the smallest or largest value of a possibly non-linear function of a large number of variables where the variables may be required to satisfy a set of linear or non-linear constraints. Such problems arise quite naturally in a vast number of applications in science, engineering, planning and economics.
The LANCELOT project is an international collaboration between Nick Gould (CLRC), Philippe Toint (FUNDP, Namur, Belgium) and Andy Conn (IBM, Yorktown Heights, NY, USA) to develop widely applicable software for solving such problems. (In case you are curious, LANCELOT is an acronym for "Large And Non-linearly Constrained Extended Lagrangian Optimisation Techniques".)
A guiding aim of the project has always been that the algorithms incorporated in the software are based on solid theoretical foundations. The underlying theoretical behaviour of the methods has been, and is continuing to be, published in international research journals. The aim has been to be able to solve genuinely non-linear problems with up to, say, 10,000 variables on a reasonable workstation, and this goal is now being realised.
Release A of the software is complete. The package works in two phases. Firstly, users must describe their minimisation problems in a standardised input language. This description is translated into data and Fortran programs which are then compiled and linked to the main LANCELOT load module. Secondly, users select from a number of options which control the minimisation method used. The minimisation program then starts to solve the problem and, with luck, the desired solution rapidly follows.
LANCELOT has already been used to solve problems in the water, gas, oil and aircraft industries and there are versions of LANCELOT for most Unix systems. If you are interested in finding out more about LANCELOT, please contact Nick Gould.
Nick Gould, DCI, CLRC
Human Computer Interfaces for the year 2015
CLRC is a member of the European TACIT network which has been awarded a grant of 1.3 million ECU by the European Commission (EC) to investigate techniques for continuous interaction between computers and their users. The project, which started in January 1998, funds eight postdoctoral researchers for three years in DCI at Rutherford Appleton Laboratory, CLRC, the universities of Sheffield and York in the UK, Parma in Italy, and Joseph Fourier in Grenoble, France, and the research institutes of FORTH in Crete, CNUCE in Italy and DFKI in Germany. The proposal was one of 20 funded by the EC Training and Mobility of Researchers (TMR) Programme in the area of mathematics and computing from about 200 submissions.
It is predicted that the available power of desktop computers will continue to double every 18 months, while network communications which currently only use 0.01% of the capacity of optic fibre will increase dramatically in the next few years. By 2015 such machines will be able to support image and speech recognition so that they can passively observe their environments, and interpret the actions and dialogue of computer users employing techniques similar to those used by humans. It is also expected that intelligent agents as well as human users and other resources will be presented in three dimensional virtual worlds to present the computer's view of the world to users. The project will develop a theory of such continuous interaction techniques as a result of research to classify and describe the structure of such techniques, formally model their semantics and temporal properties, and evaluate architectures for implementing these novel technologies.
Michael Wilson, DCI, CLRC
The World Wide Web Consortium (W3C) was founded in 1994 to lead the evolution of the Web while retaining its interoperability. Its mission is to evolve the Web as a robust, scaleable and adaptive infrastructure.
Why Join W3C?
Organisations perform best if they have the best information crucial to their future. Few organisations can ignore the impact of the Web on today's business opportunities: national borders are fast melting and world trading by even the smallest niche businesses is becoming feasible. By joining W3C your organisation can maximise the potential of its business while helping W3C to achieve its mission.
How to Join W3C
If you are interested in joining W3C or would like more information, contact W3C's UK Office at RAL Tel: 01235-446822, e-mail: email@example.com
For further information about W3C visit the web site at: http:/ /www. w3.org/
The TAP-EXTRA Project
The French city of Bordeaux is very prone to flooding at times of heavy rainfall. The company Suez Lyonnaise des Eaux is responsible for the flood control systems, and runs a 24-hour control centre called RAMSES, where experienced operators monitor information received from sensors on the pumps, retention basins and other plant, and take actions to reduce the risk of flooding.
The TAP-EXTRA project, partly funded by the European Commission under the ESPRIT Programme, has recently been successfully completed by DCI and Suez Lyonnaise des Eaux. Its objective was to assist the control room staff in their tasks, by enhancing one of their information systems with what are called "co-operative, explanatory capabilities" - in essence, adapting the system output to be clearer and more helpful to the users and tailored to their tasks.
The system in question is called Aleph, and it filters and combines the large number of incoming alarm signals from sensors (for example, that a pump has failed to start) into more meaningful reports. In TAP-EXTRA, an extra layer of co-operative, explanatory interpretation has been added, specifically aimed at assisting the users with their most pressing tasks. The three major user needs were identified as:
- risk assessment, especially in long-duration winter rains
- "mental model mis-match", that is, pointing out to the user cases when their view of the situation may not match reality, for example if an emergency pump continues to run when normally it should have shut off
- an overview of the situation, particularly for the engineer called out in times of crises.
These facilities were all added to the Aleph system using a software toolkit and methods developed in a previous ESPRIT project, I-SEE, in which DCI was also a leading partner. The development involved producing "domain models" representing knowledge about the drainage system, and "explanation techniques" for building up explanatory output to the user relating to the three needs. The I-SEE software runs as a separate process and generates explanatory text output in French which expands and gives context to the messages produced by the Aleph system.
An important part of the development was a series of very thorough user trials based on real situations. One engineer remarked that he had learnt more about handling crises from the TAP-EXTRA trials than he had from real life.
Simon Lambert, DCI, CLRC
Atlas Users Meeting
The meeting of Atlas scientific computing users on 24 October at the Royal Institution attracted about 40 attendees to hear the latest news and discuss issues relating to the services on the Cray J90, the DEC 8400 (Columbus) and the new NERC Fujitsu VPP 300.
Roger Evans (CLRC) described the recent service levels which had been very good on the J90 also the DEC 8400 had been excellent following its upgrade to Digital Unix 4.0b.
John Gordon (CLRC) described recent changes: Cray no longer support the old Fortran 77 compiler and any performance or compatibility issues with the Fortran 90 compiler should be reported to User Support so that they could be forwarded to Cray for action. The NQS queues on the J90 had been modified to allow for the memory upgrade from 4 GByte to 8 GByte and users were invited to comment on further changes that were desirable. The file system layout on the Fujitsu had caused initial problems since Fujitsu's large file system did not cope well with small files and the temporary disk area (/tmp) had had to be split into smaller areas since the poor small file performance was adversely impacting the language compilers.
On both the J90 and Columbus there were periods of inefficient memory utilisation which were mainly due to users not bothering to quote memory limits on the job request. NQS and LSF then reserve the maximum for the job class even though the usage may be much less.
Peter Oliver (CLRC) described the recent updates to the applications software suites including the new NAG Mk17 library. This brought comment from the audience that NAG's change to the implementation of single/double precision on the J90 (which is strictly a 64 bit machine) was an unnecessary headache for users maintaining portable codes.
Alison Wall (EPSRC) gave a short description of the state of play on the HPC'97 project which is now progressing well with short listing in December, benchmarking through February 1998 and an intention to install a major new service in the Summer of 1998.
Lois Steenman-Clarke (UGAMP Reading) described the new Fujitsu VPP 300 (reported in Issue 7) and their initial impressions of the service. An opportunity was given to up to two EPSRC projects to evaluate the VPP 300 in the period up to March 1998 and discussions at the meeting led to an atomic physics project (UCL / Queens University Belfast) and a material science project emerging as the two most suited to the machine.
Steve Parker (Bath) then described more user experiences of the VPP 300 and its use for the simulations of minerals at high temperature and pressure in geophysical modelling. The aims are to predict thermodynamic quantities, stability, atom transport and creep rates, and the surface structure and how it interacts with the environment.
The methods used are similar to other materials science models using either interatomic potentials or electronic structure calculations. Bulk properties agree well with measurements but the geophysical environment can make some subtle differences important where they lead to a change of stable form at a given pressure.
Grain boundary effects and the effects of hydration on surfaces can be modelled for simple materials such as CaO and MgO and the new Fujitsu VPP 300 would lead to the extension of this work to more interesting materials such as carbonates and silicates, even being able to include impurities.
With a combination of the increased single processor performance of the VPP 300 and some optimisation by Fujitsu their model now ran 40 times faster than its original J90 version.
In the open discussion period the need to increase the capacity and capability of the DEC 8400 became clear and a memory upgrade and a second chassis have now been procured giving about a doubling of capability. CLRC would work with EPSRC and a subset of users to define future requirements in this area.
The extra memory on the J90 was discussed and additional changes to the NQS queues were agreed as was a general increase in the CPU time limits for interactive work.
File systems filling up on the J90 was a continuing irritation and CLRC would obtain price estimates for a large increase in disk capacity and would look at alternate strategies for aggressively clearing out 'temporary' files.
The next meeting will be at RAL, provisionally on 23 April 1998.