Central Computer and Telecommunications Agency

A later career review confirmed that John Freebody was promoted to Staff Engineer and set the task of founding the Technical Support Unit.

At that time telecommunications engineering staff comprised 8 dealing with Systems Evaluations, 6 with Peripheral Equipment and 10 in the areas of Accommodation,[4] Testing, and Maintenance.

Details of names, grades, qualifications, salary and relevant experience can be found in Hansard Volume 717: debated on Tuesday 27 July 1965.

The contracts also included requirements to run on-site and sometimes predelivery acceptance trials of a specified format, designed and supervised by engineering staff.

The problem was solved by Roy Longbottom who, at various promotion levels between 1968 and 1982, was responsible for designing and supervising acceptance trials of the larger scientific systems.

The first candidate was an IBM 360 Model 65 at University College London in 1971, then in 1972 by trials on all mainframes, minicomputers and supercomputers covered by CCTA contracts.

Later that year, top end systems tested were the $5 million scalar supercomputers CDC 7600 at University of London Computer Centre and IBM 360/195 at UK Meteorological Office.

[7] Detailed analysis of fault returns, hands on observations during acceptance trials and system appraisal activities lead to a deeper understanding of reliability issues, published in a 1972 Radio and Electronic Engineer Journal, titled “Analysis of Computer System Reliability and Maintainability”, with probability considerations.

This included a tour of the production factory and discussions with higher level engineering, design and quality control staff.

The first in a finally standard format was “System Summary Notes” (range 5000 to 6999), starting in 1967, with such as early IBM 360 mainframes and Digital Equipment Corporation PDP 8 minicomputer, up to the last issue in 1980.

[15] In 1972 Harold Curnow wrote the Whetstone Benchmark in the FORTRAN programming Language, based on the work of Brian Wichmann of the National Physical Laboratory.

On achieving 1 MWIPS, the Digital Equipment Corporation VAX-11/780 minicomputer became accepted as the first commercially available 32-bit computer to demonstrate 1 MIPS (Millions of Instructions Per Second), CERN,[19] not really appropriate for a benchmark dependent on floating point speed.

This had an impact on the Dhrystone Benchmark, the second accepted general purpose computer performance measurement program, with no floating point calculations.

[20] The latter includes hardware functions for exponential, logarithmic or trigonometric calculations, as used in two of the eight Whetstone Benchmark tests, where these can dominate running time.

The new versions, in the C programming language, included the new CCTA automatic calibration function to run for a specified finite time, still applicable 50 years later.

[24] He became a member of the Technical Subgroup of the National Policy Committee on Advanced Research Computers and the Universities’ Benchmark Options Group.

The latter involved leading a party to the USA including having discussions with Jack Dongarra and Frank McMahon, respectively authors of the Linpack and Livermore Loops, key benchmarks of the day for scientific applications.

In 1992, the Science and Engineering Research Council requested CCTA to provide independent observation and reporting on benchmarking a new supercomputer for University of London Computer Centre, comprising a large sample of typical user applications.

The aforementioned performance consultancy covered more than 45 projects between 1990 and 1993, mainly for data processing applications, with systems from 18 manufacturers, including mainframes, minicomputers and PCs.

[26] The next one, on Database System Benchmarks and Performance Testing was in a Conference on Parallel Processors, at NPL in 1992, providing a warning of the dangers for the supercomputer community, and published in a later book.

[27] Finally, a new approach to performance management was suggested based on the assumption that initial sizing estimates would be incorrect and actions should be considered for application at each stage of procurement, presented at UKCMG Conference Brighton, in 1992.

In this case, the reasons were identified by measuring CPU, input/output, communications and memory utilisation of a number of transactions, using the UNIX SAR performance monitor.

[1] The benchmark comprised six real representative programs with disk and magnetic tape input/output, covering updates, sorting, compiling and multi-stream operation, measuring CPU and elapsed times and the number of data transfers.

These were freely available, produced in conjunction with the Compuserve Benchmarks and Standards Forum, see Wayback Machine Archive,[30] covering PC hardware 1997 to 2008.

The report includes the following comparison with the first version of the Raspberry Pi computer based on average Livermore Loop speeds, as this benchmark was used to verify performance of the first Cray 1.

That quotation was reproduced in numerous Internet posts, some including a reference to the author worked for “the UK Government Central Computer Agency”, as quoted in the report.

In December 1998, the DfEE moved its server from CCTA at Norwich to NISS (National Information Services and Systems) in Bath when it relaunched its website.

[41] Between 1989 and 1992, CCTA's "Strategic Programmes" Division undertook research on exploiting Information Systems as a medium for improving the relationship between citizens, businesses and government.

Two major TV documentaries were produced by CCTA – "Information and the Citizen" and "Hymns Ancient and Modern" which explored the business and political issues associated with what was to become "e-government".

These were aimed at widening the understanding of senior civil servants (the Whitehall Mandarins) of the significant impact of the "Information Age" and identifying wider social and economic issues likely to arise from e-government.