Descriptors provide essential memory protection, security, safety, catching all attempts at out-of-bounds access and buffer overflow.
From 1959 to 1961, W.R. Lonergan was manager of the Burroughs Product Planning Group which included Barton, Donald Knuth as consultant, and Paul King.
According to Lonergan, in May 1960, UCLA ran a two-week seminar ‘Using and Exploiting Giant Computers’ to which Paul King and two others were sent.
Paul King took the ideas back to Burroughs and it was determined that virtual memory should be designed into the core of the B5000.
[6] In the mid-1960s, other vendors, including, RCA, and General Electric developed machines that supported virtual memory, with operating systems oriented towards time-sharing.
IBM didn't support virtual memory in machines intended for general data processing until 1972.
While the details have changed since 1964, the idea of the descriptor essentially remains the same up until the current Unisys Clearpath MCP (Libra) machines which are direct descendants of the B5000.
This is the main weakness of Pascal as defined in its standard, but which has been removed in many commercial implementations of Pascal, notably the Burroughs implementations (both the University of Tasmania version by Arthur Sale and Roy Freak, and the later implementation on Burroughs Slice Compiler system by Matt Miller et al.) In a program in the Burroughs/Unisys MCP environment, an array is not allocated when it is declared, but only when it is touched at run time for the first time – thus arrays can be declared and the overhead of allocating them avoided if they are not used.
This saves the programmer the great burden of filling programs with the error-prone activity of memory management, which is crucial in mainframe applications.
In fact, many buffer overruns in apparently otherwise correctly running C programs have been caught when ported to Unisys MCP architecture.
[8] C, like Pascal, is also implemented using the Slice compiler system, which uses a common code generator and optimizer for all languages.
Such object allocation is higher level than C's malloc and is best implemented with a modern efficient garbage collector.
With the advent of the A Series in the early 1980s, the meaning of this field was changed to contain the address of a master descriptor, which meant that 1 megabyte data blocks can be allocated, but that the machine memory could be greatly expanded to gigabytes or perhaps terabytes.
Reloading blocks from virtual memory on disk can significantly degrade system performance and is not the fault of any specific task.
On B5000 machines, 'other p-bits' indicate a system problem, which can be solved either by better balancing the computing load across the day, or by adding more memory.
The last and maybe most important point to note about descriptors is how they affect the complementary notions of system security and program correctness.
C, in particular, uses the most primitive and error-prone way to mark the end of strings, using a null byte as an end-of-string sentinel in the data stream itself.
During indexing operations, pointers are checked at each increment to make sure that neither the source nor the destination blocks are out of bound.
Each memory segment containing integrity sensitive data, such as program code, is stored in tag 3 words, making an uncontrolled read – let alone modification – impossible.