The VGA analog interface standard has been extended to support resolutions of up to 2048 × 1536 for general usage, with specialized applications improving it further still.
The term "array" rather than "adapter" in the name denoted that it was not a complete independent expansion device, but a single component that could be integrated into a system.
[11] Unlike the graphics adapters that preceded it (MDA, CGA, EGA and many third-party options) there was initially no discrete VGA card released by IBM.
The first commercial implementation of VGA was a built-in component of the IBM PS/2, in which it was accompanied by 256 KB of video RAM, and a new DE-15 connector replacing the DE-9 used by previous graphics adapters.
As the VGA began to be cloned in great quantities by manufacturers who added ever-increasing capabilities, its 640 × 480, 16-color mode became the de facto lowest common denominator of graphics cards.
Third-party "multisync" CRT monitors were more flexible, and in combination with "super EGA", VGA, and later SVGA graphics cards using extended modes, could display a much wider range of resolutions and refresh rates at arbitrary sync frequencies and pixel clock rates.
It remains an option in XP and later versions[citation needed] via the boot menu "low resolution video" option and per-application compatibility mode settings, despite newer versions of Windows now defaulting to 1024 × 768 and generally not allowing any resolution below 800 × 600 to be set.
The need for such a low-quality, universally compatible fallback has diminished since the turn of the millennium, as VGA-signalling-standard screens or adaptors unable to show anything beyond the original resolutions have become increasingly rare.
[citation needed] The standard VGA monitor interface is a 15-pin D-subminiature connector in the "E" shell, variously referred to as "DE-15", "HD-15" and erroneously "DB-15(HD)".
With BNC, the coaxial wires are fully shielded end-to-end and through the interconnect so that virtually no crosstalk and very little external interference can occur.
It is backward compatible with the EGA and CGA adapters, but supports extra bit depth for the palette when in these modes.
Typically, these starting segments are: A typical VGA card is also provide this port-mapped I/O segment: Due to the use of different address mappings for different modes, it is possible to have a monochrome adapter (i.e. MDA or Hercules) and a color adapter such as the VGA, EGA, or CGA installed in the same machine.
There is a trade-off for extra complexity and performance loss in some types of graphics operations, but this is mitigated by other operations becoming faster in certain situations: Software such as Fractint, Xlib and ColoRIX also supported tweaked 256-color modes on standard adaptors using freely-combinable widths of 256, 320, and 360 pixels and heights of 200, 240 and 256 (or 400, 480 and 512) lines, extending still further to 384 or 400 pixel columns and 576 or 600 (or 288, 300).
The highest resolution modes were only used in special, opt-in cases rather than as standard, especially where high line counts were involved.
These modes were also outright incompatible with some monitors, producing display problems such as picture detail disappearing into overscan (especially in the horizontal dimension), vertical roll, poor horizontal sync or even a complete lack of picture depending on the exact mode attempted.
The development of SVGA was led by NEC, along with other VESA members including ATI Technologies and Western Digital.