Video standards associated with IBM-PC-descended personal computers are shown in the diagram and table below, alongside those of early Macintosh and other makes for comparison.
Over the course of the early-to-mid-1990s, "SVGA" became a quasi-standard term in PC games, typically referring to a 640×480 resolution using 256 colours (8 bpp) at 60 Hz refresh rate.
Later, larger monitors (15" and 16") allowed use of an SVGA-like binary-half-megapixel 832×624 resolution (at 75 Hz) that was eventually used as the default setting for the original, late-1990s iMac.
(683:384 exact) The high-resolution mode introduced by 8514/A became a de facto general standard in a succession of computing and digital-media fields for more than two decades, arguably more so than SVGA, with successive IBM and clone videocards and CRT monitors (a multisync monitor's grade being broadly determinable by whether it could display 1024×768 at all, or show it interlaced, non-interlaced, or "flicker-free"), LCD panels (the standard resolution for 14" and 15" 4:3 desktop monitors, and a whole generation of 11–15" laptops), early plasma and HD ready LCD televisions (albeit at a stretched 16:9 aspect ratio, showing down-scaled material), professional video projectors, and most recently, tablet computers.
(25:16 exact) (64:27, or 2.370:1, or 21.3:9 exact) Although the common standard prefixes super and ultra do not indicate specific modifiers to base standard resolutions, several others do: These prefixes are also often combined, as in WQXGA or WHUXGA, with levels of stacking not hindered by the same consideration towards readability as the decline of the added "X" - especially as there is not even a defined hierarchy or value for S/X/U/+ modifiers.