Aside from 3D games, Vérité contained an IBM VGA compatible display controller, and served as a traditional 2D/GUI accelerator for the Windows operating system.
In the book Masters of Doom, Carmack cited bad experiences with programming the Vérité as the reason for id's shift away from proprietary APIs toward the industry-standard OpenGL.
At least four companies sold Vérité boards: the Creative Labs 3D Blaster PCI, the Sierra Screamin' 3D, the Canopus Total 3D, and the Intergraph Reactor (later renamed Intense 3D 100).
As the ATI Rage, S3 Virge, and Matrox Mystique delivered 3D graphics of questionable benefit, id Software's vQuake and Eidos's Tomb Raider were influential in fueling consumer interest in 3D rendering hardware.
The Vérité (and Voodoo) ports added 16-bit color rendering, bilinear filtering, per-polygon MIP mapping, and edge anti-aliasing to the game's 3D visuals.
Released in time for Christmas 1996, both vQuake and Tomb Raider demonstrated the V1000's 3D hardware to be both faster and better-looking than software rendering on even the most powerful host CPU.
On the extreme, in regular MCGA/VGA resolution or "Mode X", the V1000's performance was embarrassingly slow; older MS-DOS games (such as Doom) ran at near slideshow speeds, even on a top-of-the-line host CPU (Pentium 166 MHz).
In VESA VBE 2.0 display mode, Vérité's speed was outstanding, comparable to other top-rated cards of the era (such as the Matrox and ARK Logic PCI VGA chipsets.)
The V2100 only saw implementation on one board, the Diamond Multimedia Stealth II S220 PCI with 4 MB SGRAM, which was offered at $100 initially but quickly dropped to $50 due to limited demand.
Rendition and Hercules were at one point cooperating on a "Thriller Conspiracy" project which combined a Fujitsu FXG-1 "Pinolite" geometry processor with a V2200 core to create a graphics card with a full transform and lighting (T&L) engine years before Nvidia's GeForce 256 or ATI's Radeon.
Although this embedded memory design was later used in Micron's AMD Athlon chipset codenamed "Mamba", the actual graphics chip never surfaced.
Previewed Micron "SuperChip2" motherboard chipset specifications:[3] Rendition built a thorough list of supported games by encouraging developers large and small to make use of their free APIs.
[citation needed] Rendition used libraries developed by SiArch (licensed through Synopsys at that time) for their digital logic synthesis, which is done by means of sophisticated software auto generating and simulating the actual chip manufacturing process.
Chips are quality tested in simulation before manufacture, or "spiced", because faults are incredibly difficult to detect in microchips with modern trace widths, even with highly accurate instrumentation.
The result of this was excessive resistance with a weak bus-hold cell, which ate into the allowable noise margin and violated the static discipline in good digital logic design.
[citation needed] These faults manifested as an intermittent bug that was seen in the lab on real silicon but not in high level functional or even RTL or gate-level simulations.
The root cause was only determined after months of investigation, simulations, and test case development in the lab, which narrowed the problem to a very confined space.
At that point, the chip was run live under a scanning electron microscope using the oscilloscope probe mode to find the problem net between the NOR3 gate and the scan-flop.
The company was eventually purchased by Micron, who kept the development team intact as a source of embedded graphics solutions for their own line of motherboards.
Rendition's engineers were initially excited by the prospect of utilizing Micron's embedded DRAM technology for a high-end graphics processor, but such a product never surfaced commercially.