The phase instruction allows a shader program to operate on two separate "phases" (2 passes through the hardware), effectively doubling the maximum number of texture addressing and arithmetic instructions, and potentially allowing the number of passes required for an effect to be reduced.
NVIDIA's GeForce4 Ti series offered a more accurate anisotropic implementation, but with a greater performance impact.
higher order surfaces), called Truform, which can automatically increase the geometric complexity of 3D models.
At launch, the card's performance was below expectations and it had numerous software flaws that caused problems with games.
For example, ATI was detecting the executable "Quake3.exe" and forcing the texture filtering quality to a much lower level than normally produced by the card, presumably in order to improve performance.
[1] HardOCP was the first hardware review web site to bring the issue to the community, and proved its existence by renaming all instances of "Quake" in the executable to "Quack.
"[2] However, even with the Detonator4 drivers, the Radeon 8500 was able to outperform the original GeForce3 (which the 8500 was intended to compete against) and in some circumstances its faster revision, the GeForce3 Ti500, the higher clocked derivative Nvidia had rolled out in response to the R200 project.
Later, driver updates helped to further close the performance gap between the 8500 and the Ti500, while the 8500 was also significantly less expensive and offered additional multimedia features such as dual-monitor support.
The Radeon 8500 LE became popular with OEMs and enthusiasts due to its relative affordability, and overclockability to 8500 levels.
[4] Though the GeForce4 Ti4600 took the performance crown, it was a top line solution that was priced almost double that of the Radeon 8500 (MSRP of $350–399 versus US$199), so it didn't offer direct competition.
With the delayed release of the potentially competitive GeForce4 Ti4200, plus ATI's initiative in rolling out 128 MB versions of the 8500/LE, this kept the R200 line popular among the mid-high performance niche market.
[5] Over the years the dominant market position of GeForce 3/4 meant that not many games targeted the superior DX8.1 PS 1.4 feature level of the R200, but those that did could see significant performance gains over DX8, as certain operations could be processed in one instead of multiple passes.
In November 2001 was the release of the All-In-Wonder Radeon 8500 DV, with 64 MB and a slower clock speed like the 8500 LE.
Notably, overclockers found that Radeon 8500 and Radeon 9000 could not reliably overclock to 300 MHz without additional voltage, so undoubtedly R250 would have had similar issues because of its greater complexity and equivalent manufacturing technology, and this would have resulted in poor chip yields, and thus, higher costs.
[8][9] ATI, perhaps mindful of what had happened to 3dfx when they took focus off their "Rampage" processor, abandoned the R250 refresh in favor of finishing off their next-generation DirectX 9.0 card which was released as the Radeon 9700.
This proved to be a wise move, as it enabled ATI to take the lead in development for the first time instead of trailing NVIDIA.
[11][12] Nvidia's response was the GeForce4 4200 Go (NV28M), launched in late 2002, featuring the same feature-set and similar performance compared to the desktop GeForce 4 Ti4200 (NV28), so it was DirectX 8 compliant while being significantly faster than the Mobility Radeon 9000.
However, the GeForce4 4200 Go, aside from a reduced clock speed, had thermal output similar to its desktop counterpart, and also lacked power-saving circuitry like the MX-based GeForce4 4x0 Go series or the Mobility Radeon 9000, making the 4200 Go unpopular with laptop OEMs.
The R200 series of Radeon graphics cards is supported by the Amiga operating system, Release 4 and higher.