DRDRAM was initially expected to become the standard in PC memory, especially after Intel agreed to license the Rambus technology for use with its future chipsets.
However, RDRAM got embroiled in a standards war with an alternative technology—DDR SDRAM—and quickly lost out on grounds of price and, later, performance.
To emphasize the advantages of the DDR technique, this type of RAM was marketed at speeds twice the actual clock rate, i.e. the 400 MHz Rambus standard was named PC-800.
This was significantly faster than the previous standard, PC-133 SDRAM, which operated at 133 MHz and delivered 1066 MB/s of bandwidth over a 64-bit bus using a 168-pin DIMM form factor.
Compared to other contemporary standards, Rambus showed an increase in latency, heat output, manufacturing complexity, and cost.
RDRAM includes additional circuitry (such as packet demultiplexers) on each chip, increasing manufacturing complexity compared to SDRAM.
RDRAM was also up to four times more expensive than PC-133 SDRAM due to a combination of higher manufacturing costs and high license fees.
[citation needed] PC-2100 DDR SDRAM, introduced in 2000, operated with a clock rate of 133 MHz and delivered 2100 MB/s over a 64-bit bus using a 184-pin DIMM form factor.
In 2002, Intel released the E7205 Granite Bay chipset, which introduced dual-channel DDR support (for a total bandwidth of 4200 MB/s) at a slightly lower latency than competing RDRAM.
In 2001, benchmarks pointed out that single-channel DDR266 SDRAM modules could closely match dual-channel 800 MHz RDRAM in everyday applications.
RDRAM allowed N64 to be equipped with a large amount of memory bandwidth while maintaining a lower cost due to design simplicity.
A Jumper Pak dummy unit is included with the console due to the aforementioned design quirks of RDRAM.
[14] RDRAM offered a potentially faster user experience than competing DRAM technologies with its high bandwidth.