Having designed the graphics architecture for Microsoft’s Xbox 360, ATI’s management had boasted for months ahead of its acquisition by AMD that its engineers were experts at designing the type of unified shader architecture envisioned by DirectX 10. Imagine our surprise when the R600 not only hit the market several months after Nvidia’s take on unified architecture but that the company’s best offering can’t compete with Nvidia’s top two GPUs.
AMD, for its part, says not competing with Nvidia at the high end is all part of its master plan, that it would rather focus on the “mainstream” graphics market, where most people are actually buying new videocards. And so it has positioned the ATI Radeon HD 2900 XT in this PowerColor card to compete with cards based on Nvidia’s GeForce 8800 GTS with 640MB frame buffers. If you believe that, we’ve got some prime real estate in Afghanistan you might be interested in. No, we’re convinced AMD ran into some design problems that it just could not resolve.
The Radeon HD 2900 XT is indeed faster than Nvidia’s 640MB 8800 GTS (but not the insanely fast 8800 Ultra or the only slightly tamer 8800 GTX). It’s also street-priced about $50 higher—but that’s about what we’d expect from the faster component. What we didn’t expect is a GPU that sucks down nearly as much electrical power as an 8800 Ultra while delivering benchmark results that are about 50 percent lower. (The HD 2900 XT requires both a six-pin and an eight-pin cable connection to your power supply.)
Power consumption isn’t something you think about every day, but with energy prices soaring, you should know that our test rig (see the footnote in our benchmark chart) draws 175 watts from the wall with a single Radeon HD 2900 XT at idle. That number jumped to a shocking 318 watts while benchmarking Quake 4 and increased to a staggering 515 watts when we dropped a second card in our Bad Axe II motherboard for CrossFire testing. A single 8800 Ultra, for the sake of comparison, sucked down 192 watts at idle and 320 watts under load in the same motherboard.
Another glaring problem with the 2900 XT is the absence of any driver support for AMD’s new Unified Video Decoder, which is designed to deliver hardware support for high-definition video decoding. Without UVD support, the 2900 XT must rely on the host CPU to handle much of this workload. To be fair, none of Nvidia’s 8800-series cards feature that company’s second-generation PureVideo HD engine either (you must step down to the GeForce 8600 to get it), but at least Nvidia has the excuse that its faster designs are several months older than the 8600.
It’s also interesting to observe that the 2900 XT is considerably slower than either the 8800 GTX or the 8800 Ultra, despite having 2.5 times as many stream-processors (320 compared to 128). This fact, combined with AMD’s FUBAR driver support for UVD and the card’s massive power footprint, strengthens our opinion that the 2900 XT is just not what AMD intended.
Those foibles aside, this card boasts some impressive architecture, including a true 512-bit memory interface (the best Nvidia can offer is a 384-bit interface, and that narrows to 320 bits for the 8800 GTS that this card competes with). The chip also has a built-in programmable tessellation unit—again, based on technology already present in the Xbox 360—but this feature won’t be of much real-world use until it’s exposed in DirectX 10 (or will that be DirectX 11… or 12?). But getting back to the real world, the fact that the stream processors in Nvidia’s part are clocked at more than twice the speed of its core didn’t help the 8800 GTS outrun the 2900 XT: PowerColor’s product delivered single- and dual-card benchmark numbers that were 15- and 25-percent higher than what equivalent 8800 GTS configurations could produce.
The 2900 XT, of course, supports CrossFire—AMD’s technology for operating two videocards in a single PC. And as with the latest spins of the X1000 series, the master/slave concept has been eliminated: All HD 2000 series GPUs have a compositing chip baked right into the silicon. AMD has also jettisoned the external cables that previous-gen CrossFire cards used for communication—replacing them with simple ribbon cables that fit inside the case. As with Nvidia’s SLI technology, however, you can operate only one display while in dual-videocard mode.
Two of PowerColor’s HD 2900 XT cards running in CrossFire are indeed faster than a single 8800 Ultra, but a pair of those cards will cost you slightly more than a single Ultra. And if you swing Nvidia’s way, you can always drop in a second Ultra for even more insane performance (click here for our review of the XFX 8800 Ultra XXX Edition).
It would be easy to dismiss this card as a whiff, but it’s really not a bad product, and it’ll be a whole lot better if AMD can unlock its UVD circuits.

Need to capture analog video? The HD 2900 XT is outfitted with VIVO (video-in/video-out), a feature Nvidia doesn't offer in its better cards.
www.powercolor.com

There are some good bones in this architecture.

The ratio between power draw and performance is way out of whack; no driver support for HD-video decode in hardware.
SPECS | ||||
Radeon HD 2900 XT | GeForce 8800 GTS | |||
No. of Shader Units | 320 | 96 | ||
Core Clock Speed | 743MHz | 500MHz | ||
Shader-Unit Clock Speed | 743MHz | 1.1GHz | ||
Frame Buffer | 512MB | 640MB | ||
Memory Speed | 828MHz | 800MHz | ||
Memory Interface | 512-bit | 320-bit |
BENCHMARKS | ||||
GeForce 8800 GTS | Single PowerColor | PowerColor in CrossFire | ||
3DMark06 Game 1 (FPS) | 20.8 | 21.5 | 41.9 | |
3DMark06 Game 2 (FPS) | 19.6 | 20.6 | 41.9 | |
Quake 4 (FPS) | 65.6 | 75.7 | 127.8 | |
Fear (FPS) | 51.0 | 63.0 | 111 | |
Supreme Commander (FPS) | 24.3 | 28.8 | 38.9 | |
Best single-card performance scores are bolded. cards were installed in an Intel D975XBX2 motherboard with a 2.93GHz Intel Core 2 Extreme X6800 CPU and 2GB of Corsair DDR2 RAM. |

Share to Facebook
Tidak ada komentar:
Posting Komentar