The folks at Tech Report have a surprising take on the GeForceFX 5200. They have a better take on it than I can put into a few words, so here’s the text of the email I was sent:
WHEN NVIDIA announced its NV31 and NV34 graphics chips, I have to admit I was a skeptic. The chips, which would go on to power NVIDIA’s GeForce FX 5600 and 5200 lines, respectively, promised full DirectX 9 features and compatibility to the masses. Who could resist?
Me, at least initially. Perhaps I still had a bitter taste in my mouth after the recycled DirectX 7 debacle that was the GeForce4 MX, or maybe it was NVIDIA’s unwillingness to discuss the internal structure of its graphics chips. Maybe it was merely the fact that I didn’t believe NVIDIA could pull off a budget graphics chip with a full DirectX 9 feature set without making cutting corners somewhere.
Or maybe I’m just turning into a grumpy old man.
Well, NVIDIA may have pulled it off. Now that I Albatron’s Gigi FX5200P graphics card in hand, it’s time to take stock of what kind of sacrifices were made to squeeze the “cinematic computing” experience into just 45 million transistors. Have NVIDIA and Albatron really produced a sub-$100 graphics product capable of running the jaw-dropping Dawn demo and today’s 3D applications with reasonably good frame rates? How does the card stack up against its budget competition?