XFX Geforce GTX 260 Review

Hello people.

We have a treat for you today. We are taking a look at the XFX Geforce GTX 260. It’s one of the two GT200 series cards recently launched by NVIDIA. It’s a cheaper of the two GT200 cards and is going head to head against ATI’s Radeon HD4870 in the market today.

This is very significant to enthusiast market. NVIDIA has a long history of releasing a card that is almost as good as their flagship card but one that costs significantly less. The GT was that card in the past. This time around NVIDIA has decided to drop their usual naming, and went in for different naming scheme. They had ran out of the 4 digit numbers anyway with their 9800 series ;)

So this slightly scaled down version is now called GTX 260, a slightly crippled version of the big brother GTX 280.

Today we are going to put this card through some vigorous game testing and IQ testing as well. So lets have a look.
box1.jpg

[BREAK=The GT 200]

GT 200 Architecture

NVIDIA is coming off a successful year where it totally dominated the GPU market, outperforming ATI’s graphics cards. The real success story is the G92 core. It was basically tweaked and die shrunk of G80. But this made it affordable and 8800GT G92 reached the masses at very attractive price.

With GT200, NVIDIA decided to stick to same fab process but built a chip that’s carrying massive 240 shaders. This means, it’s the biggest chip that TSMC has ever built. And when something like this happens, one can expect trouble with yields atleast to begin with.

Here is a picture of the GT200 architecture. I will not go into in depth detail here but just a brief idea.

architecture.jpg


You have Pixel shader, setup/raster unit, vertex shader and geometry shader unit. The next in line is the shader array. This is where the brute force is generated. The unified shader array consists of 10 shader blocks. Each block contains 24 shader units / cores.

Here is the close up of the individual shader block.

shader%20block.jpg


On GTX 280 all 10 shader blocks are enabled, while on GTX 260 you get 8 of them. So total 192 shader processors. (I have indicated the disabled units in GTX 260 by the red border so that you get an idea of what you are loosing in GTX 260)

Next are the L2 cache blocks and then the memory.
It’s major specification bump from the G92. It does not add any features from the G92, but it adds more of what you used to get in G92. Everything is basically multiplied and tweaked.

NVIDIA is still sticking to its stand of not adopting DX10.1 here as well. The GTX 260 and 280 are both still DX10.0 cards.
Let’s quickly run through specifications.
[BREAK=Specifications]
Specifications

Manufacturing Process: 65nm

Steam Processors: 192

Texture Units:
64

Memory Interface:
448bit

Memory Type: 896MB GDDR3

Clock Speed: 576Mhz

Memory Speed: 999Mhz

Pixel Fillrate: 16.1GPixel/S

Texture Fillrate: 40.3GTixel/S

Transistors:
1,408M

Die Size: 24x24mm (576 mm2)

Additional Features: HDCP Ready, PCI Express 2.0, Tri SLI ready, CUDA and PhysX

As you can see the memory is still GDDR3 and the memory interface is little odd at 448bit. There is simple explanation for this. GTX 280 has full blown 512bit bus which is thanks to 8channel 64bit interface. The GTX 260 is 7channel 64bit interface card.

All specifications are slightly lower than the GTX 280. They are using same core for GTX 260. So they can basically use some defective cores which does not meet GTX 280 specifications and put them to use.

NVIDIA is supposed to release official PhysX software bundle in coming week. If I still have the card at that time, I will try to add the PhysX benchmarking to this review.

As far as CUDA goes, currently there are limited ways in which you can use the raw processing power of this card. You have Folding @ Home GPU client which runs on the NVIDIA cards. Also 3rd party applications for video conversion using GPU do exist. But they are not free.

Enough of this purely technical stuff. Let’s have a look at the XFX GTX 260.
[BREAK=The package]

The Package

Sticking to the tradition of huge boxes, XFX GTX 260 comes in big black and green box, advertising all the features. I kind of like the box.

Box.png


The box back has usual Key feature list.

box-back.jpg


Opening the box reveals a nicely packed card. The card is covered by foam from all sides securing it in the place. No chance of damage here.

box%20opened.jpg


The XFX is bundling Assassin’s Creed with this card which is nice game indeed. Here is the snap of what’s included in the box.

bundle.jpg


You get basic bundle. Driver & game cds, HDTV box, DVI to VGA converter, audio cable and manuals are included.

[BREAK=The Card]
The Card

The card itself is huge. Absolutely huge. I though HD4870 was huge. But this monster is very very big indeed.

card%20front.jpg


You get nice big dual slot cooler having green XFX logo with this card. And it indeed runs nice and quiet.
card%20front%20angle%202.jpg


card%20fan.jpg


The back of the card is completely covered by the metal plate. So entire PCB is wrapped in the heatsink-fan assembly.

card%20back.jpg


You get 2 DVI ports and HDTV/S video out connector on the back plate of the card. The dual slot cooler has external exhaust which dumps the heat outside of the cabinet.

card%20plate.jpg

Unlike GTX 280, the GTX 260 comes with 2 regular 6 pin PCI express connectors. This means you don’t need a power supply with 8 pin PCI express connector which is needed for GTX 280.
card%20top%20view.jpg


This card sports Tri-SLI connector which is covered with a plastic cap which prevents the dust sneaking inside when the SLI is not in use.

SLI%20connector.jpg


I wanted to strip this card and have a look at the core and the PCB. But this card proved to be one of the most difficult cards to do so. The back plate is held in place by series of interlocks which are not easy to unlock. You unlock one and when you move to unlock second lock, the first one will snap in the lock. Due to time restriction I will do this at the later date if I get enough time before sending back the card.

Let’s move onto the system we used to test this baby and the methods used for benchmarking the gaming suit.
[BREAK=Test Setup & How we tested]
Test Setup.

We have decent, powerful setup here to test this card.

Processor:
Intel C2D E7200 @ 3.99Ghz

Memory: OCZ Platinum 4GB DDR2 1000

Motherboard: Gigabyte X48-DQ6

Power Supply:
Tagan BZ800

Cabinet: Custom Modified Cooler Master CM690

Graphic Cards: XFX GTX 260, Sapphire HD4870

Games Used:
Crysis, Mass Effect, Race Driver Grid, World In Conflict

Synthetic Benchmarks Used: Futuremark 3Dmark Vantage

Drivers: Forceware 177.70 for GTX 260, Catalyst 8.7Beta for HD4870.

Operating System: Windows Vista Ultimate X64

How we Tested

We are testing the latest DirectX 10 card. Which meant we went with the Windows Vista. There is no point in testing this card in DirectX 9 environment.

For World In Conflict, we used game’s in built benchmarking utility. For Crysis we used Crysis Benchmark tool 1.0.0.5. Mass Effect and GRID does not come with any built in benchmark tool. So for that we used good old FRAPS. Every attempt was made to replicate exact same game play.

We picked the drivers which seemed to work the best on our system. I personally don’t care if the driver is beta or official WHQL. I did test other Beta drivers for GTX 260 and settled for 170.70 as this driver was performing much better in games like Mass Effect and almost similar in other games.

The drivers of both the cards were set for maximum quality setting for game testing.
Now let’s have a look at the competition for this card, the card against which GTX 260 will be battling in the market.
[BREAK=The Rival]
The Rival.

ati%20vs%20nvidia.jpg


Well no points for guessing the card against which we will be testing the GTX 260. It’s obvious, its big, it’s red and it’s ATI.

2%20cards.jpg


The HD4870.
It’s the direct competition to this card. The ATI Radeon HD4870 is actually priced lower in India at Rs.19000/- ( Approximate street price in Mumbai ).

Let’s now move onto the benchmarks.

[BREAK=Mass Effect]
Mass Effect.

330487f88ce46d08.jpg


This game uses the Unreal Engine 3. This is the reason we dropped the bioshock and UT3 from out benchmark suit. The game graphics and environment feels like trademark UT3 engine game. But the facial expressions and details are extremely accurate and detailed in this game with some stunning graphics effects. The game play does require some time getting used to but overall this game is great benchmark.

In this game we directly jumped to 1680x1050 resolution as this game is not really taxing on these modern GPUs. Fraps was used to benchmark the game.

Here are the numbers.

Mass%20Effect.png


As you can see without AA / AF, the cards perform almost similar. But when you turn the AA/AF on for some weird reason, the ATI suffers huge performance loss. GTX 260 blows it away in this game.

To investigate, I ran the benchmark on ATI card at 8XAA and even higher/lower resolutions and to my surprise, the numbers remained the same. There was no performance drop with 8XAA. Reducing / increasing game resolution did not help at all. So there is some definite bug here in ATI drivers for this game with AA. We had to force the AA from CCC as game does not have in built setting for this. Maybe the way ATI AA works is creating this problem in this game and it might be fixed in the future. But for now, its clear victory for GTX 260.

[BREAK=Race Driver GRID]
Race Driver Grid

330487f88ce32041.jpg


One of the new racing game that came out from the Codemasters. Codemasters in past had a reputation of churning out outstanding racing simulation games. Grid is no exception to that. Beautiful game with some great cars and circuits.

This game is very hard to benchmark. There is no built in benchmark system. And races are fully dynamic. So you have to drive through circuit and measure fps using fraps. I tried my best to drive as carefully and uniformly. I drove very carefully for 1 lap of the circuit at the back of the grid with all cars in front of me. All in game settings were maxed out. And 16XAF was enabled. AA was set from the game.

GRID.png


As you can see, without AA, the GTX 260 is trailing the HD4870 by about 12FPS. With 8XAA enabled, the gap opens up more. There is absolutely no performance hit on HD4870 with AA enabled. One might think there is something wrong with these results. Even I did, but after inspecting the image, AA was in fact working. Having 4X or even 8X AA makes almost zero difference to the HD4870. Free 8XAA is great. GTX 260 is lagging behind by 17FPS. A clear win for HD4870.
[BREAK=World In Conflict]
World In Conflict

33047fa0b6d9dd06.jpg

World in conflict is one of the best strategy games that came out recently. The graphics and game play both are absolutely stunning, and its one of the games that is really CPU and GPU intensive.

We used game’s in build benchmark system. For this test, the graphics setting were set to very high in the game. This enables DX10 render path and also enables 4x AA.

We even tested cards with 8X AA. For ATI 8X AA was forced from CCC.

WIC.png


The results are very close. There is not much to choose between the two. With 8XAA HD4870 does outperform the GTX 260. Actually 4870 takes almost no hit with 8XAA enabled. Just max fps is lower.
[BREAK=Crysis]

Crysis.

33047fa0bc2759b4.jpg


Oh yes, the ever so debated game out there. Many people call it badly coded game, many curse it for being just a technological demonstration. But surely, no review will can ignore or eliminate this game from their gaming tests ;) The game is nemesis of the GPUs.

We used this game to really stress the cards. This game did allow me to use 1920x1080 setting. So you will find results of this resolution as well here.

Crysis1.png


Crysis2.png

As it is clear from results above, the HD4870 does manage to outperform GTX 260 in every setting. It must be noted that Crysis does get nice boost in performance by using 8.7 beta drivers. The WHQL release performance is lower. But like I said, I don’t really care for WHQL.
[BREAK=3DMark Vantage]

3DMark Vantage

330487f894350f63.jpg


This is the latest 3d benchmark from FutureMark. It’s first DX10 benchmark from FutureMark. A set of synthetic CPU and GPU tests to evaluate system performance. Though it’s synthetic in nature, it is good benchmark for relative comparison.

For 3DMark Vantage, we decided not to force the High Quality settings in NVIDIA CP. It’s a synthetic benchmark. Both ATI and NVIDIA optimize their drivers for 3Dmark. So we set everything to default.

Vantage.png


Here the HD4870 does manage small lead. But it’s nothing significant.

Lets move onto the image quality tests.
[BREAK=Image Quality]

For image quality test, we selected Crysis and GRID. The WIC and Mass Effect didn’t offer anything to differentiate. Crysis was a different story. There is noticeable difference. I am posting comparative screenshots of Crysis below.



Observe the hand. In second set of screenshots though difference is not much, observe branches of coconut tree and the cloud.

NVIDIA GTX 260 Screenshot


ATI HD4870 Screenshot


There is some graphical difference between these 2 cards. The clouds do appear more prominent on HD4870 and hand glove rendering is totally different on these cards. I will leave it to you to judge them as many people prefer different things.

In GRID, again difference is not much. Slight variation in rendering of road surface on tyre burn marks.
NVIDIA GTX 260 Screenshot


ATI HD4870 Screenshot


[BREAK=Overclocking, Temperature and Noise]
Overclocking, Temperature and Noise.

Now to overclocking. The GTX 260 overclocked easily to 700/1200 with shaders in sync. For Vantage I was also able to do a run with clock speed in excess of the 700mhz. But crisis and other games were not stable above 700Mhz.


The overclock results in nice jump in Vantage score as you can see. And even in games, you see decent performance increase. But again an overclocked HD4870 managed to produce similar results with matching and even outperforming this card.

In crisis the card managed to touch 24.5 FPS with Very High settings @ 1680x1050. Which is not bad, but you cant ignore HD4870 here which managed in excess of 27FPS.

The temperatures were never an issue for this card. The stock cooler kept the card cool enough. The card was idling at 46-48°C and at full load it was doing 75°C which is not bad at all.

The fan is not noisy at all. Even at 70-80% it was perfectly tolerable. I don’t think there is a need for any after market cooler for average user for this card. You can happily live with the stock cooler (And you probably wont dare to change the cooler as its very difficult to get off the card :p )
[BREAK=Multimedia and Cinema Experience]
Multimedia & Cinema experience.

This has become a major factor today. Unfortunately GT200 has nothing more to add to its feature set here. It offers same driver options of dynamic contrast as all G92 cards do. You can find that in one of my previous reviews here.
XFX 790i Ultra SLI & XFX 9800GX2 Review - Reviews and Previews - TechEnclave

Unfortunately I do not own or have access to BlueRay player for PC to test the Decoding capabilities of this card.

All regular X264 encoded HD files ran smoothly with EVR of Media Player Classic Homecinema using less than 5% of the CPU. This is why Vista makes a great platform for those who are looking for good multimedia functionality.

Though HD4870 have better vibrance / contrast and natural looking colours, after enabling dynamic contrast and little fiddling with the NVIDIA Digital Vibrance, the GTX 260 was able to match the image quality of Radeon. No complains on this front.

The HDTV users will be familiar with the scaling issue of the NVIDIA cards. While using the HD LCD TV with digital output like HDMI, the NVIDIA faces scaling issues. I experienced this even with 8800GT and I see the same thing with GTX 260.

NVIDIA’s scaling takes the borders of the picture out of the viewing area. It does not crop the video signal properly. This is not a big issue to be honest as this can be corrected from NVIDIA control panel. But if you play 1080P videos on these Televisions, and switch the TV which is native 720P to 1080P mode, the movies do look better even though the resolution is non native. And here the GTX 260 faces scaling issues that I have stated. So you are forced to be content with only native 720P mode or manually change the scaling from control panel each time you change the resolution.

This is not the case with ATI HD series. Every HD series card I have tested including the budget HD3450 was able to properly scale the image without me needing to manually fiddle with the TV settings or CCC.
[BREAK=Conclusion]

Conclusion.

What we have seen in past month is absolutely fantastic to observe. The limits of graphics card performance are rewritten with the introduction of these new generation cards.

But what is important to see is the way this whole market has turned over its head. For the first time in years, NVIDIA had to reduce the prices on their card in international market because of the pressure from ATI. This is good for the consumer.

Now about the card. The numbers are in front of you. As an upgrade from 8800Ulta or 8800GT/GTS, the GTX 260 offers mind boggling performance. It’s all good and nice and rosy. But HD4870 proves to be a spoil sport for NVIDIA’s GTX 260. If ATI had not been able to bring HD4870 in the market for this price, we would still be shelling out $400-450 for GTX 260.

HD4870 spoils the game for GTX 260, badly at times. I was expecting this as I had the HD4870 for quiet a while now. Sure you need a fan hack with these beta drivers for HD4870 to keep temperatures under control. GTX 260 does run quiet and cool. But where it counts the most, its lagging behind the ATI HD4870.

Even when overclocked, this situation does not change much HD4870 still manages the overall lead.

So what would have been a fantastic unmatched performer offering a great deal of performance improvement over last generation, and probably THE card to get, finds itself playing a unsuccessful catch up game with HD4870.

Mind you, I quiet like the card. And if it had not been for HD4870, you would have seen our Graphics card section and show off sections full of discussion about this card by now. But picture is different this time around.

One more thing that is against this card is the price. I do not have official recommended pricing from the distributors for this card. But when I contacted the local dealers for the same, this card turn out to be considerably more expensive than HD4870 in India. I will post the final price here when I get update from the distributors.
So the all important question. Is it recommended?

I am sorry to say, but answer is No. Not at the price point at which it is available in India. This card needs to go below the price of HD4870 for people to consider this. The performance and Price are both in favour of HD4870.

I did try to be objective, and made a Pro and Cons list. When you will see Pros and Cons of this card below you will get the idea. When the only 2 Pros of the card are Bundle and low noise levels, it has failed to do what it is meant to do in enthusiast segment which is to offer better gameplay at competitive price than the competitor.

Some serious marketing strategy needs to be adopted in India for this card to sale here. Otherwise it simply won’t sell. It’s amazing to see how things have changed since I reviewed 9800GTX and GX2 a while back.

Pros


*Bundle
*Quiet

Cons

*HDTV scaling not as good as rival. Needs manual adjustment each time.
*Gets outperformed by the cheaper card.
*Cost.

Our special thanks to Rashi peripherals and XFX India for supplying this card to us.

Shripad signing out until the next time.

Please Digg this Review here : Digg - XFX Geforce GTX 260 Review
 
Lemme be the 1st 2 thank U ;)

Nice 1 funky :eek:hyeah: not 2 shot, nor 2 long, right up to the mark and says all I wanted 2 hear :hap2:

Dug it :p
 
$heet!!! :O

Amazing results from Crysis!!! :O

That was truly unexpected!!! :O

Nice work dada... :thumb:

BTW what was the gripe of removing the stock cooler of the GTX??
 
Its very hard. I am 100% sure the card will sustain some sort of damage tot he cooler body while removing the assembly. There are too many locks at odd places. Only way to unlock them is get the fine flat screw driver in the gap and twist it open. This causes serious dent in card cooler and if screw driver slips, it will definitely take some smd component off the PCB of the card. And I need to forward this card to someone else in good condition :p

If Rashi gives me go ahead to do whatever to the card, I will take it apart by any means possible :p
 
amazing effort. repped
whats the official dealer quote for the same? and of HD4870 as well.

What did you use to cool the 7200?
 
funky, let me tell u honestly, u rock!!
That was awsome review i had ever seen here..I really like IQ screen Shots, multimedia & cinema test which i was eager waiting for someone to put it in review..:hap2:
 
amazing amazing review their funky, damn professional review....you ought to join some of the bigger magazines as a reviewer/editor man......repped...
Edit : have to spread some reps before repping u again....
 
Wow, look at the effort there :O.

Awesome review.

Minor correction (typo), the 5th line on 14th page reads "In crisis he card". But you are bound to have typos when you type like >1000 words :O.

Awesome review once again.
 
Yup, there were indeed lot of small typos. Have fixed many and some minor changed made to sentences.

Anyone putting their hand up for editing work for the next review? :p

mr47 said:
amazing effort. repped
whats the official dealer quote for the same? and of HD4870 as well.

What did you use to cool the 7200?
Using Xigmatek Achilles HDT S1284 to cool the E7200 at the moment.
 
Hey a very nice review! Good numbers going, and crysis results were interesting, however I had a question did you test them using the island gpu benchmark, or a custom time demo?

I ask cause HardOCP, did have some interesting results using a custom timedemo!

And dude these reviews keep getting better! Keep it up! repped!
 
Back
Top