ATI Radeon HD 2600 XT @ [H]

Good article. Shame there isn't much competition this generation. DX10 performence also makes me a little weary since I'll be getting a 8800 GTX and Vista soon.

Typo fixed, thanks - Brent
 
149.99 MSRP = 101.99 for the GDDR3 version at ewiz.com


At 101.99 it is the better value.
 
Good review [H]! Thanks for testing both 32bit and 64bit, as well as DX9 and DX10. I know it probably took longer but the information gleaned was worth it.

:)
 
The video playback application must support hardware acceleration of UVD in order for it to work. You must use Vista x32

Thanks for the info.

Bummer. Well I am not upgrading to Vista, so I guess the last possible use for these cards is exhausted.
 
reviews like these make me feel like i'm doing the right thing, holding out for next-gen dx10 cards (or a REAL cdn sale on an 8800).

great review as always, so much for any ideas of a price war waged at the midrange level.
 
149.99 MSRP = 101.99 for the GDDR3 version at ewiz.com


At 101.99 it is the better value.

I submit that if the GDDR4 version can't even stand up to the 8600 GT, the GDDR3 version hasn't a snowball's chance of holding its own. You can definitely get much cheaper 8600 GTs. We used the one we did because it was closest in price and factory overclocked.
 
I'll make 3 points, and say now that I'm not concerned with the HD playback features.

Why? Because I already have a XP 2500+ w/ 2GB box that plays 720p onto my 22" CRT just fine. (this is my "TV")

1. Most LCD monitors sold today have a dot pitch > .25 these days. Yes I know many pro models can best that, but few actually buy them. And for everyone who posts "But I have XXX monitor and it does XXX..." there are a hundred CS players out there not paying attention to us dweebs.

2. You cannot get as good a CRT monitor these days as you could a few years ago, they simply don't make them. This is what drove me to buy an LCD a couple years back, just to see if the future was as crappy as I feared.

3. The single most effective way to improve IQ at a low resolution is the use of AA. This can make the paltry 1280x1024 of my 19" LCD (at this point, seriously, who doesn't have one laying around. they are everywhere) livable. Yes I consider 1280x1024 low. 1600x1200 was conquered years ago, and I know several people who play at 1900x1200 and higher.

Now considering all this I put to you one question...

What on earth were they (ATI) thinking when it was decided that AA was going to be handled by a base chunk the chip core rather then a dedicated hardware/software hybrid employed by every other solution on the market today? (including last gen ATI)

"Ok guys, so when they kick it in to high gear and start cranking the AA up, we'll go ahead and just use a few of those Stream Procs, ok? I mean, we have 320 man, way more then the GTX. What? Damnit Bob, will you stop going on on about how it doesn't have enough horsepower in the first place and is geared too heavy for raw shader power. In fact, you're fired, and that quiet guy behind you gets your parking space."

Sadly I feel I must add the now common disclaimer: I own ATI hardware. I have a x1800 in my wifes utility box, and I have a Rage Fury MAXX sitting in a box here somewhere. Yup that's it. After the MAXX debacle it took the 9xxx series to convince me ATI was up to snuff again. Then I picked up a x1800 pro... Early even, not later when they were dirt cheap. Go figure.

I would like nothing more then the graphics market to just get on with it already.

*Waaambulance returning to base*
 
Good review [H]! Thanks for testing both 32bit and 64bit, as well as DX9 and DX10. I know it probably took longer but the information gleaned was worth it.

:)

You are quite welcome, as you can imagine quite a lot of work went into this, a very informative evaluation.
 
So UVD doesn't work under WinXP? Figures...

I think that is a misunderstanding. According to many posts over at AVS, it works fine on XP. I think what was meant here was that the drivers do not currently work for the 64-bit version of Vista, not that it doesn't work on XP.
 
Looks like all those people posting in the last few ATI article threads saying how much better the ATI stuff would be if only [H] had tested under Vista and DX10 were spewing loads of nonsense. Even using vista 64 bit didn't help ATI.

This despite the fact that the ATI product in this review had almost 30% higher clock rate in core, and almost 40% higher clockrate in memory, and had DDR4 memory vs nvida's DDR3, and had more stream processors (although it depends on how you count them).

ATI choking on AA doesn't help matters either. AA is a pretty important feature nowadays, especially on the top end cards.

I still see no reason to upgrade to vista yet for the average gamer (still using XP myself). The big hyped dx10 and dx9 differances are just not there yet in current games. And no, I'm not going to put any stock into crysis dx10 vs dx9 screenshots before the game is released and tested by a 3rd party. I still remember the stunt they and nvidia pulled with the screenshots that supposedly showed farcry shader 3.0 differances from a patch, and it turned out it was NO shaders in one screenshot and full 3.0 shaders in another, so it was a VERY deceiving screenshot that was basically made to scare people away from ATI's non shader 3.0 products..
Try testing the newer games...
Since you are testing under Vista, how could you not include the Halo 2 for vista only?
...
Are you freaking kidding me? WTF is the point of testing a 3+ years old xbox 1 game! The only reason MS released it as vista only was to try and increase sales of vista. The only technical reason they did so was the OMG new installer which installs the game in the background as you play, which hmm... steam has had in valve games for YEARS now. So even that isn't really even something that you need vista for.
MS did NOTHING to it to release it for vista, heck they even nerfed it compared to the xbox version (just like they did with halo 1 vs xbox version).
It is NOT, I repeat NOT a dx10 vista game. In fact, I've heard there's even a crack floating around that lets you run the game in XP.
And since when is a 2004 game designed for an obsolete console a "newer game" ?
 
Since the only purpose of this series is video decoding can we get more info on that?

A: Does it work for downloaded video files (Trailers, downloaded x264 mkv files etc?).
B: Does it work under windows XP?
C: Does it work in generic players like Media Player Classic.

Because if all this does is offer BluRay/HDDVD playback in proprietary players it is near useless to me.

If has more general applicability I might be interested in adding one of these to extend the life of AGP box by improving the video capabilities. It would still be faster than my 9700Pro hopefully?
Here's some generic things from my experiance with HD stuff.
C) Using MPC you can select what decoder to use, and if you pick the proper powerdvd decoder, you can get hardware decoding assistance on playing files in MPC. However, often you get just as good as performance using a software decoder like CoreAVC. Supposedly CoreAVC will be adding hardware GPU support in the future, so that should be the best thing to use if that version is ever released. As a side note, nvidia sells a special DVD decoder that uses thier purevideo hardware accelleration, and I belive you can choose that decoder in MPC if you buy that, but I'm not sure, I haven't been willing to shell out extra money for that. I have no idea if ATI has something similar or not.

B) Yeah, usually the hardware decoding works in XP. But you have to check the specific driver and specific video card, for instance the nvidia XP drivers right now for the geforce 8 series are crap, they dont support purevideo for most geforce 8 series cards (I think they added 8800 purevideo XP support in latest beta, but I'm not sure, its ambiguous in the driver notes). Its INEXCUSABLE for nvidia's drivers to be in this state for XP! I mean the cards have been out for a while now.
If anyone can tell me for sure 100% that the 8800 series driver supports purevideo HD acceleration I'd be very grateful.
http://www.nvidia.com/object/winxp_2k_158.22.html
"PureVideo™ HD support is currently only available on Microsoft Windows Vista for GeForce 8600, 8500 and 8400 GPUs. PureVideo HD support for Windows XP will be available an upcoming driver."
To me I can read that either way, that anything not on that list is accellerated, or that purevideo just isn't supported on XP for 8 series.

A) As for downloaded files, it just depends. If the MKV file (or whatever file) can use the proper decoder (the particular powerdvd one) then yeah, its hardware decoded. If its a quicktime file (which most trailers are unfortunately) I haven't seen any signs of hardware decoding for that.
 
Looks like all those people posting in the last few ATI article threads saying how much better the ATI stuff would be if only [H] had tested under Vista and DX10 were spewing loads of nonsense. Even using vista 64 bit didn't help ATI.

Naw...now they'll all bitch'n'moan about how [H] should have used 32-bit Vista! :p
 
We will probably do more GTS's. We'll take your request into consideration, but typically we do focus on the video card being evaluated, since the one we are comparing it to will usually already have its own evaluation where we OC'd it. This one didn't because we haven't evaluated that XFX card yet.

Thanks for the response, Brent. I'll take this opportunity to say that I have eagerly awaited the [H] real-world approach to both Vista gaming and DX10 gaming, and I was not disappointed by this debut article. Keep the good stuff coming--I'll try to do my part by clicking away on those banner ads!:)
 
All I have to say to ATI is: "better luck next year".

Will there be a next year? I doubt we'll see an R800, maybe not even an R700. That idiotic acquisition has very likely doomed the future to an Nvidia monopoly on mid and high end graphics.
 
Will there be a next year? I doubt we'll see an R800, maybe not even an R700. That idiotic acquisition has very likely doomed the future to an Nvidia monopoly on mid and high end graphics.
I doubt that will remain the case on mid end. It might be the case on high end.

However, its quite possible in the future that nvidia will capture the high end market and ATI will capture the OEM and midrange markets, either one of which far outnumbers the high end. So if that happens, ATI will end up making bucketloads more money than nvidia. If ATI can pull off the gpu/cpu integration, this scenario is far more likely to occur.


Despite the fact I prefer nvidia cards, I hope that ATI comes back and kicks nvidia's behind next generation. Why? Because without competition, nvidia won't feel the pressure to improve nearly as much between generations, and prices will shoot up in the high end. Ideally you'd have each manufacture be extremely close in price/performance at the high end, or have the companies alternate in kicking each other around every other generation.
 
Will there be a next year? I doubt we'll see an R800, maybe not even an R700. That idiotic acquisition has very likely doomed the future to an Nvidia monopoly on mid and high end graphics.

I would wait and see what you might get THIS year....
 
Very nice review, good to see some of the mid-range offerings. Not so good to see the HD2000 lineup being a complete waste of everyones money.

All the new games were nice addons too, good to see some newer games up for reviews. Although when you talked about Lost Planet and how the controls have been optimized for PC, did they change the XBOX controller pictures and "Press the red B Button to continue" pics from the demo? That was really ridiculous.... Straight from console port. At least they did good with the graphics.

Also, I'm guessing that BF2142 is a popular game, but it doesn't seem like it's very stressful on newer video cards, even these mid-range cards handled it rather easily. Maybe a newer, more graphics-intensive game. I mean, the 8800's handle it w/ 16xAA supersampling, right? Not stressful in the least....

Other than that, awesome review, I'm always amazed to see how much work goes into your reviews.

Also not so surprising but quite amazing there is ALWAYS someone with the same response on every freaking video card review "What about this game? You biased assholes are paid by NVIDIA!!!"

:rolleyes:
 
Also, I'm guessing that BF2142 is a popular game, but it doesn't seem like it's very stressful on newer video cards, even these mid-range cards handled it rather easily. Maybe a newer, more graphics-intensive game.
I imagine that Quake Wars may be in the cards for a later review, depending upon its popularity. It's not mindbendingly more intensive than 2142, but it's always good to get more OpenGL titles in evaluations, and it's in the same "class" of game as the Battlefield series.
 
ooh.. you know something we don't??? =)

I think he is betting on a new midrange card from ATI.

I would say it is a race to see who delivers the decent midrange with 256 bit interface first and whether NV still kicks ATI's ass in the midrange.

Odds are NV will be first (since they have had a lot more time).

It also seems likely NV will be fastest as well, as a simple respin/downsize wont do it for ATI.
 
Didn't the same type of thing kind of happen last generation, at least for ATI? X1600Pro basically sucked bad, the X1650Pro came out which was basically a better X1600XT and the X1650XT competed nicely with the 7600GT.
 
I wonder. The suggestion has been that the original intentions for the r600 included much higher clock speeds, which would have made it competitive with G80. If r650 at 65nm is able to achieve those speeds, it may have a big impact. Big enough to stave off G9x? That's a different story.

To add some spice to it, consider IIRC that r600 clocks the shaders in sync with the rest of the GPU. That sounds just dumb compared with G80, where they run double or more, an approach they laid the groundwork for in the different clock domains of the G70. Why would ATi not crib the idea and gain the same advantages, especially when they were planning to run AA on the shaders? It doesn't sound so short-sighted if their idea was to run the rest of the core so fast that it and the shaders could both scream along at close to G80's shader speed. But instead of both being wicked fast, when the process and circuit design didn't work out, they were both woefully slow (compared to the target).

That doesn't mean that it wasn't an unwise and ultimately disastrous gamble, whereas NV's uncoupled clock domains allow it to tweak throughput for whatever works best and whatever the real-world silicon allows. But if a revision could allow the ATi core to rev up the speed and thus the shaders, the game might change. They might even be able to introduce separate clock domains of their own.

Again, potentially rendered (no pun intended) irrelevant if G9x moves the goal posts farther back in a big way.

Re: The battle between 7600s and 1600s, I think (and hope) that the Fall round will be more like 7900GS vs. 1950Pro.
 
I understand that many think we are being rather hard on ATI lately. In fact it does seem almost silly having to keep repeating this doom and gloom in regards to the HD 2000 series. Honestly, we feel the same way, we wish we could say the Radeon HD 2000 series is better at gaming than the GeForce 8 series; we gladly would, if it was. The facts are though, that when you compare the gaming performance you get for the price, the GeForce 8 series are simply a better value.

Sadly, that pretty much sums things up this round. Barcelona will also be a flop. Things are not good for Amd, nope, not good at all...


Ply
 
I understand that many think we are being rather hard on ATI lately. In fact it does seem almost silly having to keep repeating this doom and gloom in regards to the HD 2000 series. Honestly, we feel the same way, we wish we could say the Radeon HD 2000 series is better at gaming than the GeForce 8 series; we gladly would, if it was. The facts are though, that when you compare the gaming performance you get for the price, the GeForce 8 series are simply a better value.

Sadly, that pretty much sums things up this round. Barcelona will also be a flop. Things are not good for Amd, nope, not good at all...


Ply

The first part of your post is probably the highlight of the the entire thread, but why the need to brand a product that hasnt even been released as a flop. That is like saying the street is wet because of the rain. With the Barcelona, not you, not me, not even the reviews at the [H] are going to determine the success of it. AMD's ability to get DELL, HP, Levino, IBM and Toshiba to use it is going be the deciding factor. People who read sites like this are the minority in the grand scheme of things .
 
The first part of your post is probably the highlight of the the entire thread, but why the need to brand a product that hasnt even been released as a flop. That is like saying the street is wet because of the rain. With the Barcelona, not you, not me, not even the reviews at the [H] are going to determine the success of it. AMD's ability to get DELL, HP, Levino, IBM and Toshiba to use it is going be the deciding factor. People who read sites like this are the minority in the grand scheme of things .
Just to clarify, he's quoting from the review (which is probably why he bolded and underlined it)
 
Hmmn my bad, didnt read it other then the UVD section, already knew basically what it was going to say.
 
Listen, AMD will survive. They've been not making money for years. They used to be a second rate chip manufacturer building cheap-knock off Pentium class CPUs and later on they built a comparable processor on a shitty platform (super-7, early slot A, socket A) and they still managed to come up with a superior and very competitive processor line. (Athlon 64, Opteron, X2 etc.) Given time the cycle will repeat itself. All this doom and gloom stuff isn't helping and it's probably dead wrong. AMD and ATI aren't going anywhere. If there is one thing AMD does well it's stay afloat.

ATI dropped the ball this generation. Any of you who have seen NVIDIA in action back during the early days should NOT be surprised by this. NVIDIA normally dominated their competition and only since R300 have they had to really compete with anyone. For those fo you who remember ATI and NVIDIA back in the day, did you really think ATI could keep up forever? The balance of power inevitably shifts from time to time.

ATI has dropped the ball, but I don't prefer to think of the ATI HD 2000 series as something that's horrible, it surely isn't, but what has happened here is that NVIDIA released a product that is simply "that good" and very hard for anyone to counter. Given time the balance of power will shift again, then shift back. It always has and always will.
 
In the review section for lotro, I am sure you guys did it but it wasnt mention and probably should be. That with ATI cards you need to go in to the UserPrefernces config file and change "AllowFakeFullScreen=True" to "AllowFakeFullScreen=False" or ATI cards will not change to 3D speed and run the game at the standard 2D core and memory speeds.

EDIT: Here is the Knowledge Base Link it is for the X1950 but works the same for the HD2000 series
 
In the review section for lotro, I am sure you guys did it but it wasnt mention and probably should be. That with ATI cards you need to go in to the UserPrefernces config file and change "AllowFakeFullScreen=True" to "AllowFakeFullScreen=False" or ATI cards will not change to 3D speed and run the game at the standard 2D core and memory speeds.

That's pretty wild--wonder why they have a "fake full screen" mode?
 
In the review section for lotro, I am sure you guys did it but it wasnt mention and probably should be. That with ATI cards you need to go in to the UserPrefernces config file and change "AllowFakeFullScreen=True" to "AllowFakeFullScreen=False" or ATI cards will not change to 3D speed and run the game at the standard 2D core and memory speeds.

EDIT: Here is the Knowledge Base Link it is for the X1950 but works the same for the HD2000 series

Good info, but remember from the review they said that the card lacked 2d and 3d speeds which is why it ran so hot.
 
In the review section for lotro, I am sure you guys did it but it wasnt mention and probably should be. That with ATI cards you need to go in to the UserPrefernces config file and change "AllowFakeFullScreen=True" to "AllowFakeFullScreen=False" or ATI cards will not change to 3D speed and run the game at the standard 2D core and memory speeds.

EDIT: Here is the Knowledge Base Link it is for the X1950 but works the same for the HD2000 series
I don't see where it applies to the HD2000 series, but even if it did, thats only on a single game. Doesn't explain the rest :)
 
The 2600 XT doesn't have seperate 2D/3D speeds to begin with, so it is moot. It is always running at 800 MHz/2.2 GHz.
 
Given time the balance of power will shift again, then shift back. It always has and always will.

I don't think so. History repeats itself until it doesn't, so past pattern has no guarantee of repetition, especially in a field as volatile as technology.

Further to that, there is no real history of power shifting back and forth. The real pattern is perpetual second fiddle making good, essentially once.

The real pattern:

ATI was perpetual second fiddle in 3d graphics until they won the bidding for ArtX. ArtX people built the R300 which saved ATI's bacon and got them back in the game. They have essentially been riding variations on R300 ever since. A few more units, more bandwidth, but essentially the same architecture. R600 is the first real new architecture since then and it is a bust. I would bet more on ATI return to second fiddle role than producing another R300 like shift in power in their favor. There is no ArtX to acquire and re-invigorate the company. NV has continued it's aggressive in-house R&D and is now firmly in the drivers seat again.

AMD was similarly perpetual second fiddle in CPU's starting as second source of 8086, they didn't have a real competitor until Athlon. Athlon was the result of the collaboration of the NexGen acquisition team and the Dec Alpha Engineers that joined later. Athlon was the first competive design, they enhanced this to Athlon 64 and soundly triumphed as netburst hit the wall. Intel went back to the drawing board. Refocused R&D and is firmly in the drivers seat again.

So neither of the now AMD branches have a history of repeated success, they have a history of perpetual backseat with one architecture win that the rode for a while. That win refocused the traditional leaders to redouble their efforts. I doubt there will be a shift back in the foreseeable future (~5 years).

AMD-ATI is less likely to produce the killer CPU or killer GPU than they were alone, they now have limited resource split three ways, standalone CPU, standalone graphics, and Fusion, which will likely take a while to become a competitive real world solution.

Meanwhile Intel has huge resources and a renewed commitment to R&D on CPU architecture, with very short design cycles (tick-tock). NV has more resources than AMD and it is all directed to being number one in graphics.

AMD is essentially betting the future on Fusion. I don't expect them to lead in standalone cards again. Ever. The question will be does Fusion ever beat standalone cards.
 
I don't think so. History repeats itself until it doesn't, so past pattern has no guarantee of repetition, especially in a field as volatile as technology.

Further to that, there is no real history of power shifting back and forth. The real pattern is perpetual second fiddle making good, essentially once.

The real pattern:

ATI was perpetual second fiddle in 3d graphics until they won the bidding for ArtX. ArtX people built the R300 which saved ATI's bacon and got them back in the game. They have essentially been riding variations on R300 ever since. A few more units, more bandwidth, but essentially the same architecture. R600 is the first real new architecture since then and it is a bust. I would bet more on ATI return to second fiddle role than producing another R300 like shift in power in their favor. There is no ArtX to acquire and re-invigorate the company. NV has continued it's aggressive in-house R&D and is now firmly in the drivers seat again.

AMD was similarly perpetual second fiddle in CPU's starting as second source of 8086, they didn't have a real competitor until Athlon. Athlon was the result of the collaboration of the NexGen acquisition team and the Dec Alpha Engineers that joined later. Athlon was the first competive design, they enhanced this to Athlon 64 and soundly triumphed as netburst hit the wall. Intel went back to the drawing board. Refocused R&D and is firmly in the drivers seat again.

So neither of the now AMD branches have a history of repeated success, they have a history of perpetual backseat with one architecture win that the rode for a while. That win refocused the traditional leaders to redouble their efforts. I doubt there will be a shift back in the foreseeable future (~5 years).

AMD-ATI is less likely to produce the killer CPU or killer GPU than they were alone, they now have limited resource split three ways, standalone CPU, standalone graphics, and Fusion, which will likely take a while to become a competitive real world solution.

Meanwhile Intel has huge resources and a renewed commitment to R&D on CPU architecture, with very short design cycles (tick-tock). NV has more resources than AMD and it is all directed to being number one in graphics.

AMD is essentially betting the future on Fusion. I don't expect them to lead in standalone cards again. Ever. The question will be does Fusion ever beat standalone cards.

You bring up some good points and I am totally aware of all the things you recapped regarding each companies histories and purchases.

I do think that eventually AMD/ATI will find some other startup or fledgling company with something they can use. It will happen, it's just a question of when or if they can do it before Intel grabs them. It may take years but some company will surpass Intel again even if only for a short time.
 
You bring up some good points and I am totally aware of all the things you recapped regarding each companies histories and purchases.

I do think that eventually AMD/ATI will find some other startup or fledgling company with something they can use. It will happen, it's just a question of when or if they can do it before Intel grabs them. It may take years but some company will surpass Intel again even if only for a short time.

I just wanted to comment somewhere because I always see people say it is a constant back and forth seesaw battle, but in reality it isn't like that at all, and tech companies that were top competitors in the past stumble and fail all the time.

I have been following the CPU/Graphics tech wars since I bought my Amiga 20+ years ago.

RISC was supposed to kill CISC (and x86), there was X86 vs 68K for the longest time, then 68K fell hopelessly behind and never recovered. Motorola/IBM then brought out PPC which had some success, but eventually Apple abadoned even that for Intel as well. Intel has dominated from the IBM PC onwards, the only serious competitor was Athlon, the only superior product was Athlon 64. This lit a fire under Intel that will probably have them steaming for the next decade.

Graphics is littered with even more corpses. Matrox 2d king never really made the transition to 3d, and while they amazingly avoid bankruptcy, they don't really exist in the graphics card market in any significant manner, S3 similarly fell, 3dfx king of 3d fell apart. There are no guarantees that even a second place competitor will survive.

AMD essentially bet the company on Fusion. If it pays off, they could have another Athlon 64 moment, that might force a NV/Intel merger, if it fails it could destroy AMD. This in my opinion is the interesting bit to watch unfold, because Intel will dominate CPU again, while NV dominates graphics again until Fusion is (or isn't) something to be reckoned with.
 
AMD essentially bet the company on Fusion. If it pays off, they could have another Athlon 64 moment, that might force a NV/Intel merger, if it fails it could destroy AMD. This in my opinion is the interesting bit to watch unfold, because Intel will dominate CPU again, while NV dominates graphics again until Fusion is (or isn't) something to be reckoned with.

But until then, if AMD/ATI can't make something half-way competitive, there won't be anymore Price Wars, and we'll have to start paying more than what we should for our computer parts again.

Hopefully not, I really hope not...
 
But until then, if AMD/ATI can't make something half-way competitive, there won't be anymore Price Wars, and we'll have to start paying more than what we should for our computer parts again.

Hopefully not, I really hope not...

Even in the absence of competition Intel does have reason to keep the processor pricing in check to a degree. If they make them too expensive then people won't buy them or at best they'll only by the lowest end and cheapest.
 
Actually, ATI only respun the R300 with the R420 and R480's, doubling pipelines and added SM2.0 support was the only difference, either way those cards kicked ass and they didn't have to do alot of development with it. The R520/R580 was completely different, even down to the memory controller, it wasn't relying on their old R300 technology at all, and it kicked some serious ass, so calling out ATI on having one subpar performing product as an indication of their future really doesn't have much ground to stand on.
 
Back
Top