[H] ATI X1000 (R520) Series Eval.

Very informative review. In response to the apples-to-apples comparisons, it appears that every test was A-to-A since the cards perform virtually the same. I don't believe omitting that comparison detracted from the review at all.

I understand the [H] is fed up with being ATI's pr for paper launched products, but its still important to get review info about launched cards out to the consumer (hard or paper) so that they can make future buying decisions. Lots of people have been waiting to see the R520XT performance before upgrading video cards, and those people (myself included) will be looking forward to a proper [H] review as well. Few other sites offer the same quality as you guys.

That said, the reasoning behind the card choice makes sense. ATI has alot of nice features this round and made some significant strides from their last generation of cards while keeping good performance.
 
My 2 cents.

As a long time reader (many years), I fully support the concern about playable settings and I understand the [H] approach. We all know that neither Nvidia nor ATI can be trusted. Big money is at stake.

However, I have come to HATE the playable settings graphs. I want comparisons. I want a videocard cage match! I want a knock down drag out!! I want to see someone get their ass handed to them!!! And do I get any of that from those graphs? NO!!!!

Alright, I am being a little facetious :) , but I'm serious. By mixing the settings with which each card is being run, we lose visibility into the performance deltas.

My recommendation? Do the playable settings assessment for the faster cards on a game by game basis. Then, try to run ALL the cards with those settings for each game. The low end cards get spanked around, but that is life. Then you would know what performance improvement you are getting for that extra $100 or so.

The high end card gets 50fps at 1600x1200,8xAA,16xAF and the low end card only gets 22 FPS? Good comparison, now I can decide if I want to spend the extra cash.

Keep the playable settings summary table. That is a good summary of your playable settings recommendations and lets readers get a "one stop shopping" summary. Good information.
 
I think that this is a decent showing on ATI's part. I just wish that it hadn't taken them so long to get it to this point. There may even be some more performance in these cards with different drivers. Now, just get the damned things on the shelfs!!
 
Teufel said:
After reading around 10 reviews, it becomes clear that the new ATI cards got _by far_ the worst benchmark results in this review here.
I read in all other reviews that the R520 perform ~20% faster in BF2, but just in this review I have to read that the R520 performs ~17% worse at the same parameters!

I can't help, but I definitely get the impression that HardOCP reports are influenced by unprofessional emotional feelings after hearing that they called ATI "lying canucks" and that they compare overclocked Nvidia cards with ATI cards clocked according the GPU manufactor specification.


Let me point out that we DO NOT USE BENCHMARKS because we believe they do not show you what you can expect in realworld gameplay. Your posting above goes to further my thoughts that we are very correct in the way we are doing things.

As for your impressions, the author of the article and the person posting about Canucks are two different folks. As for comparisons we used the best selling NVIDIA cards in the retail North American channel for the comparison.

Go try to start trouble elsewhere, you're boring me.
 
pbj75 said:
My 2 cents.

As a long time reader (many years), I fully support the concern about playable settings and I understand the [H] approach. We all know that neither Nvidia nor ATI can be trusted. Big money is at stake.

However, I have come to HATE the playable settings graphs. I want comparisons. I want a videocard cage match! I want a knock down drag out!! I want to see someone get their ass handed to them!!! And do I get any of that from those graphs? NO!!!!

Alright, I am being a little facetious :) , but I'm serious. By mixing the settings with which each card is being run, we lose visibility into the performance deltas.

My recommendation? Do the playable settings assessment for the faster cards on a game by game basis. Then, try to run ALL the cards with those settings for each game. The low end cards get spanked around, but that is life. Then you would know what performance improvement you are getting for that extra $100 or so.

The high end card gets 50fps at 1600x1200,8xAA,16xAF and the low end card only gets 22 FPS? Good comparison, now I can decide if I want to spend the extra cash.

Keep the playable settings summary table. That is a good summary of your playable settings recommendations and lets readers get a "one stop shopping" summary. Good information.

Other reviews do apples to apples. Only [H] does it this way. If you want to see all those graphs go to toms or anand or the tech report.

I thought the review was good. In light of your comments on ATI paper launching the XT, I thought it was good of you to leave off the card that won't be available until november at the earliest.
 
pbj75 said:
However, I have come to HATE the playable settings graphs. I want comparisons. I want a videocard cage match! I want a knock down drag out!! I want to see someone get their ass handed to them!!! And do I get any of that from those graphs? NO!!!!

If you paid attention, one company DID get its ass handed to them:

http://www.pricewatch.com/m-37.htm
 
Jonsey said:
Other reviews do apples to apples. Only [H] does it this way. If you want to see all those graphs go to toms or anand or the tech report.

Exactly my point. I don't want to go to other sites.

I know [H] won't give me the standard PowerPoint PR numbers and they definitely won't pander to the mfgs. I'll get a true assessment. I just think there needs to be more of a comparison.
 
Teufel said:
After reading around 10 reviews, it becomes clear that the new ATI cards got _by far_ the worst benchmark results in this review here.
I read in all other reviews that the R520 perform ~20% faster in BF2, but just in this review I have to read that the R520 performs ~17% worse at the same parameters!

I can't help, but I definitely get the impression that HardOCP reports are influenced by unprofessional emotional feelings after hearing that they called ATI "lying canucks" and that they compare overclocked Nvidia cards with ATI cards clocked according the GPU manufactor specification.

I see your new here. Let me just say I've been here before the days of the ATI 9700. With that being said, I want to let you know that HardOCP was once called ATI's PR bitch, HardOCP was viewed as an ATI !!!!!!. In short, MANY thought HardOCP was a little bias. Funny how things are reversed now.

It may be a little emotional but let me tell you this fact, all people that enjoy ATI's products are still bitter over the whole price gouging, lack of availability, ATI X800XT series. I paid $650 for the X800XTPE. What a mistake.

ATI dropped the ball bigtime and from this review they really didn't make much of a comeback.

Personally I feel we waited all this time for a product that sucks ass. If your this fucking behind you better present a product that kicks ass and takes names. Here, they delivered a product that licks ass, not kicks it.

The people that got on the boat for nvidia's SLI a year ago definetly got on the right boat. I regret I didn't do the same. What a goddam waste of time and I'm embarrased that I am an ATI !!!!!!.
 
pbj75 said:
Exactly my point. I don't want to go to other sites.

I know [H] won't give me the standard PowerPoint PR numbers and they definitely won't pander to the mfgs. I'll get a true assessment. I just think there needs to be more of a comparison.

There's only so much review resources. You can't have everything. Would you rather have [H] do what everyone else does, or give you a different piece of the puzzle?

Really, this has been debated dozens of times.
 
Personally, I'm looking forward to what can be done with driver optimizations. It seems like R5xx is designed with further driver tweaking in mind, so let's hope we see this gen grow some long legs.
 
Teufel said:
After reading around 10 reviews, it becomes clear that the new ATI cards got _by far_ the worst benchmark results in this review here.
I read in all other reviews that the R520 perform ~20% faster in BF2, but just in this review I have to read that the R520 performs ~17% worse at the same parameters!

I can't help, but I definitely get the impression that HardOCP reports are influenced by unprofessional emotional feelings after hearing that they called ATI "lying canucks" and that they compare overclocked Nvidia cards with ATI cards clocked according the GPU manufactor specification.

That's because all the other sites out there use plain-jane apples to apples benchmarks. They don't mess with SSAA, HQ AF, nothing, they just run the same benchmark at the same resolution across all cards. They are, quite frankly, lazy reviews. HardOCP tries out multiple combinations of quality filtering, trying to find the best median for each invididual card, instead of just inputing the same basic settings and letting it fly. Their methodology adds a huge amount of additional time that needs to be done both per card and per benchmark, but it creates a true real-world experience.

I don't know about you, but this is exactly what I do at home. I don't just run regular 4xAA/8X AF on every single game I have, I vary between all kinds of different AA and AF modes, THEN mess around with MSAA/SSAA, Gamma-Correct AA, in addition to all the avaliable in-game settings that can be fucked around with.

After reading [H] reviews for about a year now, it's pretty blatantly obvious how shitty (and as I said, lazy) the methodology used by other sites really is.
 
SgtSweatySac said:
That's because all the other sites out there use plain-jane apples to apples benchmarks. They don't mess with SSAA, HQ AF, nothing, they just run the same benchmark at the same resolution across all cards. They are, quite frankly, lazy reviews.

It makes you wonder what you're suppose to gleam from those reviews when they're so "cookie cutter." I guess the [H] crew should go back to running 3DMark again as well since it tells us so much.

Honestly, apples to apples doesn't tell me anything other then who I should cheer for. :rolleyes:
 
And the other thing to consider is this: as ATI has proven with the whole Crossfire debacle, the possibility exists that these are very specifically cherry-picked samples that have been sent from ATI. [H] gets them, and ATI fails to deliver on the retail end. With the 7800GT and GTX, everyone had the exact same cards in their hands on the same day.

I've said it before and I'll say it again, if they still weren't having production problems, we'd have X1800XT's coming out our assholes like we did with the 7800GTX. I'll still throw down good money on a repeat of "Phantom Edition".
 
CrimandEvil said:
I guess the [H] crew should go back to running 3DMark again as well since it tells us so much.

Someone a few days ago posted a link showing how a Pentium 2 350MHz box running a 9700 would knock around a 2.4 P4 running a 9600 in 3DMark 03. I haven't trusted that POS ever since 03, and I sure as hell don't have any confidence in 05.
 
Kyle, you failed me for the last time. It has become so abundantely clear reading your reviews over the last year who pays the bills for [H]ardocp. Although the X1800XT wont be available for public purchase, your readers want benchmarks so they can decide if the XT is worth the wait. You stated the XT is nothing to get excited about? Reading the more objective reviews out there, its clear ATI has a real winner on its hand, outperforming the GTX with less pipelines!!! You and Brent are full of $#%$!
 
I enjoyed the preview very much.

I like the fact that I have some time to build the capital I need to purchase the Cross Fire motherboard and the X1800 XT.
I can't wait to see what two X1800 XTs (or even XT-PEs) will do for me!
 
DandyBear said:
Kyle, you failed me for the last time. It has become so abundantely clear reading your reviews over the last year who pays the bills for [H]ardocp. Although the X1800XT wont be available for public purchase, your readers want benchmarks so they can decide if the XT is worth the wait. You stated the XT is nothing to get excited about? Reading the more objective reviews out there, its clear ATI has a real winner on its hand, outperforming the GTX with less pipelines!!! You and Brent are full of $#%$!
Makes me laugh, really.
DandyBear said:
Kyle, you failed me for the last time.
For some reason, this reminds me of "Jabba, this is your last chance. Free us, or die!"

I don't know. I like ATi as much as anybody, but I'm looking at the other reviews and I'm not really seeing the GTX getting stomped, except in certain situations. I would say, from where I sit today, the XT is slightly faster than the GTX, but not by an overwhelming amount.
 
Perhaps I missed it, but what sort of core temperatures are the 2 reviewed cards seeing?
 
jebo_4jc said:
Makes me laugh, really.
For some reason, this reminds me of "Jabba, this is your last chance. Free us, or die!"

Lol, yea me too, although I wasnt reffering to Return of the Jedi. Im just fed up with the reviews provided here.
 
DandyBear said:
Kyle, you failed me for the last time. It has become so abundantely clear reading your reviews over the last year who pays the bills for [H]ardocp. Although the X1800XT wont be available for public purchase, your readers want benchmarks so they can decide if the XT is worth the wait. You stated the XT is nothing to get excited about? Reading the more objective reviews out there, its clear ATI has a real winner on its hand, outperforming the GTX with less pipelines!!! You and Brent are full of $#%$!
How did Kyle fail? I don't see it.

HardOCP reports real-world gaming performance results, not shitty synthetic benchmark scores. ;) And if anyone honestly thinks that a 16 pipe card will outperform a 24 pipe card by extreme leaps and bounds once you crank up Resolution, HDR, AA, AF, etc then you are out of your mind! I'm not saying the XT won't be faster than the GTX, I'm just saying no one should be getting their hopes up that it will be faster by a huge margin in real-world gaming performance results. Of course, this might change with future driver optimizations (as NV has proven time and time again and are the masters of doing just that).

I applaud ATI for finally beinging a product that has comparable features to the 6000/7000 series NV parts, but I flip them the bird (once again) for not having product ready to ship in quantity to consumers. :mad:

I'm staying on the green side of the fence this time around.:D
 
All that extra time (and still not available at launch) + 625/1500 clocks and its only better than the 7800GTX sometimes in DirectX games basically. It looks like it has potential with later games & possibly later driver revisions to optimize for the new hardware.

NO AVAILABILITY.... not even pre-orders!
liars.jpg


But overall kinda disappointing.

Now, lets see if Nvidia counters with a souped up 7800GTX.... ie: 7850 Ultra.... 512Mb, 500/1500 clocks, etc...
 
chrisf6969 said:
Now, lets see if Nvidia counters with a souped up 7800GTX.... ie: 7850 Ultra.... 512Mb, 500/1500 clocks, etc...
Soon, I hope!
 
chrisf6969 said:
. It looks like it has potential with later games & possibly later driver revisions to optimize for the new hardware.
500/1500 clocks, etc...
Im still waiting for non beta drivers, then seeing preformence.........

not that imma do a lot of searching, but what was the diffrence from beta drivers for the 7800gt vs the newest? :p

and it seemed kinda weird how they put all the first games that the ati cards sucked at, then in the middle put the ones they were good at, then at the end the ones they sucked at....

i cant take this emoational roller coaster :( *runs and cries away*

to be on topic, thanks for the review, im not gonna be upgrading, nor would i........my x800gto2 will be fine for the next 2 years for me ;)

soulsaver
 
jebo_4jc said:
Does Brent really believe this:
If I had to choose between a 6800U and a x850xt, I would choose the XT....I realize the 68xx had SM3 and all, but that isn't that big of an advantage.

Technically I do yes, SM 3.0, FP16 blending, FP32 precision in the pixel pipeline, it's a whole generation ahead of R420.
 
DandyBear said:
Kyle, you failed me for the last time. It has become so abundantely clear reading your reviews over the last year who pays the bills for [H]ardocp. Although the X1800XT wont be available for public purchase, your readers want benchmarks so they can decide if the XT is worth the wait. You stated the XT is nothing to get excited about? Reading the more objective reviews out there, its clear ATI has a real winner on its hand, outperforming the GTX with less pipelines!!! You and Brent are full of $#%$!

I would like some XT benchmarks as well but asking for them politely is much better than insulting Kyle any day.

Also he never "stated the XT is nothing to be excited about".
He presented facts about the XT's availability though. If facts upset you, go play in your own reality.
And as far as being objective, I don't see any use of the word "I" or any clear cut opinions in the preview.
It is clear that that he is upset with ATI's empty promises but he has not deemed their products to be bad, he has however condemned their business practices.
 
Arialis said:
Kyle what about 2d video quality? Any noticible differances between the two. And did you happen to check out video quality while playing any vides you may have laying around. I was looking at the VIVO that ATI has. What does nvidia have to compare to it?

Honestly I didn't try any video playback, time was a factor. I did however keep my eyes peeled to see of the new 10-bit DACs showed any improvement in 2D or 3D to my display. I honestly did not notice any differences in quality. It's cool they have it, and it could be a real good thing on other displays, I'm using an NEC MultiSync FE950+ CRT monitor for gaming so there may be better displays out there that do show differences. May be some LCD's that do as well with the better digital link, but most of the time things like that are subjective, one person simply may not notice any differences while another person does. I do like that they support the better dual-link and 10-bit analog interfaces.
 
jebo_4jc said:
I think you are really on to something with the way you always boiled the architecture discussion down to what it means to gamers. Sometimes I can get lost in all the babble about floating point precision and z compression and whatever else, and I would consider myself an above average [H] reader. I think a concentration on giving us the bottom line when it comes to this stuff is a really great idea, and deserves further exploring.

I can see in your articles how your brains are constantly working on how to improve your articles, and I think your efforts show in the quality of the work you have been putting out lately. Thanks for not sitting on your heels.

wow, thank you very much

I know how alot of the real technical stuff can get boring, gamers usually just glance over it or skip it all together so my goal was to bring it back around and explain it to them in a easy to understand way directed squarely what those features mean to the gamer. I'll make sure to expand on this.
 
Good review.

Months later ATI has finally produced a competing card. Oh wait, I can't even buy it yet. Good Job ATI :rolleyes: . I'm still using a 6800gt but I think I will just go ahead and get a 7800 anyways on principle. It really meant a lot to me that the day I read the review for the 7800 it was available on Newegg and I'm not going to forget that. Let's just hope that the days of ATI screwing up are over and we finally have two good product lines to choose from again, but for this generation I'm staying Nvidia.
 
R1ckCa1n said:
I like how you guys stood strong on the X1800XT part. Was it more that you couldn't get one or just stood on your morals?

Oh I have one alright, it's sitting on a table right here beside my desk.

We chose not to evaluate it in this review which we explained in the review.
 
DandyBear said:
Kyle, you failed me for the last time. It has become so abundantely clear reading your reviews over the last year who pays the bills for [H]ardocp. Although the X1800XT wont be available for public purchase, your readers want benchmarks so they can decide if the XT is worth the wait. You stated the XT is nothing to get excited about? Reading the more objective reviews out there, its clear ATI has a real winner on its hand, outperforming the GTX with less pipelines!!! You and Brent are full of $#%$!

Thanks, your opinion is noted. Please let me know where I can pick up my check as I seem to be missing it.
 
[Disclaimer: Now that I look back I have only bought Ati cards, and there have only been two of them. One was a card from 1998 and the other is my X850XTPE AGP. But I went with whats best in my price bracket, and will continue to do so.]

What I liked. In the AA vs AA screenshots it looks, to me at least, that the Ati card does better with 4x AA. And in the 4x TA or what ever they call it is also looks better. Not as good as either of the cards at 6x (which seems to be identical screenies) but better in 4x vs 4x.
That is what I like to see. When these card are in our computers there will only be a few months where we can max everything out and still run at 1680x1050. So we will have to make cut backs. I usually do this by dropping AA down to the next level. Seeing how Ati looks better doing this (again, only my opinion) its nice. With future games this will be a must or FPS will take a huge hit.

There comes a time where FPS (frames per second, not first person shooter) don't matter. I think this is around 50-60fps (again one mans opinion). Then it gets down to image quality. People who just look at the fps vs fps in games are probably the people who look at only 3D mark scores as well. I am not one of those people. I look at the fps to make sure the card can run at a decent level. Then it comes down to the screenies.

With Ati looking tops in the 4x screenies I like it, and matching the 6x AA and all other add ons is nice. Doesn't mean I am getting either card (mine will hold me over for a good while) but I think that will be the future.

I think FPS will go the way of the MHZ race. Other things matter here, like overall image quality in association with fps. Just worring about FPS and 3D marks is foolish in my opinion.
 
jebo_4jc said:
Lol pwn3d

The dude did bring up a good point, though. Since x1800 beats 7800 with fewer pipes, that means ATi is getting more work done per pipe. Seems like the architecture has some real potential when they iron out the manufacturing process and enhance their drivers.


And you know what, we will sing their praises loud and clear when they EXECUTE on that. But until then, it is nothing but another idea in a PDF presentation IMO.
 
Really nice review there Kyle. Its very informative giving readers the not only pretty details, but hard core facts behind all the pretty details. Thats something allot of reviewers miss all together. As I read your review, I felt that there were two things that need to be worked on.
One was comparing similar resolutions between cards instead of chosing 1280x1024 for one and 1600x1200 for another. If your wanting to compare on both resolutions, then do it with both cards.
Secondly, use a true competator in your reviews. The X1800XL vs. the 7800GT-OC was not exacly a good match up. I realize you have a finite ammount of cards to review with on hand, but using directly competeing hardware is really where this review should have been.... X1800XL vs. 7800GT. I also realize the OC version doesn't really OC allot, but every little bit favors the results one way or the other.
I'm not a huge !!!!!! of ATi or Nvidia but I, like many others, was quite astonished that the new breed of ATi cards are doing nothing more that catching up instead of taking the next step up. It will be interesting to see where the official ATi drivers will take this new series of cards as official drivers usually have added tweeks that help provide better performance over beta drivers.
I'm looking to get a new PCI-E video card here in the next two weeks and I have been eyeing the 7800GT, but the X1800XL might sway my way of thinking once actually pricing, availability and final drivers are all worked out.
 
I'll be brutally honest - This preview is confusing to the point of being of no use to me, leaves alot of questions in my head.

Why would you need to defend your decision on how to review a product? Why compare the X1800 XL to the 7800GT OC? Where's the vanilla card? Why no overclocking results of the XL?

Sure, the graphs are informative and indicative of real world performance, but where are some X1800 XT numbers so we can decide to wait or not? There are plenty of people who would like to know if it's even worth waiting for.

Also, where's the list of links to other reviews that you usually provide?
 
autobot said:
Really nice review there Kyle. Its very informative giving readers the not only pretty details, but hard core facts behind all the pretty details. Thats something allot of reviewers miss all together. As I read your review, I felt that there were two things that need to be worked on.
One was comparing similar resolutions between cards instead of chosing 1280x1024 for one and 1600x1200 for another. If your wanting to compare on both resolutions, then do it with both cards.
Secondly, use a true competator in your reviews. The X1800XL vs. the 7800GT-OC was not exacly a good match up. I realize you have a finite ammount of cards to review with on hand, but using directly competeing hardware is really where this review should have been.... X1800XL vs. 7800GT. I also realize the OC version doesn't really OC allot, but every little bit favors the results one way or the other.
I'm not a huge !!!!!! of ATi or Nvidia but I, like many others, was quite astonished that the new breed of ATi cards are doing nothing more that catching up instead of taking the next step up. It will be interesting to see where the official ATi drivers will take this new series of cards as official drivers usually have added tweeks that help provide better performance over beta drivers.
I'm looking to get a new PCI-E video card here in the next two weeks and I have been eyeing the 7800GT, but the X1800XL might sway my way of thinking once actually pricing, availability and final drivers are all worked out.

This has been addressed hundreds of times here over the years. If you want apples to apples benchmarks, I suggest you visit www.anandtech.com. We find their value to be lacking.

As for the use of the BFGTech 7800 GT OC, it is the best selling retail card in North America. IMO there is not a better card to compare to.
 
anakin357 said:
I'll be brutally honest - This preview is confusing to the point of being of no use to me, leaves alot of questions in my head.

Why would you need to defend your decision on how to review a product? Why compare the X1800 XL to the 7800GT OC? Where's the vanilla card? Why no overclocking results of the XL?

Sure, the graphs are informative and indicative of real world performance, but where are some X1800 XT numbers so we can decide to wait or not? There are plenty of people who would like to know if it's even worth waiting for.

Also, where's the list of links to other reviews that you usually provide?

7800 GT OC = Best selling retail card in North America.

No OC as the samples we have a NOT retail samples and therefore may very well not reflect real world OC results, therefore, we do not share them.

All of your other questions have been answered eitehr on the front page at www.hardocp.com or in the article should have time to read it.
 
SgtSweatySac said:
I don't know about you, but this is exactly what I do at home. I don't just run regular 4xAA/8X AF on every single game I have, I vary between all kinds of different AA and AF modes, THEN mess around with MSAA/SSAA, Gamma-Correct AA, in addition to all the avaliable in-game settings that can be fucked around with.

Actually, that's EXACTLY what you do - run 4xAA/8xAF (or whatever) in every game you have - ATI doesn't give you a choice.

No game profiles in the drivers!

Now, when I had nVidia cards, I did do exactly as you state - messed around with the FSAA and Aniso settings for each and every game, and each game ran them differently, etc.

ATI doesn't support that, though. You've got one setting for D3D, one for OpenGL, and that's it - every game gets the same settings. And since I'm not going to spend 15 minutes to open the glacial CCC to change the FSAA/Aniso settings between each game, that's what it's stuck at. Using those settings on an ATI card is all about finding the the "lowest common denominator" - the best setting you can use that is playable on every single game. (Which makes 6xAA and 16xAniso pretty useless - what card can run FEAR at 1600x1200 with those settings?)
 
Back
Top