misrepresentation of the Fermi offerings

The problem with sites that simply bunch a lot of data into one review is that they don't actually rebench all those cards. As was mentioned in a lot of reviews, they were using outdated drivers for the 5870 (9.12... some even using 8.66 wtf??). There was a major revision going from 9 -> 10.

Also, is it really useful to see the 200 fps graphs in techpowerup? It's amusing, sure... but ultimately not useful at all.

This is true, and sometimes the data were even taken using the cutscenes.
 
The problem with sites that simply bunch a lot of data into one review is that they don't actually rebench all those cards. As was mentioned in a lot of reviews, they were using outdated drivers for the 5870 (9.12... some even using 8.66 wtf??). Now why would you ever do that... unless of course you didn't actually rebench the 5870 with newer drivers and simply used old numbers. There was a major revision going from 9 -> 10.

Also, is it really useful to see the 200 fps graphs in techpowerup? It's amusing, sure... but ultimately not useful at all.

This is why I'm such a fan of the Hardware Canucks reviews: they re-bench everything (10.3 drivers for ATI) *and* provide the mass amount of data that is desirable to decide from on a purchase like this.
 
This is why I'm such a fan of the Hardware Canucks reviews: they re-bench everything (10.3 drivers for ATI) *and* provide the mass amount of data that is desirable to decide from on a purchase like this.

Do they use real gameplay data or just some benchmarking tool?
 
Do they use real gameplay data or just some benchmarking tool?

Combo of both, some manual run-throughs, as well as *custom* timedemos (they do not use in-game canned benchmark tools as they have stated for Far Cry 2 for example is heavily optimized for on the standard "ranch" bench included with the game). This pretty much cuts out any driver silliness of optimizations for specific game segments.
 
If you have issue with our format, I can appreciate that; it is NOT for all hardware enthusiasts. That said you would likely have a better experience elsewhere praising other reviewers. I am not changing the way we do things because I truly believe in our methodology. You can get what you seek elsewhere. I am not going cookie cutter anytime soon. You can get that anywhere. I sleep well at night with my decisions that are weighed by millions of readers every month. I think we give great purchasing advice and I am proud of it.
Posted via [H] Mobile Device
 
Except it really isn't scientific and objective when you tell the companies making the two products the exact test you will be doing months before you review the product, which is essentially what canned benchmarks do. Hell, anandtech even benchmarked a damn cutscene in a review. How does a cutscene reflect the performance of playing the game?

The problem isn't whether or not it is subjective or objective, the problem is whether or not it is *accurate*. And canned benchmarks simply aren't accurate in what they test. Crysis's canned benchmarks, for example, have almost no relevance to playing the actual game. In the game you don't lazily fly around in the air, so why does it benchmark that? If a card gets a good score flying through the air in Crysis, does that mean it can handle explosions, shooting, up close textures/bump mapping, etc...? No, it doesn't, it just means it can fly through the air - which tells me nothing since that isn't what I do in the game.

So really you're graph would be more accurate if the "typical top class review" was well below the usefulness line - because they don't tell you how the cards will play games at all. They only tell you how well a card benchmarks, which isn't useful at all. Remember, many of those "typical top class reviews" showed the 2900XT as being a clear winner - the card benchmarked faster than the 8800GTX after all. [H] even got bashed for saying that the 2900XT wasn't any good, and people ragged about [H]'s testing methodology saying that it was flawed and wrong - just like you are now. Guess who was right? [H] was. We don't even think about it anymore, everyone knows the 2900XT was a flop.

Anandtech called the 2900xt a piece of crap:
http://www.anandtech.com/video/showdoc.aspx?i=2988&p=31

So stop making up stories. All I'm saying is you're trying to discredit other reviewers (who are prominent) that have been accurate and impartial (Kyle has been as well) to further a point you do not have.
 
yeah see I don't mind that sort of mixing in reviews.

I am going to get a 480 because now I know that it's utility isn't undermined by underwhelming game performance. The cards are going to be used for more and more things, and people are wanting to learn.

The nVidia stuff is quite accessible for people enthusiastic enough. Yes I understand the heat issue and everything, thats why I'm going to watercool it. I thought watercooling 2 GTX 280s on a PA120.3 was going to be overkill but that 40 degrees at load was an absolute life saver.

I am happy I bought a big radiator as overkill, since it will be a good reason to stay with WC for Fermi. I was gonna actually go back to air, but once I realized that it's just new blocks I figured the heat/noise being as-minimum would be best of all worlds. Agree?

What exactly are you intenting to use it for? I can't think of any real practical applications for myself right now for using Fermi for CUDA, GPGPU unless you are big on folding, etc.
 
Yes, at a resolution (2560x1600) that nVidia says has current outstanding pre-release driver issues... ;) you complain about some sites using 9.12's for the ATI cards, but magically forget about it when it's a loss for nVidia due to drivers, eh? Nice double standards!

Your short term memory is atrocious. I never said ANYTHING of the sort.

In any case, the reviews I linked are apple-to-apple comparisons, not various settings that drastically affect performance pitted against eachother at a resolution nVidia confirms is bugged since the cards aren't out yet. If you don't like the HWC one, why not simply read the firingsquad one which does max the settings? Oh, right, because it doesn't match your desires to twist the facts.

Sure sure, drivers are going to make everything better. We've all heard that claim before. It rarely ever becomes reality. And I'm not "twsting the facts". I've provided proof of my claims. If you disagree with the results, take it up with Brent - its his results.
 
Sometimes I love all these threads when new cards come out. Remember, fanboys are a good thing fellas - we can slug it out ATI vs. Nvidia or PC vs. Mac, etc. over the internet instead of going to war and killing each other over ideology which is what people used to do based on the same primal instincts.
 
Anandtech called the 2900xt a piece of crap:
http://www.anandtech.com/video/showdoc.aspx?i=2988&p=31

So stop making up stories. All I'm saying is you're trying to discredit other reviewers (who are prominent) that have been accurate and impartial (Kyle has been as well) to further a point you do not have.

rofl, what?

Despite the delays, despite the quirks, and despite the lack of performance leadership, AMD has built a good part.

How on earth did you get from that to "piece of crap"?

Did you even bother reading what you just linked?
 
In end the user decides. I am still stuck on Nvidia but I am still waiting till next gen with better cooling and more performance. Seems like a rush job to me. I am going to get better cpu now.
 
The card had a 512bit bus you goof. It showed significant amounts of promise. The architecture was theoretically sound$
Posted via [H] Mobile Device

And yet, as both you and I have said, it was a piece of crap. And yet, Anandtech called it GOOD in their REVIEW. HardOCP called it SHIT. Who was right? HardOCP. Who was wrong? Anandtech.

EDIT: But thanks for digging up the proof for me, that was real kind of you to help prove my point.
 
uh not really. Apples and oranges. ATi scrapped that design and reworked it. I'm not going to argue with a child anymore.

Go take a processor design course before you try and lecture me on how things work. If that was an nVidia offering with UDA then it wouldn't be useless since that theoretical bandwidth could be exploited via caching.

UDA simply means they put all the drivers into a single installer. It has *nothing* to do with performance or caching.

And of course its "apples and oranges", you just completely changed the subject.
 
uh not really. Apples and oranges. ATi scrapped that design and reworked it. I'm not going to argue with a child anymore.

Go take a processor design course before you try and lecture me on how things work. If that was an nVidia offering with UDA then it wouldn't be useless since that theoretical bandwidth could be exploited via caching.


Ever heard the term better to shut your mouth and be thought a fool than to open it and remove all doubt? Yeah you just went the latter.

http://www.nvidia.com/object/feature_uda.html

Go get a clue now, then feel free to come back and talk with the grownups.
 
uh not really. Apples and oranges. ATi scrapped that design and reworked it. I'm not going to argue with a child anymore.

Go take a processor design course before you try and lecture me on how things work. If that was an nVidia offering with UDA then it wouldn't be useless since that theoretical bandwidth could be exploited via caching.

WTF does that have to do with anything? WTF does the UDA have to do with theoretical bandwidth?
 
Last edited by a moderator:
I'm going to have my head blow up, are you so stupid to think you can just compile code optimized for a certain architecture on another and have it work just as good? Get a fucking clue, before you really start pissing me off because you think you have some sort of education, go back to the drawing board.
It's only hardware. Buy what you want before you pop an aneurysm.
 
I hope everyone who is defending the Fermi reviews from other sites to have their card on April 12th. No excuses about it being sold out, price gouging, lack of funds, canceled my order etc. When you have yours within a week of the 12th, make sure you all post pics. I'm sure some of you still won't pull the trigger. When it's time to put the money up, most of you will say I'm waiting for some other future card yet to be announced by Nvidia to avoid embarrassment i.e. 512 core version. Waiting for that version is admission that this current model doesn't have the goods.
 
Right, and I take advantage of what you've established. I mentioned anandtech but you can replace that with any of your competitors. My disagreement is with your review methodology; that is different from saying you are not a smart guy, or that you run a poor website, etc. I think we all appreciate what you've provided for us, and even those that look to other sites for the kind of info you do not provide come to Hard for its unique editorials, for the personal feel of the articles. I think it's good that you're attempting something different. Thanks for the 480gtx review.

I think how they review is they go the theoretical limits of the actual card as anything below that will result in a gameplay that is more than enough that you can not tell the difference between other cards. One thing about [H]ard is that they provide links to other reviews so you have multiple sources.
 
On what basis? May I ask you where you've been educated? Obviously you are not up to speed, you just think you are.

Because you can't speak of what you don't know, you act like you can come in here and ridicule me? I highly suggest you go get a modern processor design book and read it.

You cannot write a "unified driver architecture" without hardware compliance. I do not understand where in your fat-loyalist head this fails to register.

Do you see this or are you still pretending you know what it is?

Ok. Out of your entire thread you have claimed that Nvidia is completely dominate and yet did not provide any links to back up your statement. Api scaling doesn't mean shit to me except for sli, which will cost you a pretty penny, and the fact that most games do not use proprietary standards. Physx and 3d vision are the only applications I can see usefulness in gaming aside from surround gaming, but the fact that ati is supporting open standards for the exact things leads me to believe open standards will win out. And before you start the whole fanboy thing, please know I am running both nvidia and ati in the same rig. This statement is straight from the [H]ard review.

Starting from the bottom up, we would say the least relevant video card is the GeForce GTX 470. In all our gameplay testing today, not once did the GeForce GTX 470 provide a superior gameplay experience compared to the Radeon HD 5850, even in Metro 2033. In fact, performance was very close between both video cards, and in some cases the Radeon HD 5850 proved to provide faster framerates. Looking strictly at performance, these video cards are equal.

And the 5850 is a 100 dollars less.
 
Isn't McCartney a self-declared nvidia fanboy? I recall he was temp-banned on one occasion.

Just ignore him and move on.

edit: what I really, really love about new product reviews on [H] is how easy it is to spot fanboys of either side in the discussion that follows. It's amusing to see quite a few members here going from "Yea I'll wait for the [H] review, as always", to questioning the review methodology and cherry-picking benchmarks from other reviews.
 
edit: what I really, really love about new product reviews on [H] is how easy it is to spot fanboys of either side in the discussion that follows. It's amusing to see quite a few members here going from "Yea I'll wait for the [H] review, as always", to questioning the review methodology and cherry-picking benchmarks from other reviews.

Reviews always bring out the best in us all! ;):D
 
I am not a fan of HardOCP's subjective reviews either. Techpowerup has an amazing amount of data in their review of Fermi. It's good that we have both because while a larger dataset allows you to extrapolate where your system would fit in (along with other useful charts like performance per watt, etc.), I read [H] for the editorial content really because I can't take anything useful from the "maximum playability" graphs. It's probably a manpower thing. If they had the time to apply those to every resolution, it would be a much more useful review.

Again, it's good to have comprehensive sites with regular reviewing methods and also good to have [H] because I enjoy the editorials.

TechPowerUP has some of the least informed reviews I have ever read, lol. Pages of canned benchmarks with no analysis are useless these days and don't lead to informed decisions.
 
TechPowerUP has some of the least informed reviews I have ever read, lol. Pages of canned benchmarks with no analysis are useless these days and don't lead to informed decisions.

While I have a great deal of respect for W1zzard's work on his tools/utilities + his technical knowledge, and I applaud the presentation of so much data, I do agree that without context such as minimum FPS #'s and even a little analysis that it becomes far less useful. I pretty much glossed over the TPU review this round, unfortunately.
 
To wrap things up, let’s start with the obvious: NVIDIA has reclaimed their crown – they have the fastest single-GPU card. The GTX 480 is between 10 and 15% faster than the Radeon 5870 depending on the resolution, giving it a comfortable lead over AMD’s best single-GPU card.

I stopped reading here, whoever wrote that is imposing importance of having the fastest single-GPU card, there is none, all that matter is price/speed/power/heat/noise/reliability. How the card works under the hood is completely irrelevent, it's like saying that AMD have the performance crown for cards with only 1Gb of vRAM...why the hell does that matter? We don't buy video cards based on the amount of memory, nor do we care how many GPUs it has.

AMD still have the single fastest card, no matter what butt hurt brand loyal nutcases argue.
 
To wrap things up, let’s start with the obvious: NVIDIA has reclaimed their crown – they have the fastest single-GPU card. The GTX 480 is between 10 and 15% faster than the Radeon 5870 depending on the resolution, giving it a comfortable lead over AMD’s best single-GPU card.

I stopped reading here..

I felt the same way. Immediately reading that opening statement threw all objective opinions out the window for me.

I thought [H] acted very respectable by giving 480x SLI a silver award. They didn't take cheap shots at Nvidia, instead the article posted the positives while not sugar-coating. I felt like the review did a very good job at showing just how incomplete the Fermi story is. Other sites are posting information that will surely change a dozen times in the next 30 days and are merely feeding the frenzy Nvidia is attempting to whip up.

I know the next article I look forward to reading will be from Kyle and Brent. [H]ardOCP did a very good job at displaying an initial reaction without condemning or idolizing Nvidia. There's two weeks yet before cards are available and I believe the information that [H]ardOCP provides between then and now will be of considerable value.

Keep it up, it is appreciated.
 
Back
Top