SemiAccurate at it again LULZ

GTX 480 should be easily 30% faster then 5870.

20% clock speed miss? Oh wait what are final GTX 480 clocks?
Only 448 shaders? Oh wait how many shaders does GTX 480 have?
1.7% yields? Oh wait you just cant translate Japanese, Charlie. Dont worry Charlie no one is going to hear you raging in your moms basement.

Seriously why do you guys believe this guy again?

He has not been right about anything....even the TDP which is 250W for GTX 480. Oh wait, he has been right about some things, like G2xx cards going EOL. But as if that requires any effort I'm sure his pipeline in TSMC will tell him anything. As for actual hard info on Fermi that only nVidia personnel know. Nada, nothing, null, zip.

Well, at least Charlie got the temperature sort of right. But wait he said Fermi will hit 70C idle right? Never mind he even screwed up on that one too.
 
4 FOR ME!

Honestly CD just doesn't impress me. Yeah, he's got a lot of facts but they are are for the purpose of trashing nVidia. He's great as using the truth to tell a lie. I REFUSE to read his stuff anymore.
 
GTX 480 should be easily 30% faster then 5870.

20% clock speed miss? Oh wait what are final GTX 480 clocks?
Only 448 shaders? Oh wait how many shaders does GTX 480 have?
1.7% yields? Oh wait you just cant translate Japanese, Charlie. Dont worry Charlie no one is going to hear you raging in your moms basement.

Seriously why do you guys believe this guy again?

He has not been right about anything....even the TDP which is 250W for GTX 480. Oh wait, he has been right about some things, like G2xx cards going EOL. But as if that requires any effort I'm sure his pipeline in TSMC will tell him anything. As for actual hard info on Fermi that only nVidia personnel know. Nada, nothing, null, zip.

Well, at least Charlie got the temperature sort of right. But wait he said Fermi will hit 70C idle right? Never mind he even screwed up on that one too.

Adjustments were being made on this card for months. It's impossible to predict exact numbers. I'm sure changes were being made on this card even in February. So if you had inside info in January, those same figures can change by February, March and just before launch. Even the pro-Nvidia sites didn't know what was going on and started to doubt Fermi themselves.
 
I will be extremely surprised if fermi ends up as good as it was suppose to be, concidering how late it is and how much negative hype it has attracted over the last 6 months
 
He's usually right. NV fanbois just hate bc he's mostly right, esp. about the fermi fiasco. And it is a fiasco. rofl.:p I love that guy...

nvidia fans will always want him to be wrong
and ati fans always wants him to be right so they can continue bashing Fermi at every single Fermi thread. Its almost like some of you guys enjoy seeing Fermi fails

He's the guy that you'll either hate or love him:p

Either way he's not some spy in nVidia or anything like that. He just make prediction and fuel them with hatred


Don't even compare him to Kyle. Kyle never present his predictions as facts, and if Kyle makes a guess, he makes it clear that he's just predicting something.
 
So we are supposed to roll everything back to DX9 to make Fermi look better now? Sad, nVidia, sad. Too bad Semi Accurate is spot on. Irony is a funny thing.

These are DX11 parts, if we have to cripple everything to give Nvidia a fighting chance then what is the point of Fermi ...

It's one fucking demo, troll. Jesus, put it into perspective and stop whining. And IT'S AN APPLICATION BUG.. it doesn't run in DX11 on Fermi because of some clearly broken detection programmed into the demo.

It's simple logic.. if you want a direct comparison, use the retail game where this issue doesn't exist, or just roll back to DX9 as said. Or you can feel free to try to compare DX9 to DX11 and look like a total idiot. I guess you think that's a viable option?
 
Last edited:
going to be interesting to see the outcome.

lol, very!!

No matter which camp you are in, you have to hope that Fermi is a really good card. For the people in the ATI camp it means prices should be reduced and you can get some good deals. For the Nvidia camp you can cheer about been on top again and then purchase a reduced price gtx280 or 260. :)
 
Regardless, 87c is ridiculous.

Yeah that's shocking 3 degrees more than 4870 and less than 8800gt:D

1214363351CrDBh6XGXC_8_3.gif
 
And...

Temperatures don't really change. They just push new designs to the same thresholds.
 
BlueStorm, the issue is that a benchmark was puplished which appears now to be biased, and the people who performed this benchmark are supposed to be experts, do you seriously expect them not to know...

So for anyone calling Charlie biased or a "liar" and so on, I do not hear any comment on the practices Nvidia is employing. Are they "honest" practices and if people are ignoring this then the questions stands: Who is truly not biased?

Lets just say Fermi is a good polarizer.
 
Oh yeah, real fiasco...

Read the last part of this thread: http://hardforum.com/showthread.php?t=1505709 Apparrently it sucked so bad they had to raise the MSRP ;).


They raised the MSRP bc so many of them are burning-up at the factory they have to compsensate for the losses. :p

I know i'm totally on the red team, but these Fermi cards....Even if you've been jonesing for a Fermi. ,..I'd serisouly stay away from them. Huge die size, rumors of the chips burning-up, high heat and power draw. Nvidia has to release something to save some face, even if it's at the expense of a few thousand suckers who get-in on this early. I forsee nothing but problems with these cards. If you're gonna take a chance on one, best have your receipt ready for the RMA. ;)
 
He is a moron, not sure why you guys even pay attention then again, I guess I do now why....

Funny you put him down fast, and when he is right will you be back here to eat your words and apologies? Since he, like many others has been right for most of Fermi so far.
 
I don't give a $#)@ about ATi, but you Nvidiots are even worse than CraApple idiots.
 
Funny you put him down fast, and when he is right will you be back here to eat your words and apologies? Since he, like many others has been right for most of Fermi so far.
How can he right when Fermi has not launched - unless you made up your mind to believe propaganda?
:rolleyes:
 
Eeh, my ol' X1900XT was hitting 90c and it still managed to survive.
Whats worrying is that these Fermi's are on the new (for nvidia anyway) 40nm process yet those temperatures; cannot be a good sign.

Hopefully it compensates for the performance.

It's harder to manage temperatures at smaller processes. You hope that you have enough reduction in power to offset the slower dissipation of heat. Imagine the increased challenge of displacing 180W of thermal energy from a chip the size of a nickel or quarter compared to displacing 180W from a chip the size of silver dollar. With Fermi, you're doubling the thermal density while reducing surface area. Unless you suddenly invent a material that has a squared thermal conductivity you're going to get a rise in temperature at maximum operation. Of course if your environment is cool enough, and you have a material with a high enough specific heat and adequate conductivity, you'll probably be able to manage. But it looks like conductivity was important enough for nvidia to use direct-contact heatpipes.
 
Last edited by a moderator:
Here is what Charlie had to say about the G80
http://www.theinquirer.net/inquirer/news/1044075/nvidia-g80-mystery-starts-thickening

In any case, the G80 is shaping up to be a patchy part. In some things, it may absolutely scream, but fall flat in others. The architecture is looking to be quite different from anything else out there, but again, that may not be a compliment. In either case, before you spend $1000 on two of these beasts, it may be prudent to wait for R600 numbers
.

Anyone who waited for the R600 probably barfed up a lung. :D

Here is the list of many, many claims he has made about (and changed about) the GTX4xx
http://forums.anandtech.com/showthread.php?t=2061780

This one is my favorite
Nvidia's Fermi GTX480 is broken and unfixable (February 17, 2010)

What a dbag.
 
Still he was right on the heat, power consumption and reduction in cuda cores, hence at least give him credit for this. IMO puplications who follow nvidia propaganda and just placatly print what they have been told without question are the dbags.
 
How can he right when Fermi has not launched - unless you made up your mind to believe propaganda?
:rolleyes:

hasnt he been right on yield problems, power issues, size and so on, delayed launches...

so your saying Kyle then as well is wrong as he also predicted all this stuff happening.
 
hasnt he been right on yield problems, power issues, size and so on, delayed launches...

so your saying Kyle then as well is wrong as he also predicted all this stuff happening.

What did Kyle predict, and why is his opinion more or less valid than any others? I'll say that Kyle is a lot more careful, but then again he doesn't really make any predictions. Kyle was also the one that got all gooey-eye-excited at the first GF100 unveil January 17, posting on Jan 16th ~ midnight, something like "It changes the game, it is really quite exciting" and the article goes on to mention OOO triangle setup & execution and higher tessellation performance, with a very short and uninspiring discussion on its impact for games, and concludes saying that he was disconcerted that Nvidia didn't reveal any performance specifications for the part, but was impressed that the demos he saw were "running quickly". He sensationalizes, but with greater restraint.
 
Last edited by a moderator:
So for anyone calling Charlie biased or a "liar" and so on, I do not hear any comment on the practices Nvidia is employing. Are they "honest" practices and if people are ignoring this then the questions stands: Who is truly not biased?

I think we expect biased benchmarks from ATI and nVidia. That's why we wait for official reviews and benchmarks.

Just consider the benchmarks to be true, from a certain point of view.
 
Yeah that's shocking 3 degrees more than 4870 and less than 8800gt:D

1214363351CrDBh6XGXC_8_3.gif

my question, which i cant find the answer to, is was the dirt 2 test done on water cooling (like in the first video we saw from nVidia where they walk us around their offices for a bit) or was it on air
 
If it would have been watercooling and you still get 87 degree, the card would burn up using stock cooler under furmark. I still rember the overtemp issue with the gtx280 (the initial 65nm) when people were running furmark.
 
And...

Temperatures don't really change. They just push new designs to the same thresholds.

It's a big TDP difference between ATI and Nvidia this gen. I guess people will ignore all of the signs just to say they have one of these cards.
Both companies are using the same 40nm process but are 50-60 TDP apart, six months late and more expensive.
IMO Nvidia is hurting the GPU market by overpricing these cards. ATI will not cut prices because of this.
 
Last edited:
someone needs to make a check list of all the things charlie said about the gf100.
see how many things he was wrong on and how many things he was correct on.
i have a feeling the correct column will have a lot more checks.

why does everyone love to rip him?? do i agree with everything he say? no, but he has a good point most of the time.

also im not a hardcore ati guy, the last ati card i had before my 4870 was a 9800pro.
there has been a lot of nvidia cards in my systems (6600, 7800, 8800).

but for the most part it seems like nvidia really screwed up the gf100 and he has called them out on it. if they didnt really screw up the card wouldnt it have been released in sept, oct?


usually when things get delayed that long it isnt good. they have to release something to the market, so we are getting the 480/470. it isnt like they can rebrand the gt200 with dx11 (i bet they thought about it). how long is it before the gf100 gets replaced with the gf200? these cards will be eol by sept of this year.


no they rebranded the GT 210/220/240 to 310/320/340 with dx9/10/10.1 support instead of dx9/10 only in the 2xx series..
 
Yeah that's shocking 3 degrees more than 4870 and less than 8800gt:D

1214363351CrDBh6XGXC_8_3.gif

good point.. but if you compare the change from 55 to 40nm for ATI and temp difference while running slightly faster clocks.. you would expect the same from nvidia.. but in this case it increased because they tried to stuff as much friggin crap into the gpu as possible.. makes you wonder how ATI was able to get the performance they got while dropping the temp and cost of production.. while nvidia on the other hand is stuffing all this crap into a gpu to get what maybe 10% more performance then a HD5970 at best?
 
good point.. but if you compare the change from 55 to 40nm for ATI and temp difference while running slightly faster clocks.. you would expect the same from nvidia.. but in this case it increased because they tried to stuff as much friggin crap into the gpu as possible.. makes you wonder how ATI was able to get the performance they got while dropping the temp and cost of production.. while nvidia on the other hand is stuffing all this crap into a gpu to get what maybe 10% more performance then a HD5970 at best?

have you read anand's article on the history of the ATI chips? it explains what they did right and hints at things that nvidia didnt do


http://www.anandtech.com/video/showdoc.aspx?i=3740

and this article showed parts that charlie had mentioned A LONG time ago
 
hasnt he been right on yield problems, power issues, size and so on, delayed launches...

so your saying Kyle then as well is wrong as he also predicted all this stuff happening.

i know what has been tested at my work bench all week :p


. . . and i just got my HD 5870 PowerColor PCS+ .. for Professional Cooling Solution
It is AMD's way to minimize the new GTX launch's impact.

- i am eager to see how far this hyper-overclocker goes
--- i will have new results next week (part 2 of my testing) and also CrossFire benches
 
Maybe that just shows that Charlie == Anandtech.

It's really difficult to detect tone on the boards lately, well at least for me. That said I think anandtech has been a pretty solid site for quite a few years with regards to their theories and their testing methods. Obviously if your comment was meant in jest, I apologize and now look like an idiot. Either way carry on.;)
 
Yeah that's shocking 3 degrees more than 4870 and less than 8800gt:D

1214363351CrDBh6XGXC_8_3.gif

I had to RMA three 8800gts that seemed to die due to heat. If you don't remember, the cooler was later upgraded on the 8800gts due to thermal problems.
 
Back
Top