ATI : Return of the king : Nvidia in trouble

"The GT200b will be out in late summer or early fall, instantly obsoleting the GT200. Anyone buying the 65nm version will end up with a lemon, a slow, hot and expensive lemon. Kind of like the 5800. It would suck for NV if word of this got out. Ooops, sorry."

OH yes! My dream has come true. NV Fails.

Never mind that the rumors of a 55nm variant are just that, and that nVidia's CEO himself has already made a couple of comments indicating that they considered a 55nm GTX 280 but that it actually proved to not be the optimum solution and that die shrinks do not necessarily reduce heat/improve performance. What I glean from nVidia's comments is that they're probably going to wait until they can shrink the die below 55nm before they refresh the GTX 260 and GTX 280 in much the same manner they skipped GDDR4. Sure, they could just be trying to cover their asses to ensure people buy the GTX 260 and GTX 280 instead of wait, but a 55nm refresh would target the exact same audience, so...

And to the person who was commenting that "nVidia cards = too big, too power hungry, too hot, SO I'll just go for two RV770XT's in Crossfire..." Uh... yeah... wtf? The HD 4870's TDP is not going to be half that of the GTX 280's nor will its power consumption be half that of the GTX 280's and the way thing are shaping-up right now it very much sounds like GTX 280 > 2x HD 4870 in Crossfire and 2x HD 4870 1024MB = poentially $200 more than a single GTX 280. So pardon me if I don't see the advantage in buying two cards that, individually, consume less power and output less heat than another individual card but that, combined, consume more power, output more heat, perform worse, are more expensive, and make me deal with Crossfire crap (not that SLI is any different there).
 
Ah here we go, another piece of responsible journalism from our good friend Charlie:
As bad as any example you will find on theinq is, it's nothing compared to the stuff he trolls on Aces and other web boards. He is so full of himself and even somehow believes the propaganda he invents. He's a grade A moron.
 
It's time for AMD and/or nVidia to leak some benchmarks. By know they should know who is in the drivers seat and the winner should be more than happy to pound their chest.

This is probably the reality of it:

1) The GTX 280 will be the single fastest single core GPU by a good bit.
2) The 4870 will be good, but not as fast.
3) 4870X2 will be faster than GTX 280 in situations where games scale well with multiple GPU's and probably slower where games don't

It's that simple. Charlie is using the truth to tell a lie. His facts aren't probably all that off, but when it comes down to game time, people are going to want to pay the money for the GTX 280 cause it will be the fastest single card. He is raising issues about heat and quality and that may prove to be a problem. I don't see two 4870X2 cooling the house either. It nVidia has a supply problem that could prove to be bad, very, very bad. Comming into next gen without enough supply while your competitor can pump them out is what has been an Achilles heal for ATI/AMD and hurt them badly. nVidia BETTER have the chips or will kill them on this good around because I don't AMD will have that problem.
 
Dunno if I would say "always been right," but I at least always try to be ambiguous. ;) But no, our track record ain't that bad. :D
 
If you asked me that article was just poorly written, and get your facts right.

I'm not taking sides or whatever, but I think its too early to judge. We haven't seen the new GT200 and RV700 in action yet so why make early conclusions for?
 
If what Charlie said about NV's yields is true, it's not going to matter how much NV mops the floor with AMD, AMD's market share is going to grow as long as the 4870 performs marginally better than a 9800.

And I think that's what AMD should be focusing on right now. Gain back market share and shake this negative public perception they've developed over the last year or two between R600 and Phenom.
 
I for one am going the sane route and getting 2x rv770XT's when they're out (unless, of course, they turn out to be junk). I like being an early-adopter and all, and have some money to spare, but I honestly can't see spending $600+ for a card that games don't take advantage of, an probably won't for some time.

2 RV770 XTs will cost you more than a single GTX 280 (1 RV770 XT is rumored to cost $349 while 1 GTX 280 is rumored to cost $649 at most) and if rumors are true, will perform worse and consume more power. So I don't see much sense in saying what you did. You are better off in waiting for real reviews, before pulling the trigger, or you may be in for a big disappointment,
 
Ah here we go, another piece of responsible journalism from our good friend Charlie:

Couldn't resist. Moral of the story, do not put put too much stock in what this douche bag has to say. He clearly has a chip on his shoulder, maybe nvidia ran over his childhood pet and this is his way of getting revenge. :D

Not to jump on the INQ defense, but you have to go back in time and understand the context of what he wrote. Back then you had a massive dis-information campaign by NV that lulled us all to believe that the G80 would be just a so-so product. I mean a few months before this you had David Kurt saying that unified architecture is not the way to go. Then out comes the G80 all unified and kicking ass. That and we did expect the R600 to lay as big of a goose egg as it did. Had ATI upped the number of texture units and had better AA, things could have turned out differently. But that was the past.

I think Charlie post everything he hears. Some are close to being spot on, others are not. And its still kind of fun to read as we wait for the products to launch :)
 
"actually proved to not be the optimum solution and that die shrinks do not necessarily reduce heat/improve performance"

since when did Nvidia use AMD excuses???


AMD has already said that their new 4XXX cards are main stream (meaning not so hot)

so why is Nvidia making stupid statements now, even before the new cards are out?
 
Fanboyism destroys credibility especially when its based on the Inquirer.
 
Charlie is off his meds again. Coming from him and the Inq and based on his past articles... he might just as well have said that aliens visited him last night and gave him a ride in their spaceship.
 
Not to jump on the INQ defense, but you have to go back in time and understand the context of what he wrote. Back then you had a massive dis-information campaign by NV that lulled us all to believe that the G80 would be just a so-so product. I mean a few months before this you had David Kurt saying that unified architecture is not the way to go. Then out comes the G80 all unified and kicking ass. That and we did expect the R600 to lay as big of a goose egg as it did. Had ATI upped the number of texture units and had better AA, things could have turned out differently. But that was the past.

I think Charlie post everything he hears. Some are close to being spot on, others are not. And its still kind of fun to read as we wait for the products to launch :)

That's pretty much it. Everyone in the forums were like "wtf?" about the 384bus/768mb memory rumor shortly before g80 came out, and his sentiments about it were no different.

When looking at the history of things, any kind of history, you gotta take things into perspective. If you look at things at the surface, you can take things way out of context whether you meant to do it or not.

EDIT: His comment about r600 annihilating in DX10 had no basis in that particular post. Which again may be taken out of context since I'm not aware of the information circulating around at that timeframe.
 
It makes sense, what ATI is doing. Developing a great price-to-performance chip, then having two of them for a high end chip. This way, they can develop their tech and new processes faster, instead of taking one year to build a whole new "GPU of all GPU's", they spend six months shrinking down their process and improving it for a new, more lean GPU. The HD3 series was released about six months after the 2900, and the HD4 series will be released about six months after the HD3 series. I wouldn't be surprised if we get an improved, 45nm HD5 series in December/January.

And since ATI is making an awesome price-performance chip, that has low power, two of them should be able to take on Nvidia's high end chip. And if it beats it, Nvidia can't do anything, because they're using a process that's two big and hot for a "two-gpu" card. So by the time Nvidia will shrink down their GT200 so they can put two on the same board (ala six or so months), ATI will have a new refresh of cards.

Anyway, that's the way I think (and hope) it will pan out.
 
The fact that the die is huge, is neither an Inq or Fudzilla exclusive. We know it from other sources for a while now. They are just not as exuberant as the Inq is, in how they say GT200 is a big chip...

big chips mean big prices.. at some point ATI having the smaller cooler process tech will bite Nvidia in the ass.... is this the time?
 
ATI needs to learn how to write lean and mean drivers, not bloated 40+mb crapware that fubars .NET installs.
 
It makes sense, what ATI is doing. Developing a great price-to-performance chip, then having two of them for a high end chip. This way, they can develop their tech and new processes faster, instead of taking one year to build a whole new "GPU of all GPU's", they spend six months shrinking down their process and improving it for a new, more lean GPU. The HD3 series was released about six months after the 2900, and the HD4 series will be released about six months after the HD3 series. I wouldn't be surprised if we get an improved, 45nm HD5 series in December/January.

And since ATI is making an awesome price-performance chip, that has low power, two of them should be able to take on Nvidia's high end chip. And if it beats it, Nvidia can't do anything, because they're using a process that's two big and hot for a "two-gpu" card. So by the time Nvidia will shrink down their GT200 so they can put two on the same board (ala six or so months), ATI will have a new refresh of cards.

Anyway, that's the way I think (and hope) it will pan out.

Well, rumored TDP put two HD 4870 in Crossfire beyond the TDP of a single GTX 280 and the GTX (based on rumored specs) should be faster than two HDs.

Actually rumored TDP of the GTX 260 is not that far off of a single HD 4870 and the GTX 260 (again by rumored specs) should roughly match a 9800 GX2, while the HD 4870 seems to match only a HD 3870 X2. We'll just have to wait and see.

By price points only, the HD 4870 will be competing with a 9800 GTX / 8800 Ultra, while being a bit faster and the HD 4850 will probably be a bit faster than a 8800 GT and slightly slower or equal to a 8800 GTS 512.
 
big chips mean big prices.. at some point ATI having the smaller cooler process tech will bite Nvidia in the ass.... is this the time?

How so ? NVIDIA's 9600 GT @ 65 nm consumed less power than ATI's 55 nm HD 38xx under load and only consumed slightly more power under idle, while performing the same or better. It's not all about smaller fab process. It's also about efficient architecture.
 
How so ? NVIDIA's 9600 GT @ 65 nm consumed less power than ATI's 55 nm HD 38xx under load and only consumed slightly more power under idle, while performing the same or better. It's not all about smaller fab process. It's also about efficient architecture.

then why would nvidia ever switch to a smaller die at this point... since 65nm is fine. we will see how fine 65nm is with the g200
 
Inq seems to be wrong more than they're right. They make up bogus news all the time.
 
Inq seems to be wrong more than they're right. They make up bogus news all the time.

BUT GUESS WHAT... tons of people still read...they are right at times... they do get the early line on things at times.... u just gotta sort thru the fluff and dont take any of it as brass tax
 
BUT GUESS WHAT... tons of people still read...they are right at times... they do get the early line on things at times.... u just gotta sort thru the fluff and dont take any of it as brass tax

replace fluff with poop and you have a winner.
 
then why would nvidia ever switch to a smaller die at this point... since 65nm is fine. we will see how fine 65nm is with the g200

I didn't say 65 nm is fine for GT200. Same thing happened with G80 @ 90nm. I'm just saying that using the smaller fab process, is not the only factor that matters here. Architecture efficiency is extremely important too and the 9600 GT vs HD 3870 is a good example of that (and I used those two cards only because they perform more or less the same). And we'll have the chance to know how this pans out, when the cards are released.
 
ATI drivers PWN NVidia drivers IMHO. The only complaint I ever hear about these current G80 and G92 cards is that the drivers are balls...and that NV needs to hurry up and release a non-rehashed card besides 9600GT. :p

Jbirney, I hear you on the context of that INQ article re: G80. When you look at some of the other rumors that were flying around regarding G80 (and also R600) at the time, you see that Charlie is actually pretty on-point in comparison.

It sounds like GT2X0 at 65nm will be way more heat, power consumed than the last record holder good ol' R600 at 80nm.
 
how are ati drivers any better than nvidia? you get ati drivers once per month and sometimes the official drivers leave out some of the improvements of the hotfix drivers (powerplay bug anyone), with nvidia you get constant beta driver updates with seldom official driver updates, and games sometimes breaking in between the betas. no body wins all the time.
 
I know I will probably be called a madman for saying this but I feel that the ATI Catalyst Control Center is much easier to work with than NVidia ForceWare Control Panel. My friend has 9600GT SLi and constantly has to disable/re-enable SLi, sometimes involving reboots. My X1900 Crossfire setup never disables the Crossfire by iteself or requires a reboot to engage it. I admit that when it comes to multi-GPU, NVidia has one-up on ATI by allowing selection of the mgpu rendering mode but overall I think CCC is a more polished piece of software. Now when it was first launched is another story, but I think it's overtaken the NVidia software at this point.

Edit: Don't take me wrong guys, there is huge room for control panel improvements from both parties here.
 
I know I will probably be called a madman for saying this but I feel that the ATI Catalyst Control Center is much easier to work with than NVidia ForceWare Control Panel.
Neither are great, but CCC probably is better. NV really destroyed the CP and really hasn't done much to fix it since. It still has critical bugs in some functionality and I just wonder what they're thinking. Don't get me wrong, it is useful and has a couple of very nice features, but NV pointlessly changed it for the worse overall.
 
I know I will probably be called a madman for saying this but I feel that the ATI Catalyst Control Center is much easier to work with than NVidia ForceWare Control Panel. My friend has 9600GT SLi and constantly has to disable/re-enable SLi, sometimes involving reboots. My X1900 Crossfire setup never disables the Crossfire by iteself or requires a reboot to engage it. I admit that when it comes to multi-GPU, NVidia has one-up on ATI by allowing selection of the mgpu rendering mode but overall I think CCC is a more polished piece of software. Now when it was first launched is another story, but I think it's overtaken the NVidia software at this point.

Edit: Don't take me wrong guys, there is huge room for control panel improvements from both parties here.

the catalyst a.i. is the part that changes the rendering method. default is sfr.
 
I know I will probably be called a madman for saying this but I feel that the ATI Catalyst Control Center is much easier to work with than NVidia ForceWare Control Panel. My friend has 9600GT SLi and constantly has to disable/re-enable SLi, sometimes involving reboots. My X1900 Crossfire setup never disables the Crossfire by iteself or requires a reboot to engage it. I admit that when it comes to multi-GPU, NVidia has one-up on ATI by allowing selection of the mgpu rendering mode but overall I think CCC is a more polished piece of software. Now when it was first launched is another story, but I think it's overtaken the NVidia software at this point.

Edit: Don't take me wrong guys, there is huge room for control panel improvements from both parties here.

Naw, I agree with this too. I think the nVidia CORE drivers are better than ATI's, but I seriously think ATI's GUI pwns nVidia's.

Christ, I hate that stupid forceware control panel :mad:
 
The fact that the die is huge, is neither an Inq or Fudzilla exclusive. We know it from other sources for a while now. They are just not as exuberant as the Inq is, in how they say GT200 is a big chip...


That was first verified by spy satellites.
 
the catalyst a.i. is the part that changes the rendering method. default is sfr.

Having used Crossfire for a while I am indeed aware of this, I just dislike how it's obfuscated behind a bunch of marketing terms and BS by ATI whereas NVidia tells you what the heck is going on technically.
 
Having used Crossfire for a while I am indeed aware of this, I just dislike how it's obfuscated behind a bunch of marketing terms and BS by ATI whereas NVidia tells you what the heck is going on technically.

i, too have been using cf for a while but only learned of what catalyst a.i. does recently.
 
I like ATI's marketing where you have to wait awhile for new drivers to get those functionalities you think you get when you first purchase the card!
 
I like ATI's marketing where you have to wait awhile for new drivers to get those functionalities you think you get when you first purchase the card!

Is that ATI's marketing or just ATI fanboi wet dreams? Oh wait...its both.
 
I like how NV doesn't update drivers for months.

They don't officially update their drivers for months, but new betas are leaked and spread nearly on a weekly basis consecutively (and I'm not joking). A couple of times times you'd see two new beta drivers within the same week. Yes, there are improvements in these driver "releases" more often than not.

"WHQL" doesn't mean squat FYI; it's just four letters.
 
Back
Top