ATI : Return of the king : Nvidia in trouble

POPEGOLD

[H]ard|Gawd
Joined
Dec 18, 2005
Messages
1,117
http://www.theinquirer.net/gb/inquirer/news/2008/05/29/nvidia-gt200-sucessor-tapes

The Inq is touching on what i was suspecting

while ATI is getting its LEAN and MEAN 4800 line ready
Nvidia is getting its HOT and EXPENSIVE g200 line ready

If the early numbers hold true... who in they right minds would spend MORE for Nvidia..for at best par performance... more heat..and more power drain...

Ati had its Hot heavy 2900...lead weight... now its nvidia turn to put out a boat anchor/space heater
 
Last 3 gens ATI has been the favorite to win and each time nVidia has posted better numbers...
 
Well in some instances TheInq can be informative, but I always hold what they say with a grain of salt. To me they are not near as bad as Fud, but they can be way out there. This summer will be very interesting, and I would love for ATI to get back into the game. It would be interesting to see nVIDIA to eat some crow for a change.
 
Last 3 gens ATI has been the favorite to win and each time nVidia has posted better numbers...

QFT. nVidia's not stupid .. though ATI is due for a killer bangforthebuck card. It's been way too long since the 9800 Pro. :p
 
Yes nVidia will rest on their laurels and "the king" will return :rolleyes:

Why is it everytime someone throws out words like those it's always laughable :eek:
 
hoping and getting a good product from ati is good for us all. i am very excited to see what they've got.
 
Remember the Inq touted they found the specs for 2800XTX2 and 2800XT at 750 Mhz. I guess that "secret" still is a secret.
 
"The GT200b will be out in late summer or early fall, instantly obsoleting the GT200. Anyone buying the 65nm version will end up with a lemon, a slow, hot and expensive lemon. Kind of like the 5800. It would suck for NV if word of this got out. Ooops, sorry."

OH yes! My dream has come true. NV Fails.
 
Although sensationalized, I think this will pan out as true in most respects. It is likely that:

- nV will continue their performance lead with GTX 280/260
- nV will continue their tradition in large die processors
- GTX 280/260 will be hot and use lots of power... and will be followed several months later by a cooler less power-hungry die-shrinked GPU and the mid-range line (these will be the real compitition to 770XT/Pro)
- GTX 280/260 will be very expensive so their price/performance ratio compared to 770XT will be extremely bad. Prices will normalize a bit once r700 is released.

I for one am going the sane route and getting 2x rv770XT's when they're out (unless, of course, they turn out to be junk). I like being an early-adopter and all, and have some money to spare, but I honestly can't see spending $600+ for a card that games don't take advantage of, an probably won't for some time.
 
I figured it was theinq or fudo from the title. Clicking the thread didn't disappoint.

Either place could open up an axe-grinding shop.
 
The problem with Charlie's rant is that it seems to be at odds with reviewers how have actually seen the card. And even Charlie said that the GTX280 was going to be the fastest single GPU card, though supposedly 4870X2 will stomp it.

If you read between the lines, he's not really saying that anything that we haven't heard from others, yes its hot, yes a 55nm part is coming soon after. And even in performance I don't see where the 280 won't be the fastest single card.

All he seems to have done is put a negative spin of the nVidia having the fastest single GPU card.
 
Looks like Charlie's stopped taking his meds again, in between the ranting and raving he's not saying anything new or original, just making more and more outlandish claims that NV will fail this time etc etc.
And of course fanboys from both sides come out of their closets to flame the crap out of each other to the amusement of the general population.
Of course this is the INQ we are talking about and any claims they make really needs a dump truck load of salt before being considered seriously.

Why not simply wait for the benchmarks and hardware reviews to be finished before commenting, speculation is useless at this point.

As for Charlie, he sounds like a broken record, the only time I've seen anyone as passionate about bashing a company was if they were a paid stock basher on a company stock board. I wonder how many NV shares he's got and is trying to short before dumping them.
 
Quite right, time will tell. It's not fanboyism for most, most gamers simply want to know the truth. The thing about the INQ is that they have the uncanny ability to take the truth and turn it into a lie. As we all know the card will be hot, the GTX 280 being top of the line will be exspensive. But apparently it will be the fastest single GPU around, but it will suck. That's what Charlie is saying.

The only thing that's new here at least for me is the supply. He say's it's going to be short. Which may not be all that bad for nVidia in a way if the margins are low but short supply overall probably wouldn't be a good thing.

We shall see.
 
I am a big fan of Charlie's and have offered him a job every time I have spoken to him in person. That said, anyone that has actually used the cards might have some more important points to make about the hardware.
 
Wouldn't mind this being true, we sure need the competition between ATI and Nvidia

Anyway, we will all know in three weeks or so. :)
 
I hope ati one up nvidia this time, needs some competition. If only ati Linux drivers improves....
 
That is one hell of a big rant...lol

The GTX 2x0 will barry anything ATI throws at them I'm pretty sure.
 
Ati drivers are as good as or sometimes better than nvidias :)
and also ati should get back on track in Q1 09 with the R800.

Now its the R800 in Q1 2009 thats supposed to help them? I doubt thats going to happen. Ever since the R600 people have been saying ATI will take down nvidia. Besides that, nvidia probably could just put GDDR5 on its GT200 and the increase in clocks would be enough for the R800 anyway.
 
Nothing but "X will now save the day!" flag waving. There needs to be a mandatory test to post on these forums about BS sites.
 
Ati drivers are as good as or sometimes better than nvidias :)
and also ati should get back on track in Q1 09 with the R800.

Not in my experience with my old pc at home or the machines I work on at the computer shop i work at.
 
Well it would be interesting for ATI/AMD to come out on top once in a while as competition is a good thing for us. Better products more often.
 
Well it would be interesting for ATI/AMD to come out on top once in a while as competition is a good thing for us. Better products more often.

its not even an issue of coming out on top, just being competitive at the same price point would be a nice change of pace.
 
Title should read: Return of the BS: Nvidia gets toilet paper.

Jesus Christ almighty already. These rumor sites are ridiculous and people eat that shite. :roll;
 
well if the prices i have seen put the 260 at 449 and the 280 at 599....
and the 4870 at 299....

if the performance is close ... Nvidia loses.... with gas being 4 bucks a gallon.... i just dont see people paying double for more heat more power drain...and maybe slightly more performance.

the g200 line has fx5800/2900xt writted all over it
 
There was tons more complaining about heat and power usage for the 8800gtx. You'd have thought the world was ending. And everyone was talking about how ATi would beat it anyways. The 8800gtx was doomed to fail and be replaced by something better from nv within six months.

Sounds familiar. Who knows, maybe the is time the bombastic, rampant speculation will be accurate. Simple chance dictates that eventually the Inq will get something right.

Articles like that are just masturbatory without any actual hard knowledge/evidence of performance between the cards.
 
Can someone pull out those Inq articles where Charlie was swearing black and blue that he had seen the R600 in action and that it was going to dominate the 8800's? And also the article where his laptop with all his notes and data was conveniently stolen from his car? I need a good laugh :D
 
I personally think the Inq. are a bunch of ATi fan boys. EVERY time new GPU's come out they always hype ATi no matter if they are better or not.
 
I don't think 40% yields is considered all that bad, especially for such a large die. As the inq pointed out, nvidia will still make money at 40% yields, and even if the part is too hot, they have a die shrink on the way, so I don't really get the point of the article.
 
I don't think 40% yields is considered all that bad, especially for such a large die. As the inq pointed out, nvidia will still make money at 40% yields, and even if the part is too hot, they have a die shrink on the way, so I don't really get the point of the article.
You're putting too much faith in a guy who bats 0.100.
 
Charlie's job, like most IT "journalists" isn't to be accurate, its to generate web traffic, and that's done the same way as in conventional journalism. Make shit up and then speculate as though you have real information. As Kyle implied earlier, he's not seen anything. Since he's not under an NDA, he doesn't have first hand knowledge and probably has been as close to a GTX 280 as I have.

I wish I had his job though! Making shit up and then getting people to read it and discuss it! Amazing!:p

But for all we know he might be right on. Even a drunk monkey knows where the booze is.
 
Let's be honest. Getting absolutely no information early, even if its not entirely accurate, would make for some boring web reading. :)
 
You mean they'd do something like re-hash the same product for years at a time?

As if anyone does differently year in and year out? :p

No, they stick with what works and go radical on products after a few years; it's been this way forever with anything computer related. :rolleyes:
 
Ah here we go, another piece of responsible journalism from our good friend Charlie:

YESTERDAY I SAID I was chasing more info on the Nvidia G80 chip. The more I dug, the more my head hurt, nothing concrete, so take this with a pretty big grain of salt. In any case, there is some interesting info here.

The first thing is I was a little off on the dates, the tech day now seems more likely in late October with a launch in early November. What's a few weeks between friends? If they have not gotten back the latest silicon though, it is going to be really tight to make that schedule, chips take time to fab.

What they will talk about is the odd part. First is the arrangement of the chip, physically we are hearing that it is 2 * 2 cm, or about a 400mm die. Ouch. One of the reasons it is so big is the whole dual core rumor that has been floating around. G80 is not going to be a converged shader unit like the ATI R500/XBox360 or R600, it will do things the 'old' way.

Some people are saying that it will have 96 pipes, split as 48 DX9 pipes and 48 DX10. While this may sound like a huge number, we are told if this is the case, it is unlikely that you can use both at the same time. Think a 48 or 48 architecture. That said, I kind of doubt this rumor, it makes little sense.

In any case, NV is heavily downplaying the DX10 performance, instead shouting about DX9 to anyone who will listen. If the G80 is more or less two 7900s like we hear, it should do pretty well at DX9, but how it stacks up to R600 in DX9 is not known. R600 should annihilate it in DX10 benches though. We hear G80 has about 1/3 of it's die dedicated to DX10 functionality.

One of the other interesting points that surfaced is much more plausible but still odd, and that is memory width. As far as I know, GDDR3 is only available in 32-bit wide chips, same with the upcoming GDDR4. Early G80 boards are said to have 12 memory chips on them, a number that is not computer friendly.

Doing a little math, 12 * 32 gives you a 384b wide bus. On the odder side, if you take any of the common card memory capacities, 256 and 512MB and divide them by 12, you end up with about 21 or 42MB chips, a size not commonly found in commodity DRAMs. If 12 chips are destined for production, you will end up with a marketers nightmare in DRAM capacities, 384/768 is not a consumer friendly number.

While it could just be something supported by the GPU and not destined for production, you have to ask why they may need this. The thing that makes the most sense to me is that with all the added pipes, they needed more memory bandwidth. To get that, you can either go faster or go wider.

If you look at the cost of high end GDDR parts, faster is ugly. If you look at the yields of high end GDDR parts, faster is ugly. Drawing more traces on a PCB to go wider is far less ugly. If NVidia needed the bandwith that you can only get from a 384b wide bus, then the odd memory size may simply be a side effect of that.

In any case, the G80 is shaping up to be a patchy part. In some things, it may absolutely scream, but fall flat in others. The architecture is looking to be quite different from anything else out there, but again, that may not be a compliment. In either case, before you spend $1000 on two of these beasts, it may be prudent to wait for R600 numbers. µ

http://www.theinquirer.net/en/inquirer/news/2006/09/13/nvidia-g80-mystery-starts-thickening

Couldn't resist. Moral of the story, do not put put too much stock in what this douche bag has to say. He clearly has a chip on his shoulder, maybe nvidia ran over his childhood pet and this is his way of getting revenge. :D
 
Back
Top