Anand on Engagdet about NVIDIA

Very interesting read. I really doubt nvidia will be exiting the PC GPU market anytime soon. And if they do, I fear what would happen to video card prices.
 
Nice read Anand always puts together nice editorials. That being said I know alot of ATI fanboys came out of the woodwork here lately but seriously I hope you all understand what would happen to all of us if ATi were the only real GPU game in town.




Right?

Let's hope that doesn't happen we need at least 2 major players in the market.
 
I'd say just look at the current prices on the 5800 series. Then imagine worse.
You don't have to imagine anything. Nvidia launched cards at $600+ MSRP while there was competition from 3Dfx, ATI and S3.:rolleyes:

Selective memory is a wonderful thing, it allows uninformed people spew self-righteous drivel. You should be thanking AMD's change of design philosophy for the lower prices (smaller chips, cheaper to manufacture and develop).
Nvidia is still developing huge chips which cost a lot more and take longer to develop, and the cost is passed down to you.

Edit: For the sake of accuracy, competition can also be irrelevant WRT consumer prices. ATI and Nvidia were investigated by the FTC for price fixing a few years ago and a civil lawsuit fizzled out due to lack of regulation enforcement in the Bush era. You know that same lack of regulation that brought the housing bubble. The lawsuit ended up in a relatively small $1.7 million settlement.
 
Last edited:
You don't have to imagine anything. Nvidia launched cards at $600+ MSRP while there was competition from 3Dfx, ATI and S3.:rolleyes:

Selective memory is a wonderful thing, it allows uninformed people spew self-righteous drivel. Like youreself. Maybe with ATI BUT not when 3dfx and s3 were around not 600+ like youre memory serve you. And I think the drivel has gone down ya chin into youre lap
 
You don't have to imagine anything. Nvidia launched cards at $600+ MSRP while there was competition from 3Dfx, ATI and S3.:rolleyes:

Selective memory is a wonderful thing, it allows uninformed people spew self-righteous drivel. Like youreself. maybe ATI BUT not when 3dfx and s3 were around not 600+ like youre memory does not serve you right and i think the drivel has gone down ya chin into youre lap
 
The last time NVIDIA was this late to a major DirectX transition was seven years ago, and the company just quietly confirmed we won't see its next-generation GPU, Fermi, until Q1 2010. If AMD's manufacturing partner TSMC weren't having such a terrible time making 40nm chips I'd say that AMD would be gobbling up marketshare like a fat kid.

I thought Nvidia used TSMC and AMD/ATI used Globalfoundries? Wasn't Globalfoundries a part of AMD until just recently?
 
Nice read Anand always puts together nice editorials. That being said I know alot of ATI fanboys came out of the woodwork here lately but seriously I hope you all understand what would happen to all of us if ATi were the only real GPU game in town.

But we are nowhere near that. Nvidia has the dominant market share in discreet GPUs by a rather large margin. I like seeing ATI kick Nvidia while they are down (launching midrange cards like the 5750 and 5770 instead of waiting until months later like Nvidia does - how long did it take for the 8800GT to come out after the 8800GTX?) just because I want to see the market share slide towards a more 50/50 split (so that nvidia will stop its bullshit of crippling their hardware if you have a competitors card, calling standard AA implementations IP, and ignoring the results of expensive calls on competitors cards).
 
I don't mean to be a jerk but why the hell should I care about DirectX11? I mean when is a game going to actually support it? Just seems silly everyone drinking the koolaid over something my kids might be old enough to take advantage of when it comes out.
 
But we are nowhere near that. Nvidia has the dominant market share in discreet GPUs by a rather large margin. I like seeing ATI kick Nvidia while they are down (launching midrange cards like the 5750 and 5770 instead of waiting until months later like Nvidia does - how long did it take for the 8800GT to come out after the 8800GTX?) just because I want to see the market share slide towards a more 50/50 split (so that nvidia will stop its bullshit of crippling their hardware if you have a competitors card, calling standard AA implementations IP, and ignoring the results of expensive calls on competitors cards).

You are comparing G80 -> G92 to a normal mainstream GPU release now?
Holy crap, when will you stop?! :eek:

The 8800GT was an anomaly, you would have to go back to the GeForce4 to find a simmilar performance delta, but that still dosn't cut it...no dieshrink, no new PCB, different memorybus ect.
Here do read up:
http://www.anandtech.com/video/showdoc.aspx?i=3140&p=2
Stay off the cool aid :rolleyes:
 
You are comparing G80 -> G92 to a normal mainstream GPU release now?
Holy crap, when will you stop?! :eek:

The 8800GT was an anomaly, you would have to go back to the GeForce4 to find a simmilar performance delta, but that still dosn't cut it...no dieshrink, no new PCB, different memorybus ect.
Here do read up:
http://www.anandtech.com/video/showdoc.aspx?i=3140&p=2
Stay off the cool aid :rolleyes:

When will you stop drinking Nvidia's cool aid? 8800GT wasn't an anomaly. The 8600GTS sucked balls, and there was a *HUGE* gap between the 8600GTS and the 8800GTS. A gap nvidia left unfilled until the 8800GT, 8800GS, and 9600GT, a gap that took them 11 months to fill.

But lets look at this gen. Where is Nvidia's current midrange card? Oops, they don't have one. Unless you consider the GTS 250 (or should I say 8800 GTS 512/9800GTX/9800GTX+), but that isn't the same generation as the 260 and 280.

Lets look at ATI for a second. For the 4xxx series they had the 4850 which could be found for $150 the same week it launched (and it launched quickly after the 4870). Later on they added the 4830 and 4770. Of course, this generation there is the 5750 and 5770, both launched very quickly after the highend cards.

We can even go farther back into Nvidia's past, as they used to release cards like the 7600GT, 7800GT, 7900GT, 7900GS - there were 4 7800 series cards and 5 7900 cards all released at different price points. The 6 series was similar. Nvidia used to have plenty to choose from in the price/performance category. When the 8 series showed up - they completely abandoned that category.

So when will you stop your crusade to prevent anyone from speaking bad about your beloved Nvidia?

I don't mean to be a jerk but why the hell should I care about DirectX11? I mean when is a game going to actually support it? Just seems silly everyone drinking the koolaid over something my kids might be old enough to take advantage of when it comes out.

Games already support it... Dirt 2 is the big one, but there are others.
 
When will you stop drinking Nvidia's cool aid? 8800GT wasn't an anomaly. The 8600GTS sucked balls, and there was a *HUGE* gap between the 8600GTS and the 8800GTS. A gap nvidia left unfilled until the 8800GT, 8800GS, and 9600GT, a gap that took them 11 months to fill.

Umm, I think he was talking about performance delta between generations. As in between the 7900 generation and the 8800 generation.
 
Umm, I think he was talking about performance delta between generations. As in between the 7900 generation and the 8800 generation.

If so why would he attack me, as my point was it took Nvidia for freaking ever to release the mainstream cards (*if* they ever releases them) whereas ATI releases them right out the gate?
 
Also, remember fellas, both ATI and Nvidia released refreshes of their DX9 cards. There was the Radeon 1800XT, then later the 1900XT/X. There was also the Geforce 7800GT/X, then later the 7900GT/X. The 79xx and 19xx replaced the 78xx and 18xx cards, respectively.

The 8800's were the fluke. From a company standpoint, although infuriating for end users who are "in-the-know" like us, rebranding the 8800's was a smart business decision on Nvidia's card. The chips were so good, and so ahead of their time, that Nvidia didn't need to pour resources into redesigning them. The unfortunate catch is, now that's come back to haunt them. :)
 
I do agree with most of it - tegra has the potential to make lots of money through huge volume (zune and next nintendo DS are a pretty big thumbs up for it). Tesla has the potential to make lots of money though huge profit margins - they can sell them at several times the cost to make and still be 10 times cheaper then the rack of multi cpu boxes they are replacing.

However twas a bit too doom and gloom at the start. Nvidia isn't exactly out of the high end gpu market - the 260 downwards are selling just fine. TSMC is screwing both Ati and nvidia - nvidia completely as they can't release the monster fermi at all (although they do have 40nm cards selling now). Ati isn't much better off - they can't produce enough 58xx chips to make a difference (and the cost is probably still too high which will hit any profit), and the 57xx cards still cost to much (perhaps due to TSMC's bad yields) so most are still buying 48xx and GTX 2xx instead.

His conclusion that nvidia might be leaving the PC gpu market also seem a bit unlikely. I mean telsa is based on the same cards, and the tegra gpu leverages the same gpu tech that goes into other nvidia cards (currently 6 series, but tegra 2 will have an 8 series core). Hence the PC gpu market is still pretty key.

There are other things that might have a big impact - the microsoft + 8086 pact might be shaken by google + ARM. 8086 is far from efficient these days with it's huge legacy of instructions, and the trend towards smaller lower power computers is forcing intel into arm's back yard where Intel is the underdog and no one cares about 8086 compatibility. Google android and chrome OS are also being groomed to support these smaller machines, where the heavyweight microsoft windows is going to suffer.
 
If nvidia is planning on leaving the PC gaming market then why are they coming up with new hardware such as 3D Glasses and PhysX?
 
I'd say just look at the current prices on the 5800 series. Then imagine worse.

10 years ago, the top of the line videocards cost the exact same price point. Working in inflation, cards are actually cheaper today.
 
Fanboy wars are such good comedy relief!

I just love it when people defend a piece of electronics and the corporation that made it like it was their favorite puppy.

Continue the comedy folks!
 
Nvidia is still developing huge chips which cost a lot more and take longer to develop, and the cost is passed down to you.
Yeah, and AMD just eats R&D costs. Just eats 'em right up. AMD is your friend! They'll walk your dog for you if you ask! They'll say nice things about your mother!

Damn, you're terrible at forums :)
 
Fanboy wars are such good comedy relief!

I just love it when people defend a piece of electronics and the corporation that made it like it was their favorite puppy.

Continue the comedy folks!

Indeed! This comedy is relevant to my interests.
 
I do agree with most of it - tegra has the potential to make lots of money through huge volume (zune and next nintendo DS are a pretty big thumbs up for it). Tesla has the potential to make lots of money though huge profit margins - they can sell them at several times the cost to make and still be 10 times cheaper then the rack of multi cpu boxes they are replacing.

However twas a bit too doom and gloom at the start. Nvidia isn't exactly out of the high end gpu market - the 260 downwards are selling just fine. TSMC is screwing both Ati and nvidia - nvidia completely as they can't release the monster fermi at all (although they do have 40nm cards selling now). Ati isn't much better off - they can't produce enough 58xx chips to make a difference (and the cost is probably still too high which will hit any profit), and the 57xx cards still cost to much (perhaps due to TSMC's bad yields) so most are still buying 48xx and GTX 2xx instead.

His conclusion that nvidia might be leaving the PC gpu market also seem a bit unlikely. I mean telsa is based on the same cards, and the tegra gpu leverages the same gpu tech that goes into other nvidia cards (currently 6 series, but tegra 2 will have an 8 series core). Hence the PC gpu market is still pretty key.

There are other things that might have a big impact - the microsoft + 8086 pact might be shaken by google + ARM. 8086 is far from efficient these days with it's huge legacy of instructions, and the trend towards smaller lower power computers is forcing intel into arm's back yard where Intel is the underdog and no one cares about 8086 compatibility. Google android and chrome OS are also being groomed to support these smaller machines, where the heavyweight microsoft windows is going to suffer.

Quoted because I wanted to congratulate you on such a great post :) Very insightful.
 
Marvell told me that the market for ARM-based SoCs is expected to grow to around five billion chips per year -- if NVIDIA can capture a sizable portion of that market, we're easily talking a couple of billion dollars per year. Tegra could be just as big as NVIDIA's GPU business today.


Yep that all summed it up for me , if. I bet there many people that can use an IF statement and attach a lofty goal :) .
 
Nice read Anand always puts together nice editorials. That being said I know alot of ATI fanboys came out of the woodwork here lately but seriously I hope you all understand what would happen to all of us if ATi were the only real GPU game in town.

Right?

Let's hope that doesn't happen we need at least 2 major players in the market.

+1,000
 
Back
Top