What is nVidia's short term plan?

Found out what their plans are:
http://vr-zone.com/articles/nvidia-directx-11-is-not-important/7674.html?doc=7674
“DirectX 11 by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build content, which is always good, and the new features in DirectX 11 are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in a GPU,” explains Mr. Hara.
http://www.tweaktown.com/articles/2...onnaire_a_response_to_amd_s_views/index2.html
We are confident that when the Fermi-based GeForce products ship that we will maintain our performance crown. Even today our GeForce GTX 285 significantly outperforms the HD 5870 in next gen games like Batman Arkham Asylum with physics. In fact, even our GeForce GTS 250 outperforms it. That's fundamentally the difference between us and AMD. We focus on graphics plus cool effects like physics and 3D stereo, whereas AMD just makes incremental changes to their graphics cards.
http://physxinfo.com/news/330/official-nvidia-position-on-hybrid-ati-nv-physx-configurations/
Nvidia: It is true that running an NVIDIA GPU for PhysX with an AMD GPU for graphics is not a supported configuration. This mode was inadvertently turned on in a beta version of our Win7 drivers and on some older XP drivers. AMD does not support PhysX for their customers, and we don’t QA this configuration. With no QA, it is risky to run this configuration so we removed this capability in a recent driver release.
 
We are confident that when the Fermi-based GeForce products ship that we will maintain our performance crown. Even today our GeForce GTX 285 significantly outperforms the HD 5870 in next gen games like Batman Arkham Asylum with physics. In fact, even our GeForce GTS 250 outperforms it. That's fundamentally the difference between us and AMD. We focus on graphics plus cool effects like physics and 3D stereo, whereas AMD just makes incremental changes to their graphics cards.

OOOOOOO nVidias PR damage control has come out swinging lol. So DX11 isnt important until nvidia has a DX11 card and Eyefinity isnt a cool new feature? Riiiiiiight.

And i find it funny how he says nvidia will "maintain their performance crown." 1 game that the 285 performs better in is hardly what id call having the performance crown. Hell if AMD released the 5000x2 today then nvidia wouldnt even have that, and even their 295 would be obsolete.

I dont think it will be a case of nvidia maintaining their performance crown, i think it will be a case of trying to pry it from the hands of the 5800 series.
 
Last edited:
Considering that a GTx285 is nothing more than a year and a half old card on a die shrink with a speed bump with a price about the same as the GTX 280 over a year ago I think that they could afford a decent price cut. No one knows their manufacturing costs, it's just speculation mostly from Charlie. Does that make you feel better?:rolleyes:

The GT200 is a big-ass chip, and it has a 512-bit memory bus, which means a more expensive PCB with more layers. 5800 has a smaller die size, a 256-bit bus, AND better performance.

So no, it's not JUST speculation. It's logical that the GTX285 is expensive to make and that it's difficult to scale down the cost. In order for it to be competitive with the 5850, Nvidia would have to drop the price by about $100. Let me ask you: do you honestly think they could cut prices that much and still make a profit?
 
If I were nvidia I would lower the price on gtx285's, and gtx 275's and sell them practicably at cost with no margins to keep mind share...

The gtx 260 seems like its too expensive to produce, and with 55 nm now mature disabling perfectly good GT200b chips to produce them doesn't make sense. Maybe they'll release a card with the 260's slow(compared to gtx275) memory but full 240 shaders.

they have been doing that. think the GTX275 for not disabling shaders (and saving face). as for selling at cost they haven't been doing much more then that except with theGTX285.

would love to see a simple die shrink of the gtx285 and GDDR5 though, not an engineer but I would think they could compete
 
http://www.tweaktown.com/articles/29...ws/index2.html
We are confident that when the Fermi-based GeForce products ship that we will maintain our performance crown. Even today our GeForce GTX 285 significantly outperforms the HD 5870 in next gen games like Batman Arkham Asylum with physics. In fact, even our GeForce GTS 250 outperforms it. That's fundamentally the difference between us and AMD. We focus on graphics plus cool effects like physics and 3D stereo, whereas AMD just makes incremental changes to their graphics cards.

sorry but I have to call BS here, any advantage they have is purely artificial here, this is basically out right dishonesty in fact.

http://forum.beyond3d.com/showthread.php?p=1332461

http://www.youtube.com/watch?v=AUOr4cFWY-s
 
Nice [backpedaling] I might add too, you've been exposed.
Your apparent obsession with me is flattering, but I can't say I'm interested.

Considering that a GTx285 is nothing more than a year and a half old card on a die shrink with a speed bump with a price about the same as the GTX 280 over a year ago I think that they could afford a decent price cut.
The GTX 280 debuted at around $650 in June of '08. Today, you can snag a GTX 285 from BFG for around $300 or for as low as $250 at Best Buy. That is not what I would call "about the same".

That's a pretty damn good price at Best Buy, by the way.

No one knows their manufacturing costs, it's just speculation mostly from Charlie.
No, we can engage in our own speculation, which is exactly what I did:
At this point, I think it's between not earning much revenue off the GTX 2xx series or losing money by doing price cuts.

Does that make you feel better?:rolleyes:
Yeah, it makes me feel better when people don't purposely misconstrue what I've said, as strange as that may seem.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
We are confident that when the Fermi-based GeForce products ship that we will maintain our performance crown. Even today our GeForce GTX 285 significantly outperforms the HD 5870 in next gen games like Batman Arkham Asylum with physics. In fact, even our GeForce GTS 250 outperforms it. That's fundamentally the difference between us and AMD. We focus on graphics plus cool effects like physics and 3D stereo, whereas AMD just makes incremental changes to their graphics cards.

Man, can I buy drugs from these guys?
 
The GTX 280 debuted at around $650 in June of '08. Today, you can snag a GTX 285 from BFG for around $300 or for as low as $250 at Best Buy. That is not what I would call "about the same".

That's a pretty damn good price at Best Buy, by the way.

The video card you linked to on Amazon is priced at $310 and is really from Tigerdirect. And as for Best Buy 285, that's because Best Buy Outlet sells refurb items. The same way that Dell Outlet sells refurb items. I've got nothing against refurbs, but comparing the price of a brand new video card to the price of a used/refurb video card is hardly apples to apples. New 285s still run for mostly $320+. Just check Newegg.

http://www.newegg.com/Product/Produ...5&bop=And&ActiveSearchResult=True&Order=PRICE

Let's see $259 for brand new 5850 that outperforms 285, or pay $50-100 MORE for a card that is SLOWER and lacks DX11 or Eyefinity. I wonder which is the better VALUE? Hmmmmmm...
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Let's see $259 for brand new 5850 that outperforms 285, or pay $50-100 MORE for a card that is SLOWER and lacks DX11 or Eyefinity. I wonder which is the better VALUE? Hmmmmmm...

But GTX 285 has PHYSX!!!! 1111 !!!!!!!!!!!
 
The video card you linked to on Amazon is priced at $310
It was about $303 when I had posted the link, for whatever that's worth.

And as for Best Buy 285, that's because Best Buy Outlet sells refurb items.
I see. Best Buy certainly doesn't make that clear, which I imagine isn't surprising.

Let's see $259 for brand new 5850 that outperforms 285, or pay $50-100 MORE for a card that is SLOWER and lacks DX11 or Eyefinity. I wonder which is the better VALUE? Hmmmmmm...
That's a separate debate. I wouldn't bother trying to claim that that the GTX 285 is a better value, even at $250. Our discussion pertained only to the prices of the GTX 280 when it launched versus the current pricing of GTX 285s.
 
Your apparent obsession with me is flattering, but I can't say I'm interested.


The GTX 280 debuted at around $650 in June of '08. Today, you can snag a GTX 285 from BFG for around $300 or for as low as $250 at Best Buy. That is not what I would call "about the same".

I was referring to the price cuts after the 4870 hit the market. Of course Nvidia was price gouging after their new top card hit the market wich wasn't much more than $350. On top of the die shrink the GTX285 is also using cheaper vrms and GDDR3. I would imagine that they can afford a price cut for the short term. I'm not saying that it would be a good long term strategy but at even $300 they aren't losing their shirts.

No, we can engage in our own speculation, which is exactly what I did:

I can't do the same?

Yeah, it makes me feel better when people don't purposely misconstrue what I've said, as strange as that may seem.

The feeling is mutual.

Let's see $259 for brand new 5850 that outperforms 285, or pay $50-100 MORE for a card that is SLOWER and lacks DX11 or Eyefinity. I wonder which is the better VALUE? Hmmmmmm...

With that logic I could argue that Phenom 2 is faster than i7. They do a pretty good job trading blows in games and I would also venture to say that the GTX285 has more features that I can use today (I sound like that PR guy). Eyefinity means nothing to me much like that stereoscopic crap and I don't own any DX11 games and probably wont for a while.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
I've spent thousands of dollars on Nvidia cards since 3dfx died and only used the 9800non-pro and tv tuner from ATI.. but I'm done with Nv.
The 5870 dominates, this card cannot be matched by Nvidia. I think it's dead silent, anything that is quieter than the loudest fan in my case (cpu fan, which is pretty quiet itself) with this kind of HP is amazing. Toss in triple output support and DX11 and fan favorites pHySx/3d ViSiOnz are more laughable than they already were.
DX11 means the end to physx (as if half of us paid any attention to it anyway, and I've always had the latest Nvidia cards on launch) and 3d has been around forever as a gimmick pushed by so many. All failing miserably.
The way Nvidia acts it's almost like they've gotten so egotistical they can't stand the thought or idea they could ever be trumped.
I don't remember their press statements being so obtuse in the past?
 
I was referring to the price cuts after the 4870 hit the market.
I see. That wasn't clear from your post.

I would imagine that they can afford a price cut for the short term.
I'd likely think they could afford a minor or possibly even a moderate price drop, but not a substantial-enough reduction to compete well with AMD's current offering. NVIDIA's cards have some attributes in their favor such as CUDA support, 3D Vision support and F@H performance (which is abysmal on AMD's cards), but AMD also has its own set of advantages (EyeFinity and DX11).

I can't do the same?
I have no issue with you speculating, no. I took issue with your misconstruing my post.

The feeling is mutual.
What comment(s) of yours did I misconstrue?
 
My thoughts/opinion which are not based on facts would be that nVidia's short term plan is to spin around mis-information and release dates suggesting that the GT300 will be out very early compared to its actual release date. The more people they can convince not to buy the 5870 or other 58xx/57xx models in favor of waiting for Fermi, the more possible customers nVidia will have for their Fermi line.

Step A: Denounce DX11, Eyeinfinity or any features your competition has doesn't matter, isn't useful and won't drive video card sales to confuse consumers and reassure shareholders. (Check, already done in that nVidia press annoucement)

Step B: Claim what your current cards are good and presumably what Fermi will be good at is important and will drive future video game sales. IE, promote folding @ home, promote PhysX, promote your stereo 3D (Check)

Step C: Release rumors your video card will be released in six months, three months or less time to encourage people to wait for your video card's release. Don't offically announce a release date though as when you fail to make an official release date, it would hurt your companies credibility in both the eyes of gamers and share holders. There are however many websites willing to accept, post and spread any rumors given by nVidia with seeming zealous blind faith. Some websites even willing to expand on your rumors to or extrapolate non-existant truths.

Step D: Rinse and repeat until the cycle no longer appears to be working

Step E: Release die-shrinked GT200 modes or GT200 models with DX10.1 support to buy time till GT300 comes and increase dales in november or december before x-mas.
Step F: Start spreading rumors that there's no real difference between DX10.1 and DX11 so you might as well buy a GT200 /w DX10.1 instead of the 58xx/57xx. Promote Physx at same time and stereo3D

Step G: Release a new series of rumors that since the rehashed GT200s are being released, the GT300 must be JUSTTT around the corner and no-doubt will be out in time for x-mas presents. Don't buy now, wait for x-mas or boxing day!

Step H: Start steps A-C again.

Step I: Release the GT300 after x-mas and hope you've successfully ruined the 58xx's potentially good x-mas sales with mis-information campaigns.
 
Well many are running the last gen top end red or green and don't feel a compelling need to update yet. Most DX11 games arrive Q1 2010. Slide of DX11 games.
English version of STALKER: Call of Pripyat coming out only in 2010/Q1.

Many more are hoping nVidia releases to lower current prices on AMD's offerings.
Many are waiting for AMD to add to Eyeinfinity's features.
Many will wait for thinner bezels.

It's not so cut and dry.
 
If they really wanted to steal ATI's thunder they would do a paper launch...but they haven't even done that yet.
 
I'd say it's pretty cut and dry.

As cut and dry as I've ever seen this market. The 5870 is a bigger success, though many can't see it now, than the 9700 and the 8800. Frankly, I've never seen such an all around excellent product with such tight execution (releasing the lower ranges very soon).
Probably the ugliest time period Nvidia has, and will, go through. More injury from Intel has not only also occurred, but Intel will continue to push Nvidia out of business altogether when AMD+Intel are someday selling x86 GPUs/CPUs. Intel isn't playing around with the small-timers at Nvidia, they recognize a thorough spanking is necessary.

Nvidia can be successful through it all, but they'll have to accept Intel's rule and essentially bend over for them as everyone in the computing space must. That isn't likely.

Intel NEEDS AMD, or Intel goes poof under antitrust litigation like Ma Bell. If need be, Intel would ensure funding for future AMD operations, and obviously ATI in that case.

Consider AMD on "cannot fail" protection plans like GM. No ones going to let AMD go under, regardless of the propaganda/fear tactics that there's a glimmer they'd be allowed to go down. Intel has too much riding on their existence. I'd rather purchase AMD products than Nvidia, knowing that their future is secure.
 
No one is saying the HD5800 isn't a great release.
I listed reasons not everyone dropped what they were doing and purchased one.
 
No one is saying the HD5800 isn't a great release.
I listed reasons not everyone dropped what they were doing and purchased one.

Very true, it takes a long time to gain market share for this very reason. AMD/ATI is on a roll though.
 
If they really wanted to steal ATI's thunder they would do a paper launch...but they haven't even done that yet.

Can't really get away with that anymore. You have to have at least token quantities available for purchase on launch day, or within a few days, and good availability with in a month or you risk having tech sites and forum posters far and wide slamming you. Even many of their own fans would flame them if they launched and had nothing on the shelves a month later, let alone the 3-6 it seems like the 300 series will take b4 it hits the shelves in even token quantities.
 
Yes very quaint isn't it.
Believe it or not, most computer users walk into the nearest B&M when they need a upgrade if they don't order a new OEM or grab what's on the shelve. We all know their salemen aren't always up to date on tech releases.

Have you ever been in a B&M and listen to the customer and try not to laugh at the saleman's answer/sale pitch?

I'de believe that, if both circuit city and compusa weren't gone. Best Buy doesn't exactly stock that many components. Most people today woudn't have the courage/skill to upgrade there computers, or have laptops. This is the [H]ard Forum, we are not the majority. :cool:
 
Insurmountable is a good word. "not capable of being overcome". All products are always designed, from the producers point of view, to make money. To the point: Sony is a company, companies make money. Like everything else, being the market leader has effects on your bottom line. If, as you say, the PS3 R&D cost was insurmountable at $300, then selling said product at $300 would make the product non-profitable.

Your point would be valid, if not for historic evidence in the past that selling at a loss has actually provided a long term success. The original Xbox cost MS $850 at the time to put together and they sold the machine for $300. That's a TITANIC loss, however that investment built a userbase that has served them well to this day. Another example would be the Rockband/Guitar Hero Drumsets. Reportadly they cost $200 to build, and are sold for signifigantly less, why? They don't care about lossing money in the short run, because when a customer gets your product in there hands, and it works well for them, they stick with you, and with all the money ActiBlizzion and EA are raking it in with DLC, you should be able to see how taking it in the chin for a little while can eventually bring financial success.

When I started my computer business I was in the red for a long time mainly because I was charging 1/4 my competitors and even giving away a TON of services at no charge. But eventually, I built a loyal base and now have a nice constant flow of business, Trust me, I was bleeding money for a few years. But you get back what you put out, and believe it or not mindshare is more important than any profit you may make and in the long run will make up for any loses you initially incur. I wish most of these companies that turned HUGE would just humble up and just take it on the chin, for the consumer. It's likely pigheadedness (nvidia keeping their prices at these outrages levels is a good example). Imagine how many people would say "Forget the 5770/5750" if the GTX 260 was priced at $109? Or how many people would think twice about a 5870 if the GTX295 was $240? Or say "screw off" to the 5850 if you can pick up a GTX285 for $170? Nvidia would not only screw over ATI they would gain Mindshare...enough to earn a pass even if Fermi is a complete failure (like the NV30). They would maintain there market leadership and maybe even force ATI to drop prices. It's bussiness 101, and it amazes me how giant corporations forget this.

Maybe I'm the luckiest guy in the world, or Karma has been exceptionally kind to me, but as a successful business owner, I firmly believe that taking it on the chin for the little guy. For example, I'm thinking about doing FREE Windows 7 installs for the week of the release of Windows 7....of course the customer has to have there legitimately purchased copy of Windows 7 (I don't sell software), but don't you think that's going to cost me money? Of course it is! But in the long run those customers will come back to me for other things....it's just how business works.
 
Maybe I'm the luckiest guy in the world, or Karma has been exceptionally kind to me, but as a successful business owner, I firmly believe that taking it on the chin for the little guy.
Except that nVidia is (still) a dominant player, not a little guy like your business. If nVidia sells their products for a loss to drive out a (relatively) minor player, such action would qualify as an anti-competitive practice (more specifically, dumping), which is illegal.

If some day nVidia becomes a minor player in the field, then it would have much more flexibility in its tactics. But not until then.
 
Back
Top