NVIDIA GPU Conference Press Conference Followup

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,601
Now attending a press conference that is following up the keynote speech by Jensen Huang, CEO of NVIDIA, at the NVIDIA GPU Conference. Attending are Jensen Huang, Drew Henry - General Mgr GPU, and Tony Tomasi - Senior VP of Content and Technology.

Jensen is opening up stressing the importance of heterogeneous computing and that the GPU is a big part of that. GPU will be utilized far beyond just graphics, but graphics will still be a huge part of NVIDIA's GPU business. New tools have been developed that allow for heterogenous programming and debugging.

Jensen added, Silicon of Fermi is "in house" and "we are bringing it up." Time to market is likely "a few short months." Jensen is stating that Fermi is much more than a video card for gaming and that, "DX is just a feature and it is not enough. We need something big and new!" "We need Fermi to bring excitement and sex appeal back." I am not sure what exactly that means, but it sounded good.

On the topic of 5800 shipping and market competition, Jensen stated, "Nobody likes when the competition has a product. I don't like keeping our enthusiast waiting on our next generation processors. I would rather be shipping today, but we will ship when the product is ready to ship."

On the size of Fermi, Jensen stated, "It is only big right now, because it is the biggest chip ever built." "Power consumption will be the same as today's GPUs. It will fit into today's PCs and 1U servers."
 
Last edited by a moderator:
Thanks for the coverage Kyle. The card looks pretty tiny for such a beastly performer. Still dodgy statements on the actual shipping of the card though.
 
Meh the card is not gonna play Crysis VH 19x12 no AA/AF at 60fps (even the 5870 is only 35fps) so keep dreaming of the single GPU with no scaling problems and blaze through Crysis. I'd be impressed with 40fps.
 
Interesting choice, NV. I like the approach, but I think they might take themselves right out of the gaming market. If the card doesn't perform up to the competition, the market that will buy these gpu's for the computing power is far far smaller. I expect it should perform quite well in games, but pricing isn't going to be able to be close to AMD I would think, with all that extra stuff.

The holidays are going to be ugly for them, though.
 
ATI has the upper hand. All they need is user customizable crossfire profiles and there would be no looking back. Sucks that NVIDIA doesn't have anything for quite some time. Would love to see the competition brew up again.
 
ATI has the upper hand. All they need is user customizable crossfire profiles and there would be no looking back. Sucks that NVIDIA doesn't have anything for quite some time. Would love to see the competition brew up again.

This and fix the 1080i instead of 1080p in some games issue.
 
NVIDIA has good technology and good products, but they keep building these huge monster GPUs, which are late, power-hungry, and expensive.

Fermi (according to my friend who works at NVIDIA, it's not called GT300 internally) may be a great GPU, but it doesn't mean a damn if they can't get it to market at the right price and in quantity. AMD/ATI have launched 3 new GPU series (RV670, R700 series, Cypress) in 2 years; in the same time NVIDIA has launched GT200 and a bunch of G92 rebadges.
 
The card they showed was a mock up apparently.


Honestly I don't think there is much to go on. Sounds like they released information to show what they're working on for now. Looks not only like we have to wait some months for the cards to come out. In addition, looks like we'll be waiting some time before we actually see something tangible on actual Fermi hardware. (The current demos were running on 200 series GPUs apparantly.)

Nice, now I can get a 5870, enjoy it for some months until NV brings their new toy.
 
Last edited:
That and they keep having this grandeur dream of capturing markets that aren't game based. It's great to see companies innovate and evolve, but it's something else to keep seeing NV trying to persuade people that their GPUs are good for plenty of other things. If that were the case, Tesla won't have such pathetic market share right now.

Bah, but competition is good, and we need NV to give us fanboys to make fun of LOL
 
They want people to get excited about something they named Fermi? Sometimes I think Nvidia gives their marketing monkeys to much influence but they could've used one there. :p
 
i am sure that NVIDIA will have the single faster GPU when they lanched, but at a cost and most probably at an unlinear price/performance cost, i do believe that by the engineering decisions in both NVIDIA and ATI , ATI are becoming the best performers of the 400$ and lower market.

i'll be getting an ATI 5800 in the next 3 months, i am still planning an upgrade and i am considering eyefinity.which obviously a solution that NV is not having and it affects gaming experience big time, this is invovation, this is better than DX11 logo on them at the moment.
 
Last edited:
That and they keep having this grandeur dream of capturing markets that aren't game based. It's great to see companies innovate and evolve, but it's something else to keep seeing NV trying to persuade people that their GPUs are good for plenty of other things. If that were the case, Tesla won't have such pathetic market share right now.

Bah, but competition is good, and we need NV to give us fanboys to make fun of LOL

they're doing pretty well with tegra, i don't see why tesla can't take off...
 
i am sure that NVIDIA will have the single faster GPU when they lanched, but at a cost and most probably at an unlinear price/performance cost, i do believe that by the engineering decisions in both NVIDIA and ATI , ATI are becoming the best performers of the 400$ and lower market.

–-------

Intel has been the better performer and more expensive cpu manufacturer for how long now? Noone seems to have a problem with that so why is this any different?

i'll be getting an ATI 5800 in the next 3 months, i am still planning an upgrade and i am considering eyefinity.which obviously a solution that NV is not having and it affects gaming experience big time, this is invovation, this is better than DX11 logo on them at the moment.

Ya cause eyefinity is something totally unique, inexpensive (not like you have to buy multiple monitors or anything) and not at all gimmicky in any way amirite? Fanboys my god, get serious a second. Ya and all that resolution goodness doesn't require 2 cards in xfire at all. Oh and ya those bezels sure look good as they cut up you image heh. The only game that looked nice in kyle's eyefinity review was shift, everything else looked very meh. In fact those bezels detracted from the immersion imo, but that's just me I guess. Everyone seems to think this shit is the greatest thing since sliced bread.
 
That and they keep having this grandeur dream of capturing markets that aren't game based. It's great to see companies innovate and evolve, but it's something else to keep seeing NV trying to persuade people that their GPUs are good for plenty of other things. If that were the case, Tesla won't have such pathetic market share right now.LOL


if you follow the hpc side of computing you might notice more than a few centers installing
a considerable amount of nvidia hardware for gpgpu. id say they have a decent market
share. and really, who would their competition be in gpu based hpc?

"The cluster will also sport 128 NVIDIA Quadroplex 2200 S4 units each with four Quadro
FX 5800 GPUs. To accommodate large data the cluster will have 13.5 TB of RAM on the
CPUs (2 TB on the GPUs)."

http://insidehpc.com/2009/09/29/tacc-and-nsf-add-7m-visualization-service-to-open-science-community/
 
Ya cause eyefinity is something totally unique, inexpensive (not like you have to buy multiple monitors or anything) and not at all gimmicky in any way amirite? Fanboys my god, get serious a second. Ya and all that resolution goodness doesn't require 2 cards in xfire at all. Oh and ya those bezels sure look good as they cut up you image heh. The only game that looked nice in kyle's eyefinity review was shift, everything else looked very meh. In fact those bezels detracted from the immersion imo, but that's just me I guess. Everyone seems to think this shit is the greatest thing since sliced bread.

If enough people get cards capable of eyefinity, don't you think monitor manufacturers are going to respond with some high end models that will not have a bezel issue. I am sure they are capable of building such monitors right now.
 
If enough people get cards capable of eyefinity, don't you think monitor manufacturers are going to respond with some high end models that will not have a bezel issue. I am sure they are capable of building such monitors right now.

samsung's been making monitors with very small bezels intended for use in arrayed displays for several years now.

once ATI adds bezel management to CCC, the effect will be much less disconcerting.

frankly all the trash talk against eyefinity just sounds like sour grapes ("i won't use it, so it's stupid! wa!")

at the end of the day for me, i see NV spending its R&D budget on trying to enter (and even create) other market segments, and i don't feel i need to spend my gaming dollars on their long-term business plan. maybe i'd feel differently if i had stock in NVDA, but i don't.

meanwhile i see AMD/ATI trying to bring new technologies to the PC gaming market, however much they might be niche technologies, and keep prices within reason, and i say to myself "hey, it looks like the red team is earnestly committed to PC gaming." i can't really say that for the green team, not this time around anyway.
 
Last edited:
I hate to break it to you guys, but nVidia is very focused on making GPU computing advances. They're going to make a lot of money if they can get GPUs in super computers. If you check out Wednesday 9/30, 4:30PM, General Session: Breakthroughs in High Performance Computing on nVidia's conference technology page, and listen to the Cray CEO talk about super computing, you'll begin to see why nVidia is so focused on this.


It turns out,[ ;-) ] GPUs are vastly more efficient at many massively parallel tasks, and with their new architecture, they are accurate enough to be used in mission-critical simulations. (double floating-point precision) - oh, and then there's the issue of power per computation. But I'll let you watch the video to learn more.

Seriously, this isn't great news for gamers, but it's pretty impressive tech, and certainly a smart move for nVidia. We'll see if their offering to gamers is as impressive, but I have a feeling they're willing to lose some of the gaming market.

Just sayin... :cool:
 
I am all for getting GPUs into other areas, they are powerful beasts at what they do. I just hope it doesn't compromise the gaming experience, price and power envelope.
 
Really interesting stuff, somewhat like a counterattack by nvidia to keep people informed that they're actually working on something. Although It seems to be quite some time until we'll see real performance numbers, and even longer for availability and products at shelves.
 
Kyle quoted Brent, but I think he might be referring to Batman's AA and removing support from ATi+nVidia PhysX?
 
It is exciting for their company to get involved in other aspects of GPU computing besides focusing all their R&D on making Crysis run over 200fps :) All that technology only for games. C'mon guys. Besides the market for high priced pc gaming hardware is dwindling with the popularity of consoles and very little DX11 games down the pipeline. Nvidia is changing/growing up/expanding. There is a potential huge market for everything else outside of gaming. CUDA is going to pick up significantly.
 
That and they keep having this grandeur dream of capturing markets that aren't game based. It's great to see companies innovate and evolve, but it's something else to keep seeing NV trying to persuade people that their GPUs are good for plenty of other things. If that were the case, Tesla won't have such pathetic market share right now.

Bah, but competition is good, and we need NV to give us fanboys to make fun of LOL

Marketshare? You're implying ATI's stream cards have more share than Tesla? lol no...

I think what you mean to say they won't represent so little revenue to the company.. in which case you're also wrong, because the whole point is that for compute purposes, from G80 through GT200 boards, NVIDIA GPUs have always been a GPU with some compute stuff tacked on/hacked in. Now feedback has been gathered and they've felt their way through the market for it's demands and requirements, and as a result compute is a full featured equal citizen of the chip with graphics.

So the point is it's very likely (and obviously NVIDIA believes) that their revenue from this market hasn't been much because they haven't had the features they just unveiled! Obviously no sane person would design a chip completely around a market that they thought could only ever be a maximum of 1% of their total revenue... it may be 1% but they did what they did because they believe it could be 10%... or could be 25%.

That's how smart business people think.. anticipating these things, entering new markets and revolutionizing them.. not spitting out commodity components that sell for less and less each year.
 
I hope nVidia at least announces something before the year ends. I'll be in the market for a graphcis card and I rather have choices from both companies before making a purchase. So far it looks like nVidia wont really say anything about their next generation card

Is it technical problems with their samples or simply thinking Directx 11 wont have an impact?
I dont know the answer. But I do think they are sort of trying to make it look like they have taken a different path with their GPUs and keep people busy on that while they try to get their samples ready for final production. Gaming is still nVidia's primary market and it would be foolish to make an 180 turn in one generation. I dont think it is wrong for nVidia to evolve their GPU design, but it'll take more time than they think
 
Back
Top