Why is there so much hype around latest Fermi vid?

Alright, let's get back on-topic please. ;)
 
Alright, let's get back on-topic please. ;)

ok let’s.

Why is there so much hype over a card that achieves performance similar to a card that has been out for months.

why would anyone be interested in this failure card. why can’t you Nvidia fans admit the defeat?

why people act as though the 5970 doesn't exist and that its not faster than Nvidia latest offering which hasn't even shipped.

if the answer to the previous question was “we are not defeated, even 5% is still faster than the 5870” then don’t you get that the old 5970 is still faster than Fermi? (I hope no one will remember Fermi will also have a dual GPU card)

when the Fermi does come, whats not saying that ATI cards dont drop in price thus not only taking the top performance vga crown but also the best bang for buck.

if the dual GPU Fermi was mentioned, don’t you see that ATI cards (that already took the vga crown) will still have better value?
 
Because multi-gpu technology (on both sides) has inherent issues, and many enthusiasts who know better stick with single GPUs. They (and I) want the fastest single GPU available to avoid said issues/shortcomings of SLI/Crossfire.

Not to mention that if you have the fastest single gpu, that means you can typically double them up and also have the fastest double GPU.. the single chip is the foundation for that. So even if you want the fastest multi-gpu board, you'll be able to get one I'm sure.

Way to contradict yourself lol.
 
I'm confused, what are you referring to?

The extras. CUDA, PhysX, Stereo 3D. I myself actually hesitated when i realized i'll have to give up working stereo 3D into my upgrade path if i were to go with the 5850.

EDIT: I had stereo 3D before back when nVidia still supported legacy through the VGA port. I wanted to get into stereo gaming again. If nVidia had continued supporting legacy 3D hardware, i'd have probably waited for Fermie =P
 
Last edited:
The extras. CUDA, PhysX, Stereo 3D. I myself actually hesitated when i realized i'll have to give up working stereo 3D into my upgrade path if i were to go with the 5850.

Sure, it was stated in a manner that implied that ATI doesnt have anything that Nvidia doesnt offer and that was the part that I was "confused" about.
 
i think that it's true, the nvidia card it only comes close to a 5870. if it was faster they would tell us. they are trying to "prepair us" for their slower card, and make this talk, oh our card is more technologicaly better...
 
OP how long have you been around?
There is always hype about new video cards.
Also for the record, what you say about Nvida people ignoring the 5970, dont forget that when ATI launched its new single card, it was still beaten by the gtx295 in most games so I do not see your point.
In any case, Nvidia offer things that ATI does not so that is one VERY good reason in my book

Niche things like PhysX and "3D" doesn't count, they are both garbage IMO. Nothing noteworthy.
 
Niche things like PhysX and "3D" doesn't count, they are both garbage IMO. Nothing noteworthy.

If niche technologies like PhysX and 3D don't count then by default Eyefinity shouldn't count either. Yet in most of the threads dealing with AMD vs Nvidia, Eyefinity is almost always brought up as the holy paladin of AMD.

And don't get me wrong, I'll admit I'm actually more of an AMD/ATI guy anymore, especially with the evergreens (reminds me that I need to update my sig, running a 5770 now). And I don't care for the 3D gimmick, but physx I find neat in the few games where it's actually been implemented in interesting ways.

They may not be noteworthy enough for you or I to take them into consideration, but in my opinion that doesn't mean they should immediately be disregarded. For some people one or both of those technologies may be more than enough reason to stick with a company.
 
I think it's interesting because of what it didn't show, you know, like, real games. Also, the two examples of 3D surround that I have seen don't show the game in real action, just "sitting still". Edit: Nevermind that, 3d vision surround seems to be alive and kicking :)
 
Last edited:
I think it's interesting because of what it didn't show, you know, like, real games. Also, the two examples of 3D surround that I have seen don't show the game in real action, just "sitting still".

That's what I was thinking as well. We've seen some impressive demos, but we want to see real world performance.

The graph they show after the Unigine demonstration leaves a lot to be desired, especially if the street price is as high as what is being speculated.
 
That's what I was thinking as well. We've seen some impressive demos, but we want to see real world performance.

The graph they show after the Unigine demonstration leaves a lot to be desired, especially if the street price is as high as what is being speculated.

Also what I was thinking. I mean is a steady 50 fps on a tech demo meant for your card a good or bad number? Was it really that visually impressive? A 5970 made that Unigine tech demo beg for mercy.
 
Here I was thinking Nvidia was going to release their answer to ATI's HD 5xxx series. GTX 480 is comparable to a 5870.............what about the 5970. Why is there so much hype over a card that achieves performance similar to a card that has been out for months.

I want to say I'm not flaming Nvidia or starting ati vs nvidia thread, but I want to know why people act as though the 5970 doesn't exist and that its not faster than Nvidia latest offering which hasn't even shipped.

Brings me to another point, when the Fermi does come, whats not saying that ATI cards dont drop in price thus not only taking the top performance vga crown but also the best bang for buck.

I'll state this as a complete non-gamer (ATM, at least) who doesn't have a dog in the fight but is always interested in new architectures and the like. Okay--so CUDA is interesting to me, but I probably can get by with a 9600 gso for my interests. Love to see more and better implementation on ATI cards (OpenCL) for more architecture-independent goodness. AKA--I don't have a bias one way or the other. I do get my kicks out of watching flame-wars, though.

I'll ask you one question: If you are truly asking what the subject of your post says, why did you bring such bias into the body? Don't try and hide behind the statement that you weren't starting an NVidia vs ATI thread--that's pure and unadulterated bullshit that even a blind person can see. The logical dissonance is staggering. Hint: the answer to your original question on hype is answered in your own, highly biased opinion. NVidia folk are just as onerous.

I'll give you the best answer to your question, as originally stated though. NVidia has been pretty quiet for a long while, and for some strange reason shut down their gt200 architecture instead of going for another die shrink to pick up some more clocks, lower power and perhaps spin in some more low-level functionality (well, that and getting good at 40nm on a less ambitious platform than Fermi). It's been lopsided towards ATI for a while now, and there's a lot of excitement within the computer game enthusiast crowd about something new. Fermi is a shakeup as far as its design, so there's a lot of speculation on the implimentation, its efficacy in the milieu of DX11. It's also is well behind schedule for a gamut of reasons that may annoy enthusiasts, but far more annoys the company itself.

Couple that with the cranial-rectal problem some/lots of people have and here's the spectacle you see. Hope that helps.
 
Niche things like PhysX and "3D" doesn't count, they are both garbage IMO. Nothing noteworthy.

And that is indeed your opinion, a very flawed one.
You haven't notice the world is finally going 3D in full force? ;)
Also, maybe you haven't noticed that pretty much all the gaming we do is in awesome 3d worlds that most of the users are still experiencing in a plain 2d way?
That makes sense to you? So it is a gimmick to actually experience the world in 3D...the way there are created for us to enjoy? :rolleyes:
 
ATi will impliment 3D when it becomes more than a novelty......we had the 3d craze before and it faded once people realized that the technology sucked ass. IMHO it still sucks ass.
 
ATi will impliment 3D when it becomes more than a novelty......we had the 3d craze before and it faded once people realized that the technology sucked ass. IMHO it still sucks ass.

And the only one that's actively made progress on it is nVidia. If it wasn't for them, stereo would still be limited to laboratories. nVidia took the first step into making 3D commercially viable and it actually works.

Ever tried playing WarCraft or C&C in 3D stereo? Awesome! It actually looks like there's a moving diorama on the other side of your desk!
 
Ever tried playing WarCraft or C&C in 3D stereo? Awesome! It actually looks like there's a moving diorama on the other side of your desk!

And thats just it, everything looks like a diorama...

After playing around with a few demos, the experience is really underwhelming; It's more like looking through an interactive pop-up book than anything else. And considering the cost, new monitor and the Nvidia kit, its disgustingly expensive for what it offers.

That's my opinion anyway.
 
And thats just it, everything looks like a diorama...

After playing around with a few demos, the experience is really underwhelming; It's more like looking through an interactive pop-up book than anything else. And considering the cost, new monitor and the Nvidia kit, its disgustingly expensive for what it offers.

That's my opinion anyway.

My 'kit' cost me a grand total of... $10.00

Seriously! I don't know why they got expensive, all it really is is just simple circuit and some primitive LCD's. Those dongles just connect to the monitors V-Pin and switches LCD's everytime it reads a signal. An engineering student can build one from spare parts in an afternoon!

That's what pissed me off when they stopped supporting legacy hardware, the price jumped from $10.00 to $200.00 WTF??? :mad: I still don't get what they did that! :mad:

If they had kept supporting legacy, i'm pretty sure, you won't mind dropping what amounts to pocketchange too ;) It certainly gives you a different perspective in FPS games, particularly when you're running through mist, foliage, or dodging spears ;)
 
the fermi hype is because nVidia advertised so much, jeesh 3 of the 4 last games I played show a "nVidia - the way it's meant to be played" logo on startup... crysis, batman aa, and bioshock 2 I believe.
 
Way to contradict yourself lol.

How broken do your reading comprehension skills need to be to come to that conclusion?

I said there are issues with multigpu technology. Hence I want the fastest single gpu. But IF you want the fastest solution period, regardless of these issues, the fastest single gpu usually lays the foundation for creating the fastest multigpu part.

Someone asked why people care about fermi performance when a possibly faster 5970 multigpu is already out... my answer is some people prefer single gpu anyway, and also this fermi matters because if its the fastest single gpu, they can probably make a multigpu board out of it that IS faster than a 5970.

Get it yet???
 
well no we don't and can't get it.

There is nothing to buy. Picking programs, settings and etc to make your card look best has been done for years.
Until there are cards available to purchase and unbiased reviews (what a concept) this card does not exist.
We are talking about bs test set up to make it look great, and until they get to market its all BS.
 
While it's late to the market the GTX480 will presumably be the fastest single GPU and then at some point they'll release an alternative dual GPU card which would play somewhere inline with a HD5970. The additional features mirror Eyefinity but with 3Dvision.

Yes it's late, shit happens. Why all this hate over it I suppose can be attributed hype pissing people off.
 
Someone asked why people care about fermi performance when a possibly faster 5970 multigpu is already out... my answer is some people prefer single gpu anyway, and also this fermi matters because if its the fastest single gpu, they can probably make a multigpu board out of it that IS faster than a 5970.

No. When you have a single GPU that is king of the hill versus other single GPUs, you can't just go and slap two of them on a card, and suddenly have a performance leader. Take the GTX 295 versus the HD 4870 x2, for-example:

The 4870 X2 came together quickly, because the maximum consumption of the 4870 was well-under 150w. This meant you could sandwich them together, and not break the 300w PCIe power limit.

The GTX 295 was another matter. The GTX 280 used far too much power to fit a dual solution into 300w. It was possible to wedge dual-GTX260 setups into the power envelope, but that configuration would not have ensured that the GTX 295 would beat the 4870 x2.

The solution was a compromise: they had to wait 6 months for the 55nm GT200 chips, and they decided to create a balanced solution that used less power, while still providing significantly better performance than the 4870 X2. But the first revision was expensive to build, so they did not see sales take-off until about six months later with Revision 2.0. Sure, it was the fastest GPU on the planet, but the wide availability was so long in coming that it had a very short lifetime; the 5870 made it less desirable, and then the 5970 made it cry and run home to mommy.

The point is, thermal envelope is everything. And if Fermi has anywhere near the power consumption that sites have been parroting on about, then it will have the highest single-GPU power consumption ever, beating-out such favorites as the the HD 2900 XT and the GTX 280. And I have no doubt that, that being the case, a dual-GPU version is way far off in the distance.
 
Last edited:
The point is, thermal envelope is everything. And if Fermi has anywhere near the power consumption that sites have been parroting on about, then it will have the highest single-GPU power consumption ever, beating-out such favorites as the the HD 2900 XT and the GTX 280. And I have no doubt that, that being the case, a dual-GPU version is way far off in the distance.

^^ This!

EDIT: Also FWIW, I experienced micro stuttering with my GTX 295 in Vista 64. Once I went Win7 64, I no longer see it. I saw it in GTA IV before, now in Win7 I do not and the performance is night and day in that particular game. I also play FPS, ect.
 
Last edited:
Back
Top