Are nVidia complete idiots?
http://www.techreport.com/articles.x/18332/5
nVidia are coming six months late to market with a huge/hot chip that will be difficilut to sell at a profit, as it is at best only 20% faster than the 5870 whilst having 50% greater die size and memory system to cover costs on, and yet they have the utter gall to start artificially disabling features on a GPU that will cost north of £320!!!!!!!
I realize DP makes zero difference in games, but the kind of person who is willing to drop £350 on a GPU is more than just a gamer, he's a tech enthusiast who is interested to the point of geekiness in new technology. that person wants to know they are buying the most awesome, capable and flexible tech available because otherwise they wouldn't be buying it in the first place.
Even if that person doesn't fold; they are buying that processing magnificence on the idea that in future their multitude of GP-GPU apps will crunch numbers at a truly awesome rate.
You tell him, on a product that is already a marginal gaming proposition on a price/performance metric, that they're also going to cripple the added functionality whose necessity caused its price/performance to be so marginal in the first place and he'll reply: "oh, so i'm buying bog standard consumer junk then? well sure i'm interested, i'll pay £175 and stick one in my HTPC."
I also realise that nVidia want to prop up their high-margin tesla sales, but frankly they should rely on ECC memory to make that premium worthwhile, because right now they need every advantage they can get!
So explain this idiocy to me please, because it makes zero sense.
I'm planning a new PC in late April precisely because Fremi will be here along with any AMD refresh, and I am more interested in Fermi because it sounds like a more advanced and forward looking architecture, and should in theory be more flexible which will make up for its reduced efficiency as a pure gaming product, but now i hear they are going to cripple the product and my first instinct is; "well, /@ck you then!"
Given their precarious position in the computer market as a non-provider of full platforms, i am amazed at nVidias ability to repeatedly shoot itself in the feet, now with this Fermi launch I see nVidia has both barrels pointed firmly at its own wedding-tackle and an itchy trigger finger to boot.
They begger belief sometimes....................................
[/rant]
http://www.techreport.com/articles.x/18332/5
-------- - GT200 - GF100 - RV870
SP FMA rate 0.708 Tflops 1.49 Tflops 2.72 Tflops
DP FMA rate 88.5 Gflops 186 Gflops* 544 Gflops
I should pause to explain the asterisk next to the unexpectedly low estimate for the GF100's double-precision performance. By all rights, in this architecture, double-precision math should happen at half the speed of single-precision, clean and simple. However, Nvidia has made the decision to limit DP performance in the GeForce versions of the GF100 to 64 FMA ops per clock—one fourth of what the chip can do. This is presumably a product positioning decision intended to encourage serious compute customers to purchase a Tesla version of the GPU instead. Double-precision support doesn't appear to be of any use for real-time graphics, and I doubt many serious GPU-computing customers will want the peak DP rates without the ECC memory that the Tesla cards will provide. But a few poor hackers in Eastern Europe are going to be seriously bummed, and this does mean the Radeon HD 5870 will be substantially faster than any GeForce card at double-precision math, at least in terms of peak rates.
nVidia are coming six months late to market with a huge/hot chip that will be difficilut to sell at a profit, as it is at best only 20% faster than the 5870 whilst having 50% greater die size and memory system to cover costs on, and yet they have the utter gall to start artificially disabling features on a GPU that will cost north of £320!!!!!!!
I realize DP makes zero difference in games, but the kind of person who is willing to drop £350 on a GPU is more than just a gamer, he's a tech enthusiast who is interested to the point of geekiness in new technology. that person wants to know they are buying the most awesome, capable and flexible tech available because otherwise they wouldn't be buying it in the first place.
Even if that person doesn't fold; they are buying that processing magnificence on the idea that in future their multitude of GP-GPU apps will crunch numbers at a truly awesome rate.
You tell him, on a product that is already a marginal gaming proposition on a price/performance metric, that they're also going to cripple the added functionality whose necessity caused its price/performance to be so marginal in the first place and he'll reply: "oh, so i'm buying bog standard consumer junk then? well sure i'm interested, i'll pay £175 and stick one in my HTPC."
I also realise that nVidia want to prop up their high-margin tesla sales, but frankly they should rely on ECC memory to make that premium worthwhile, because right now they need every advantage they can get!
So explain this idiocy to me please, because it makes zero sense.
I'm planning a new PC in late April precisely because Fremi will be here along with any AMD refresh, and I am more interested in Fermi because it sounds like a more advanced and forward looking architecture, and should in theory be more flexible which will make up for its reduced efficiency as a pure gaming product, but now i hear they are going to cripple the product and my first instinct is; "well, /@ck you then!"
Given their precarious position in the computer market as a non-provider of full platforms, i am amazed at nVidias ability to repeatedly shoot itself in the feet, now with this Fermi launch I see nVidia has both barrels pointed firmly at its own wedding-tackle and an itchy trigger finger to boot.
They begger belief sometimes....................................
[/rant]
Last edited: