Anyone else think Nvidia will...

Status
Not open for further replies.

teletran8

2[H]4U
Joined
Jan 12, 2011
Messages
2,220
up their bus width to 512-bit for their next gen card, and be done with it?

With the obvious Shader/CUDA Core increase and 28nm tech. 680 can't be stopped even if they stick with 384 bit imo, so who thinks they'll go for 512 bit?

Does Nvidia even need a strategy lol?
 
up their bus width to 512-bit for their next gen card, and be done with it?

With the obvious Shader/CUDA Core increase and 28nm tech. 680 can't be stopped even if they stick with 384 bit imo, so who thinks they'll go for 512 bit?

Does Nvidia even need a strategy lol?


no

no

no

and NO! they will not go to 512bit and they will not exceed 3GB of ram, odds are they won't even increase the shaders that much given how power hungry the fermi architecture was the more they increased the shaders. i do not have very much hope for them finding the magical power usage fix with Kepler either.. quit living the pipe dream, kepler is not going to turn out like everyone thinks it will.. at best i don't expect more than a 20 to 30% increase in performance over the gtx 580 in gaming when it comes to the GTX 780(yes its called the 780 and not the 680, the 600 series is for the mobile market). now as far as GPGPU goes i expect to see huge gains since thats all nvidia really cares about at this point.
 
it's common knowledge that kepler has a 768 bit memory bus and a built in 1200 watt power suply so it can outperform 3x GTX 580s.
 
I've read rumors of XDR2 with just a 256 bit bus too.

and who cares? no ones using it.

what you actually thought Rambus was telling the truth about XDR2 let alone the fact that they could ever produce it? anyone who actually thought they could is an idiot. anyone remember RDram? the so called future of memory and how much of a joke it was.

either way you can compensate the bus speed of the ram by increasing the clocks. which is why AMD's 256bit GDDR5 ran at a much higher clock speed then Nvidia's 384bit GDDR5 yet gave almost the same performance.
 
no

no

no

and NO! they will not go to 512bit and they will not exceed 3GB of ram, odds are they won't even increase the shaders that much given how power hungry the fermi architecture was the more they increased the shaders. i do not have very much hope for them finding the magical power usage fix with Kepler either.. quit living the pipe dream, kepler is not going to turn out like everyone thinks it will.. at best i don't expect more than a 20 to 30% increase in performance over the gtx 580 in gaming when it comes to the GTX 780(yes its called the 780 and not the 680, the 600 series is for the mobile market). now as far as GPGPU goes i expect to see huge gains since thats all nvidia really cares about at this point.


You really think GTX 780 will not be faster than 7970? I would be really surprised if that was the case.
 
It will be faster, but cost 300 dollars more and require a 2kw pay.

You are exaggerating. When did Nvidia price their fastest single GPU $300 higher than AMD's? Besides their best GPU is faster than AMD anyway. Since the 7970 is priced at $550, The GTX 780 will cost around $50 more because it will be faster (based on their track record).
 
They could release a 585 and match the 7970 pretty easily.

Uhh when accounting for both cards actually being overclocked, no they cant.

20% better performance than a 580 overclocked, while not being overclocked itself isn't something you can match "easily".

EDIT TO ADD:
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_580/27.html

Taking account of base performance + overclocking potential on a conservative approach (15% and 15%), it is the same as a jump from the 5870 to the 580 performance wise at 1080p//1600p (1.15*1.15 * the fps of the 5870 =~ fps of the 580), so again, not really easy at all.
 
Last edited:
512-bit? small time, i heard they will be going to 1024-bit, and pair it with 16gb of vram, the video card will also be clocked at 2 ghz and amd will go bankrupt because of it.
 
Pfff all wrong. 2048-bit with 2 terabytes gddr8. Approx 5 trillion trAnsistorz. Using of course 8nm process for power. Itll be powered by the pcie slot only. Guaranteed 5000% performance gain. Its actually a joint venture with amd so both companies will profit. Theyll also be adding a 4D ultra vision plug that you can stuff into the back of your head matrix style so you can travel through time in your miiiinnnddd brah.
 
I doubt it. GDDR5, from all accounts, is hard as hell to route.

Unless if PCB size grew larger, or nVidia clocked the GDDR5 at low speeds.

I don't see the return of the 512bit bus in GDDR5 form.
 
512-bit? small time, i heard they will be going to 1024-bit, and pair it with 16gb of vram, the video card will also be clocked at 2 ghz and amd will go bankrupt because of it.

You forgot the bit about them using XDR3 chips with the 1024-bit bus :D
 
With all the above features i'm in for one if it's under 548 bucks, not one dollar more.
 
They could release a 585 and match the 7970 pretty easily.

Lol? Where did that smoke from?

Anyway, based on my insider info, you won't see Kepler anytime soon...
And no, no XDR usage....
 
it's common knowledge that kepler has a 768 bit memory bus and a built in 1200 watt power suply so it can outperform 3x GTX 580s.

I call bs, you have your facts WRONG, Its going to have 2x 1200 watt psu's, get your facts straight before posting.
 
I call bs, you have your facts WRONG, Its going to have 2x 1200 watt psu's, get your facts straight before posting.

Speaking of checking your facts, you do realize they will be powered by Nuclear Fusion.
 
Speaking of checking your facts, you do realize they will be powered by Nuclear Fusion.

Yeah, they are also giving them away for free to people that promise to "enjoy" using them in their machines.
 
Should I wait for these cards or is it worth it to invest in a time machine so I can go get some 880's in SLI?
 
Should I wait for these cards or is it worth it to invest in a time machine so I can go get some 880's in SLI?

Stupid question, stop wasting our time with nonsenseical questions. Of course you should invest in a time machine!
 
nvidia's next card will beat the 7970 im sure. That has been the case for the past generations ,but it will cost more and use alot more power. That is all. I am fine with the ati cards for the past 3 generations bang for the buck ( although I don't agree with the new $550 price tag of the 7970).
 
I find it hilarious how people dont like the 7970s price tag.
6970 comes out at $370/400 its agreat price against a 580 which was going for $600 and when you ask why they bought a 580 "i wanted the fastest single gpu"
7970 comes out at $550 and its a horrible price even though a 580 is still $450+ for a 1.5gb version at $500+ for a 3gb version. "only 20% faster then a 1 year old 580? that is OCd? ILL PASS"

Oh well, more 7970s for me
 
nvidia's next card will beat the 7970 im sure. That has been the case for the past generations ,but it will cost more and use alot more power. That is all. I am fine with the ati cards for the past 3 generations bang for the buck ( although I don't agree with the new $550 price tag of the 7970).

Nvidia doesn't have much head room left with FERMI. Remember the issues with the 590 Meanwhile the 7970 has some nice OC range that will allow the 7990 to be a monster and to continue AMD's reign of the top end market.

As for the price, it's fair enough. That's what Nvidia charges for the 3GB GTX 580s and the HD 7970 is faster while using less power. Also the 7800 cards will be well worth their price being 6900 cards with the die shrink.
 
I find it hilarious how people dont like the 7970s price tag.
6970 comes out at $370/400 its agreat price against a 580 which was going for $600 and when you ask why they bought a 580 "i wanted the fastest single gpu"
7970 comes out at $550 and its a horrible price even though a 580 is still $450+ for a 1.5gb version at $500+ for a 3gb version. "only 20% faster then a 1 year old 580? that is OCd? ILL PASS"

Oh well, more 7970s for me

It's not an nvidia, is why. People will happily pay $200 for the nvidia badge alone. Paying $550 for a card without one is pretty poor value for money. People are used to being fleeced for the geforce name, so they don't expect it from their competitors. Frankly, AMD have every right to charge what they do for the HD7970 because it's worth that much, but that's not going to be taken lying down by green advocates.
 
They could release a 585 and match the 7970 pretty easily.

You really think they could do that?
Im curious, what would they do to the 585 to enhance its performance by 20%?
I dont mean to object needlessly but im of the belief that if Nvidia could have put a card out with 20% more speed within the same family of GPUs they would have.

Fasterclocks could maybe up the performance by 7% across the board.
a 512bit bus would maybe add 5% more performance but none of those things would be cost effective or easy for them to just tack on.
My number are pure speculation but i am very doubtful of Nvidias ability to just push out a matching card whenever they get usurped. The realitiy is that they cant..
 
And don't forget, that is 20% faster than something that is already overclocked, and also must still have at least 15% headroom for extra overclocking (these are multiplicative, not additive, so it is more than 35% betterm it must be 1.2*1.15 = 1.38 or 38% better, in the same gpu family)
 
The use of the MDT card in place of a normal 580 in [H]'s test is a bit disappointing really, firstly because it makes people think that the differences between it and a 7970 are those between a normal 580 and the 7970, and secondly because you can't actually buy them anywhere, they might as well be fictitious. At least in a couple of weeks the 7970s should show up in stores.
 
The use of the MDT card in place of a normal 580 in [H]'s test is a bit disappointing really, firstly because it makes people think that the differences between it and a 7970 are those between a normal 580 and the 7970, and secondly because you can't actually buy them anywhere, they might as well be fictitious. At least in a couple of weeks the 7970s should show up in stores.

the review should have "sponsored by GALAXY" written all over it for providing that card.
 
I'm not going to pursue this line of conversation as I'll be suspended for it, but suffice to say, I was disappointed with HardOCP's choice of cards, and uncharacteristically rate their review as one of the poorer of the tests of this new card.
 
snip

Frankly, AMD have every right to charge what they do for the HD7970 because it's worth that much, but that's not going to be taken lying down by green advocates.

When Kepler comes out and beats down the 7970 by 20-30% will you still feel this way if Kepler costs $600 per card? Assuming of course Kepler beats down the 7970.
 
If it beats the HD7970 by a handy margin, then either it will cost more than $600, or the HD7970 will drop in price once it's released. Personally speaking I don't think Kepler will manage the 20-30% gain, simply because of past occurrances. It's been a very long time since nvidia have had that much of a lead, like for like. The last time it happened was 8800GTX vs HD2900XT, the latter of which was a known failed architecture, whereas the HD7970 is not.
After that, HD4870 vs GTX280 was about 12%, HD5870 vs GTX480 was about 5%, and HD6970 vs GTX580 was about 10% (at high res). 20-30% just hasn't happened for more than 5 years. Why should it this time? It's not as if the HD7970 has underperformed.
 
Only if they copy the HD2900XT with the 512-bit token-ring bus. :D
 
Status
Not open for further replies.
Back
Top