Videocardz says: ASUS STRIX GTX 970 and ASUS GTX 980 listed

8GB might be overkill but it'll at least prevent you from running into that vRAM wall for the next 2 years. Whether it's worth the money is entirely up to the individual of course.

Fair enough.
 
Hmmm lil faster then a 780 GTX, 4gb of memory with only a 256bit bus. for $500.

Ugh almost sounds as bad as the 285 release.

Other then having 1gb more memory, and maybe a lil less on the power usage its more of a meh card to me.

Looks like AMD or Nvidia can't bring out anything good until 20nm comes out.

AMD Releases the 285 which is suppose to replace the 280x.

Nvidia releases the 980gtx which is suppose to replace the 780 gtx (not the 780ti).

Looks like 20nm is the one to keep an eye on.
 
Haha, that would be funny if they went to a 1080 scheme after ;).

Who are we kidding, every fiber of my being tells me that they will call it Geforce X. There isn't a single shitty sub human marketing person on this earth that wouldn't call it that.
 
Who are we kidding, every fiber of my being tells me that they will call it Geforce X. There isn't a single shitty sub human marketing person on this earth that wouldn't call it that.

The X is for eXtreme.
 
I'll be happy if this card supports HDMI 2.0 and is close or slighter faster than the GTX 780 and then it's all good for me.
 
I'll be happy if this card supports HDMI 2.0 and is close or slighter faster than the GTX 780 and then it's all good for me.

So you want to side grade your GPU? You are better off buying a 780ti for $550 then a 980 GTX for $500.
 
For what though? Running on 4k monitors/tv's? Nvidia already has a work around for that.

And 1 780 GTX isnt enough to run games at 4k.
Exactly: it's a workaround. Running at 4:2:0 chroma subsampling isn't a solution. With so many people on this forum being sticklers for color quality on their displays, why is it so surprising to want full 4:4:4 at 60Hz?
 
Exactly: it's a workaround. Running at 4:2:0 chroma subsampling isn't a solution. With so many people on this forum being sticklers for color quality on their displays, why is it so surprising to want full 4:4:4 at 60Hz?

Totally agree, but 1 780 gtx isnt going to put 60hz without lowering alot of IQ. Which wont help color quality at all if games look like ass.

1 980 GTX isnt going to be that much faster. Even retailers are saying its faster then a 780 gtx, slower then a 780ti. That really isnt much performance to spend another $500 for HDMI 2.0.

But hey different strokes for different folks.

If someones wants HDMI 2.0 that bad and wants to run with max IQ and top notch color quality they would need 2 980's.

This is all speculation of course because we have no way of knowing if Nvidia will support HDMI 2.0.
 
Totally agree, but 1 780 gtx isnt going to put 60hz without lowering alot of IQ. Which wont help color quality at all if games look like ass.

1 980 GTX isnt going to be that much faster. Even retailers are saying its faster then a 780 gtx, slower then a 780ti. That really isnt much performance to spend another $500 for HDMI 2.0.

But hey different strokes for different folks.

If someones wants HDMI 2.0 that bad and wants to run with max IQ and top notch color quality they would need 2 980's.

This is all speculation of course because we have no way of knowing if Nvidia will support HDMI 2.0.
Not even going to comment on 900-series performance until it's in the hands of [H]. I'm not going to take the word of leakers who just want to get their names out there.

Anyway, the poster you were responding to has a 55" LG 4K television. If I had spent $2,000 on a screen like that I'd want to have the proper hardware for it, too. What if he isn't just gaming? What if he wants to watch 4K video content through his PC? If he had trouble running games on that TV I don't think he would have said he wouldn't care if the performance was close to the 780.
 
HDMI 2.0 isn't just about gaming...in fact, that's probably the least important thing about it. It's mainly related to the combination of resolution, refresh, and audio. The work-arounds are cludgy and temporary.
 
Well that Asus GTX 980 is the one I want so my money is ready to be taken...
 
Not even going to comment on 900-series performance until it's in the hands of [H]. I'm not going to take the word of leakers who just want to get their names out there.

Anyway, the poster you were responding to has a 55" LG 4K television. If I had spent $2,000 on a screen like that I'd want to have the proper hardware for it, too. What if he isn't just gaming? What if he wants to watch 4K video content through his PC? If he had trouble running games on that TV I don't think he would have said he wouldn't care if the performance was close to the 780.

Then why buy a 4k TV for your PC knowing there are no HDMI 2.0 video cards out?

If he wanted to stream proper 4k content, then you would of done research to find out you couldn't to it properly with a PC yet.
 
So you want to side grade your GPU? You are better off buying a 780ti for $550 then a 980 GTX for $500.
For what though? Running on 4k monitors/tv's? Nvidia already has a work around for that.

And 1 780 GTX isnt enough to run games at 4k.
I would like to run both of my bandwidth hog monitors (U3014 and PG278Q) over DP, and all signs point to 2x DP on the new generation of cards. The only way to get that now is a 295X2 or some FirePro/Quadro cards. Also, I'd really like to be able to run the Oculus Rift CV1, and it will probably require HDMI 2.0 or DP, since DK2 is already pushing current HDMI limits.

That's not even considering future displays in the 4K to 5K+ range that will need even more bandwidth. http://www.anandtech.com/show/8496/dell-previews-27inch-5k-ultrasharp-monitor-5120x2880 The linked model coming out this year from Dell requires 2x DP1.2 ports. By itself. :eek:
 
will have less shader cores that 780ti it seams...

This has been discussed already... one cluster of 128 maxwell cores perform over 90% as well as one cluster of 192 kepler cores. Additionally they do so at twice the power efficiency. The core count was always expected to be lower as a result.
 
Then why buy a 4k TV for your PC knowing there are no HDMI 2.0 video cards out?

If he wanted to stream proper 4k content, then you would of done research to find out you couldn't to it properly with a PC yet.

Huh? You think people buy 4k TVs for their living room for their PC? No, they buy a nice 4k TV with HDMI 2.0 because they want a 4k TV. And then when a video card comes out with HDMI 2.0, they buy that too so they can connect their PC to the TV. If everyone waited for everything in their house to support everything they bought, no new features would ever come out because everything they owned didn't support it yet.
 
Who are we kidding, every fiber of my being tells me that they will call it Geforce X. There isn't a single shitty sub human marketing person on this earth that wouldn't call it that.

XFX is just sad they're no longer in the NVIDIA game, so they won't be able to release the XFX GeForce X Xtreme XXX EdiXion.
 
Gigabyte reference 980 and G1 Gaming 970 now listed at Synnex..

http://videocardz.com/52140/gigabyte-gtx-980-gtx-970-g1-gaming-also-listed-256-bit-bus


Also there's a weird listing at Geizhals for the 980..

https://geizhals.at/eu/diverse-geforce-gtx-980-a1156553.html

Chip: GM104 "Maxwell" • Speicher: 4GB GDDR5, 256bit • Shader-Einheiten/TMUs/ROPs: 2560/160/64 • Fertigung: 28nm • Shader Modell: 5.0 • Schnittstelle: PCIe 3.0 x16 • Bauweise: Dual-Slot • Externe Stromversorgung: 2x 6-Pin PCIe
 
They probably meant gm204. The shaders seem to fall in-line with another rumor.
 
It's not 20nm.

B31G0Ye.gif
 
They probably meant gm204. The shaders seem to fall in-line with another rumor.

It's not 20nm...

Could just be a bad listing (or whatever it actually is).. if they wrote GM104 for a 980, then it's possible the 28nm info might be wrong as well?

..or conversely, the 970 and 980 could actually be GM1xx at 28nm, and all this GM2xx at 20nm talk is for the Maxwell 3rd-gen stuff next early next year? :eek:
 
This has been discussed already... one cluster of 128 maxwell cores perform over 90% as well as one cluster of 192 kepler cores. Additionally they do so at twice the power efficiency. The core count was always expected to be lower as a result.
Yeah it's weird that people on a hardware forum would be so hung up on the numbers. Pentium 4, anyone? What matters is performance and efficiency. NVIDIA's design philosiphy since Kepler has been about more performance per watt.
 
Yeah it's weird that people on a hardware forum would be so hung up on the numbers. Pentium 4, anyone? What matters is performance and efficiency. NVIDIA's design philosiphy since Kepler has been about more performance per watt.
Which is an excellent and welcomed reversal from their old philosophy of "Just let the EDA tool take care of everything, we'll slap a 300W heatsink on it and give it mad extra power connectors."
 
Yeah it's weird that people on a hardware forum would be so hung up on the numbers. Pentium 4, anyone? What matters is performance and efficiency. NVIDIA's design philosiphy since Kepler has been about more performance per watt.


I think it's a mixture of things. You know people always bitch and whine about things they think they want, but really don't want. For as many years as I can remember we had insane improvements every 6-9 months and then the 12 month cycles, which still brought big changes. Now we're slowly moving from 12 month cycles to 15 and possibly 24. Depending on how you look at it we're already at 24 month cycles with refreshes.

For so long people wanted lower power, less heat, and decent performance. We knew for years that performance would suffer in that scenario, but now that we're seeing it first hand it's hard for people to accept 2 year cycles before you see the card originally designed. We want it all and that just isn't going to happen. The days of 50-100% improvements are over and that really does suck. A few technologies in the wings offer hope to see that for a generation or two, but 4K is going to be a slow process. Just think it took about 8 years of 1080p being mainstream before we got to this point where we can yawn at just about every 1080p game out there.
 
256bit bus kids...
bus-width is not a tell-all. People keep forgetting that this isn't the first time we've seen a reduction in memory bus width + a simultaneous increase in performance.

- Increased memory clock-speed can compensate for a smaller bus.
- Increased L2 cache makes the entire card less dependent upon a large memory bus.
- Data compression can pull the same amount of data across a smaller bus at the expensive of some GPU cycles.

Go back and look at what Nvidia did when they launched the GeForce 9 series. They reduced the bus width on high-end cards from 384 bit to 256 bit, but increased the clockspeed of the RAM and introduced a modest data compression scheme.

Result? A 9800GTX (256 bit bus) is cheaper to produce, lower-power, cooler-running, and pretty much just as fast as an 8800 GTX (384 bit bus).

It's not 20nm.
Neither is the GTX 750 Ti, but it's still a VERY desirable card.

Nvidia managed to seriously reduce power consumption without reducing performance and without a die-shrink. I say hats-off to them for some seriously amazing R&D.
 
Last edited:
Huh...worth upgrading from a 680? Guess we'll see.

So far it seems like I'd be better off getting another 680, but I'm a little worried about the 2 GB VRAM considering how these next-gen console ports have been.
 
Huh...worth upgrading from a 680? Guess we'll see.

So far it seems like I'd be better off getting another 680, but I'm a little worried about the 2 GB VRAM considering how these next-gen console ports have been.
Dead Rising 3 just proved that it isn't always going to be about VRAM. Or GPU power, for that matter :rolleyes:.
 
dead rising 3 is really not that poorly optimized. the game looks very good and i average 45-60 fps at 1080p max settings no aa. at 1440p, however, my framerate is frequently in the 30s. no vram issues with 2 gb. i think that's alright. at least it's not like ac4 where the game is ugly and runs like complete shit unless you turn most of the settings down.

i was going to go for a cpu upgrade to raise my minimum fps in bf3/4 but now i don't know what to do anymore :(
 
dead rising 3 is really not that poorly optimized. the game looks very good and i average 45-60 fps at 1080p max settings no aa. at 1440p, however, my framerate is frequently in the 30s. no vram issues with 2 gb. i think that's alright. at least it's not like ac4 where the game is ugly and runs like complete shit unless you turn most of the settings down.

i was going to go for a cpu upgrade to raise my minimum fps in bf3/4 but now i don't know what to do anymore :(

Are you really implying DR3 looks better than AC4?
 
Nvidia managed to seriously reduce power consumption

personally i don't give a damn about power consumption, since i don't game 24/7
As long it doesn't kill my kwatt psu that is
So i would prefer higher power consumption if that would mean higher performance

Based on those specs i seriously doubt 980 will be substantial faster that 780ti .Might be even, a few % more, but only 50 buck cheaper at least till prices don't drop ?
 
personally i don't give a damn about power consumption, since i don't game 24/7
The assumption was similar performance between the two parts.

With that being the case, it would be pretty daft to pick the card with higher power consumption. You'd be burning through extra power and throwing off more heat and generating more noise... for no benefit.

Exactly: it's a workaround. Running at 4:2:0 chroma subsampling isn't a solution. With so many people on this forum being sticklers for color quality on their displays, why is it so surprising to want full 4:4:4 at 60Hz?
Then use DisplayPort instead of HDMI :p
 
People confuse art work and engines alot. Don't be one of them.

Yep, it's like someone confusing a power tool for a piece of furniture... I don't get how they are unable to know the difference.

Regarding the whole "980 GIBBO POST!!!!!111" stuff, Gibbo has already clarified that it was complete speculation on his part and not based on any internal knowledge or leaks as to his performance/pricing guesses:

From the ocUK thread http://forums.overclockers.co.uk/showpost.php?p=26850427&postcount=2692 :
vsg28 said:
Soooooo: http://www.redgamingtech.com/nvidias...-coming-later/
"Nvidia’s GTX 980 Maxwell Slower Than GTX 780 Ti & 8GB Model Coming Later" and Gibbo's the source. Ah the wonderful world of tech "journalism".

His reply?
Gibbo said:
I posted speculating and then all these journalist take it as factual, lol.

9XrrCVq.jpg
 
Are you really implying DR3 looks better than AC4?

absolutely. ac4's environment looks worse than crysis 1. it's flat, dull, and low quality. dr3 has been nothing but high resolution textures and good looking lighting so far. funny, considering by the nature of dr3's setting, it should be more unpleasant to look at than ac4, but it's not.
 
I'll just leave this here..

Geforce GTX 980 Alleged Benchmark and TDP Surfaces – Just 170W for ~110% 780 Ti Performance


Dunno. I'm thinking this info could just be one of the guys over at Chiphell making calculations based off of the leaked 870/970 benchmarks. Someone over there also said they were looking forward to an upcoming "MINI" card, so I'm happy with that small glimmer of hope :)
 
I'll just leave this here..

Geforce GTX 980 Alleged Benchmark and TDP Surfaces – Just 170W for ~110% 780 Ti Performance


Dunno. I'm thinking this info could just be one of the guys over at Chiphell making calculations based off of the leaked 870/970 benchmarks. Someone over there also said they were looking forward to an upcoming "MINI" card, so I'm happy with that small glimmer of hope :)

I wonder what the O/C potential on that thing would be....
 
Back
Top