Videocardz says: ASUS STRIX GTX 970 and ASUS GTX 980 listed

GoldenTiger

Fully [H]
Joined
Dec 2, 2004
Messages
29,670
Synnex Thailand can be viewed here: http://www.synnex.co.th/th/Products/ProductDetail.aspx?ProductID=U1RSSVgtR1RYOTcwLURDMk9DLTQ=

http://videocardz.com/52125/asus-strix-gtx-970-asus-gtx-980-listed

http://cdn.videocardz.com/1/2014/09/ASUS-STRIX-GTX-970-4GB-OC.png

Apparently a custom-cooled OC STRIX model is even hitting right at launch. 4GB and 9xx nomenclature confirmed.

My favorite new meme:

2EzhH3r.gif


We should expect to see listings filtering into e-tail sites that syndicate distributors within the next few days... usually when cards are listed like this they hit within about a week though, which is contrary to nvidia holding their game24 event on the 18th. Maybe the launch is sooner?
 
Last edited:
Im not very excited for this release

Let's see...

Maxwell: low power consumption, great oc'ability potential (judging by 750ti and commonplace 1350-1450mhz oc's), low heat output/high efficiency, much lower pricing than the 780 Ti (current rumors place the GTX 980 at $499 vs the 780ti @ 699-739).... and finally, this performance rumor for the 980

uVM70Zg.jpg


which outpaces a typical 780Ti gpu subscore of ~5100-5200 (not comparing overall score as cpu impacts that) by clocking in at ~6600 here for, assuming rumors are true, a 28% performance boost with all of the above factored in.

And you aren't interested even a little? :)

P.S. Yes, the image blurred out a name of "880", but catzilla sent out an email mentioning 880 support recently, so we can assume 880 got renamed to 980 very quickly.
 
Still waiting for specs. I would definitely be interested in the STRIX GTX 970 if it ran a lot cooler and quieter than my 780s, even if it was just a side-grade. I know I've said previously on this forum that I would wait for Pascal to make my next upgrade, but I've got the itch :eek:. Maybe I'll take the 780s and make my dad the computer he has always deserved ;).
 
Let's see...
finally, this performance rumor for the 980

which outpaces a typical 780Ti gpu subscore of ~5100-5200 (not comparing overall score as cpu impacts that) by clocking in at ~6600 here for, assuming rumors are true, a 28% performance boost with all of the above factored in.

And you aren't interested even a little? :)

Have you seen this?

http://videocardz.com/52122/overclockersuk-reference-geforce-gtx-980970-4gb-8gb-models-also-planned
Quote:
Thankfully, Gibbo from OverclockersUK spilled some beans about GTX 980 and GTX 970. According to him NVIDIA is planning 4GB reference models and also 8GB with custom cooling solutions ‘at later date’. Of course, we also heard about 8GB models. In fact flagship cards with doubled memory are nothing surprising. According to OCUK’s and our information GTX 980 and GTX 970 with 8GB models are expected somewhere between November and December.

Gibbo did not comment on GM204 performance, but he did mention that GTX 980 replaces GTX 780, not GTX 780 Ti.


Quote:
Gibbo:

980 / 970 4GB reference cards with 8GB coming at a later date with custom coolers. Can’t say any more but don’t expect a huge leap in performance over current single GPU stuff.

Quote:
Gibbo:

They replace 780, not 780Ti.

Performance wise, my lips are sealed on how it compares to 780Ti, but of course 980 is quicker than 780 for sure!
 
Have you seen this?

http://videocardz.com/52122/overclockersuk-reference-geforce-gtx-980970-4gb-8gb-models-also-planned
Quote:
Thankfully, Gibbo from OverclockersUK spilled some beans about GTX 980 and GTX 970. According to him NVIDIA is planning 4GB reference models and also 8GB with custom cooling solutions ‘at later date’. Of course, we also heard about 8GB models. In fact flagship cards with doubled memory are nothing surprising. According to OCUK’s and our information GTX 980 and GTX 970 with 8GB models are expected somewhere between November and December.

Gibbo did not comment on GM204 performance, but he did mention that GTX 980 replaces GTX 780, not GTX 780 Ti.


Quote:
Gibbo:

980 / 970 4GB reference cards with 8GB coming at a later date with custom coolers. Can’t say any more but don’t expect a huge leap in performance over current single GPU stuff.

Quote:
Gibbo:

They replace 780, not 780Ti.

Performance wise, my lips are sealed on how it compares to 780Ti, but of course 980 is quicker than 780 for sure!


Yes, I posted those on OCN ;). And if you read the actual quotes the only two things he actually says are:

"GTX 980 will be faster than 780."

and in response to a pricing question
"They replace 780, not 780Ti".

Nothing more, nothing less.
 
Yes, I posted those on OCN ;). And if you read the actual quotes the only two things he actually says are:

"GTX 980 will be faster than 780."

and in response to a pricing question
"They replace 780, not 780Ti".

Nothing more, nothing less.

uhm you forgot this

Can’t say any more but don’t expect a huge leap in performance over current single GPU stuff.
 
20% to 30% could make a big difference if your trying to his 120FPS steadily thou. Im waiting this one out and not replacing my 3 780Gtxs just yet.
 
20% to 30% could make a big difference if your trying to his 120FPS steadily thou. Im waiting this one out and not replacing my 3 780Gtxs just yet.

20-30% can make a huge difference in upping the details, AA, etc. just trying to keep 60fps/vsync in some games. IMO, that's nothing to sneeze at.
 
256bit bus kids...

Improved scheduler, 8x the L2 cache in-chip, and brand new architecture, kid. :) Bus width isn't the only factor for a gpu's performance.

Even first-gen maxwell is 90% of the gtx 660 performance on a 750 ti, despite the 660 having 50% more cores, 50% wider bus, etc. while chomping down tons more power and heat. ;)
 
uhm you forgot this
[.B]
Can’t say any more but don’t expect a huge leap in performance over current single GPU stuff.[/B]

I didn't forget it, you simply didn't quote it ;) and it was irrelevant to my post anyway. A "huge leap" is subjective; 15, 20, 30% would be a nice jump for most people compared to a 780 ti while not being a "huge leap", particularly at this kind of price point.

Your point?
 
P.S. Yes, the image blurred out a name of "880", but catzilla sent out an email mentioning 880 support recently, so we can assume 880 got renamed to 980 very quickly.

I mean it can't have been that recently. GTX "X"80 is on the box!
 
I didn't forget it, you simply didn't quote it ;) and it was irrelevant to my post anyway. A "huge leap" is subjective; 15, 20, 30% would be a nice jump for most people compared to a 780 ti while not being a "huge leap", particularly at this kind of price point.

Your point?

My point is it that it makes me wonder if the rumors that the 980 will be faster than the 780 but slower than the 780ti are true.

edit - BTW I did quote it the first time, go re read my post.
 
My point is it that it makes me wonder if the rumors that the 980 will be faster than the 780 but slower than the 780ti are true.

edit - BTW I did quote it the first time, go re read my post.


That's what it sounds like to me as well. When I hear not "leaps in performance above the 780 I compare it directly to the 780 Ti that stays mysteriously out of the question. The performance gap between those two cards is NOT that much to not make a comparison.
 
That's what it sounds like to me as well. When I hear not "leaps in performance above the 780 I compare it directly to the 780 Ti that stays mysteriously out of the question. The performance gap between those two cards is NOT that much to not make a comparison.

My point is it that it makes me wonder if the rumors that the 980 will be faster than the 780 but slower than the 780ti are true.

edit - BTW I did quote it the first time, go re read my post.

Sorry, I missed it in your post then :).

Re: your point Rinaldo, when's the last time nvidia released a new flagship that was slower than the previous one they eol'd to make room for in their lineup?

Re: Liger88, saying that doesn't actually say much on Gibbo's part at all, that's the whole point of the exercise. He got OCuk some publicity while not violating the NDA in a way that will get him more than a slap on the wrist if even a comment thrown his way by nvidia ;). The GTX 780 vs. Ti is at best a 20% margin, so saying it's "not a big leap" over the "existing cards" doesn't mean anything particularly when he didn't word it even to be clear as to what existing cards he was talking about.


P.S. Can we stop the new-age "ghz myth" aka bandwidth lolly-gaggling? I feel like I got time-warped back to the Athlon 64 X2 days having to explain to people that just because the mhz number was lower, it didn't mean the CPU was slower than a Pentium 4 :p. I'll quote myself again since people keep parroting the bus width as though it's 100% an issue:

Improved scheduler, 8x the L2 cache in-chip, and brand new architecture :). Bus width isn't the only factor for a gpu's performance.

Even first-gen maxwell is 90% of the gtx 660 performance on a 750 ti, despite the 660 having 50% more cores, 50% wider bus, etc. while chomping down tons more power and heat. ;)

From Tsumi @ OCN as a summary of pricing history:
GTX 280 launched at $650, then later had a price reduction to $500 after AMD's HD4870.
GTX 285 launched at $340 with the dual-GPU card launching at $500.
GTX 480 launched at $500.
GTX 580 launched at $500.
GTX 680 launched at $500, despite being equal to/faster and significantly more efficient than the HD7970 at $550.
GTX 780 launched at $650, with the Ti following months at $700.

Each card was faster than the last ;). Bus widths respectively were:
GTX 280 512-bit
GTX 285 512-bit
GTX 480 384-bit
GTX 580 384-bit
GTX 680 256-bit
GTX 780 384-bit

and now we have GTX 980 99% confirmed as 256-bit.
 
Last edited:
IA "huge leap" is subjective; 15, 20, 30% would be a nice jump for most people compared to a 780 ti while not being a "huge leap", particularly at this kind of price point.

Your point?

Exactly!! Who the hell has a 780/Ti that is really looking for or need an upgraded in this next gen??? The cost effectiveness just isn't there for 780 owners, and makes more sense that they wait a little longer for the true Maxwell part. Its really not relative to them. Now the huge bump in performance would be for ppl like me who are looking at moving from 2x 660Ti's to a $399 970 or a $499 regular 980. Now that's going to be a performance jump and power reduction! ;)
 
Exactly!! Who the hell has a 780/Ti that is really looking for or need an upgraded in this next gen??? The cost effectiveness just isn't there for 780 owners, and makes more sense that they wait a little longer for the true Maxwell part. Its really not relative to them. Now the huge bump in performance would be for ppl like me who are looking at moving from 2x 660Ti's to a $399 970 or a $499 regular 980. Now that's going to be a performance jump and power reduction! ;)

Definitely. The only reason I sold my 780, which previously was satisfying me @ 2560x1440 110hz, is because I went 4K. Now I plan on two GTX 970's for SLI. But otherwise? I wouldn't even be in the market...
 
Re: your point Rinaldo, when's the last time nvidia released a new flagship that was slower than the previous one they eol'd to make room for in their lineup?
Actually that was discussed

#41 Yesterday, 06:48 PM
Unknown-One [H]ardForum Junkie, 9.5 Years Status:

Yup, the 8800 GTS 512MB, 9800GTX, 9800GTX Rev2, and GTS 250 are all fairly similar, but not identical... That said, an 8800 Ultra keeps up with all of them, usually beating them at high resolutions. They released piles of cards after the 8800 Ultra and nothing managed to beat it until the GTX 260 launched.

Newer cards with higher numbers aren't always faster. If the GTX 980 turns out to be slightly slower than a GTX 780 (with lower power consumption, quieter operation, and a lower pricetag), I wont be surprised in the least.
 
The gtx 980 will be faster than the 780, nothing shocking there. Will it be faster than the 780ti (nvidias single card flagship), is the big mystery.
 
The gtx 980 will be faster than the 780, nothing shocking there. Will it be faster than the 780ti (nvidias single card flagship), is the big mystery.

Exactly. Even better is that in time the Maxwell drivers will mature and that could mean it's as fast or faster in the long run, barring some bandwidth but not memory limited situations in regards to 3Gb 384bit vs 4Gb 256.

Now if people would stop saying that 980 isn't "true" Maxwell that'd be great...
 
I mean it can't have been that recently. GTX "X"80 is on the box!

They had all their AIB partners scramble around with sharpies scribbling out the 8 and writing in a 9 between GTX and 80 on the cards and boxes before shipping them out :p

cCA8F3Y.jpg
 
They had all their AIB partners scramble around with sharpies scribbling out the 8 and writing in a 9 between GTX and 80 on the cards and boxes before shipping them out :p

cCA8F3Y.jpg


Sharpie Mania Strikes Wal-Mart Stores Nationwide foxnews.com
http://www.foxnews.com/2014-09-04/us/sharpiemania
Across the nation, a shortage of Sharpie branded markers has been reported for as yet unknown reasons. Manufacturer 3M has their annual market forecast report slated for September 15th, but there has been no comment as to analyst remarks regarding a positive seasonal adjustment of forecasts due to the sudden sales improvement. Read more...

:D
(/sarcasm and /satire before some legal eagle decides this needs removal, as it is made up as a joke :) ).
 
Exactly!! Who the hell has a 780/Ti that is really looking for or need an upgraded in this next gen??? The cost effectiveness just isn't there for 780 owners, and makes more sense that they wait a little longer for the true Maxwell part. Its really not relative to them. Now the huge bump in performance would be for ppl like me who are looking at moving from 2x 660Ti's to a $399 970 or a $499 regular 980. Now that's going to be a performance jump and power reduction! ;)

I guess the only reason i can think of, and is mine personally, VRAM.

Hitting 2.5GB easily at 2560x1440 on Bioshock Infinite which as we know is pretty old now.

I need dat 8gb
 
I guess the only reason i can think of, and is mine personally, VRAM.

Hitting 2.5GB easily at 2560x1440 on Bioshock Infinite which as we know is pretty old now.

I need dat 8gb

With what level of AA? Because on a 4k monitor MSAA is unneeded, and FXAA provides better overall image quality (moreso than lower resolutions by far due to the increased pixel count for the algorithm to work with) while chewing up a minimal amount of VRAM in comparison as well as a very low performance impact.

And even if you do use MSAA w/ 4k, you shouldn't be hitting anywhere near 4gb of usage, even in more demanding games, for now at least:
http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=7

I will admit I am very strongly tempted to go eVGA so I can step-up to any possible 8GB model that may be released as I run a 4k monitor. At 1440p I wouldn't be concerned at all for the next couple of years.
 
With what level of AA? Because on a 4k monitor MSAA is unneeded, and FXAA provides better overall image quality (moreso than lower resolutions by far due to the increased pixel count for the algorithm to work with) while chewing up a minimal amount of VRAM in comparison as well as a very low performance impact.

And even if you do use MSAA w/ 4k, you shouldn't be hitting anywhere near 4gb of usage, even in more demanding games, for now at least:
http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=7

I will admit I am very strongly tempted to go eVGA so I can step-up to any possible 8GB model that may be released as I run a 4k monitor. At 1440p I wouldn't be concerned at all for the next couple of years.

you confused me.

I am running 2560x1440 as stated, i am hitting 2.5gb vram usage as stated, not running 4k and hitting nearly 4gb vram usage as you mentioned.

And i am just using "AA on" from the bioshock options,not forcing anything from nvidia control panel.
 
you confused me.

I am running 2560x1440 as stated, i am hitting 2.5gb vram usage as stated, not running 4k and hitting nearly 4gb vram usage as you stated.

And i am just using "AA on" from the bioshock options,not forcing anything from nvidia control panel.

Re-read my post, please, then respond to what I actually wrote which demonstrates a more demanding circumstance doesn't even need 4GB, let alone the 8GB you are saying you "need".
 
Re-read my post, please, then respond to what I actually wrote which demonstrates a more demanding circumstance doesn't even need 4GB, let alone the 8GB you are saying you "need".

How can hitting 2.5gb vram on a 3gb flagship card playing a game over a year old not be grounds to want more vram in the upcoming gpu's.

Please explain that.

*waits for some bullcrap story about it being allocated vram.
 
Last edited:
How can hitting 2.5gb vram on a 3gb flagship card playing a game over a year old not be grounds to want more vram in the upcoming gpu's.

Please explain that.

*waits for some bullcrap story about it being allocated vram.

A) It is allocated which isn't a "bullcrap story" , but B) that's not what I'm even talking about anyway.... why you believe you need 8GB when you're barely managing to allocate 2.5gb is beyond me, particularly when higher resolutions don't hit the 4GB that a GTX 970/980 will provide. Your opinion is based on faulty logic rooted in no factual basis. That, friend, is my point.

P.S. Interesting you added the allocated ram line over 3.5 hours after your original post :p.
 
A) It is allocated which isn't a "bullcrap story" , but B) that's not what I'm even talking about anyway.... why you believe you need 8GB when you're barely managing to allocate 2.5gb is beyond me, particularly when higher resolutions don't hit the 4GB that a GTX 970/980 will provide. Your opinion is based on faulty logic rooted in no factual basis. That, friend, is my point.

P.S. Interesting you added the allocated ram line over 3.5 hours after your original post :p.

Yeah was just thinking that will be your reply, so thought i'd add it. I was right.

So, your telling me future games will not require more vram than a game made a year ago? What about in 2 years from now? What about modded games? Modded Skyrim? Modded Minecraft? ENB's? Resolution scaling?
 
Last edited:
To be fair, you might be forced to "need more vRAM" due to how badly PC games are optimized these days. I mean Watch Dogs on Ultra at just 1080p WITH NO AA chews up 3+ GB of vRAM. (I've seen it chew up 3.12GB on my 780M with 4GB vRAM)

Sure this may be an extreme example, but point being having 8GB of vRAM means you'll be prepared for even the most terribly optimized games for the next 2 years, at least at 1440p.
 
Yeah was just thinking that will be your reply, so thought i'd add it. I was right.

So, your telling me future games will not require more vram than a game made a year ago? What about in 2 years from now? What about modded games? Modded Skyrim? Modded Minecraft? ENB's? Resolution scaling?

My reply actually had nothing to do with allocation or what you added; I changed the keyword because you're correct that calling it "usage" is improper ;) even though you called it "bullshit" while pointing it out. You persist in not bothering to read what was written and instead zeroed in on a brief aside made in remark to your silly addition :p.

What I actually said is that you are using a faulty leap of logic beleiving you need/are using 2.5gb-3gb to magically, suddenly claiming you need 8gb at only a small 1440p resolution. Needing 8gb is extremely unlikely to be normal within the next few years there considering 3gb-4gb is more than plenty for 4K resolution, far larger than yours, currently ;).

Given that, your 1440p display will likely not benefit from more than 4gb from more than a tiny handful of unoptimized ports in the next few years, by which time you'll be shopping for a replacement card anyway as it won't be strong enough to push games asking for 8gb of video memory by then :p.

The rest of your random questions that aren't even related to anything that was actually stated, written, or implied don't warrant a glance.
 
Last edited:
To be fair, you might be forced to "need more vRAM" due to how badly PC games are optimized these days. I mean Watch Dogs on Ultra at just 1080p WITH NO AA chews up 3+ GB of vRAM. (I've seen it chew up 3.12GB on my 780M with 4GB vRAM)

Sure this may be an extreme example, but point being having 8GB of vRAM means you'll be prepared for even the most terribly optimized games for the next 2 years, at least at 1440p.

Sure, but if he only needs 2.5-3gb now, why he would expect to suddenly need 8gb boggles me :). One or two ridiculously unoptimized titles aren't particularly indicative of an industry trend, especially taking into account that vram usage pretty much plateaued due to consoles the past several years. This is likely to repeat itself rather quickly, especially when you take into consideration the consoles only have 5gb between system and video memory combined, with most titles likely using in the neighborhood of 1.5-2gb for the video end of things.
 
Because shitty optimization knows no bounds lol.

No but seriously would you really want to run into another Watch Dogs situation where your 780 Ti stutters because of some stupid vRAM wall that was purely due to terrible optimization? At this rate I fear the next AAA titles will start gobbling up 4GB of vRAM even at 1080p.

8GB might be overkill but it'll at least prevent you from running into that vRAM wall for the next 2 years. Whether it's worth the money is entirely up to the individual of course.
 
Back
Top