NVIDIA Unveils Pascal GPU: 16GB of memory, 1TB/s Bandwidth

I'll quote the one and only comment on that page because it definitely sums it up.

"Unless I'm dreaming, we've known this information for months... how is this news?"
 
Hopefully we'll see the consumer version of these cards with at least 8GB standard now. Can't imagine they'd ship a GTX 1080 (or whatever) with 16GB.
 
And somehow game developers will find a way to use all that RAM with their poorly optimized pieces of shit games.
 
How do you draw a line between 'Poorly optimized' and 'demanding'?

I always find this term to be very vaguely, if ever, defined.
 
Soooo...I guess I will wait for the Titan X successor for the new build then.
 
How do you draw a line between 'Poorly optimized' and 'demanding'?

I always find this term to be very vaguely, if ever, defined.

Well for me personally if it doesn't look like Crysis 3, yet needs more hardware and runs worse, then it's poorly optimized.
 
Thats exactly it: If a game A and Game B look identically advanced, yet Game A runs at 1080p 60FPS on a GTX 970, but Game B runs 1080p 40FPS on the same hardware, you're looking at a difference in optimisation quality.
 
HL3? Bwahaha! You'll see the dodobird in the sky before you see HL3. Silly kid, valve delivers games now. They don't make them (unless you count those stupid zombie left for dead 'games' as games). And they won't sell the rights to someone who can and who will.
 
Astro aren't they the makers of Dota? . Most popular game in the world right now.

Anything that has something to do with the words "online" "multiplayer" "arena" "p2p" "pvp" "p2w" I don't consider a game.
 
This card should be a monster. Hopefully they will keep the RAMDAC for CRT use. And it best be out at least a few months before AMD's new cards, if they want to really punish AMD.
 
How do you draw a line between 'Poorly optimized' and 'demanding'?

I always find this term to be very vaguely, if ever, defined.

The line is very clear, it's people that use these terms loosely, and often incorrectly.
 
pffft you still need Titan X SLI for the best possible 4K experience, and 5K is almost 2x that of 4K. So while GP100 might be the first GPU that could handle 4K all on its own, throw 5K at it and watch it get reduced to tears. :D (5K to GP100 would be what 4K is to Titan X right now basically)

Anyway what I'm trying to get at is that it's really not that difficult to shift the bottleneck from the CPU to the GPU, if you really have "too much" GPU power
 
Well for me personally if it doesn't look like Crysis 3, yet needs more hardware and runs worse, then it's poorly optimized.

Lol, pretty much this. If a game requires or uses more hardware and the results are equivalent or "barely" better,sometimes worse, to other AAA titles, poorly optimized.
 
No CPU will be able to keep up with this tho.

5960X @ 5GHz or 6GHz should have no problem. Those clocks would need quite a TEC or phase change for cooling, but even at 4.5, which is possible on water, there should not be bottlenecks at all with one card and only minor issues in SLI if at lower resolutions and desiring high framerates (i.e: 1280x960 @ 240, 1600x1200 @ 200).
 
5960X @ 5GHz or 6GHz should have no problem. Those clocks would need quite a TEC or phase change for cooling, but even at 4.5, which is possible on water, there should not be bottlenecks at all with one card and only minor issues in SLI if at lower resolutions and desiring high framerates (i.e: 1280x960 @ 240, 1600x1200 @ 200).

Running 4.6 WC and still waiting for a game that utilizes that. With all the shit unoptimized POS4 and XBONE ports, I don't see much need for a high OC unless benchmarking for ePeen

Please let me know if there exists a game right now where even 4.5 is needed. The only reason I'm even running that is for nongaming purposes
 
Well for me personally if it doesn't look like Crysis 3, yet needs more hardware and runs worse, then it's poorly optimized.

Thank you.... I can't believe how little we have progressed in graphics since this game. I just reloaded it last week and it still looks better and runs on older hardware then most everything out.
 
That still a rather superficial and open (poorly defined) metric.


It is, however, still a 100% valid and truth telling metric for anyone who considers themselves a "gamer"... Crysis 3 is an ANCIENT game in the gaming world and still looks better then most everything put out so far this year and can do it using less comp then the newest titles.

Honestly if I was in game development I would literally ask "Does our game look better then Crysis 3 on the same hardware?" if yes then we did our job, if no then we rewrite the whole dang engine and do it again.

That being said some games are not meant to look great graphically... Fallout 4 for example is EXTREMLY underwhelming for graphics, but that franchise has never been about cutting edge graphics so... meh
 
It is, however, still a 100% valid and truth telling metric for anyone who considers themselves a "gamer"... Crysis 3 is an ANCIENT game in the gaming world and still looks better then most everything put out so far this year and can do it using less comp then the newest titles.

It came out in 2013. That's "ancient"?? Also, its more or less a corridor shooter interspersed with a few open-ish areas. All the big AAA's this year have been huge open world sandbox games (GTA 5, Witcher 3, Fallout 4). Not exactly an apples to apples comparison.

In any case, CryEngine has always been in a league of its own, and I hope we'll see another big AAA built upon it sometime soon.
 
25% faster or more than a 980ti and I'm buying on launch.

Can't wait to get rid of SLI.

16GB seems excessive considering nothing comes close to my current 6GB cards, but this is possibly the figure for the Tesla cards with consumer models having more reasonable numbers.
 
As long as the antique consoles make up the largest segment of gamers, we won't see huge leaps in graphics quality. The xbone and ps4 can't even push 1080p in most games. Why push the envelope when only pc gamers can take advantage of it.

Biggest disappointment in gaming for the next 8 years, when they might release a consol that's equal to 2015 pc's.
 
As long as the antique consoles make up the largest segment of gamers, we won't see huge leaps in graphics quality. The xbone and ps4 can't even push 1080p in most games. Why push the envelope when only pc gamers can take advantage of it.

Biggest disappointment in gaming for the next 8 years, when they might release a consol that's equal to 2015 pc's.

I remain more optimistic then you. Unlike previous generations of console the current batch is based on multi-core x86 cpu's, run using low level API's that share a lot with Vulcan/DX12 due to their underlying GPU's. There is no reason 2016 console/PC games should be held back. It will take time as the industry moves into this new era, but it will happen.
 
The more VRAM we have, the more games will use it, just to cache/preload stuff. It can mean longer loading screens but much less of them, which I'm all for.

Who knows, you might only have to preload the assets once at launch, with 16GB VRAM. Which is nice.
 
Of course it is nice, but not on your wallet. Going by these pre-launch hype specs, hope you thought TitianX was affordable. LOL!
 
Holy Hell!

Glad I bought 32GB od DDR4 with my 6700K last month! I also have the GTX 980 Ref design. sigh, my son will be thankful I guess for the unexpected upgrade from his 670

I feel sorry for AMD, but I am a Nivida fan since the Riva 128. I had ATI/AMD cards twice, but driver lead be back to Nivida.
 
It is, however, still a 100% valid and truth telling metric for anyone who considers themselves a "gamer"... Crysis 3 is an ANCIENT game in the gaming world and still looks better then most everything put out so far this year and can do it using less comp then the newest titles.

Honestly if I was in game development I would literally ask "Does our game look better then Crysis 3 on the same hardware?" if yes then we did our job, if no then we rewrite the whole dang engine and do it again.

That being said some games are not meant to look great graphically... Fallout 4 for example is EXTREMLY underwhelming for graphics, but that franchise has never been about cutting edge graphics so... meh

Or maybe that "gamers" are not actually as well informed, knowledgeable, and analytical as they may think they are?

Crysis 3 performs extremely differently within itself, yet if you compared screenshots of those segments one is not going to inherently say the slower areas are "better" (how are we even defining better looking?) looking.

Take Crysis 3 and expand the environment sizes, expand the amount of AI actors and increase how much simulation they require. This will add to the performance demands of the game yet visually it will basically be identical. Did the game suddenly become less optimized?

This is a small sample of why such a superficial and vague criteria is not very useful.
 
Im waiting for benchmarks.
The numbers sound good but memory bandwidth and VRAM size alone isn't enough to sway me.
Benchmarks sway me.
 
Or maybe that "gamers" are not actually as well informed, knowledgeable, and analytical as they may think they are?

Crysis 3 performs extremely differently within itself, yet if you compared screenshots of those segments one is not going to inherently say the slower areas are "better" (how are we even defining better looking?) looking.

Take Crysis 3 and expand the environment sizes, expand the amount of AI actors and increase how much simulation they require. This will add to the performance demands of the game yet visually it will basically be identical. Did the game suddenly become less optimized?

This is a small sample of why such a superficial and vague criteria is not very useful.

I think I understand the point you're trying to make, but then how would you go about comparing game optimization then, especially if they run on different engines as well?

Or is your point terms such as "poorly optimized" inherently meaningless because each game is different?
 
208x228_Joseph-Ducreux-HEY-THEE-FINE-SIR-YOUR-PERSONAL-DEVICE-CAN-IT-RUN-CRYSIS.jpg
 
Back
Top