NVIDIA Unveils Pascal GPU: 16GB of memory, 1TB/s Bandwidth

I think there will be third party frameworks to help with the SLI being in developer hands
 

Not a valid comparison. The fact is that for the screen sizes most anyone uses these days be it TV's or a monitor; 4k is in fact enough. Anything more is just a waste. I could see a greater then 4K resolution being needed on the larger size of the scale TV screens like 60+ inch. Anything smaller the pixel density is enough though.
 
Not a valid comparison. The fact is that for the screen sizes most anyone uses these days be it TV's or a monitor; 4k is in fact enough. Anything more is just a waste. I could see a greater then 4K resolution being needed on the larger size of the scale TV screens like 60+ inch. Anything smaller the pixel density is enough though.

I remember pretty much the same being said about HD 10-15 years ago.
 
Zarathustra[H];1042004481 said:
SLI

The Pro's:
  • Higher frame rate
  • Bragging rights?

The Con's:
  • Poor scaling. Theoretically should scale 100%, (double performance) but this rarely if ever happens. More typical scaling is 20% or so. So add another video card, get only +20% performance. Thus usually cheaper to get a single higher end GPU (if it exists)
  • Increased input lag due to AFR (at 60fps: ~+25ms with two GPU's, ~+42ms with 3 GPU's, ~+58ms with 4 GPU's)
  • Stutter
  • Inconsistent frame times
  • Launch day Game bugs are way worse with SLI. Some bugs never get fixed for SLI users.
  • Higher CPU load (usually not an issue, except in already CPU heavy titles, especially those that are poorly threaded)
  • Stutter, frame time and CPU load problems get worse the higher the resolution (in other words, when you need SLI the most)
  • Heat & noise produced
  • Power, PCIe lanes and PCIe Slots used
  • Having to wait longer to run new titles, until they get a profile. Some titles never get profiles.
  • Constantly having to mess with, tweak and optimize SLI profiles, as the default ones only ever seem to work well in a select few AAA titles that get a lot of attention.

The same applies to Crossfire, just make the bugs worse, the time to get profiles 10x longer, worse stutter, etc. etc.

Full disclosure. I'm currently on dual 980ti's in SLI

My Crossfire experience is now a few years old (used to have dual Radeon 6970's in 2011)

I understand Crossfire has improved a bit since then, but even so I'd rank it somethign like this:

Single fast GPU (AMD or Nvidia) >> SLI >> Crossfire


If you can, just don't. Don't go multi-gpu, whether its on a single video card, or on separate video cards. They suffer from the same issues, as dual GPU video cards just run SLI/Crossfire on the same card, and now you can't even split them up.

If you have the option to - for the same performance - choose one faster GPU over two slower GPU's always do so, even if it costs more.

The only reason I have SLI is because there is currently no single GPU solution fast enough for my needs at 4k resolution.

The only reason I had the dual 6970's back in the day was because at that time there was no single GPU solution fast enough for my needs at 2560x1600.


In 2012, as soon as the 7970 launched, I got one, and replaced my two 6970's. I got SLIGHTLY lower raw framerate, but everything else improved. it felt more responsive, the stutter was gone, all games just worked, etc. etc. I was much happier.

As long as I can get 25%+ improvement with a big Pascal over a single 980ti, I plan on doing the same.

If we ever go 8k, I might have to do SLI again, but 4k is plenty of resolution, and I'd rather just not.

Thanks for taking your time to write that up, very informative!
 
I remember pretty much the same being said about HD 10-15 years ago.

True, but those of us on 1440p, 1600p and 4k now, are generally on larger screens.

For a 24" screen 1920x1200 was a great resolution. No need to go higher at 24"

For a 30" screen, 2560x1600 is a great resolution. No need to go higher on a 30" screen.

How much larger than a 42-44" 4k screen do you actually go, before it becomes silly?

On the desktop, at a viewing distance of 2-2.5 ft, going much above 100-110ppi is just a waste. Move further away, like you would sit from a larger TV and that PPI figure goes way down.

For an 8k screen to make sense at desktop viewing distances , we are talking about using ~85" screens at ~2ft distance. That is just silly, as you have screen WAY outside your peripheral vision.

Now, I am sure - because I know how product marketing works - there will be a "bigger numbers is better" mentality and screens will come in higher and higher resolution at the same sizes, (like those silly 28" 4k monitors on Monoprice) but the truth is, for desktop use, the diminishing returns curve falls off VERY quickly above 100-110ppi. Between 110 and 200ppi you really have to focus on still images to pick out the differences. Once you hit about 200ppi, the human eye can no longer make out the difference at all at 2-2.5ft.

Sure you can lean in and look closer, bit that's not how you use a desktop screen. These aren't smartphones. We aren't going to be holding them a couple of inches from our faces.
 
I don't agree with the scaling assessment. I found that on average my 780's scaled much higher than a mere 20% on each game... but I agree with most everything else. I swapped over to one 980Ti and have been much happier with the gameplay experience.
 
I don't agree with the scaling assessment. I found that on average my 780's scaled much higher than a mere 20% on each game... but I agree with most everything else. I swapped over to one 980Ti and have been much happier with the gameplay experience.

Yeah, it is very title dependent. I can only speak to the titles I play.
 
Zarathustra[H];1042006533 said:
True, but those of us on 1440p, 1600p and 4k now, are generally on larger screens.

For a 24" screen 1920x1200 was a great resolution. No need to go higher at 24"

While I certainly think that's an acceptable resolution (it's what I've got), higher would be better. I just measured the distance from my screen and it's 2 feet away when I recline and I can see lines at 2 feet and if I sit up i'm 1'8" from the screen and I see pixels. It's not bad, but it's obvious that more PPI would look better even if all you're looking at is the [H] Forum
[/quote]

I've seen a 5k 27" Mac and it looks amazing.
 
While I certainly think that's an acceptable resolution (it's what I've got), higher would be better. I just measured the distance from my screen and it's 2 feet away when I recline and I can see lines at 2 feet and if I sit up i'm 1'8" from the screen and I see pixels. It's not bad, but it's obvious that more PPI would look better even if all you're looking at is the [H] Forum

I've seen a 5k 27" Mac and it looks amazing.[/QUOTE]

The ability to see a line is not the test as to whether or not your resolution is sufficient.

If you use the inability to see a single row of pixels as your determination of sufficient resolution then you are going to err incredibly on the side of more resolution than necessary.

What is the point of drawing a line that you can't see?

The true test is to take more complex images. A high resolution photograph for instance, and do a blinded test side by side to see if you can pick out more detail or clarity in on ore the other.

This is going to be a much trickier proposition. I'd argue that for most it would take some serious studying of the image to tell any difference above 110 ppi at normal viewing distances Above 200ppi, I'd think that no one can see a difference at all.
 
Zarathustra[H];1042004481 said:
SLI

The Pro's:
  • Higher frame rate
  • Bragging rights?

The Con's:
  • Poor scaling. Theoretically should scale 100%, (double performance) but this rarely if ever happens. More typical scaling is 20% or so. So add another video card, get only +20% performance. Thus usually cheaper to get a single higher end GPU (if it exists)
  • Increased input lag due to AFR (at 60fps: ~+25ms with two GPU's, ~+42ms with 3 GPU's, ~+58ms with 4 GPU's)
  • Stutter
  • Inconsistent frame times
  • Launch day Game bugs are way worse with SLI. Some bugs never get fixed for SLI users.
  • Higher CPU load (usually not an issue, except in already CPU heavy titles, especially those that are poorly threaded)
  • Stutter, frame time and CPU load problems get worse the higher the resolution (in other words, when you need SLI the most)
  • Heat & noise produced
  • Power, PCIe lanes and PCIe Slots used
  • Having to wait longer to run new titles, until they get a profile. Some titles never get profiles.
  • Constantly having to mess with, tweak and optimize SLI profiles, as the default ones only ever seem to work well in a select few AAA titles that get a lot of attention.

The same applies to Crossfire, just make the bugs worse, the time to get profiles 10x longer, worse stutter, etc. etc.

Full disclosure. I'm currently on dual 980ti's in SLI

My Crossfire experience is now a few years old (used to have dual Radeon 6970's in 2011)

I understand Crossfire has improved a bit since then, but even so I'd rank it somethign like this:

Single fast GPU (AMD or Nvidia) >> SLI >> Crossfire


If you can, just don't. Don't go multi-gpu, whether its on a single video card, or on separate video cards. They suffer from the same issues, as dual GPU video cards just run SLI/Crossfire on the same card, and now you can't even split them up.

If you have the option to - for the same performance - choose one faster GPU over two slower GPU's always do so, even if it costs more.

The only reason I have SLI is because there is currently no single GPU solution fast enough for my needs at 4k resolution.

The only reason I had the dual 6970's back in the day was because at that time there was no single GPU solution fast enough for my needs at 2560x1600.


In 2012, as soon as the 7970 launched, I got one, and replaced my two 6970's. I got SLIGHTLY lower raw framerate, but everything else improved. it felt more responsive, the stutter was gone, all games just worked, etc. etc. I was much happier.

As long as I can get 25%+ improvement with a big Pascal over a single 980ti, I plan on doing the same.

If we ever go 8k, I might have to do SLI again, but 4k is plenty of resolution, and I'd rather just not.

Too many wurdz hurt my head. I only know me gamez looks vury much smoother to me with SLI. Right now Fallout 4 and ARK Survival (With help from NVidia inspector) look QUITE NOTICABLEY more smoother. Plus me e-penis too. ;)
 
I don't agree with the scaling assessment. I found that on average my 780's scaled much higher than a mere 20% on each game... but I agree with most everything else. I swapped over to one 980Ti and have been much happier with the gameplay experience.

That was my initial reaction but then I thought about it. 60FPS on SLI feels like 40 on a single card. So my "effective" scaling as perceived by me was probably around 20-30%.

When I went from Titan X SLI to a single Titan X in a lot of games I didn't even have to change my playable settings. In Bf4 I went from 130% scaling to 115%... or roughly a 30% drop I megapixels.
 
Zarathustra[H];1042007318 said:
I've seen a 5k 27" Mac and it looks amazing.

The ability to see a line is not the test as to whether or not your resolution is sufficient.

If you use the inability to see a single row of pixels as your determination of sufficient resolution then you are going to err incredibly on the side of more resolution than necessary.

What is the point of drawing a line that you can't see?

The true test is to take more complex images. A high resolution photograph for instance, and do a blinded test side by side to see if you can pick out more detail or clarity in on ore the other.

This is going to be a much trickier proposition. I'd argue that for most it would take some serious studying of the image to tell any difference above 110 ppi at normal viewing distances Above 200ppi, I'd think that no one can see a difference at all.[/QUOTE]

Maybe not, but a 27" 5k monitor is about 217 PPI and a 4k monitor 24" monitor is 204 PPI, which means they match roughly the minimum PPI you'd want to use to print most normal size photos --I assume pros might like as much as 300ppi-- and also match what you've listed as the most a person can see. So why not 4k on a 24" monitor? Now if you're gaming, then maybe lower DPI makes sense, since most TVs have far lower pixel density than a 24" 2k monitor.
 
The ability to see a line is not the test as to whether or not your resolution is sufficient.

If you use the inability to see a single row of pixels as your determination of sufficient resolution then you are going to err incredibly on the side of more resolution than necessary.

What is the point of drawing a line that you can't see?

The true test is to take more complex images. A high resolution photograph for instance, and do a blinded test side by side to see if you can pick out more detail or clarity in on ore the other.

This is going to be a much trickier proposition. I'd argue that for most it would take some serious studying of the image to tell any difference above 110 ppi at normal viewing distances Above 200ppi, I'd think that no one can see a difference at all.

Maybe not, but a 27" 5k monitor is about 217 PPI and a 4k monitor 24" monitor is 204 PPI, which means they match roughly the minimum PPI you'd want to use to print most normal size photos --I assume pros might like as much as 300ppi-- and also match what you've listed as the most a person can see. So why not 4k on a 24" monitor? Now if you're gaming, then maybe lower DPI makes sense, since most TVs have far lower pixel density than a 24" 2k monitor.[ /QUOTE]

I am not sure you know how to properly use the quote feature on the forum..:p
 
Thanks for taking your time to write that up, very informative!

I can pretty much echo everything Zarathustra[H] said about Crossfire / SLI / single faster GPU, having gone a very similar route in terms of GPU upgrades with just a few changes (namely, 8800 GTX -> SLI -> Tri-SLI -> Radeon 4890 Crossfire -> Radeon 6950 unlocked Crossfire -> 980 Ti SLI). 4K also pushed me to 980 Ti SLI, but then ironically I ended up at 3440x1440, since I couldn't find a 4K TV I liked enough to use as a monitor. Still, 5MP is a big step up from 2.3MP and I can be very demanding when it comes to performance / graphics settings, so it's not a total waste.

I don't notice stutter in most titles, but when it's there it's miserable. I don't mind messing with profiles, but it's always frustrating when you can't reuse something and have to wait on AMD/Nvidia to get a new profile out, and yes the wait time from AMD is much, much longer in most cases. Neither setup scales how it should, though I regularly see more than 20% -- if it were that low in most titles, I wouldn't bother. SLI tends to work better over-all than Crossfire, but I realize this may change with FuryX and DX12 titles set up to address it properly. Granted, we're probably a ways off from that yet.
 
I can pretty much echo everything Zarathustra[H] said about Crossfire / SLI / single faster GPU, having gone a very similar route in terms of GPU upgrades with just a few changes (namely, 8800 GTX -> SLI -> Tri-SLI -> Radeon 4890 Crossfire -> Radeon 6950 unlocked Crossfire -> 980 Ti SLI). 4K also pushed me to 980 Ti SLI, but then ironically I ended up at 3440x1440, since I couldn't find a 4K TV I liked enough to use as a monitor. Still, 5MP is a big step up from 2.3MP and I can be very demanding when it comes to performance / graphics settings, so it's not a total waste.

I don't notice stutter in most titles, but when it's there it's miserable. I don't mind messing with profiles, but it's always frustrating when you can't reuse something and have to wait on AMD/Nvidia to get a new profile out, and yes the wait time from AMD is much, much longer in most cases. Neither setup scales how it should, though I regularly see more than 20% -- if it were that low in most titles, I wouldn't bother. SLI tends to work better over-all than Crossfire, but I realize this may change with FuryX and DX12 titles set up to address it properly. Granted, we're probably a ways off from that yet.

One of my big annoyances is that every time I upgrade the driver, I have to re-do the SLI profile changes I made to make Red Orchestra 2 work right.

Nvidia has failed to notice that Red Orchestra 2 and Rising Storm use the same executable, so it is in their profiles twice. Once of the profiles is correct, the other is broken, so every time I upgrade the Nvidia driver (which seems to be more and more often lately) I have to open Nvidia inspector and recreate my changes that make it work right.

it's been this way for a long time, and despite many complaints about it, release after release come out without a fix.

I've gotten pretty good at it, with backed up profiles now, but it's still a major pain IMHO.
 
GTA V looks amazing and runs 60+ FPS streaming on my nvidia shield console to my 64" F8500 plasma. With all the eye candy on it destroys all console versions.

rocket league too looks fucking great "non sli game"

my first two cards in sli were 980s and plan on using SLI for every upgrade.
 
Thread derail....

I was an early adopter of SLI.

My first SLI config was 2 x 8800 Ultras. That was in 2007. Yes, those things were literal space heaters. And 2 x of them in even a giant Coolermaster Cosmos still allowed me to cook turkeys inside my PC.

This was back in the days before "apps" like GeForce experience made everything nearly brainless. There was no such thing as a game "profile". SLI was a lot more trouble back then. I gave it up and went to single cards until the 980 came out.

When the 980s were released, I picked up 1x of them around launch date. I was not getting a full 60 fps at 3440 x 1440 (was using the flat screen LG 34" 3440 x 1440 back then) in some heavy titles with the details cranked up. Decided to give SLI another shot and added a 2nd 980. Things for the most part worked fine.

I'm running 2 x Titan X's now and its actually needed to get 80-100 FPS in heavy titles at 3440 x 1440 (Running the 100hz Acer X34 now) with the details cranked up. Yeah like some posters have said, sometimes you have to wait for a profile or use nvidia inspector and forums to do a work around. And for the life of me, I cant understand why everytime nvidia releases a new driver update, SLI defaults to OFF and you have to remember to go into your nvidia control panel and re-enable SLI. Doing this for each driver update is mildly annoying and I cant understand why they cant fix this.

SLI has come a long was since 2007 but everyone has budget as well as case size concerns. I run a therlmatake V51 with a custom 3 x radiator setup and I actually like how 2 cards look vs a single card.
 
Last edited:
Strange how nothing was mentioned about Pascal at CES2016. I guess the release date is still a secret or Nvidia is not sure when they will be ready.
 
Strange how nothing was mentioned about Pascal at CES2016. I guess the release date is still a secret or Nvidia is not sure when they will be ready.

nvidia should concentrate in fixing what they have not in what they will have.
G-SYNC is a mess, SLI is a marketing gimmick, current drivers are a failure.
 
nvidia should concentrate in fixing what they have not in what they will have.
G-SYNC is a mess, SLI is a marketing gimmick, current drivers are a failure.

Strange bro.


I just played a ton of:

Fallout 4
Far Cry 4
GTA V
WarThunder


At 80-100 fps (Have an X34 at 100hz), g-sync and SLI made the experience incredibly smooth.

Not sure what hardware you are using but nvidia works fine for me. And SLI is certainly NOT a marketing gimmick at 3440 x 1440 at 100 hz. No single card will go much above 60 fps on AAA titles at high settings. I have 2 x Titan Xs OC'ed under water and they still dont get 100 FPS in all titles at 3440 x 1440
 
Not sure what hardware you are using but nvidia works fine for me.

And there's the crux of the issue. Works well for some, and not so much for others.

Depends on the game. Depends on the computer hardware. Depends on the drivers. Depends on the number of monitors and resolution.

Lots, and lots, and lots of variables.

My laptop with 780M SLI is gimped by new drivers, as anything past 345.2 causes the both cards to downclock to idle/low power states in games in certain heavy graphic situations (stock, or even underclocked speeds). 345.2 and earlier works perfectly, even if I heavily overclock. Opened a ticket with Nvidia last month and did some testing for them, but I still haven't heard of a fix or if they are going to work on it.
 
And there's the crux of the issue. Works well for some, and not so much for others.

Depends on the game. Depends on the computer hardware. Depends on the drivers. Depends on the number of monitors and resolution.

Lots, and lots, and lots of variables.

The thing is, this has been true since, well...

the mid 1980s when "the clones" first started appearing. I'm old enough to remember the phrase: "Is it PC compatible?"

Its gotten better, but the issue is never going to go away. You don't want to deal with it anymore? Buy an XBOX.
 
nvidia should concentrate in fixing what they have not in what they will have.
G-SYNC is a mess, SLI is a marketing gimmick, current drivers are a failure.

Don't have a G-Sync monitor, so don't care. Don't use SLI (and hope I never do), so don't care.

I do care about drivers, but I wouldn't be surprised if the drivers for a new architecture is done by a different team than those doing releases for current h/w. Either way, the faster they put out the new architecture, the sooner it can be discounted :D
 
It's been too long so I could be horribly wrong, but it seemed when SLI was scan line interleave and each card did half the work performance scaled better than 2 processors trying to pretend they are one big one.
 
It's been too long so I could be horribly wrong, but it seemed when SLI was scan line interleave and each card did half the work performance scaled better than 2 processors trying to pretend they are one big one.

You can choose Alternate Frame Rendering (AFR 1 or 2) so 1 gpu controls top/left of screen and other controls bottom/right of screen
 
Strange how nothing was mentioned about Pascal at CES2016. I guess the release date is still a secret or Nvidia is not sure when they will be ready.

I disagree. CES is a strange place to talk about an upcoming GPU line. I'd expect something at GDC.
 
You can choose Alternate Frame Rendering (AFR 1 or 2) so 1 gpu controls top/left of screen and other controls bottom/right of screen

Uhh you are thinking of SPLIT frame rendering there, chief.

AFR as the name implies each chip renders a single frame, and they show them alternatively.
 
And there's the crux of the issue. Works well for some, and not so much for others.

Depends on the game. Depends on the computer hardware. Depends on the drivers. Depends on the number of monitors and resolution.

Lots, and lots, and lots of variables.

My laptop with 780M SLI is gimped by new drivers, as anything past 345.2 causes the both cards to downclock to idle/low power states in games in certain heavy graphic situations (stock, or even underclocked speeds). 345.2 and earlier works perfectly, even if I heavily overclock. Opened a ticket with Nvidia last month and did some testing for them, but I still haven't heard of a fix or if they are going to work on it.

Welcome to PC gaming. This is true for every iteration of hardware, each and every release.
 
Welcome to PC gaming. This is true for every iteration of hardware, each and every release.

Been gaming since DOOM with a 286 I cobbled together from my Dad's old work computers (sometime in the mid 90s). Even had a 3DFX Voodoo 5 5500 a few years after that! Real SLI :p
 
Can't wait to see the consumer cards. 6-8 gig standard? stable 60 fps at 4k? Only one can hope. I can't wait to trade in the 980.
 
They spoke about Pascal at last GDC (and launched the Titan X) so it'd be kinda weird if there wasn't an update.
 
SLI was old in 2007! I had Voodoo 2s in SLI back in 1998. Worked pretty well back then - only 50% fps gain over a single card, but never had SLI related issues/bugs

Thread derail....

I was an early adopter of SLI.

My first SLI config was 2 x 8800 Ultras. That was in 2007. Yes, those things were literal space heaters. And 2 x of them in even a giant Coolermaster Cosmos still allowed me to cook turkeys inside my PC.

This was back in the days before "apps" like GeForce experience made everything nearly brainless. There was no such thing as a game "profile". SLI was a lot more trouble back then. I gave it up and went to single cards until the 980 came out.

When the 980s were released, I picked up 1x of them around launch date. I was not getting a full 60 fps at 3440 x 1440 (was using the flat screen LG 34" 3440 x 1440 back then) in some heavy titles with the details cranked up. Decided to give SLI another shot and added a 2nd 980. Things for the most part worked fine.

I'm running 2 x Titan X's now and its actually needed to get 80-100 FPS in heavy titles at 3440 x 1440 (Running the 100hz Acer X34 now) with the details cranked up. Yeah like some posters have said, sometimes you have to wait for a profile or use nvidia inspector and forums to do a work around. And for the life of me, I cant understand why everytime nvidia releases a new driver update, SLI defaults to OFF and you have to remember to go into your nvidia control panel and re-enable SLI. Doing this for each driver update is mildly annoying and I cant understand why they cant fix this.

SLI has come a long was since 2007 but everyone has budget as well as case size concerns. I run a therlmatake V51 with a custom 3 x radiator setup and I actually like how 2 cards look vs a single card.
 
Back
Top