Is buying any high end video card with < 6g Ram a bad idea with 4k on the horizon?

goud

Limp Gawd
Joined
Apr 18, 2005
Messages
220
Just hoping to get some discussion going.
My view;
I will most likely be looking at a 4k monitor in the next 1-2 years.
Given the current high cost of the 1 step down for the very highest end video cards(3g ram) and I would like to think a new video card is a 3+ year investment. Since these cards have only 3gig of RAM I will most likely be memory constrained at 4k, and my understanding is SLI/Crossfire does not add memory for frame buffering, just processing.(plus sli/crossfire does have issues) I am thinking I really should not buy a current generation video card but wait for 6gig based cards to enter the mainstream, if I plan on 4k in the next 1-2 years?
Unless I buy a 1grand Titan now, which is crazy for something I might need in 1-2 years as it will cost a lot less when I need it.
Bottom line is I think if you plan on 4k, and who doesn't, limp along with current video card....
Thoughts?
 
Whatever video card you buy today will be woefully underpowered in 2-3 years, especially at a 4k resolution. Halving the RAM on the 780 is a suckers game to get you to shell out a grand for a titan. Get the card which delivers the performance you need today and forget future proofing.
 
Notice they didn't test 780s. I'd bet the 6gb means almost nothing. Tri 780s would flow just as well I'd bet. Maybe 5-10% slower, just like at every other resolution.
 
Since these cards have only 3gig of RAM I will most likely be memory constrained at 4k

No, you won't. Your problem will be not having enough GPU performance to drive the resolution. Current video cards will bog down to unplayable framerates long before they hit 3GB of VRAM usage.

The main attraction of 4K gaming is not having to use high levels of anti-aliasing anymore, and maybe getting away with not using any anti-aliasing at all.
 
Said GPU won't be able to run 4k games anyway, it's way to many pixels to shade.
 
in 1-2 years there will be more powerful graphics cards. It would be ridiculous to purchase a graphics card now, to drive a 4K display 1-2 years from now.
 
Other threads on [H] indicate that 3 GB VRAM is sufficient and that a 780 is fine if you drop the quality settings.
 
Whatever video card you buy today will be woefully underpowered in 2-3 years, especially at a 4k resolution. Halving the RAM on the 780 is a suckers game to get you to shell out a grand for a titan. Get the card which delivers the performance you need today and forget future proofing.

This...

No, you won't. Your problem will be not having enough GPU performance to drive the resolution. Current video cards will bog down to unplayable framerates long before they hit 3GB of VRAM usage.

The main attraction of 4K gaming is not having to use high levels of anti-aliasing anymore, and maybe getting away with not using any anti-aliasing at all.

...and this. Mostly because all GPUs made today were not made considering 4k displays. So whatever you buy today will be inadequate for such high resolutions.
 
...and this. Mostly because all GPUs made today were not made considering 4k displays. So whatever you buy today will be inadequate for such high resolutions.

Many GPUs in the past couple of years were made to support 4K displays. This doesnt mean you will be able to game at high settings though. Heck my 9600GT had a Display Port on it and hardly anything was out to use it around then.
 
Many GPUs in the past couple of years were made to support 4K displays. This doesnt mean you will be able to game at high settings though. Heck my 9600GT had a Display Port on it and hardly anything was out to use it around then.

I don't think putting DP on a card really has much to do with 4k as wanting to use a better standard. Most cards aren't particularly designed to handle large resolutions. The Titan was created for that very reason, but you don't see many cards like that. The 4GB cards and the 7970 6GB also were made to respond to larger displays and multi-display setups, but the majority of the cards are designed with 1080p and perhaps 1440p on the high end in mind, but not 4k.

The main limiting factor is going to be the VRAM going forward. They are going to have to come up with a better way of handling it, or adding more VRAM to the cards. Part of that solution may be combining the abilities of the GPU with the CPU and the main bus to try and make use of overall system RAM rather than just VRAM. But currently, there aren't many solutions for those of us with very high resolutions.
 
I've never heard of this "4K" nonsense - is that the non-Apple term for the Retina technology where you run a crazy DPI-scaled resolution? I imagine it would be similar to running a game with FXAA/MLAA on. With my TITAN I run things with FXAA, 32xCSAA, 8xSSAA all set to "enhance" and then setting the in-game to the highest it can be on games where I'm concerned with it looking nice (like Skyrim). I estimate that it roughly ends up with around 50xAA quality if you look at the effect and ignore the difference in techniques, pretty damn nice and not a jagged pixel in sight. :3

No, you won't. Your problem will be not having enough GPU performance to drive the resolution. Current video cards will bog down to unplayable framerates long before they hit 3GB of VRAM usage.

Unless you psychotically texture mod the crap out of already-VRAM-hungry games like Fallout:NV or Skyrim with 4096^2 or 8192^2 textures - I managed to eat 2GB cards for breakfast when most people were rocking a 512MB, and I can do it again with the TITAN I have I'm sure, lol. If the 780 was out when I got my TITAN I probably still would have gotten the TITAN instead, for that very reason.
 
The main limiting factor is going to be the VRAM going forward. They are going to have to come up with a better way of handling it, or adding more VRAM to the cards. Part of that solution may be combining the abilities of the GPU with the CPU and the main bus to try and make use of overall system RAM rather than just VRAM. But currently, there aren't many solutions for those of us with very high resolutions.

Nope. You can get 4 GB mainstream card like GTX760 any moment you want and probably making 8GB sku would also be possible. But there's no point without gpu muscle to push those resolutions.
 
I've never heard of this "4K" nonsense - is that the non-Apple term for the Retina technology where you run a crazy DPI-scaled resolution?

No, it's where you have a 32" or 39" or bigger display that has a resolution of 3840x2160.

See Wiki and here and here and here etc.
 
Nope. You can get 4 GB mainstream card like GTX760 any moment you want and probably making 8GB sku would also be possible. But there's no point without gpu muscle to push those resolutions.

I guess you missed the entire section where I talked about them producing 4GB cards to address this? Reading comprehension my man...reading comprehension. The point being that they haven't been focusing or designing cards for 4k. Why should they? It's not even close to being mainstream yet. Sure the newest cards have been a little more focused on higher resolutions, but its hardly gone mainstream.

Anyway, none of this makes a difference, there is no point buying a card to future proof if you aren't going to then buy the technology you are future proofing for. Better to wait until you need the card, then buy what is currently on the market or something that is reduced price.
 
I think everyone is over doing the 4k thing a bit. We have been playing with Eyefinity/Surround for years now. 5760x1200 is the closest equivalent, and is doable. We even have people with 7680x1440, or 7680x1600, but they do have multiple video cards.
 
I think everyone is over doing the 4k thing a bit. We have been playing with Eyefinity/Surround for years now. 5760x1200 is the closest equivalent, and is doable. We even have people with 7680x1440, or 7680x1600, but they do have multiple video cards.

Many people prefer to only have one screen with a very high resolution. I personally dont care for gaps in my display
 
Many people prefer to only have one screen with a very high resolution. I personally dont care for gaps in my display

That's not the point.

The point is, people have been playing at high resolutions for awhile, so 4k isn't going to be that much different. The only people who are really concerned about it are those who haven't used an Eyefinity/Surround setup.
 
Last edited:
That's not the point.

The point is, people have been playing at high resolutions for awhile, so 4k isn't going to be that much different. The only people who are really concerned about it are those who haven't used an Eyefinity/Surround setup.

Sure, there are tri-monitor setups with a lot of pixels, but they typically aren't run on a single GPU. The worry is "what single GPU do I need to drive these things".

HTH.
 
Other threads on [H] indicate that 3 GB VRAM is sufficient and that a 780 is fine if you drop the quality settings.

Hexus did a test with a Titan on 4k Ultra farcry3 with MSAA, results here: http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=7 It couldn't even hit 3gb of usage while performing slowly, with 4x msaa. In short 3GB unless you run huge MSAA @ 4k (which you won't have the performance for even with 2-3 780 cards in SLI) won't limit you.
 
There's quite a large gap between 5760x and 7680x, and the 4K monitor falls right in the middle of it (because it's like adding another monitor to 5760x1080):

Three 1680x1050 monitors = 5040x1050 = 5,292,000 pixels
Three 1920x1080 monitors = 5760x1080 = 6,220,800 pixels
Three 1920x1200 monitors = 5760x1200 = 6,912,000 pixels
One 3840x2160 monitor = 8,294,400 pixels
Three 2560x1440 monitors = 7680x1440 = 11,059,200 pixels
Three 2560x1600 monitors = 7680x1600 = 12,288,000 pixels
Three 3840x2160 monitors = 11520x2160 = 24,883,200 pixels
 
Sure, there are tri-monitor setups with a lot of pixels, but they typically aren't run on a single GPU. The worry is "what single GPU do I need to drive these things".

HTH.

For current tech, you may have just answered that yourself. I think a single 7970 ghz, or a 780 could manage, but not at the highest settings. Like I made mention of before, anyone that has used an Eyefinity/Surround would already know that. 3840x2160 is nothing 'new' it's just a different high resolution. If you want to know what to 4k is going to be like, look at reviews for 5760x1200, and for 7680x1440 then average those two scores. It's going to be awhile before anyone can afford a 4k 60hz screen, anyways. So there's going to be newer and faster cards by then..
 
That's not the point.

The point is, people have been playing at high resolutions for awhile, so 4k isn't going to be that much different. The only people who are really concerned about it are those who haven't used an Eyefinity/Surround setup.

Not everyone likes that wide of a screen setup though or dont care for the bezel of the display. Many people still prefer single screen setups and plan to stay that way for good.
I never said that people werent already utlizing this high of a resolution. This thread is about one display only and we all know about the Surround/Eyefinity setups.
 
Not everyone likes that wide of a screen setup though or dont care for the bezel of the display. Many people still prefer single screen setups and plan to stay that way for good.
I never said that people werent already utlizing this high of a resolution. This thread is about one display only and we all know about the Surround/Eyefinity setups.

The thread is about VRAM usage at massive resolutions, so I would say measuring vram usage while using eyefinity for high resolutions is a fair comparison.



To the OP, the best advice is to upgrade when you have something that will benefit from an upgrade. Buying the best card now just so you can maybe play a game with a monitor you'll get years from now is a bad decision all around.

If you need an upgrade now, get an upgrade for what you need now. If you get a $200 card now instead of a $1000 card now, I'm sure any $800 card in 2 years will be better than any $1000 card now.
 
No, it's where you have a 32" or 39" or bigger display that has a resolution of 3840x2160.

See Wiki and here and here and here etc.

Display size doesn't matter, similar to 720p and 1080p. All that matters is the resolution. You can have 4K on a 20" monitor provided it supports the correct resolution, similar to how you could also have 8K on a 10" tablet if it supported the proper res.

EDIT:

The first Wiki article you linked to even lists a 9" professional monitor with 4K res:

http://www.bhphotovideo.com/c/product/438018-REG/Astro_Systems_DM_3011_Astro_Systems_DM3010_8_4.html
 
Last edited:
I've never heard of this "4K" nonsense - is that the non-Apple term for the Retina technology where you run a crazy DPI-scaled resolution.

They're called 4K because the horizontal resolution is almost 4000 pixels. There's no nonsense about it.

It's actually 3840x2160 pixels, because going with a resolution that is exactly double that of 1920x1080 makes so many things so much simpler for backwards-compatibility reasons; if an OS or app or game or media container really doesn't like resolutions greater than 1920x1080, the OS or the user can just pixel-double and operate it as a 1920x1080 display with really superior sub-pixel geometry and everything will look fine.

If the term "4K" were really accurate, the horizontal resolution would be 4096 pixels; it could have been called 2160p instead (and maybe will be for TVs) but 4K is the name that stuck because it's easy to say and remember, accuracy be damned.
 
I have a GTX680SLI @2gigs running 7680x1440 and it does not have any issues with the Vram. ALL I have is the AA off but at this pixel rate, it is not needed anyway. I have a feeling that it would also be easier to power a 4k monitor since there is only one monitor instead of three, no eyefinity or surround calculations and screen stretching needs to be made ( the aspect ratio is somewhere reasonable instead of 16:27 like mine) and overall pixels are a bit less than 3x1440p. I would not need to update my video card setup for a 4k screen unless I wanted continuous 60fps with everything on.
 
Back
Top