Anyone concerned that 2gb Vram will become obsolete once PS4 and Xbox One launch?

FACT like I have said and been saying before on other post like this (and It was 100% true)

The new xbox and ps4 use tons of the RAM for OS functions and that sharing social media crap anyway and game can not use it.

OR in the xbox 360 case Social Media crap and TV crap and virtual OS crap (blocking devs from getting to the hardware directly at a super low level) so they can chnage between all that crap and it is always on.

The Ps4 does things like it with the OS and the recordings and it is always on taking up RAM and CPU cycles game devs are told they do not get all 8gb. so 4gb for video functions would be the set target on those consoles the rest for game engine functions/special needs that would go to system ram in a PC.


But they would do more on the PC versions and give even higher res textures and what ever later so the xbox and ps4 version will look like ass so why try to match the specs?

Ohh and the unreal engine 3 dx11 can take up 500 MB just for lighting functions with 4x MSAA on.
 
Last edited:
Honestly I do think that 2GB of VRAM will cover long ways from now. It's like some people have stated that all the extra shared ram on consoles are being used for everything but the game such as the social media things and the new DVR system that lets you record your gameplay and other things like that. I think that is where the majority of the shared ram is going into is just background stuff not the actual game itself. With that said though at 1080p especially I don't think there should be too many problems with 2GB of VRAM in the future.
 
its pretty funny how often vram is discussed and is a concern.



if you are at 1080p then there is no reason to be concerned at all.

i am at 1440p and im not concerned because i dont play modded games,
 
I just started playing Metro LL last night. Setting at their highest, physx on, tessellation at it's highest, 2x SSAA (which is a lot heavier than MSAA) and I was using under 1GB of VRAM at 1200p.

Obviously things will change the next couple years, but considering I'm using just half of 2GB on one of the most demanding PC games currently available, it doesn't seem like an immediate or near future concern.

Ultimately, the goal is to have my current system hold up until 4K monitors become more mainstream/affordable and operating at at least 60hz and then upgrade to whatevers out there.
 
I just started playing Metro LL last night. Setting at their highest, physx on, tessellation at it's highest, 2x SSAA (which is a lot heavier than MSAA) and I was using under 1GB of VRAM at 1200p.

Obviously things will change the next couple years, but considering I'm using just half of 2GB on one of the most demanding PC games currently available, it doesn't seem like an immediate or near future concern.

Ultimately, the goal is to have my current system hold up until 4K monitors become more mainstream/affordable and operating at at least 60hz and then upgrade to whatevers out there.


You forgot to add, and it looks freakin great! :)
 
Skyrim can already use over 2GB VRAM at 1080p. I went from a 2GB 670 to a 3GB 7970 to a 4GB 670 because I was getting stuttering in areas with a lot of textures.
 
Skyrim is really more of an example of an engine not making 'efficient' use of available resources and not an indication that 2GB is particularly constraining. The approach the platform uses with respect to resource management could be described as 'lazy'. Lazy isn't necessarily bad, but it means it's not something you can point to as an example of why 2GB is constraining: it holds on to a lot of data because it's designed to, not because it has to.
 
Interesting. Most of last month was spent sending defective 460x2wins back to EVGA (they finally got me a working 660ti) but I was still able to game at reasonably high settings at 1920x1200 with my trusty *gasp* GTX280 (2008).

I still do my primary gaming on a GTX280 >.>
 
Some of you guys are just trying to reassure yourselves your purchase. 2GB will be obsolete as soon as the new consoles hit the shelves. Vram isn't all about screen resolution. It could be used for caching, physics, higher anti-aliasing, anisotropic, and other storage material for the game. Games right now are made for low vram. When those new games come out later this year to next year they will require a lot more memory. This isn't going to be a walk in the park. Not like we are trying to run call of duty 4 graphics type of game engine here. The newer games will take advantage of that extra memory such loading areas behind doors before you enter them. Load ahead much further than usual. Load more textures and not just about bigger textures.

OS won't take up that much memory. At most it will take up 512-800MB. Windows XP is around 625MB on cd but when stripped down its around 300MB. I don't see a OS taking over 1GB of memory. 2GB of vram is nothing right now. If you pay for 4-6GB of vram right now the card will last along time. Most games are going to be based around the consoles so most of any game will play on your card for years like the games now for the older gen consoles.

People don't get it when the 360 and ps3 came out the games really pushed the barriers of the system. Most of the console games struggled to run on the pc a lot of times even with the highest end hardware until the 8800gtx came out. I see in this next generation we will see a big leap in memory usage.

Its there, they will use it. Your not going to see 4GB of free vram. There is no denying it will be used like previous consoles before which have fully maxed out their memory. Its not going to be in 2 years where we will see peak performance. These new processors x86 64bit processors. There is no big learning curve here. This is like programming on the pc pretty much. They will max it out faster than any other console before. This isn't a cell processor anymore. I think AMD videocards will have an edge because most of the games will be developed on their architecture. Buy what you want to buy but i wouldn't waste money on 2GB especially this far into the game already. At least 3GB minimum but even thats pushing it.
 
The part of me who bought 2 2gb 680s hopes you're wrong
The part or me who loves games pushing hardware to new boundaries (and not by poor optimization) hopes you're right.

The objective side of me knows it will happen, but doesn't think it will happen as quick as some may suggest.

Either way, its exciting and we'll have to wait and see what happens. PC gamers have been complaining for years about consoles holding us back, so surely we can't complain if we're being limited by our own hardware, again, provided its justified and not just "hey, we have all this ram to play with, lets throw optimization to the wind" development.
 
Its there, they will use it. Your not going to see 4GB of free vram.
In the real world, you still must contend with load times. We're still going to see the same resource streaming strategies employed in next-generation titles that we do today, because loading (and, if necessary, conditioning) resources is still an impressively slow process. They just aren't going to be aggressively purging resources.

It's a matter of policy, not of architecture. This only marginally affects PC users with 2GB of video memory or less.
 
The part of me who bought 2 2gb 680s hopes you're wrong
The part or me who loves games pushing hardware to new boundaries (and not by poor optimization) hopes you're right.

The objective side of me knows it will happen, but doesn't think it will happen as quick as some may suggest.

Either way, its exciting and we'll have to wait and see what happens. PC gamers have been complaining for years about consoles holding us back, so surely we can't complain if we're being limited by our own hardware, again, provided its justified and not just "hey, we have all this ram to play with, lets throw optimization to the wind" development.

Great post & good point.
 
Thrust me, next gen wont use anything but fxaa or the like. I mean AF will be new to them, as 95% of the titles use trilinear filtering
 
Skyrim is really more of an example of an engine not making 'efficient' use of available resources and not an indication that 2GB is particularly constraining. The approach the platform uses with respect to resource management could be described as 'lazy'. Lazy isn't necessarily bad, but it means it's not something you can point to as an example of why 2GB is constraining: it holds on to a lot of data because it's designed to, not because it has to.

It's not just that. 4k textures + AA in any modern game is going to use over 2GB of VRAM.
 
Even if skyrim is an example of not making efficient use of resources, it doesn't matter, because you think the lazy devs of consoles are going to make efficient use of resources if its easier to just use a ton of vram?

In fact its going to go both ways, some devs are going to be lazy and under use the console ram so the games work on laptops and older gpus and others are just going to screw it up the other way.

The whole point of buying a beast of a video card and other specs is because you cant control devs, you can only build something powerful enough to combat their laziness.
 
Nothing new or demanding is coming out for PC in the next year anyway. Isn't that how often a lot of us upgrade?
 
Nothing new or demanding is coming out for PC in the next year anyway. Isn't that how often a lot of us upgrade?

There are a lot of people with a lot of different purchasing motives. I personally like to be able to pass my older video cards off to family members. Some gpus I have are almost 6 years old and serving people well for their use. So if vram is going to be an issue in 2 years or so I care about it, for others it effects resale value.
 
Even if skyrim is an example of not making efficient use of resources, it doesn't matter, because you think the lazy devs of consoles are going to make efficient use of resources if its easier to just use a ton of vram?

In fact its going to go both ways, some devs are going to be lazy and under use the console ram so the games work on laptops and older gpus and others are just going to screw it up the other way.

The whole point of buying a beast of a video card and other specs is because you cant control devs, you can only build something powerful enough to combat their laziness.
I'm saying the policy is lazy, not the developers. These are two different things.
 
Besides cranking up the AA, many titles run fine under 1.5GB. Most GPUs run out of processing power before you can max out the AA.
You card may use up to 3GB maxing out AA and textures., but your FPS is below 60fps on a single card. I'll go for a more powerful GPU before more VRAM.

I rather have a powerful GPU with 2GB that can run 2xMSAA @ 60fps, over a slower card that has 3GB running 8xMSAA @ 35fps.
Also look at the mem bus. A 192 bus with 3GB is a waste. But a 384 bus with 3GB is perfect. Doubling VRAM without a larger bus to match will not do anything for performance.
 
Last edited:
. Doubling VRAM without a larger bus to match will not do anything for performance.

that's true, but only for PC.

You cant compare consoles closed programming and fixed hardware to a PC with million different configurations

And did you ever ask yourself if the mixture of DDR3 and GDDR5 on PC could be causing this problem?
 
I would like to congratulate the [H]ard community for the impressive ability to fill 6 pages of mambo-jambo without numbers or real life tests.

About 2Gb vs 4GB- the first resolution where 4Gb starts benefiting gamers in NVIDIA cards is 3x1600p. Thats about 10 times the pixels the next gen consoles will drive at 1080p, using Crossfired integrated graphics, that share memory with the OS. And yet people state, as its pure and simple truth, that 2Gb video cards will become obsolete the moment the consoles hit the shelf?

Next consoles will have games adequate for a 7850@1080p ( being generous and hyperbolic on AMD's success in crossfiring the jaguar cores for perfect scaling) . And no, giving 8GB for a 7850 wont make it play games better than @Gb Geforce 680.
PERIOD

All of this reminds me, a lot, the olod days of "1GB DDR2 cards are better for future games than 256MB DDR3 cards"
 
And did you ever ask yourself if the mixture of DDR3 and GDDR5 on PC could be causing this problem?
The mixture of GDDR and DDR is exactly what you want. The consoles make concessions in this regard.
 
AMD is saying that games will struggle with next-gen console ports on Intel cpu's.

Does that mean Nvidia gpu's could also struggle?

No. The article is talking about GPUs in the context of integrated graphics. I'd suspect they'd both struggle on that front.
 
AMD is saying that games will struggle with next-gen console ports on Intel cpu's.

Does that mean Nvidia gpu's could also struggle?

here is the article: http://www.pcgamer.com/2013/06/07/a...e-on-intel-tech/?utm_source=fb&utm_medium=emp

He is referring to the graphics component and comparing AMD APUs to Intel's not the CPU component. His reasoning is that developers are not likely to make use of the specific extensions Intel has (which is what Intel is touting) and so AMD APUs will have the advantage due to raw performance.

As to your other question there is no definite answer. AMDs GPU architecture has advantages over Nvidia's in certain workloads and vice versa (synthetic tests more clearly expose these differences). With that in mind games might be designed around catering to that type of architecture.

Also it depends on what you mean by struggle. The highest class GPUs on both sides already greatly out perform what is going into the PS4 (the stronger of the 2). Let's say the 7970 takes more performance leads over the GTX 770 but both still run the games, would you consider the GTX 770 to be struggling?

This is one aspect I'm somewhat confused about over the excitement of the new consoles. Other than arguably memory capacity (not counting the software based features and inputs) it seems like the new consoles are behind (a lot in some areas) relative to their times compared to the last set of consoles.
 
Seriously doubt vram limitations for 1080p will be an issue in the immediate future of pc gaming. Even if there was an enormous spike of software using more than 2GB after the next gen consoles are released, optimization takes months to streamline so by the time vram does become relevant you would need an upgrade anyway. I say stick with what you have and see what happens before splurging for an upgrade.
 
For those saying that 2GB VRAM is more then enough at 1080p or more.... you obviously haven't modded Skyrim.

90io.jpg


Also next gen consoles will have 8 GB ram, the PS4 will allow games to work with 5.5GB of RAM. Furthermore, the operating system will have a dedicated 3.5GB of RAM. Yes, that would total 9GB. The breakdown is such that the OS will have 3.5GB of dedicated RAM, games will have 4.5GB dedicated, but games will be able to take 1GB from the OS when necessary, raising it to that 5.5GB figure.

Now keep in mind PC version of games are always more demanding. 2GB VRAM wont make it past November, sorry.
 
Hate to break it to you, but 3.5GB plus 4.5GB is only 8GB. If you're taking 1GB from the OS 3.5GB to give games 5.5GB, then you add 2.5GB plus 5.5GB, you can't have the same Gigabytes in two places at once.
 
Try to read it again. This time more carefully. Especially that part where I say "Also next gen consoles will have 8 GB ram"
 
Ok... "the PS4 will allow games to work with 5.5GB of RAM." That's 5.5GB.
"Furthermore, the operating system will have a dedicated 3.5GB of RAM. Yes, that would total 9GB." 5.5GB+3.5GB=9GB BUT
"The breakdown is such that the OS will have 3.5GB of dedicated RAM," That's 3.5GB
"games will have 4.5GB dedicated," 3.5GB+4.5GB=8GB
"but games will be able to take 1GB from the OS when necessary, raising it to that 5.5GB figure."

So in other words... it would total 9GB if it didn't take 1GB from the 3.5GB that is dedicated for the OS. So it's 8GB, witch is what I said at the start "Also next gen consoles will have 8 GB ram,".
 
that does not make sense. if it has 3.5 of ram "dedicated" to the os then the video card cant borrow form that dedicated ram can it?
 
Ok... "the PS4 will allow games to work with 5.5GB of RAM." That's 5.5GB.
"Furthermore, the operating system will have a dedicated 3.5GB of RAM. Yes, that would total 9GB." 5.5GB+3.5GB=9GB BUT
"The breakdown is such that the OS will have 3.5GB of dedicated RAM," That's 3.5GB
"games will have 4.5GB dedicated," 3.5GB+4.5GB=8GB
"but games will be able to take 1GB from the OS when necessary, raising it to that 5.5GB figure."

So in other words... it would total 9GB if it didn't take 1GB from the 3.5GB that is dedicated for the OS. So it's 8GB, witch is what I said at the start "Also next gen consoles will have 8 GB ram,".

You are an idiot.


You can NOT get 3.5GB and 5.5GB out of 8GB. Period.

I don't care what you "clarified" in the beginning of your non-sense post.

The fact is;

The console has 8GB of shared RAM.
 
Back
Top