Is 8GB going to be the new standard graphics memory next year?

tybert7

2[H]4U
Joined
Aug 23, 2007
Messages
2,763
I ask because there were reports that the 390x is going to come equipped with a different type of memory configuration that allows for far more bandwidth, and games are already suggesting more than 4GB is video memory for certain texture levels at 1080p !!!!!!!!!!


If that's the case, I think the base level should be 8GB with variants going as high as 16GB for single cards. It sounds insane, but so are the video memory requirements of that latest lotr game.
 
Is 8GB going to be the new standard graphics memory next year?

Yes because of two reasons:


  1. 2160P (I refuse to call it 4K, why don't we start calling 1920 x 1080 2K?)
  2. Lazy console ports.
 
We'll probably see 6GB cards before that.

I think the 300 dollar price cards should come with 6GB, 4GB on ~200 dollar cards (that's basically almost where the 290 is now), and 8+GB on the 400+++ cards


for nvidia, subtract 1-2 GB for each price tier, add it back in by adding 200-300 dollars for the Ti version.
 
I ask because there were reports that the 390x is going to come equipped with a different type of memory configuration that allows for far more bandwidth, and games are already suggesting more than 4GB is video memory for certain texture levels at 1080p !!!!!!!!!!


If that's the case, I think the base level should be 8GB with variants going as high as 16GB for single cards. It sounds insane, but so are the video memory requirements of that latest lotr game.

im sorta with you on this....my next card will have 8gig of vram min
2 reasons: it will be needed if not already depending on the game
and my current card will hold me over till i decide on my next card (i actually got very tempted on that 8gig vapor x but i never pay full retail)
 
Moving from 2GB > 4GB already seemed to happen overnight and unlike days of old unless you're a [H]ardcore gamer that had little effect on most peoples performance. However, with the consoles supporting 8GB of RAM, it's only logical that GPU's from Nvidia and AMD start to make it standard within the next 2 years across the board.

Ironically, RAM is relatively cheap, but it's an easy way to recoup a lot of R&D by charging people $100 for double. I expect 8GB will become the standard soon enough and then we'll be back to the games where they upsell people for 16GB+. With stacked memory it's inevitable that RAM is going to skyrocket in capacity. HBM 2.0 supports 4GB stacks as the minimum (4 stacks most likely will be the most common), so yeah. We'll probably reach the VRAM needs for 8K+ long before the GPU power exists.
 
almost every game released thus far that has required 4+GB have used it for caching and not for rendering...so in theory any game coming out in 2015 will use it in the same way, meaning even if you have 8GB or 16GB VRAM the game will still come close to maxing it out...so no 8GB VRAM is not really necessary except for 4K and multi-monitor gaming
 
I dont know, at least not at 1080. I do remember that it was just a couple months ago that we were saying 2GB was plenty for 1080 and then Wolfenstein New Order came out and used 1.9 GB of my VRAM on my 670! 4GB is certainly a must for 1080 going forward I think and while I dont think 8GB will be the norm anytime soon for 1080, I wouldnt be surprised if I were wrong. If I were gaming at 2560x1440 then I would be a little more certain 8GB would be what you want in the future though.
 
I dont know, at least not at 1080. I do remember that it was just a couple months ago that we were saying 2GB was plenty for 1080 and then Wolfenstein New Order came out and used 1.9 GB of my VRAM on my 670! 4GB is certainly a must for 1080 going forward I think and while I dont think 8GB will be the norm anytime soon for 1080, I wouldnt be surprised if I were wrong. If I were gaming at 2560x1440 then I would be a little more certain 8GB would be what you want in the future though.


That's a very big statement that can easily lead to this thread getting hijacked. I would say around 2GB @ 1080p is when VRAM no longer became the #1 concern. The tests way back then showed it and only in very rare instances was 4GB used effectively in the rendering process. People had to load up on Skyrim mods and engines like idTech5 are notoriously horrible with memory texture issues. The waters get really blurry once we start saying more than 2GB is necessary at 1080p.

Either way if you take 2GB or 4GB as the baselines or min > max for 1080p then 4K will only need 4x that to reach roughly the exact place we are VRAM wise with 1080p. That I guarantee will come much faster because of stacked memory. HBM 2.0 will have 4GB per stack alone. You'd have to go outside the specifications to force a stack to use less than that. AMD's R9-390 is rumored to use HBM 1.0 which is limited to 1GB per stack and will use 4. Do the math and you might find in the near future the premium for extra VRAM may be substituted more by the need for the higher Bandwidth where the VRAM just comes with it.
 
I can attest for the need for 6GB personally, in 1 specific title at least (Shadows of Mordor)

I was gaming on system in my sig @3440 x 1440 with first an EVGA Titan Signature Overlocked (6GB of course) and now a box standard EVGA 980 which naturally I over clocked by about +150hz (left the memory stock).

Anywho, Shadows of Mordor game play was noticeably smoother with the Titan than it was with the GTX 980 (again, at 3440 x 1440) using the stock textures on high (maybe some on Ultra, cant quite remember). I never did bother to download the big, high end texture pack that required 6GB to run, as I had already decided to sell the Titan and keep the 980. The in game benchmark supported the theory that the Titan provided smoother gameplay as well. The Average frame rates between the two cards were about the same (greater than 60 fps) and the max frame rate was always few FPS higher on the overlclocked 980 than on the Titan. HOWEVER, the MIN frame rate was ALWAYS noticbaly higher (10 fps or so) with the Titan than with the 980, and I believe those min framerate drops are what you really feel sometimes as the gamer.

Side rant: screw 1080p, do people on H actually game at that anymore? Ive been at 2560 x 1440 for seems like years already before upping to 3440 x 1440 wide screen. /rant over

But yeah, I too want 6GB/8GB cards. I'd like to try 4K in 2015.
 
We'll probably see 6GB cards before that.

Depends on the memory bus configuration. 256-bit, 512-bit would be 2GB/4GB/8GB etc... 384-bit would be 3GB/6GB/9GB

If HBM uses 4096-bit bus on 390X, then the configuration would be the 2/4/8 config.

If there is to be a Titan Z 2, it would make sense there would be an 8GB version as well as 4GB version.
 
Depends on the memory bus configuration. 256-bit, 512-bit would be 2GB/4GB/8GB etc... 384-bit would be 3GB/6GB/9GB

If HBM uses 4096-bit bus on 390X, then the configuration would be the 2/4/8 config.
It would have to be 3GB/6GB/12GB on 384 bit bus from the way they do memory on the high end cards.
 
Ah Yes

Since 980, 970, and most likely 960 and Titan Z 2, are 256-bit those could only be the 2/4/8/16 config.

390X is the wild card, if HBM is true, probably still the 2/4/8/16 config.

I think 384-bit cards are not in our future so 3/6/12 cards are probably not something we have to worry about.
 
Side rant: screw 1080p, do people on H actually game at that anymore? Ive been at 2560 x 1440 for seems like years already before upping to 3440 x 1440 wide screen. /rant over

This will quite obviously come as a shock to you, but not everyone has money falling out of their a****.
 
Side rant: screw 1080p, do people on H actually game at that anymore? Ive been at 2560 x 1440 for seems like years already before upping to 3440 x 1440 wide screen. /rant over

Yeah I still game at 1080. Sorry if that pisses you off. Ive got a great 25" monitor with excellent picture quality and I cant see the logic in spending $600+ on a screen only 2" larger that would need a $600 video card upgrade just to get the framerates Im getting now. $1200+ for a minor upgrade isnt worth it by a longshot to me. Also I sit less than arms length from my monitor so anything much over this 25" is borderline too big.
 
it is very close to 6-8gb being the norm imo

No it is not. The only reason people are freaking about 4+ GB is because the latest generation of shitty console ports are shitty in such a way that they eat up tons of VRAM. Before games like Shadow of Mordor and Ryse, no one thought you needed 4+ GB of VRAM at 1080p.

Also, +1 to 1080p gaming. With my current setup I see no reason to go for a higher res unless pixel density gets a lot better. I'd rather go for a 23-24" at 2560x1440 than a 27"+, which is what they all basically are now.
 
8gb should have been the standard this year,
i wish my 5850s had 4gb of ram, they would still provide great performance
in new games
 
Even a dinky 750 ti blows away a 5850.

750 Ti 2GB GDDR5 is a great little card... doesn't even need an external power connector and they boost and OC like beasts. It's crazy how much performance (and at a $100-110 or so price tag) nvidia brought to the table for the majority of machines with the GM107 launch awhile back... most any desktop can just drop one of those in and deliver a better gaming setup than an Xbox One for $350ish or a PS4 for $400ish does! Add in that they run cool and quiet and it's unbelievable.

They then scaled that upward with the GTX 970 and 980 (GM204) and shook the market on its head.

That said, I would be eyeing no less than 4GB for any rig above that is not at 1920x1080 resolution or below for heavy-duty gaming. It's pretty clear that games are going to become more demanding very quickly, so while 2GB will probably cut it at some reduced settings for 1080p/below, if you run 1440p or a surround setup, 4K, or other high-end setup, you'll want more VRAM around to leverage the added horsepower your GPU's will have available.

No it is not. The only reason people are freaking about 4+ GB is because the latest generation of shitty console ports are shitty in such a way that they eat up tons of VRAM. Before games like Shadow of Mordor and Ryse, no one thought you needed 4+ GB of VRAM at 1080p.

Also, +1 to 1080p gaming. With my current setup I see no reason to go for a higher res unless pixel density gets a lot better. I'd rather go for a 23-24" at 2560x1440 than a 27"+, which is what they all basically are now.

Dell P2715Q says hello... 4K resolution in a 27" display ;). Even 27" 1440p is an unbelievably huge step up from 24" 1080p, anyways... it's not all about the PPI alone.
 
yeah well, from personal experience, i have gotten better quality gaming out of ati cards over the years,
so that's what i prefer, even if it's little slower than the "competition",
but you're off topic, cause we're talking about the benefits of extra vram
 
yeah well, from personal experience, i have gotten better quality gaming out of ati cards over the years,
so that's what i prefer, even if it's little slower than the "competition",


I switch back and forth depending on what's better, though I lean towards nvidia some because I have had better experiences with them over the last 15 years.
 
16:10 is for gaming ;) ...I'm satisfied with my 1920 x 1200 screen...is there a big difference between 1080p and 1200p?...or 1200p and 1440p?
 
16:10 is for gaming ;) ...I'm satisfied with my 1920 x 1200 screen...is there a big difference between 1080p and 1200p?...or 1200p and 1440p?

Wrong. 16:10 is at best decent for desktop work but even there 4K is better. For gaming 16:9 provides a better FOV, and 2560x1440 is about 75% more pixels than a 1920x1080 setup ;), so yes, there is an ENORMOUS difference.
 
I simply cannot stand 60Hz after having used 120Hz for almost 3 years now. The only 120Hz 1440p panel is the ROG Swift PG278Q, and it is retailing for $800. :eek: Yeah I'm not throwing that kind of money down for a single panel period. So 1080p it is for now.

No it is not. The only reason people are freaking about 4+ GB is because the latest generation of shitty console ports are shitty in such a way that they eat up tons of VRAM. Before games like Shadow of Mordor and Ryse, no one thought you needed 4+ GB of VRAM at 1080p.

Also, +1 to 1080p gaming. With my current setup I see no reason to go for a higher res unless pixel density gets a lot better. I'd rather go for a 23-24" at 2560x1440 than a 27"+, which is what they all basically are now.

While I agree on the shitty optimization part, I'm not optimistic enough to think that things are going to improve in the future. Devs these days seem to think that PCs also come with unified memory, so I'd rather dish out more for extra vRAM than be stuck with a card that has the GPU power but is held back due to insufficient memory. (think 780 Ti and Watch Dogs, although Watch Dogs was an appallingly bad case of shit from Ubi$hit)
 
Last edited:
I sure hope so. My next card will need at least 6GB, but would prefer 8GB.

I don't know why people care about HBM at 4GB. We don't need more speed at 4GB. Also speed doesn't make up for lack of capacity. I hope AMD "surprises" us with 8GB + where the bandwidth might make sense.
 
Side rant: screw 1080p, do people on H actually game at that anymore? Ive been at 2560 x 1440 for seems like years already before upping to 3440 x 1440 wide screen. /rant over
This will quite obviously come as a shock to you, but not everyone has money falling out of their a****.

Also, not everyone games at 1080P because they can't afford better. My 27" 1080P monitor does 120hz and I wouldn't trade it for a higher res 60hz panel even if it was offered to me for free.
 
Also, not everyone games at 1080P because they can't afford better. My 27" 1080P monitor does 120hz and I wouldn't trade it for a higher res 60hz panel even if it was offered to me for free.

(hugs 1440p PLS panel @ 100hz with DSR up to 3620x2036)

:D It can do 115hz but the drivers won't allow it in conjunction with DSR just yet...
 
Wrong. 16:10 is at best decent for desktop work but even there 4K is better. For gaming 16:9 provides a better FOV, and 2560x1440 is about 75% more pixels than a 1920x1080 setup ;), so yes, there is an ENORMOUS difference.

There is an enormous difference in the amount of pixels, the load on the GPU and the cost. The actual real world difference in what you see is not that much (even with the slower frame rates) and not enough for me anyway, to justify the grand I'd need to drop in order to upgrade to one. Don't get me wrong, I'm not saying 1440 gaming ain't cool, I'm just of the opinion that it's more a luxury if you've got the money to blow and is only marginally better than me with my 25" screen playing Crysis 3 maxed out (FXAA) with my 2.5 year old GTX 670. ;)
 
While I agree on the shitty optimization part, I'm not optimistic enough to think that things are going to improve in the future. Devs these days seem to think that PCs also come with unified memory, so I'd rather dish out more for extra vRAM than be stuck with a card that has the GPU power but is held back due to insufficient memory. (think 780 Ti and Watch Dogs, although Watch Dogs was an appallingly bad case of shit from Ubi$hit)

Personally I would rather not support developers who do that kind of lazy programming, but to each his own. My guess is that if port continue the way they are, those developers will find their PC sales dwindling quite quickly since 95% of users are not going to run out and buy an 8 GB VRAM video card just for some poorly-optimized ports.
 
Dell P2715Q says hello... 4K resolution in a 27" display ;). Even 27" 1440p is an unbelievably huge step up from 24" 1080p, anyways... it's not all about the PPI alone.

If it's not about PPI, what is it about? I had a 26" 1080p display and for how short of a distance I sit away from the screen, I found it to be excessively large.
 
Probably not, because the rumor is that Fiji will be using stacked memory, and the 1st gen HBM technology only allows 4-high stacks of 2Gb dies, so I believe the theoretical max will be 4GB. 2nd gen HBM will allow more.

This may not be a big deal though because they are going to have outrageous memory bandwidth and I assume that the Tonga delta color compression technology will be in Fiji as well.

HBM.jpg
 
Personally I would rather not support developers who do that kind of lazy programming, but to each his own. My guess is that if port continue the way they are, those developers will find their PC sales dwindling quite quickly since 95% of users are not going to run out and buy an 8 GB VRAM video card just for some poorly-optimized ports.

I respect that, but it just seems in general that shitty optimization is par for the course these days, and whenever we get a properly optimized game we jump up and down for joy. If PC sales dwindle enough they may stop doing PC ports altogether, or do what UbiSoft does and release twice as many shitty ports to make up for lost revenue.
 
My guess is that if port continue the way they are, those developers will find their PC sales dwindling quite quickly since 95% of users are not going to run out and buy an 8 GB VRAM video card just for some poorly-optimized ports.

enthusiasts do go out and buy what they don't need...4GB+ cards are marketed towards enthusiasts and not mainstream consumers...look how many [H] members have upgraded to a 970/980 in the past 6 weeks or so (myself included)...although I'm not going to go out and buy an 8GB card for the reasons I outlined in my previous post...

http://hardforum.com/showpost.php?p=1041248985&postcount=8
 
If it's not about PPI, what is it about? I had a 26" 1080p display and for how short of a distance I sit away from the screen, I found it to be excessively large.

Ppi is just one extra factor as I said. Gonna just quote myself from a year ago...


.
Just the same way blu-ray was the next big thing for home theater. ;) 4K will eventually go mainstream. Saying it's the next big thing is what that implies :).

While the options are limited for 4K monitors right now, and most are very expensive, the prices will come tumbling down just as 1920x1200/1080 panels and 2560x1440/1600 panels did before them. You don't need to have native game support in most cases to run at 4K res., and you benefit a lot as you get more texel space (pixels for a given area of the screen to show a texture) and dot-pitch density, not to mention larger areas for geometrical shapes to be plotted and smoother. You'll still want a low level of AA to combat moire-patterning and shimmering edges, but in most games the texture resolutions are plenty high currently to see big gains in quality unless you crouch by a wall and stare at it an inch away.

Just ask most anyone who's gone from 1080p to 1440p how much better the resolution is. It's another giant jump to 4K.
 
*Patiently waits for dual core GPU's with 16gb of usable VRAM*

Joking aside, I just upgraded to a 1440p 144hz monitor with a single GTX 970, and i must say I am more impressed with it than I thought I would.

That being said, I am not sure if I'd jump to a 4k monitor before my Swift breaks, DSR'ed 4K doesn't look jagged or blurry at all on that monitor, so I may very well just 'fake' the 4k until I actually need to buy another monitor. Mostly because 4k is still tied to 60hz, but at least 1440p can do 120+, so if future GPU who can handle 4k at 100+fps, I'll probably stick to my swift.

Probably.
 
Back
Top