3080 10g and 4k

Actually you can if there are options to do so. The 320 mb 8800gts was obsolete long before the 640 mb version was. The 256 mb 8800 gt was essentially doa compared to the 512 mb version. The 6 gb 1060 can run higher settings at much smoother performance than the 3 gb version. The 8 gb 570/580 allows much higher textures and such than the 4 gb version.

My knowledge may be a little rusty, but aren't some of those examples different in more ways than just memory? The 1060, for example, also had less CUDA cores, which is probably more responsible for the performance gap.

Likewise, my understanding was generally that the additional VRAM was meant to handle an increase in resolution. For example, with the 580, one would probably be fine with the 4GB model at 1080p but better served by the 8GB model at 1440p, no? Right now, the prominent resolutions were gaming at are 1440p and 4K (or the ultrawide in-betweens). If 10GB is enough for the 3080 to handle 4K (and according to Nvidia, it is), they why add more VRAM? Nobody is gaming at 8K, and I really don't think it's on anyones radar right now. Even if it were, I don't really think the memory is the difference between this being a 4K GPU and an 8K GPU... even Nvidia's claims that the 3090 is an 8K GPU are a bit dubious IMO. These cards aren't powerful enough for that (and they really don't need to be).

There might be one of two one-off examples of games that would be served well by more than 10GB. I expect for the functional life of the 3080, they will be very few. Few enough to not justify the cost... which is the whole point. If you want to go balls-to-the-wall, the 3090 is waiting for you. For the rest of us minding our wallets, better to trust that Nvidia knows better than we do, and matched the appropriate amount of VRAM to their GPU.
 
My knowledge may be a little rusty, but aren't some of those examples different in more ways than just memory? The 1060, for example, also had less CUDA cores, which is probably more responsible for the performance gap.

Likewise, my understanding was generally that the additional VRAM was meant to handle an increase in resolution. For example, with the 580, one would probably be fine with the 4GB model at 1080p but better served by the 8GB model at 1440p, no? Right now, the prominent resolutions were gaming at are 1440p and 4K (or the ultrawide in-betweens). If 10GB is enough for the 3080 to handle 4K (and according to Nvidia, it is), they why add more VRAM? Nobody is gaming at 8K, and I really don't think it's on anyones radar right now. Even if it were, I don't really think the memory is the difference between this being a 4K GPU and an 8K GPU... even Nvidia's claims that the 3090 is an 8K GPU are a bit dubious IMO. These cards aren't powerful enough for that (and they really don't need to be).

There might be one of two one-off examples of games that would be served well by more than 10GB. I expect for the functional life of the 3080, they will be very few. Few enough to not justify the cost... which is the whole point. If you want to go balls-to-the-wall, the 3090 is waiting for you. For the rest of us minding our wallets, better to trust that Nvidia knows better than we do, and matched the appropriate amount of VRAM to their GPU.
Only the 1060 has more cuda cores but it's not enough to matter and completely irrelevant to the point that I was making. I'm only referring to the vram part of the equation here which is being able to run higher textures and settings that use more vram. And 1440p to 1080 makes very little difference in vram usage so that doesn't save you at all if you're wanting to run higher resolution textures on a 580. The 8 gig to 4 gig difference is real and keeps you from running higher resolution textures and makes the frame rate much smoother in settings that are getting close to full vram utilization. So the point is when given two options it's always historically been better to go ahead and get the higher vram option. As it stands now there is no higher vram option for the 3080 so we will have to see what the future holds for it. As I pointed out earlier in this thread though we already have a game right now where 8 gigs of vram is a limitation at only 1440p and requires a couple of settings to be turned down. I sure as hell don't want to spend $700 or even $500 (3070) on a brand new video card and have to turn down some settings that I could otherwise run if the card had enough vram.
 
Last edited:
If we take the 1060, for exemple, the 3GB would compare to what would be a 3080 with 5GB (half it's normal 10). Not the other way around (ie comparing the 1060 6GB to a doubled-VRAM 3080).

The point is, it's always best to go with the VRAM capacity that it was designed with compared to their stripped-down versions. That's about the only conclusions to draw from the 1060/580s 3/6 or 4/8GB.

Pretty sure a 1060 wouldn't be any faster or hold its own longer if it had 12GB. Only wasted money on useless VRAM.
 
If we take the 1060, for exemple, the 3GB would compare to what would be a 3080 with 5GB (half it's normal 10). Not the other way around (ie comparing the 1060 6GB to a doubled-VRAM 3080).

The point is, it's always best to go with the VRAM capacity that it was designed with compared to their stripped-down versions. That's about the only conclusions to draw from the 1060/580s 3/6 or 4/8GB.

Pretty sure a 1060 wouldn't be any faster or hold its own longer if it had 12GB. Only wasted money on useless VRAM.
I see your point but you're sort of just making your own rules up as you go along. I am pretty sure that the 480 and 580 standard vram was 4gb. When the 480 came out people said how stupid it was to spend the extra money on the 8 gb version. Well that turned out to be a bit wrong as you can absolutely run higher settings and use vastly improved textures in most modern games compared to the 4 gb version.
 
Last edited:
Only the 1060 has more cuda cores but it's not enough to matter and completely irrelevant to the point that I was making. I'm only referring to the vram part of the equation here which is being able to run higher textures and settings that use more vram. And 1440p to 1080 makes very little difference in vram usage so that doesn't save you at all if you're wanting to run higher resolution textures on a 580. The 8 gig to 4 gig difference is real and keeps you from running higher resolution textures and makes the frame rate much smoother in settings that are getting close to full vram utilization. So the point is when given two options it's always historically been better to go ahead and get the higher vram option. As it stands now there is no higher vram option for the 3080 so we will have to see what the future holds for it. As I pointed out earlier in this thread though we already have a game right now where 8 gigs of vram is a limitation at only 1440p and requires a couple of settings to be turned down. I sure as hell don't want to spend $700 or even $500 (3070) on a brand new video card and have to turn down some settings that I could otherwise run if the card had enough vram.

Right, but there is a theoretical maximum here. Like the guy after you said, a 12GB 1060 would be a waste. Hell, 11gb on the 1080 Ti was unnecessary. You can load all the vram you want on a 3080 and it doesn't change the fact that there is a theoretical maximum the GPU can make use of. Nvidia pegged that number at 10GB. Do you know something Nvidia doesn't?
 
Sometimes doubling VRAM helps. when the ATI 5870 came out it had a whopping 1GB of VRAM, which most people thought was more than sufficient. Then the GTX480 came out with 1.5GB of VRAM which most people though was just right if they could stand the heat of the card. ATI responded with a 2GB variant of the 5870 that allowed the card to handle multi-monitor setups. The increased VRAM on the same 5870 chip really did improve performance on higher resolutions or for games with more textures.

It was also pretty clear that the Fury X and a few other AMD cards were choking on having half the VRAM the cards needed to run properly.
 
Right, but there is a theoretical maximum here. Like the guy after you said, a 12GB 1060 would be a waste. Hell, 11gb on the 1080 Ti was unnecessary. You can load all the vram you want on a 3080 and it doesn't change the fact that there is a theoretical maximum the GPU can make use of. Nvidia pegged that number at 10GB. Do you know something Nvidia doesn't?
So you actually think Nvidia gave it 10 gigs of vram because they magically think that that's just exactly how much it needs and it will never need more for future games? Really? The only reason it has 10 gigs of vram is because it has a 320 bit bus and the next option would be 20 gigs. I'm pretty sure it's just basic common sense to know that gddr6x is not exactly cheap and they built this to a price point. I mean it doesn't take a genius to look back and see that the much slower 1080 TI has 11 gigs of vram so using your logic Nvidia thought that was just the right amount for it but thinks the much faster 3080 needs less?

Anyway all that said I think 10 gigs will be just fine in most normal cases for the next couple years. When spending that kind of money on a brand new card though I would just like a little flexibility as we are already getting games that can use nearly that much vram right now. The 390 is not really an option for me as it's only going to be about 15% faster than the 3080 in gaming yet commands well over twice the price.
 
So you actually think Nvidia gave it 10 gigs of vram because they magically think that that's just exactly how much it needs and it will never need more for future games? Really? The only reason it has 10 gigs of vram is because it has a 320 bit bus and the next option would be 20 gigs. I'm pretty sure it's just basic common sense to know that gddr6x is not exactly cheap and they built this to a price point. I mean it doesn't take a genius to look back and see that the much slower 1080 TI has 11 gigs of vram so using your logic Nvidia thought that was just the right amount for it but thinks the much faster 3080 needs less?

Anyway all that said I think 10 gigs will be just fine in most normal cases for the next couple years. When spending that kind of money on a brand new card though I would just like a little flexibility as we are already getting games that can use nearly that much vram right now. The 390 is not really an option for me as it's only going to be about 15% faster than the 3080 in gaming yet commands well over twice the price.

No, exactly like you've said, I think 10gb is the logical choice because they are trying to be competitive. If it's good enough for 90% of what will be thrown at this card, it wouldn't make sense to make it several hundred dollars more for that last 10% (if that) that MIGHT need more vram. I trust that Nvidia knows better than you about what the 3080 does and does not require to remain competitive long term.

I believe the 1080 Ti has 11gb of Vram as a flex, nothing more. Nvidia put it on there because they could, and it sounded impressive at the time. My guess is if it shipped with less vram, no performance would have been lost.

Nvidia isnt trying to flex this time. They are trying to hit an extremely attractive price to performance before AMD comes to market. And they have.

My guess is very few people will be limited (or even care) about the amount of vram. Why should Nvidia unnecessarily inflate their cost to appease a minority?
 
Sigh, things like textures that need a lot of vram have almost zero impact on performance so I get tired of this nonsense claiming that a GPU has to be a certain speed to take advantage of more vram. Let's discuss some actual facts here which is that Wolfenstein youngblood right now here today requires more than 8 gigs of video RAM to run Max settings with Ray tracing even at just 1440p. If you Max all the settings like that and try to run the game it will tell you that you have exceeded the vram and if you try to run the game it will hitch and eventually lock up. You have to turn down texture streaming one notch and also lower the dlss setting to at least balanced. With more vram that is not an issue and having more GPU power would be irrelevant as again it's vram that is the limitation there. I have tested this on a 2080 super and 2060 super both of which have 8 gigs of vram. And this other nonsense people are saying about having plenty of system ram can offest the vram does not apply to plenty of games as if they run out of vram they will hitch stutter and in the case of Wolfenstein youngblood eventually just lock up.

Now this is interesting. 2080 Super is the fastest 8GB card currently available, and a game already has VRAM issues with it at 1440p? This doesn't bold well for the 3070. At least the 3080 has 10GB which is a little more breathing room.

As someone who used a Fermi chip well into the next gen, I was able to easily use more than 1GB of VRAM at playable settings in 2013 games (Tomb Raider, Battlefield 3, Bioshock Infinite). It was a mobile chip (580m) which was an underclocked 560 Ti 2GB. Similar performance to a desktop GTX 460 2GB. Yet few bought a 2GB 460, 560, or even 560 Ti. If anyone kept a 1GB Fermi into 2013 (only 2-3 years after Fermi launch), they woulda been in a VRAM limited situation. This may happen again - more so with the 8GB 3070, but with the 10GB 3080 to a smaller extent.

A lot of people here only ever run maxed out settings, or damn near maxed out maybe toggle down 1 or 2 settings. Keep a GPU for a long time and you get real used to tweaking settings. With a 2GB Fermi, I got to enjoy Ultra textures on 2013+ games at 1080p. Naturally, other settings such as lighting, shadows, draw distance, etc took a hit to Medium or High. But Textures are often single digit % difference from Medium to Ultra. Almost free as long as you have the VRAM. And personally I notice textures more than shadows. If anyone plans to skip the post-Ampere gen, then they need to be careful here. I think 2023 will not be kind to the 3070 at all, and the 3080 might have some issues of its own. I expect the 2022 gen to launch with 16GB as standard for the RTX 4070, and 12GB for the 4060. Developers will soon begin utilizing this. And this train could start sooner if 16-20GB verions of 3070-3080 are released with perhaps Super variants next year, not to mention if AMD gains some marketshare with 12-16GB cards.
 
Last edited:
While many more users report never experiencing utilization that high I've read around a dozen posts on Reddit of users who encountered games using close-to or the full 11GB on their Tis (the latter I recall specifically was the HZD port on a user's 1080Ti). So it does occur but appears to be for much more recent (or unoptimized) games. Nevertheless I do wonder how the 10GB will manage in years to come.

Also I know with various neural network/machine learning uses higher capacity VRAM helps (something I mentioned in a topic I posted here recently) but here the topic is gaming-specific obviously.
 
Turning on RT will increase vram requirements, BVH can be around 1gb in itself. Reflections require the whole scene or much more of it to be in memory just in case a reflective surface see's it. When [H]ardOCP did the initial BF5 review, the 8gb cards had issues running out of vram, hitching, stutter besides the rather low fps. I think that was cleared up later. HDR requires higher bit textures etc. adding to total vram requirements. VR where you are rendering two views with the upcoming higher resolution headsets will take more vram. Newer games are more complex with hundreds if not thousands more objects, denser objects, more shaders, more textures, longer distances, draw calls are a dramatically more than before pushing pcie bandwidth which will limit how effective system ram will be to mitigate low vram when the bandwidth is already being pushed. Think more in terms of tomorrow games, DLSS with lower resolution rendering can help, better compression to and from the GPU should help tremendously with Ampere GPUs. Microsoft with ML was able to use low res textures which was converted to high resolution using ML without any noticeable degradation or differences. So there are also solutions to vram issues as well. Will 10gb be enough? I do not know, do believe it is very much possible if not managed right and over time, how much time I do not know.
 
As long as the 3080 doesn't turn into the next 970 I'm good, I'm just excited to get a decent card for $699, my 2070 isn't getting me eNuff frames!
 
Only the 1060 has more cuda cores but it's not enough to matter and completely irrelevant to the point that I was making. I'm only referring to the vram part of the equation here which is being able to run higher textures and settings that use more vram. And 1440p to 1080 makes very little difference in vram usage so that doesn't save you at all if you're wanting to run higher resolution textures on a 580. The 8 gig to 4 gig difference is real and keeps you from running higher resolution textures and makes the frame rate much smoother in settings that are getting close to full vram utilization. So the point is when given two options it's always historically been better to go ahead and get the higher vram option. As it stands now there is no higher vram option for the 3080 so we will have to see what the future holds for it. As I pointed out earlier in this thread though we already have a game right now where 8 gigs of vram is a limitation at only 1440p and requires a couple of settings to be turned down. I sure as hell don't want to spend $700 or even $500 (3070) on a brand new video card and have to turn down some settings that I could otherwise run if the card had enough vram.

Correct on the matter for textures eating more vram than resolution, even 4k. 8k will be a different story, but we have a ways to go there.

My point is that games sort of stagnated the last couple years in vram usage. Most still only need 4 to 6 GB max, while a few will eat up 8 GB. Anything beyond that is very rare (id tech engine) and will only require a 10 second fix with little to no visual impact.

I kind of went nutso tracking vram requirements, but here are what some of the more popular games:
https://hardforum.com/threads/the-slowing-growth-of-vram-in-games.1971558/page-3
 
Memory bandwidth matters, I’m interested in independent reviews to see how the 3080 and 3070 actually perform.
 
Memory bandwidth matters, I’m interested in independent reviews to see how the 3080 and 3070 actually perform.

^This right here. Also I'm interested to see where the 3070 really lands as its being advertised as 2080ti performance but as discussed here it only has 8GB of vram. To me this means that the 3070 will be a go to gpu for 1080p high refresh gaming especially FPS/comp titles as those players dump settings to maximize FPS. The 8GB vram won't be an issue for this use case and it will sell like mad regardless if the reviews back up the hype.
 
As long as the 3080 doesn't turn into the next 970 I'm good, I'm just excited to get a decent card for $699, my 2070 isn't getting me eNuff frames!
I hope the 3070 IS the next 970. The "ram issues" never really affected it in actual games.
 
^This right here. Also I'm interested to see where the 3070 really lands as its being advertised as 2080ti performance but as discussed here it only has 8GB of vram. To me this means that the 3070 will be a go to gpu for 1080p high refresh gaming especially FPS/comp titles as those players dump settings to maximize FPS. The 8GB vram won't be an issue for this use case and it will sell like mad regardless if the reviews back up the hype.

3070 seems the most questionable. I am not worried about the drop in memory capacity, but the big drop in Memory Bandwidth vs 2080 Ti which it supposedly equals.

Every game is different. I suspect there will be some that are starved for memory bandwidth on 3070.
 
I hope the 3070 IS the next 970. The "ram issues" never really affected it in actual games.
GPU performance wise it didn't; the bigger issue wasn't that the 970 didn't have enough RAM, but that if a game decided to use that last 512MB, whether it needed it or not, your 1% lows could tank.

And with SLI like GoldenTiger is using, there was enough performance on tap to make using all of the addressable VRAM more likely to occur. Pretty awful experience when that kicks off!

And it doesn't really look like that will be an issue with the RTX 3070. All VRAM is running at full speed here.

<I still have a GTX970 in an older machine, and it cranks through older games or newer games with settings set to low at 1440p quite nicely today>
 
3070 seems the most questionable. I am not worried about the drop in memory capacity, but the big drop in Memory Bandwidth vs 2080 Ti which it supposedly equals.

Every game is different. I suspect there will be some that are starved for memory bandwidth on 3070.


Could very well turn out that way for sure.
 
Could very well turn out that way for sure.

It's looking even worse for 3070. Multiple sources confirming it has 14 Gbps GDDR6 with 448 GB/s BW.
https://www.thefpsreview.com/2020/09/05/nvidia-geforce-rtx-3070-only-uses-14-gbps-gddr6-memory/

448 GB/s BW vs 616 GB/s on 2080 Ti.

616/448 = 38% more memory BW on the 2080Ti. That is a LOT of memory BW to make up for with memory handling efficiency tweaks...

Now look at 3080 vs 2080. Its delivering 70-80% gains in Raster games. It has 762/448 = 70% more memory bandwidth than 2080. That seems reasonable.

3070 with a 38% BW deficit is somehow beating 2080 Ti. That doesn't seem so reasonable.
 
Let's not forget what the cards were designed for though. The 2080Ti was touted as a 4K card, the 3070 isn't (at least to my knowledge).
 
Let's not forget what the cards were designed for though. The 2080Ti was touted as a 4K card, the 3070 isn't (at least to my knowledge).
'Designed for' and 'marketed for' are rather much two different things. Whether a card is a '4k card' is going to depend far more on the game and settings it's being used with and much less on what it was marketed as.
 
Let's not forget what the cards were designed for though. The 2080Ti was touted as a 4K card, the 3070 isn't (at least to my knowledge).

Yes, but the recent Ampere reveal said the 3070 beats a 2080 Ti. Not that it beats a 2080 Ti until you increase the resolution. I just remain very skeptical of this one.
 
It's looking even worse for 3070. Multiple sources confirming it has 14 Gbps GDDR6 with 448 GB/s BW.
https://www.thefpsreview.com/2020/09/05/nvidia-geforce-rtx-3070-only-uses-14-gbps-gddr6-memory/

448 GB/s BW vs 616 GB/s on 2080 Ti.

616/448 = 38% more memory BW on the 2080Ti. That is a LOT of memory BW to make up for with memory handling efficiency tweaks...

Now look at 3080 vs 2080. Its delivering 70-80% gains in Raster games. It has 762/448 = 70% more memory bandwidth than 2080. That seems reasonable.

3070 with a 38% BW deficit is somehow beating 2080 Ti. That doesn't seem so reasonable.

Its possible the Founders Edition will be 16 GB/s while 3rd parties have a mix of 14 and 16 GB/s speeds.

Lets hope nVidia learned a lesson from the 5600xt fiasco.

Speculating on the 3070ti (please God do not call it a 3070 Super), it will be 256-bit GDDR6x and maybe16 GB.
 
So with that memory bandwidth deficit and at 499+ the people dumping their 2080ti cards at under 500 bucks may end up regretting that. Lots of speculation that the 2070 will beat the 2080ti in RT only and maybe be close on non RT gaming. Faster and more vram on the used 2080ti for less money sounds pretty good. I get that people selling a 2080ti are looking at the 3080 and beyond but the 2080ti may end up being the best used gpu buy ever. Until people come to their senses and the price goes back up..
 
With VRR I find that I use up to 7.6gig in some titles on my 1080ti @4k

I think that:
3070 is 1440p
3080 is 4k/60hz
3090 is 4k/120+ hz
 
Those thinking 8gb is sufficient for next gen games at 4K, even 10gb I would question. Look at Doom Eternal on max settings, 4K, it does not even have RT yet and it took 9gb of vram. Add at least another 1gb+ for ram requirements with RT. Look how well the 2080 did at 1080p and 1440p compared to the 1080Ti, then look at 4K:

 
Those thinking 8gb is sufficient for next gen games at 4K, even 10gb I would question. Look at Doom Eternal on max settings, 4K, it does not even have RT yet and it took 9gb of vram. Add at least another 1gb+ for ram requirements with RT. Look how well the 2080 did at 1080p and 1440p compared to the 1080Ti, then look at 4K:

ONE game, at 4K on Nightmare settings (Beyond Ultra). Settings that were practically designed for and recommended a 2080 Ti.

This again reminds me of when I watched Digital Foundry talking about The Witcher 2 on Modern HW.

Crushing GPUs with expensive settings? Witcher 2 from 2011, says hold my beer:


You can adjust the settings on a game that is almost a decade old (2011) to crush a 2080Ti. That doesn't mean the 2080 Ti was insufficient for next gen games starting in 2012. ;)

As in all these discussions. I hope they offer double VRAM cards for those that need/want them, and are willing to enrich the GPU companies with fat margins.
 
Last edited:
Those thinking 8gb is sufficient for next gen games at 4K, even 10gb I would question. Look at Doom Eternal on max settings, 4K, it does not even have RT yet and it took 9gb of vram. Add at least another 1gb+ for ram requirements with RT. Look how well the 2080 did at 1080p and 1440p compared to the 1080Ti, then look at 4K:



Yes, we got it. Doom/idtech in 4k uber-nightmare uses more than 8 GB. It is one of a few out of many scenarios. It still doesn't justify the cost of jumping up to 16 GB vram.
 
ONE game, at 4K on Nightmare settings (Beyond Ultra). Settings that were practically designed for and recommended a 2080 Ti.

This again reminds me of when I watched Digital Foundry talking about The Witcher 2 on Modern HW.

Witcher 2 from 2011, says hold my beer:


You can adjust the settings on a game that is almost a decade old (2011) to crush a 2080Ti. That doesn't mean the 2080 Ti was insufficient for next gen games starting in 2012. ;)

As in all these discussions. I hope they offer double VRAM cards for those that need/want them, and are willing to enrich the GPU companies with fat margins.


115fps at 4K with AA on. Ubersampling is basically 8K resolution scaling, no wonder the 2080 Ti is getting crushed.
 
115fps at 4K with AA on. Ubersampling is basically 8K resolution scaling, no wonder the 2080 Ti is getting crushed.

At the timestamp (19:26 if not working) Ubersampling is off. It's cinematic DOF that crushes performance down into the 30-40 FPS range.
 
Those thinking 8gb is sufficient for next gen games at 4K, even 10gb I would question. Look at Doom Eternal on max settings, 4K, it does not even have RT yet and it took 9gb of vram. Add at least another 1gb+ for ram requirements with RT. Look how well the 2080 did at 1080p and 1440p compared to the 1080Ti, then look at 4K:



I wouldn't read too much into this. My 2070 Super w/ 8GB VRAM can do 4K maxed no problem. The only thing that's holding it back is processing power.

EDIT: Just did another test in Doom Eternal @ 4K maxed settings, and it actually ran quite well. It does max out VRAM, but no stutters. Very smooth and very enjoyable, even @ 4K.

Doom Eternal probably saturates the VRAM as much as possible. Part of the way the game is designed.
 
Last edited:
Doom eternal at 4k is playable on maxed settings on my 1080ti. Not sure what your point is... Control on the other hand...
 
Yes, we got it. Doom/idtech in 4k uber-nightmare uses more than 8 GB. It is one of a few out of many scenarios. It still doesn't justify the cost of jumping up to 16 GB vram.
????, tomorrow games will most likely use more and more vram. RT has an overhead due to BVH memory plus more of the scene geometry has to be in memory if reflections are used, which will be incorporated into Doom. Virtually all of today games are fine but not much buffer for the future. 8gb I say is not a long term solution for Max quality and higher resolutions.

I wouldn't read too much into this. My 2070 Super w/ 8GB VRAM can do 4K maxed no problem. The only thing that's holding it back is processing power.

EDIT: Just did another test in Doom Eternal @ 4K maxed settings, and it actually ran quite well. It does max out VRAM, but no stutters. Very smooth and very enjoyable, even @ 4K.

Doom Eternal probably saturates the VRAM as much as possible. Part of the way the game is designed.
Hardware Unboxed definitely had an issue with reduced performance with their testing where the larger ram size of the 1080Ti beat out the 2080 handily at 4K while losing at the lower resolutions. Those going to use 4K OLED TVs, aka 4K I suspect will have to reduce settings due to ram amounts for some games. 10gb would give some room with the 3080. For those that upgrade every year or a little bit longer I can see 8gb maybe ok and 10gb is better. Some of us have a different perspective for long term usage and or resale value later.
 
115fps at 4K with AA on. Ubersampling is basically 8K resolution scaling, no wonder the 2080 Ti is getting crushed.
For one it is the 2080 and not the 2080Ti, video clearly shows Doom Eternal was set to Ultra with RS OFF (no scaling). Doom has one setting above Ultra and that is Ultra Nightmare, so the game was not even at it max quality level and the 2080 8gb ran out of memory. Unless they flubbed up the test. Did you watch the video?
 
ONE game, at 4K on Nightmare settings (Beyond Ultra). Settings that were practically designed for and recommended a 2080 Ti.

This again reminds me of when I watched Digital Foundry talking about The Witcher 2 on Modern HW.

Crushing GPUs with expensive settings? Witcher 2 from 2011, says hold my beer:


You can adjust the settings on a game that is almost a decade old (2011) to crush a 2080Ti. That doesn't mean the 2080 Ti was insufficient for next gen games starting in 2012. ;)

As in all these discussions. I hope they offer double VRAM cards for those that need/want them, and are willing to enrich the GPU companies with fat margins.

The game was not at it's max settings, it was on Nightmare, max is Ultra Nightmare and that 8gb vram of the 2080 puttered out. It is also relevant for those wanting those extra IQ options, textures, RT, HDR etc. that may consider getting something more than 8gb or 10gb. Yes, reduce settings on newer RT games and maybe not use RT will get the job done.

Edit: Corrections, Doom has Ultra, Nightmare and Ultra Nightmare so if Hardware Unboxed used Ultra it was 3 settings from the top. Makes me wonder if they didn't mean Ultra Nightmare with the presets.
 
Last edited:
The game was not at it's max settings, it was on Nightmare, max is Ultra Nightmare and that 8gb vram of the 2080 puttered out. It is also relevant for those wanting those extra IQ options, textures, RT, HDR etc. that may consider getting something more than 8gb or 10gb. Yes, reduce settings on newer RT games and maybe not use RT will get the job done.

Edit: Corrections, Doom has Ultra, Nightmare and Ultra Nightmare so if Hardware Unboxed used Ultra it was 3 settings from the top. Makes me wonder if they didn't mean Ultra Nightmare with the presets.

They said "Nightmare", I believe they have Low, Medium, High, Ultra, Nightmare, and Ultra Nightmare.

So Nightmare is still above Ultra, and is likely just a stupid, crush GPU setting that shows no useful visual benefit.

Also it isn't like the game performing badly. Nightmare 4K was averaging 88 FPS, 1% lows were 71 FPS.

Other points I remain are valid as I pointed out you can use setting from 2011 Witcher 2, to crush a 2080 Ti.

And out the whole test, you found ONE lonely game that showed some hint of a slowdown from 8GB of memory. But 1% lows were still above 71 FPS on nightmare 4K.

That isn't evidence that suddenly 8GB or 10GB cards will suddenly get crushed by next gen games.

Because ultimately what do you think Next Gen games are going to run on? They are going to sell every 8GB RTX 3070 they can make for months to come if it really lives up to performance, and NVidias flagship 3080 is a beast of a card with 10GB. Devs certainly won't aim higher than that.
 

I don't know... as someone who has played at 4K with a 980 Ti (6Gb), 980 Ti SLI, 1080 (8Gb), 1080 Ti (11Gb), 1080 Ti SLI, and a 2080 Super (with only, gasp, 8GB)... I'm going to cut my own hair into a wild mop and try and muddy everything. (That was sarcasm, for everyone not in tune with the sarcasm circuits)
 
Back
Top