RTX 3080 Founder's Edition Review by Gamer's Nexus.

People vastly overestimate the amount of VRAM needed in most cases. Hint: The numbers you see games report are not what is actually needed.

You're not as smart as you think you are. It's already been objectively proven that 10GB isn't enough for 4k even in last gen games. You're parroting Nvidia marketing propaganda.
 
Actually it makes them look like shills.

Already objectively proven last generation games need more than 11GB of VRAM at 4k. The 3080 is even worse.

https://www.dsogaming.com/pc-performance-analyses/marvels-avengers-pc-performance-analysis/

Have a nice day.
Sounds like a garbage game. Poor token example.

All in all, Marvel’s Avengers performs similarly to its Open Beta build. In other words, the game still suffers from some optimization issues. The biggest issue here is the game’s limited scalability. Even though it provides lots of graphics settings, the performance difference between Medium and Ultra settings is not that big. This is something that Nixxes will have to look into. Moreover, the game lacks some effects that were present in previous titles that were using this engine. For instance, it lacks the amazing snow effects of Shadow of the Tomb Raider. And while at times the game looks like a next-gen game, at other times it looks like an early current-gen title.
 
You are wrong. Every single freaking review outlet has said that you are wrong. Stop perpetuating nonsense. Makes you look like a crybaby.
Nah he is right when it comes to future games within the next year, consoles launch this year new games going to kill the vram 100%.
 
Nah he is right when it comes to future games within the next year, consoles launch this year new games going to kill the vram 100%.

We'll see. There's a lot of what-ifs when it comes to future games and what effect next gen will have on resources. No multiplatfom game releasing this year was designed solely with next gen in mind and that won't change for a while.
 
Show me in ANY of the current 4K benchmarks where the VRAM was a limitation. Please.

He can't. All he has is some comment from id/Bethesda about games coming out in a few years from now, and a game that has total garbage optimization.
 
He can't. All he has is some comment from id/Bethesda about games coming out in a few years from now, and a game that has total garbage optimization.

In a few years from now I can just buy a new GPU which will be faster in all other regards anyway. The VRAM argument around all of this seems to have been way overblown to me. Buy a 3090 if you feel like you need an obscene amount of VRAM.
 
Psst, send me the next Powerball numbers via PM.
Consoles push game software and engines forward.They are designed around those limitations now, not pc hardware. I expect a jump both in cpu cores requirements and vram very soon it's not that hard to predict.
 
In a few years from now I can just buy a new GPU which will be faster in all other regards anyway. The VRAM argument around all of this seems to have been way overblown to me. Buy a 3090 if you feel like you need an obscene amount of VRAM.

We're less than a month away from AMD's announcements. They might have a part with similar performance to the 3080 that also has 16GB of VRAM. The bottom line is that the 3080 looks janky as shit for next generation games, and it's stupid to not wait to see what the situation looks like in a month or two.
 
Consoles push game software and engines forward.They are designed around those limitations now, not pc hardware. I expect a jump both in cpu cores requirements and vram very soon it's not that hard to predict.

If my memory serves me right, it took a bit for requirements to jump across the board this generation. The next generation seems to be even more focused on multi-plat multi-generational titles than this one was, so I don't know if I'd expect things to change "very soon".
 
Next generation games will need significantly more RAM. The 3080 is DOA. Already obsolete. More snake oil from the green team.

They knew that it was a crippling problem with the part, so they told all the shills to talk up how much VRAM doesn't matter. There's a reason why every single fucking review had a little section blocked off for TRUST US THIS CARD HAS ENOUGH VRAM REALLY YOU CAN DEFINITELY TRUST US.

View attachment 279712

Ignoring your hyperbole, it is a worry that 10 GB will not be enough when these next gen games and consoles show up. Maybe not in 2020 or 2021 but I worry that at 4K in 2022 this card may start hitting its 10 GB wall which is not a worry for enthusiasts who regularly upgrade but is a problem for those guys that hang on to cards for 3-4 years at a time. It's definitely a firm "NO" from me and I'll be skipping it in favor of something better down the road. Either a refresh of Ampere from NVIDIA or I'll see how Big Navi fares although for me to consider AMD they really need solid drivers this time around.

P.S. Has anyone found a review where they tested Call of Duty BR w/this card at 1080p 240 Hz? It seems most of these reviews are virtual clones of each other with the same old boring SP games.
 
NVIDIA GeForce RTX 3080 Performance In Blender, Octane, V-Ray, & More

This is the kind of upgrade us 3d artists have been waiting for. What I find the most impressive is how well the 3080 does over the RTX Titan:

When we wrote our RTX 3080 launch article, we mentioned that it was a “2080 Ti Killer for $700”, but the truth is, this new GPU even has the TITAN RTX in its sights. Versus that, the RTX 3080 is 47% faster, but in comparison to the previously-gen $699 2080 SUPER, it’s 103% faster. It’s 3x faster than the 1080 Ti, released also at $699 a few years ago.

I can't wait to see the performance benchmarks on the 3090, because as useless that card is for gaming, and let's be honest nobody is going to tap in 24GB of ram for gaming. However, when it comes to machine learning, protein folding, simulations, and 3d rendering this is one of the biggest jumps in performance for price ratio I've ever seen. It's crazy how how NVIDIA leap frogged the Titan which is still currently at $2400 with a $700 card, not to mention it's blowing away the RTX 6000 by a factor of 2 when it comes to V-Ray mpaths performance.
 
I guess this is kind of off-topic, but every video I've watched of that Gamer Nexus guy, he always seems irritable and a little snarky. It like he hates the whole process of reviewing hardware, and is poorly trying to suppress his feelings.
 
I guess this is kind of off-topic, but every video I've watched of that Gamer Nexus guy, he always seems irritable and a little snarky. It like he hates the whole process of reviewing hardware, and is poorly trying to suppress his feelings.

I think it's just his composure and presentation. I'm pretty sure he loves what he does else he'd not have gotten into it as a career. To run a website dedicated to PC gaming technology, you have to love it because it's a high risk business and you can end up wasting years on it and finish with very little. I'm actually surprised a site like GN exists in this day and age where most people want all their information in a condensed 5-10 minute YT video.
 
Already linked it several times.

https://www.dsogaming.com/pc-performance-analyses/marvels-avengers-pc-performance-analysis/

There are a lot of games that use more than 10GB of VRAM at 4k. I think it's gone under the radar because so few people actually run at 4k.

read it. It did not establish beyond a shadow of a doubt that the game is VRAM limited since that would have required a back to back test with another card with more VRAM showing good performance, for example a titan card. The article at best can only hypothesize that VRAM limitation *may* be the issue, and that is precisely how they worded it.
 
I think it's just his composure and presentation. I'm pretty sure he loves what he does else he'd not have gotten into it as a career. To run a website dedicated to PC gaming technology, you have to love it because it's a high risk business and you can end up wasting years on it and finish with very little. I'm actually surprised a site like GN exists in this day and age where most people want all their information in a condensed 5-10 minute YT video.
Don't get me wrong, I overall enjoy his videos—they're insanely informative—and I agree that he must have a true passion for it. I just cannot overlook his joyless and somewhat passive-aggressive attitude. I already hate how more and more we're being driven to video reviews and articles. If that's the future, at least don't make it such a dour one.

Though at the same time, I hate fake chipperness. Maybe I'm just impossible to please.
 
Don't get me wrong, I overall enjoy his videos—they're insanely informative—and I agree that he must have a true passion for it. I just cannot overlook his joyless and somewhat passive-aggressive attitude. I already hate how more and more we're being driven to video reviews and articles. If that's the future, at least don't make it such a dour one.

Though at the same time, I hate fake chipperness. Maybe I'm just impossible to please.

Well he could be exhausted from doing the 3080 review in a short time as well. It's not exactly easy doing all those boring tests over and over and then having to put the graphs, media and videos together in time for a day 1 review.
 
read it. It did not establish beyond a shadow of a doubt that the game is VRAM limited since that would have required a back to back test with another card with more VRAM showing good performance, for example a titan card. The article at best can only hypothesize that VRAM limitation *may* be the issue, and that is precisely how they worded it.


Right, the only way to show you're actually VRAM- limited is if you take a card with twice he VRAM, and see significant performance improvement.

You can also show relative performance differences as you increase the resolution for a maxed-out game (but that is not always as-accurate, as other aspects of the card could be limiting higher-resolutions).

Luckily, we should expect to see 20GB 3080-s in the near-future.

In most game tests, the 1660 Ti 6GB card maintains it's expected relative-performance with the 2060 Super 8GB (so we're just starting to hit that limit in REQUIRED VRAM).

There are no stock game tests where the 8GB Turing cards run out of VRAM (so we have some life left in 10GB).

AMD also sees similar performance - the RX 5600 XT 6GB keeps up pretty well with the RX 5700 8GB, even at 4k (if you run the ram at the same speeds). 8GB is plenty for the next couple years (and 10GB is good for the next 4).

Are you still going to be running that 3080 as your primary gaming cards 4 years from now? Most-likley it will be too slow to run newer games with high-enough-effects settings to exceed that VRAM (or you will have to drop it down from 4k to 1440p, reducing your overall VRAM needs.)
 
Last edited:
Don't get me wrong, I overall enjoy his videos—they're insanely informative—and I agree that he must have a true passion for it. I just cannot overlook his joyless and somewhat passive-aggressive attitude. I already hate how more and more we're being driven to video reviews and articles. If that's the future, at least don't make it such a dour one.

Though at the same time, I hate fake chipperness. Maybe I'm just impossible to please.

He's just a reserved person. Even when he's laughing and having fun with fellow Yotuubers and stuff he's more reserved about it than most.
 
He's just a reserved person. Even when he's laughing and having fun with fellow Yotuubers and stuff he's more reserved about it than most.
Would we rather have a guy like this talking for 30 minutes in the video or a smoking hot model that is just reading a script and "acting"?

On second thought, don't answer that!
 
So is it good? I don't have the patience to listen to this guy drone on for 30min.
 
read it. It did not establish beyond a shadow of a doubt that the game is VRAM limited since that would have required a back to back test with another card with more VRAM showing good performance, for example a titan card. The article at best can only hypothesize that VRAM limitation *may* be the issue, and that is precisely how they worded it.

Yup let's just keep deflecting to make sure that as many suckers order a 3080 as possible. Have to keep parroting that line for as long as possible.
 
Yup let's just keep deflecting to make sure that as many suckers order a 3080 as possible. Have to keep parroting that line for as long as possible.

Provide more proof than one, likely poorly optimized, game. If you can't do that than all you have is baseless speculation.
 
Everyone bitching about VRAM, remember when Nvida snookered everyone with the 3.5GB 970? Now that we have 10GB VRAM and even way faster memory and $500 dollars cheaper than a 2080ti with more performance, people still complain...What a buncha whiners!
 
Ignoring your hyperbole, it is a worry that 10 GB will not be enough when these next gen games and consoles show up. Maybe not in 2020 or 2021 but I worry that at 4K in 2022 this card may start hitting its 10 GB wall which is not a worry for enthusiasts who regularly upgrade but is a problem for those guys that hang on to cards for 3-4 years at a time. It's definitely a firm "NO" from me and I'll be skipping it in favor of something better down the road. Either a refresh of Ampere from NVIDIA or I'll see how Big Navi fares although for me to consider AMD they really need solid drivers this time around.

P.S. Has anyone found a review where they tested Call of Duty BR w/this card at 1080p 240 Hz? It seems most of these reviews are virtual clones of each other with the same old boring SP games.

I think that this is often overlooked by people that upgrade pretty much every generation. I tend to keep my cards for at least a couple gens anymore and all three times that I've bought the higher VRAM model I was glad I had that extra memory well before I replaced the card though it was mostly useless early on outside of high res texture mods.

Right, the only way to show you're actually VRAM- limited is if you take a card with twice he VRAM, and see significant performance improvement.

You can also show relative performance differences as you increase the resolution for a maxed-out game (but that is not always as-accurate, as other aspects of the card could be limiting higher-resolutions).

The best way I've found to test VRAM limitations is lowering texture detail. It's nearly free performance wise unless you don't have a enough VRAM and then tanks performance if you run out. I do think that some modern game engines probaly do a better job of adjusting things on the fly to keep VRAM usage from maxing out but that means it has to be lowering details or cached assets that could also lead to lower performance or pop in.
 
My take is that the 3080 is a no-brainer-buy unless

1) You have a 2080 Ti.
2) Want to wait for 3090.
3) Want to wait for AMD's part.

Eh, #1 is debatable...especially at 4K. But if you're ok with shelling out that kind of dosh on a video card, you'll probably be in category 2.
 
thefpsreview has the best take on the VRAM issue

and it will be interesting to see if Nvidia can get their tensor core VRAM compression working in most games. It could be meaningful. but only if they do a better job of getting the feature used. DLSS is big on the marketing. But so far....barely exists in the wild.
 
thefpsreview has the best take on the VRAM issue

and it will be interesting to see if Nvidia can get their tensor core VRAM compression working in most games. It could be meaningful. but only if they do a better job of getting the feature used. DLSS is big on the marketing. But so far....barely exists in the wild.


I'm sorry dude, but, along with DLSS 3.0 Tenor Lossless Compression just one of many fake rumors. If NVIDIA had a way to accelerate lossless compression using Tensor units, then they would have included it in then launch specs.

Remember when Pascal launched with DisplayPort 1.2 certified and DP 1.3/1.4 Ready? (but has to be enabled through a BIOS update a year after?) - they still announced this feature at launch.

Same goes for Pascal's 4k video playback with HDCP 2.2 (enabled 6 months after launch with a driver update)...but it WAS in the launch specs!

So-far, the only missing feature anybody is whining about was magical-mystical DLSS 3.0 (and even those sites realize what a tremendous undertaking it would ave been, assuming it's even real?)
 
Whelp, I had one in cart putting in my billing info before the site errored me out, and now it's out of stock.

Guess my wifes PC has to wait abit.
 
Considering my local Microcenter has 13, I think that number is a bit off.
Wanna sell one of those 13 as a post-party intermediary vendor/middle man? How much to part with it? Does money talk? Everything has its price, right? What’s your fees
 
Back
Top