pendragon1
Extremely [H]
- Joined
- Oct 7, 2000
- Messages
- 47,993
BUT I WANT IT NOW!!! REEEEEEEEEThat sounds like someone thought about how not to tank the servers entirely on launch day.... Lol what a thought![]()
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
BUT I WANT IT NOW!!! REEEEEEEEEThat sounds like someone thought about how not to tank the servers entirely on launch day.... Lol what a thought![]()
Nvidia driver overhead?6650XT (8GB) getting 6/32
RTX 3080 (10GB) getting 6/25
3080 has 25% more VRAM and it's a more powerful GPU, but getting lower performance
The 10GB VRAM was the reason I didn't want to "downgrade" to a 3080, I knew it would be a problem just didn't imagine it would be a problem this soon.It's probably not 100% vram, but that is definitely a big factor. Sorry to 3070 users, but everyone should have saw that 8gb was going to be very limiting going forwards.
Yup. Never go down on an upgrade. Idc how fast the bus is. I'm not going to be caught with the next Fury X.The 10GB VRAM was the reason I didn't want to "downgrade" to a 3080, I knew it would be a problem just didn't imagine it would be a problem this soon.
Doubt it. that driver overhead doesn't seem to exist on the 3080Ti even though it’s the same driver.Nvidia driver overhead?
The 3080Ti has more VRAM so it fits in the picture, pretty sure the game just wants All the VRAM for RT.Doubt it. that driver overhead doesn't seem to exist on the 3080Ti even though it’s the same driver.
There’s a software issue at play here. It could be drivers or the game. Perhaps both. Clear to me it isn’t a hardware issue.
Okay I think some people are not realizing what is going on in some newer games. In the past we could look at vram usage and think oh that's just allocating and not really what it needs but things have absolutely changed in some newer games. What is going on now is that some games will actually start swapping out system ram without even allocating all of the vram. For instance if you have a 10 gig card a game may stop allocating at just over 9 gigs and go ahead and start using system ram if it's running out of vram. There are at least a couple of other games that have been doing this because you can see system ram usage go way up compared to a card that has more vram. For example if you took a 3080 12 gig and a 3080 10 gig you would see more system ram usage on the 10 gig model if it's vram limited.I dunno - my 3080 with 10 gigs of vram has barely gone over 9 gigs of usage yet.
Seems very sporadic how all of our experiences are.
It's probably not 100% vram, but that is definitely a big factor. Sorry to 3070 users, but everyone should have saw that 8gb was going to be very limiting going forwards.
Don't even know why it's a surprise. Did people really think 8gb was going to last forever? Yeah, sorry people overpaid $500-600+ for that paltry amount, but the writing was on the wall already.Far Cry 6 and Watch Dogs Legions at max settings and HD texture options has frame rate drop issues on an RTX 3070. The HD textures don't look that much better, but still, it is a setting that might have to be turned down. 8GB is becoming a bit limiting as of now.
I was responding to driver overhead. Also, why does the RX6650with less vram and a weaker GPU perform better than a 3080? The numbers both of these are producing are nowhere near being cpu limited so it’s not an overhead issue. And the 8gb card performing better than the 10gb card suggests it’s not a vram issue either.The 3080Ti has more VRAM so it fits in the picture, pretty sure the game just wants All the VRAM for RT.
I had a 3070 and a 2080 TI and even 2 years ago there were games where I had to lower the texture settings on the 3070 and in a couple of cases even at 1440p with ray tracing where as the 2080 TI had no such issues. Really even the 2070 and 2080 should have had 10 or 12 gigs as games then were already pushing close to 8 gigs at playable settings.Don't even know why it's a surprise. Did people really think 8gb was going to last forever? Yeah, sorry people overpaid $500-600+ for that paltry amount, but the writing was on the wall already.
I've seen this happen with my 2060 in Shadow of the Tomb Raider, I ran out of VRAM and the game started pulling around 16GB and ran fine.Okay I think some people are not realizing what is going on in some newer games. In the past we could look at vram usage and think oh that's just allocating and not really what it needs but things have absolutely changed in some newer games. What is going on now is that some games will actually start swapping out system ram without even allocating all of the vram. For instance if you have a 10 gig card a game may stop allocating at just over 9 gigs and go ahead and start using system ram if it's running out of vram. There are at least a couple of other games that have been doing this because you can see system ram usage go way up compared to a card that has more vram. For example if you took a 3080 12 gig and a 3080 10 gig you would see more system ram usage on the 10 gig model if it's vram limited.
Well to be fair - I've cached at least 64 gigs from my enterprise NVME to be used in game or wherever windows needs it too. Maybe that's helping as well.Okay I think some people are not realizing what is going on in some newer games. In the past we could look at vram usage and think oh that's just allocating and not really what it needs but things have absolutely changed in some newer games. What is going on now is that some games will actually start swapping out system ram without even allocating all of the vram. For instance if you have a 10 gig card a game may stop allocating at just over 9 gigs and go ahead and start using system ram if it's running out of vram. There are at least a couple of other games that have been doing this because you can see system ram usage go way up compared to a card that has more vram. For example if you took a 3080 12 gig and a 3080 10 gig you would see more system ram usage on the 10 gig model if it's vram limited.
Based on what? Frame rate only seems to tank with <16GB cards.I doubt this is a vram issues. Driver and/or engine issue.
The 10GB VRAM was the reason I didn't want to "downgrade" to a 3080, I knew it would be a problem just didn't imagine it would be a problem this soon.
I don't play 1440p, plus what's an outlier today will be the norm tomorrow. People are too quick to jump on the it's just badly optimized bandwagon. The RT effects do look awesome in this game probably the highest visual difference between RT off and max, than any other game I've seen.10GB VRAM is still fine (especially if you're playing at 1440p like I am)...this game is an outlier and what looks to be poor optimization...give it a few patches and performance will be better
Except the evidence here suggests that it IS a badly optimized game. Again, unless you can explain why a weaker GPU with less VRAM that isn't CPU limited is performing better than one with more vram and more powerful GPU, it's not only silly, but down right stupid to ignore optimization and jump straight to "VRAM! herp derp"I don't play 1440p, plus what's an outlier today will be the norm tomorrow. People are too quick to jump on the it's just badly optimized bandwagon. The RT effects do look awesome in this game probably the highest visual difference between RT off and max, than any other game I've seen.
I'll ask you personally for the 3rd time... Why is an RX 6650 which is weaker than a 3080 and has 2GB less VRAM, performing better than said 3080 in a non-CPU limited scenario?Based on what? Frame rate only seems to tank with <16GB cards.
I don't play 1440p, plus what's an outlier today will be the norm tomorrow. People are too quick to jump on the it's just badly optimized bandwagon. The RT effects do look awesome in this game probably the highest visual difference between RT off and max, than any other game I've seen.
Hardwareunboxed said that 8 GB is not enough even at 1080p and that for 1440p 12 GB is the minimum you really need.
I largely agree with some prior comments on here - Expecting 8GB to be enough for a 2023 AAA title is just absurd. Although Nvidia is somewhat to blame here due to being stingy with their VRAM amounts. The 3080 should have been 12GB to begin with. The 3070 should have been 10GB.The game doesn't look that good to warrant that VRAM usage. But 8GB at 1440 and especially ultra wide is becoming a problem.
It could very well be both - the game is unoptimized in some way but still may require more than 8GB of VRAM for 1440.
I think some of this is just many years of stagnation and people largely didn't need to worry about VRAM unless gaming at 4K. Most of GPU history VRAM was always the #1 upgrade factor whenever new games came out and GPU's were very short lived mostly due to the rapid expansion of VRAM requirements.And just think the 4060 is only going to be an 8 gig card. That is essentially DOA in the vram department for somegames that could otherwise handle.
Probably because AMD's implementation of RT uses slightly less VRAM? Who knows, you can't compare apples to oranges. The fact remains that the game needs more VRAM than any other for RT. If it wasn't a VRAM issue why would a 3060 12GB perform infinitely better than a 3080 10GB? Same architecture, same driver.I'll ask you personally for the 3rd time... Why is an RX 6650 which is weaker than a 3080 and has 2GB less VRAM, performing better than said 3080 in a non-CPU limited scenario?
IDK what bug was that, I tried the game with AO and Shadows off, only with reflections and it was still unplayable on 2080Ti.did they fix the RT bug?...there was a hack on Reddit which got it working properly but I hear RT AO and shadows were basically broken during the Early Access period
idk if it was a bug or people cant come to terms that their card cant handle it and its still in infancy, imo.IDK what bug was that,
Not really... It will need to run lower settings than max due to the gpu anyway, reducing VRAM requirements.And just think the 4060 is only going to be an 8 gig card. That is essentially DOA in the vram department for some games that it could otherwise handle. Really even the next gen 50 class cards need to be 10 or 12 gigs at this point.
What do you think is more likely. AMD managed to be better at RT in this one scenario despite being WELL behind nvidia in everything RT related…. Or…. There’s a glaring optimization issue? Do you have a source for AMD’s efficient use of RT? Or was that made up?Probably because AMD's implementation of RT uses slightly less VRAM? Who knows, you can't compare apples to oranges. The fact remains that the game needs more VRAM than any other for RT. If it wasn't a VRAM issue why would a 3060 12GB perform infinitely better than a 3080 10GB? Same architecture, same driver.
No, unless we're looking at different tests. The benchmarks of the prior page still show the Nvidia hardware being ahead in raytracing.What do you think is more likely. AMD managed to be better at RT in this one scenario despite being WELL behind nvidia in everything RT related…. Or…. There’s a glaring optimization issue?
Have you ever ran out of vram? Can you point me to a scenario where going from 10 to 12GB yields in a 1200% performance increase? Bet you cannot.
It seems like you’re willing to put in all the baseless guess work necessary to convince yourself this is nothing more than a vram issue.
That's kind of my point. M76 was saying it's probable that AMD uses less vram with RT on, hence better performance on a 6650 vs 3080 even though it's a weaker GPU with 2GB less VRAM. There's a 1440p chart with RT on where the 8GB 6650 was matching a 10GB 3080 in 1% lows and beating it in average fpsNo, unless we're looking at different tests. The benchmarks of the prior page still show the Nvidia hardware being ahead in raytracing.
The dev studio on this game is new. It's likely they aren't the best at telling DX12 what to do and how to do itThe game doesn't look that good to warrant that VRAM usage. But 8GB at 1440 and especially ultra wide is becoming a problem.
It could very well be both - the game is unoptimized in some way but still may require more than 8GB of VRAM for 1440.