Like I said, the VRAM usage between 4K and 1440p is not as big as u might think, and def not the same as 1080p to 1440p for example. 1440p is still heavy on the VRAM
The 3070 uses GDDR6 and 8 GB, which in today's standards, is DOA from day 1.
12 GB on the 3080 would been perfectly fine IMO.
k like I said, some want or can afford to grab a high end GPU every 2 years, some simply dont want to, or cannot. Even though I can afford it, I will still not do it. My GPU has to last 4 years at a minimum, up tp 5 years.
I dont like the idea of swapping a GPU every couple of years, deal with...
Believe it or not, am not 100% decided, but leaning towards the 3080ti / 6800xt. Right now I would give the 3080 a 40% purchase decision, and 80% for the 3080ti / 6800xt
RT and 4K are known to require more VRAM, its not a question of maybe, its already a fact, based on many games, cards and game engines.
Here are some VRAM usage stats:
https://www.guru3d.com/articles_pages/watch_dogs_legion_pc_graphics_performance_benchmark_review,9.html...
Like I said, am not planning to keep a card for 2 years only. Why buy a powerful card like the 3080 or 6800xt just to get rid of it in 2 years? Well if that was the case, then yes, most cards sold today would be OK to keep for 2 years, even the 8GB on the 3070, which is a joke of a card, but...
First of all, the next gen of GPU from Nvidia is gonna be 2 years from now. Waiting +- 3 months for a 3080ti when it releases and becomes available wont matter, because the 4080ti will take anyhow 2 years from whatever that date is. Thats the cycle.
buying a $700 GPU every 2 years is a waste of...
Well they trade blows depending on the game. Some have the 6800xt do better (even in lows), other games the 3080 does better.
I dont think people who buy the 6800xt would trade with a 3080, for many reasons. worst case though, you can sell the 3080 and grab the 6800xt. I dont think 10 GB is...
DLSS was meant
Faster VRAM does not substitute VRAM capacity. Why do u think they are adding 20 GB on the 3080ti if 10 is enough? Just for giggles? Wasting memory? Marketing as well?
Thanks. First reply that looks like something I was looking for.
Good to know, though I was told or read somewhere that the difference between 4K and 1440p is not as much as people think, maybe 1 GB VRAM less.
Outperforming in GPU power is one thing, not having enough VRAM down the line in a year or more is a totally different thing.
10GB is enough for 4K? Sure, whatever makes you feel better I guess. Maybe for the games you play, def not for all games out there.
This thread sounds like a bunch of RTX fanboys not admitting their brand new GPUs shipping with 8/10 GB are not even enough. Time to close the thread indeed
And yet, all these numbers seem to help ONLY with 4K, not 1440p. On average, the 3080 is either on a tie, or a bit slower than the 6800xt, so all things considered, that super-fast mem is only good for 4K. Ironically though, at 4K you would need MORE VRAM, not less. Guys playing 4K with a 3080...
Actually, if you did some research, you will know that nvidia limited their own memory speed and bandwidth, from memory it was about only 3 Gbps less than AMD. Having faster memory wont buy you more VRAM space
Turning down settings on brand new hardware released 2 months ago? Hardware that cost $700-$1000 depending what card we are talking about? Because such said cards did not ship with enough VRAM in 2020, let alone in 2022 and beyond?
This proves only that RTX is useless today, and that more VRAM...
like I said, many games today use more than 8 GB VRAM (use, not allocate), and some hit the limit of 10 GB VRAM. CyberPunk and Doom max out 8 GB on 1440p.
Its barely enough today. Games will always allocate and use more VRAM, not the opposite. When the next round of cards are released, 16 GB+ will be the norm, so 10 GB cards wont have much of a resale value.
Putting 10 GB on a high end card really was a mistake from Nvidia. Just does not make sense
The 3080ti would cost about $1000. Thats a good $300 over the 6800xt. Once you pass 16 GB VRAM, its overkill. Even 16 is overkill. I think 12-14 GB is the sweet spot. The 3080Ti will have a bit more power over the 6800xt, yes, but will I need it for 1440p? Maybe, probably not though. So the way...
I am favoring the 6800xt, but I would also get the 3080ti if it was available today. I had my mind set, though the more I read, the more people are divided over if 10 GB is enough or not.
Today, not many games would max out (usage) 10 GB VRAM, but the question is, what about in 2 years?
I...
Simply because I can get my hand on a 3080 easily, but not the 6800xt.
The 3080ti looks very promising, If I can wait 2-3 months, I can probably grab one, but its 2-3 months of waiting. I dont have any GPU at the moment
Decision to be made: 3080 vs 6800xt
I don't care about Ray Tracing, nor streaming, nvenc.
Some games, like doom eternal, and many others are already maxing out 8gb. 8gb is today clearly not sufficient (am talking about vram usage, not just allocation), let alone in 2 years.
Back to the...
Well Intel got away with putting all 8 cores in one single die, think of it as one single CCX.. We know that AMD can disable a core one each CCX. So that means if they did like Intel, and put 8 cores under a single CCX, they can still disable 2, 3, or more cores as they wish. Cache wise, well...
What I meant to say is both CCXs are fully enabled like in the Ryzen 7 family. Yep I read that somewhere as well, they just disable or "kill" the physical cores they dont want to use. In fact, they start with the same die, and they chose the best of them for the 1700x and 1800x family. Maybe...
From what I understand so far... Infinity fabric is used within the SAME single CCX (in other words, connecting the 4 different physical cores together) as well as connecting the two CCXs together (in case of the Ryzen 7). That said, the difference of latency of pinging the cores within the SAME...
Ok went through the article, here is some imp data:
- Latency of pings WITHIN the same physical core: Intel 14 ns, AMD 26 ns
- Latency of pings to ANOTHER physical core: Intel 76ns, AMD 42 ns (if within the same CCX)
- But here is the kicker: Since AMD uses 4 cores on each CCX, the latency...
Check this site for info, read the conclusion if you wish not to read the entire thing, but it does seem that windows is, and am quoting "the CCX design of 8-core Ryzen CPUs appears to more closely emulate a 2-socket system". So if I understand that correctly, all cores in one CCX will be...
Not sure if core affinity can be manually assigned in Windows for individual cores. I know it can be done in Linux and other OSes for various appliances and usages, but I have no idea if Windows goes as far as allowing that. But I don't know to what extent Ryzen itself manages it CCXs and core...
Been keeping an eye on this thread, but if anyone can get 32GB (2x16GB) working at 3200 with 14 or 15 latency please post your RAM and settings and MB you are using please
Thanks for the link, there is actually TONS of info in there, it would take someone weeks to go through it all :)
What I would like to know is... well let me explain, and am throwing some theoretical numbers here. If 2400 DDR4 vs 3200 DDR4 gets us (lets say) a 20% performance boost.. The...
Thanks for the results, gonna look them over later today, have not had time yet!
Was hoping that someone with a real 2400 kit does the tests for us, exactly like in the video. You see, right now we are downgrading from 3200 to 2400 and adding more timing (which is proven to be possible of...
Wait how can you be a serious OCer without having RealBench... come on! Are you going charge us posting on here too?? lol
I want an original kit running 2400 running at stock timings (whatever it comes with) --> Benchmark it with cinebench or realbench or geekbench
Then on that same system...
I read one of the reviewers on Newegg got two of the 2x8gb and they didnot work for 32gb. Gskill replied to get the 4x8gb kit.. I thought they were exactly the same...