erek
[H]F Junkie
- Joined
- Dec 19, 2005
- Messages
- 10,573
The reviews are live and it's looking promising?

Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
12GB of VRAM is going to be an issue before too long. Why I'm looking to replace my 3080, 10GB isn't far off from that.
If you buy a $700 card with 12gb on it... your stupid.
Whew, thank God I'm buying the $600 FE then, that almost applied to me
Edit: It's you're* BTW![]()
16gb is not the target for new console. 4gb~ is reserved for OS use.If you buy a $700 card with 12gb on it... your stupid.
As Hardward unboxed showed the other days.... 8GB 3000 Nvidia cards are getting handily beat in newer titles by older AMD cards with 16GB of ram in RAY TRACING performance on newer titles. Even without RT enabled some of the old 8gb cards are essentially unplayable at 1080 ultra and in some cases HIGH settings.
12gb right now is setting yourself up for tears. Game developers have hit a point where they have all said fuck it... 16gb is the target now the consoles can handle it and we are not spending a year of development trying to squeeze 8gb decent looking texture packs out, or find invisible load points to swap textures in and out.
Whew, thank God I'm buying the $600 FE then, that almost applied to me
Edit: It's you're* BTW![]()
If you have a Best Buy credit card, you should be able to use code mar24emob25 at checkout for 10% off. And you should also have an offer for 10% on any purchase made on your Best Buy credit card you can activate. So the FE and the other $600 models will be $540 + tax plus $50 in Best Buy rewards. That's effectively $490.Whew, thank God I'm buying the $600 FE then, that almost applied to me
Edit: It's you're* BTW![]()
How does the 6950xt compare to 4070 in Ray Tracing? Anyone done any comparisons of that ?I'd rather have a 6950XT for the same money pretty much.
If you have a Best Buy credit card, you should be able to use code mar24emob25 at checkout for 10% off. And you should also have an offer for 10% on any purchase made on your Best Buy credit card you can activate. So the FE and the other $600 models will be $540 + tax plus $50 in Best Buy rewards. That's effectively $490.
Seems like they are roughly equal.How does the 6950xt compare to 4070 in Ray Tracing? Anyone done any comparisons of that ?
16gb is not the target for new console. 4gb~ is reserved for OS use.
Its 2gb PS5 2.5gb Xbox. They both have Ram for background task now.
I notice no one reviewing these today is using any newer titles.
Watch hot hardwards 8gb vs 16gb last gen card comparison. In more recent games AMDs older 16gb cards are using almost 15gb of textures even at 1080p ultra. I notice today no one is using and of the current crop of games that are forcing previously high end 8gb cards to run at medium settings to remain playable. Only a few years on.... and AMDs last gen crap RT cards, are outpeforming AMDs last gen great RT performance cards with RT enabled.
Anyway my only point is... 8gb has clearly become a no go for ultra settings at this point, and even high settings is an issue with some of the latest games. Nvidia has hobbled these 4070 cards with 12gb. It probably is fine right now... I don't believe 12gb is going to age very well.
Its 2gb PS5 2.5gb Xbox. They both have Ram for background task now.
I notice no one reviewing these today is using any newer titles.
Watch hot hardwards 8gb vs 16gb last gen card comparison. In more recent games AMDs older 16gb cards are using almost 15gb of textures even at 1080p ultra. I notice today no one is using and of the current crop of games that are forcing previously high end 8gb cards to run at medium settings to remain playable. Only a few years on.... and AMDs last gen crap RT cards, are outpeforming AMDs last gen great RT performance cards with RT enabled.
Anyway my only point is... 8gb has clearly become a no go for ultra settings at this point, and even high settings is an issue with some of the latest games. Nvidia has hobbled these 4070 cards with 12gb. It probably is fine right now... I don't believe 12gb is going to age very well.
They typically don't review new games though. They want stable games to compare against where they can use historical data rather than re-testing and they also don't want to use games like TLOU because of all their technical issues.
At the resolution and settings I play at, 12GB is more than enough for my needs. 4070 it is.
Ouch.Up here in Canuckistan the 6950xt is still $1000 and usually within $100 of the 4070ti which sits around $1100.
The 4070 just needs to come in here in the $900s and it becomes the clear choice.
$1000 Canadian is just under $750 USOuch.
Ah, sometimes I forget to do the conversion lol.$1000 Canadian is just under $750 US
That's pretty ridiculous. You're basically saying that if you go nvidia it's 4080/4090 or don't bother.If you play at minimum graphics settings on 1080p and never use any user made mods that increase graphics, this card will probably work OK for a normal period of time. Plan on going above 1080p at any point with it? Planning on turning up graphics at any point? Planning on adding user made mods that include graphics uplifts? Anything less than a 16GB card is a useless waste of money.
I don't play anything released in the last two years - do you mind listing off those more recent titles that are using 15GB of VRAM at 1440p? Not feeling ambitious enough to look them up myself.I hope that holds true... newer games are using 15gb of vram at 1440, and the same at 1080 with RT enabled.
Well tbf, Nvidia kind of went out of their way to make the 4090 the only card that really stands out this generation.That's pretty ridiculous. You're basically saying that if you go nvidia it's 4080/4090 or don't bother.
Yep, exactly. Nvidia is a waste of money for anything less than 4080 for buying a card in 2023 at New price, unless you plan to play under the conditions I mentioned. AMD (6800 or higher, last gen or this gen) is the better option if you can't afford a 4080. And the 7900 XTX is a better option than the 4080 if you are not using any of the features Nvidia dominates (VR, RT, DLSS, Reflex, GSync Module). I don't like the prices on either side. If my current 2070 Super wasn't having problems, I would have waited yet another generation to see if the VRAM and price situations got any better. Instead I had to get a "discounted" RX 6800 from Micro Center - the very last one they had.That's pretty ridiculous. You're basically saying that if you go nvidia it's 4080/4090 or don't bother.
I don't play anything released in the last two years - do you mind listing off those more recent titles that are using 15GB of VRAM at 1440p? Not feeling ambitious enough to look them up myself.![]()
Congrats on your inferior video card with the inferior pricing.Good thing I don't play games then. I don't even own a PC. I don't even have electricity. 4070 should last me no problem until the RTX 6k series comes out then![]()
Yet you can play Hotwarts Legacy fine on all low's on a GTX 750TI and still keep 30fps and that only has 1GB VRAM... 20GB of consumed system RAM though.Hogwarts Legacy. Calisto Project, Last of us Part one, The new resident evil (which hardware unboxed just showed a 8gb 3070 crashing on at 1080p), The new Plague tale game. No doubt there are more... and many more on the way. Everyone of those will use well over 12gb of ram at 1080p ultra and 1440 high.
I would not go as far as to call shenanigan's on all the major reviewers today not including games like Hogwarts which is one of the most popular games going at the moment... I do have to wonder why no one has covered any of this years big eye candy titles. I seem to remember a time when reviewers didn't ONLY look at 2+ year old games in a review for a brand new GPU. Sure perhaps they mention and * hey we used patch X.X.... but to ignore all games released in the last two years seems odd to me.
Congrats on your inferior video card with the inferior pricing.
Yet you can play Hotwarts Legacy fine on all low's on a GTX 750TI and still keep 30fps and that only has 1GB VRAM... 20GB of consumed system RAM though.
Its 2gb PS5 2.5gb Xbox. They both have Ram for background task now.
I notice no one reviewing these today is using any newer titles.
Watch hot hardwards 8gb vs 16gb last gen card comparison. In more recent games AMDs older 16gb cards are using almost 15gb of textures even at 1080p ultra. I notice today no one is using and of the current crop of games that are forcing previously high end 8gb cards to run at medium settings to remain playable. Only a few years on.... and AMDs last gen crap RT cards, are outpeforming AMDs last gen great RT performance cards with RT enabled.
Anyway my only point is... 8gb has clearly become a no go for ultra settings at this point, and even high and medium settings is an issue with some of the latest games. Nvidia has hobbled these 4070 cards with 12gb. It probably is fine right now... I don't believe 12gb is going to age very well.
I have a 750ti. I believe it has 2gb. I should try it though...lol.Yet you can play Hotwarts Legacy fine on all low's on a GTX 750TI and still keep 30fps and that only has 1GB VRAM... 20GB of consumed system RAM though.
2060 6GB does fine too keeping to the high 50's low 60's on ultra, again like 20+ GB consumed system ram when playing Hogwarts.
I see a lot more need for system ram there than GPU ram.