Should I just get a 4070Ti?

What does that mean 'cannot play maxed out?' I think you hit the nail on the head regarding prices. I blame those ppl. I don't want to get attacked or have mods after me - but, that is the reality - lots of ppl spent crazy prices especially for 'crypto cards' - and now BOTH NVIDIA AND AMD - know that ppl will pay big bucks for these cards. The new gen is expensive even though they're more readily available and the cards 3080 and higher tier (Ampere) are either out of circulation or they're priced way above MSRP - they're almost at crypto period prices. Also, many are third party sellers hoping that buyers have no clue about all that - and will have the cash to spend at those insane prices.
If only ppl just waited and didn't buy - prices probably would have gone down.
I bought a 3060 new (and later sold it) and bought a 3080 used - I am only concerned about the VRAM thing. It can do some 4K (games) and I mostly hit 60 fps (most of the time).
Many people seem to think that they should be able to play 4K maxed out settings for rasterization (RT not included). For older games, the 4070 Ti is usually more than capable of 4K at max detail. Eventually you have to make detail concessions with newer game releases as with any other card to maintain playable framerates until you end up like me with a GTX 460 1GB that cannot hit playable framerates at minimum detail settings at 1080p. Nvidia also has DLSS for numerous games, not all, that can help prolong the lifespan of the 4070 Ti.

I cannot afford these expensive cards so I'll wait as long as I can and lower detail settings as needed since I was used to lowest detail settings with GTX 460.

Edt: I read an article that indicated numerous Gen Z were living at home with parents and spending like there is no tomorrow, hence inflation continuing to go up despite people like me with mortgage, car payment, utilities getting hammered by higher costs. I am still wondering who has so much money to throw at the latest gen cards to play games because I sure don't!
 
Well tbf, that's reddit. Every subreddit there is full of people who can't figure out why item A that costs 20% of item B can't do everything that item B can.

*Edit* I will say that yes, I'd be bummed about paying $800 for 12gb of vram. The difference is I know what my expectations are with 12gb. Can't say the same for them :D
The 4070 Ti has 12GB of VRAM like the 3080 12 GB and the 3080 Ti 12 GB, uses less power and even costs less in terms of crypto boom prices and at 1080p and 1440 generally exceeds the power hungry 3090 Ti in performance. So it seems reasonable from that perspective the price for the 4070 Ti if we look at raw performance and where it falls in terms of last gen parts and their prices. I'm not going to argue over MSRP because that is practically meaningless anymore. I don't like the situation but these people with money burning a hole in their pockets are keeping the prices higher than I feel that they should be.
 
The 4070 Ti has 12GB of VRAM like the 3080 12 GB and the 3080 Ti 12 GB, uses less power and even costs less in terms of crypto boom prices and at 1080p and 1440 generally exceeds the power hungry 3090 Ti in performance. So it seems reasonable from that perspective the price for the 4070 Ti if we look at raw performance and where it falls in terms of last gen parts and their prices. I'm not going to argue over MSRP because that is practically meaningless anymore. I don't like the situation but these people with money burning a hole in their pockets are keeping the prices higher than I feel that they should be.
Considering the 4080 and 4070ti aren't selling well.....Its a minority that is buying the newer cards.

I hate to even say this, but right now the 6950XT, and 7900XT are much better buys (With the 7900XT at $800) over a 4070ti just because of VRAM alone.

IMO the the 4070ti and 6950XT should be around $650 and the 7900XT should be around $700.
 
Considering the 4080 and 4070ti aren't selling well.....Its a minority that is buying the newer cards.

I hate to even say this, but right now the 6950XT, and 7900XT are much better buys (With the 7900XT at $800) over a 4070ti just because of VRAM alone.

IMO the the 4070ti and 6950XT should be around $650 and the 7900XT should be around $700.
Yeah I think if AMD actually wanted to move product they'd drop the price on the 7900 XT to 700. It would make it a proper successor to what it truly is, the 7800 XT. Anything below 16 gb of vram seems like a no go now.
 
Yeah I think if AMD actually wanted to move product they'd drop the price on the 7900 XT to 700. It would make it a proper successor to what it truly is, the 7800 XT. Anything below 16 gb of vram seems like a no go now.
Heck if The Last of Us is any indication of the future of PC gaming. It seems 8GB might struggle with 1080p ultra quality settings. It isn't that the GPU isn't powerful enough, it just runs out of memory.

1680121478980.png


I would not get a GPU under 16GB of memory right now imo.
 
Heck if The Last of Us is any indication of the future of PC gaming. It seems 8GB might struggle with 1080p ultra quality settings. It isn't that the GPU isn't powerful enough, it just runs out of memory.

View attachment 560440

I would not get a GPU under 16GB of memory right now imo.
That's the second time I've seen that screenshot posted on a forum yet I don't see anything on their YouTube channel or their website. Can you link me to where that's actually coming from?
 
Considering the 4080 and 4070ti aren't selling well.....Its a minority that is buying the newer cards.

I hate to even say this, but right now the 6950XT, and 7900XT are much better buys (With the 7900XT at $800) over a 4070ti just because of VRAM alone.

IMO the the 4070ti and 6950XT should be around $650 and the 7900XT should be around $700.
Nvidia should have bumped VRAM to 12GB with 3070 and possibly kept it at 12GB for 4070, then bump it to 16GB for 5070. They held onto 8GB for mid tier card about one generation too long as they have had 8GB since 1070. The 3080 at 10GB was a joke, should have been 12GB out the gate. They also bunched the performance of top end cards so there was hardly a performance difference between 3080 12GB/3080 Ti 12GB, 3090 and 3090 Ti. Too many top level cards with less than 10% performance difference when stepping up to the next card.

The xx70 series is really best as a 1440 card and 4K with older games. I have a 3060 Ti that does well at 1440, older games at 4K, maybe dropping some details for certain games. I played Control at 1440 with RT and using DLSS to get roughly 60 FPS, I can't complain. Eventually that 8GB is going to become a handicap at 1440 and I'll have to resort to scaling my resolution to either keep the higher details, or make some detail concessions as one has to do with any GPU that is getting ever older.
 
Heck if The Last of Us is any indication of the future of PC gaming. It seems 8GB might struggle with 1080p ultra quality settings. It isn't that the GPU isn't powerful enough, it just runs out of memory.

View attachment 560440

I would not get a GPU under 16GB of memory right now imo.
Remember, AMD has Infinity cache that does really well for 1080p and 1440 resolution, but seems to get overran at 4K in some games. That significantly higher bandwidth cache really helps boost framerates at lower resolutions. The 8GB may not be limiting it just yet. I study benchmarks where 4GB, 6GB, cards start bunching together and falling short of higher VRAM amounts before I say that a card is VRAM limited.
 
Remember, AMD has Infinity cache that does really well for 1080p and 1440 resolution, but seems to get overran at 4K in some games. That significantly higher bandwidth cache really helps boost framerates at lower resolutions. The 8GB may not be limiting it just yet. I study benchmarks where 4GB, 6GB, cards start bunching together and falling short of higher VRAM amounts before I say that a card is VRAM limited.
Interesting considering the 7900xtx is 1% slower at 1440p against the 4080 and 1% faster in 4k in a 50+ game benchmark. Looks like the infinity fabric isn't helping at all at 1440p against the 4080?

 
Interesting considering the 7900xtx is 1% slower at 1440p against the 4080 and 1% faster in 4k in a 50+ game benchmark. Looks like the infinity fabric isn't helping at all at 1440p against the 4080?


Could be game optimizations or other variables causing those specific results. I try to look at several websites for reviews to gauge an average. Might be the specific CPU that they used, who knows. Generally, I've seen the Infinity cahced cards do better at 1080 and 1440 in many benchmarks.
 
Could be game optimizations or other variables causing those specific results. I try to look at several websites for reviews to gauge an average. Might be the specific CPU that they used, who knows. Generally, I've seen the Infinity cahced cards do better at 1080 and 1440 in many benchmarks.
As I tend to agree, but this is a 50+ game benchmark. I do not think any other websites even comes close to benchmarking that many games.
 
As I tend to agree, but this is a 50+ game benchmark. I do not think any other websites even comes close to benchmarking that many games.
There is also game engine optimizations, driver optimizations, etc. Game devs seem to be getting lazy these days or they are just sick of working on the game and just want to get it out the door and fix issues later. Optimizations could fix the stuttering issues, but only time will tell and we will probably have forgotten this discussion by then.
 
Edit:
Ended up buying 7900 XTX.

I'm a casual gamer, but I game at 4k/120hz. I found out my PS5 gets more gaming use than PC but there are games that I prefer to play on PC, like upcoming Diablo 4 and many MMORPGs. And it just happens that those games are usually not very demanding.

I could probably try and find 3090 for about the same price, but I want a white card and that automatically makes it a lot rarer second hand (I'm from EU).
I'm not anti DLSS(3) and I think Ray Tracing performance is important so this is the reason I'm considering 4070Ti instead of 3090/Ti or AMD. I think AMD will just fall behind in performance as more and more games start implementing RT. AMD really messed up this gen.
Just to bring this all the way back, even though the OP has already bought a card... I still think the move is to wait. Unless you need a card now for something in particular, the entire market is screwed. I'd simply wait until either mid-gen refreshes come out for price drops or wait until next gen parts come out for price drops.

nVidia and AMD need to know these prices aren't acceptable. Voting with your wallet is the best way to do that. If selling the lowest amount of cards in any quarter ever is a reflection of that then eventually they will get the message. However, FWIW, I would've done the same as Nebell if I "had to" buy current gen, and "had to" buy now.

EDIT: If used/last gen was on the table, I would personally be looking at a 3090 (non Ti) or 6900XT/6950XT. The AMD parts here offer tremendous value if you're not playing a lot of RT games. If you are or you need CUDA + vRAM, then the 3090 seems obvious.
 
Last edited:
Heck if The Last of Us is any indication of the future of PC gaming. It seems 8GB might struggle with 1080p ultra quality settings. It isn't that the GPU isn't powerful enough, it just runs out of memory.

View attachment 560440

I would not get a GPU under 16GB of memory right now imo.
I wouldn't use the LOUP1 as an example of anything. The game is a steaming pile of shit from a technical perspective. Just one of the worst PC ports of all time.
 
I wouldn't use the LOUP1 as an example of anything. The game is a steaming pile of shit from a technical perspective. Just one of the worst PC ports of all time.
70% of the negative reviews are from people who A. dont wait for the shaders to compile or B. run out of vram while trying to run in Ultra.

Now DLSS is a mesh in the game no doubt about that though. But, there are plenty of people playing the game with no issues. Sounds like LOUP1 is the new "Can it play Crysis"
 
70% of the negative reviews are from people who A. dont wait for the shaders to compile or B. run out of vram while trying to run in Ultra.

Now DLSS is a mesh in the game no doubt about that though. But, there are plenty of people playing the game with no issues. Sounds like LOUP1 is the new "Can it play Crysis"
I think we should wait for game and GPU driver optimizations before making that assessment. Crysis likely did challenge the VRAM of GPUS of that day. I heard that LOUP1 had memory leak issues, that would certainly be an issue with 8GB cards.
 
Last edited:
No, but what I am saying is stop expecting 4K rendering monster from a mid range card. Everyone acts like the 4070 Ti should render like a 4090 and complain that it does not. Tiered products, tiered prices, tiered performance. This isn't rocket science. Who doesn't want 4K monster GPU for $200-300? The reality is companies have BOMs, overhead, melting 4090 power connector issues, etc. to cover financially. Nvidia will charge what people seem willing to pay. Don't like the price? Buy last gen or wait until next gen and hope for better prices. I keep hoping AMD will become a larger competitor to give Nvidia competition, that will help with lowering prices a bit, but many prefer Nvidia to AMD, so here we are.

But we’re not talking about a $200-$300 GPU, we’re talking about an $800 GPU, which was considered high end pricing just two years ago. The 4070Ti’s bus is limiting what that card can do, no two ways about it, and people have a right to feel that’s insufficient at $800, because it is. In what world is $800 “mid-range”?

And remember, this was originally supposed to be Nvidia’s “high end” 4080 12GB for $900, so both their engineering team and their marketing team clearly did think this was suitable for 4K high end gaming before everyone called them on their BS and they had to relaunch the product as something lower tier for a lower price.

On the last point, sure, they’re a corporation, not a charity, and price discovery/price anchoring is exactly their objective (don’t worry, they make plenty of margin on GPUs to cover their overhead). I am taking your advice and not buying this clearly overpriced nonsense. Based on sales numbers and stock levels, so are a lot of other consumers.
 
But we’re not talking about a $200-$300 GPU, we’re talking about an $800 GPU, which was considered high end pricing just two years ago. The 4070Ti’s bus is limiting what that card can do, no two ways about it, and people have a right to feel that’s insufficient at $800, because it is. In what world is $800 “mid-range”?

And remember, this was originally supposed to be Nvidia’s “high end” 4080 12GB for $900, so both their engineering team and their marketing team clearly did think this was suitable for 4K high end gaming before everyone called them on their BS and they had to relaunch the product as something lower tier for a lower price.

On the last point, sure, they’re a corporation, not a charity, and price discovery/price anchoring is exactly their objective (don’t worry, they make plenty of margin on GPUs to cover their overhead). I am taking your advice and not buying this clearly overpriced nonsense. Based on sales numbers and stock levels, so are a lot of other consumers.
I think they may have been banking on DLSS 3 being implemented as quickly as possible to add frames in between rendered frames so it would be more of a high end 4K card in terms of framerates with DLSS trickery. That is more scammy and ridiculous than an $800 mid range card, IMHO, especially if what became the 4070 Ti was actually originally a low end 4080 "high end" card that used DLSS 3 to get high end performance when titles rolled out with DLSS 3 support, eventually.

The 4070 Ti is in that awkward spot where it can do 4K, but is better suited as a high end 1440 card for long term usage. That makes it a hard pill to swallow because with the higher price tag we expect 4K performance, but only getting fabulous 1440 performance and decent 4K performance for now. It is too close to not being good at 4K to buy it as a 4K card especially as 12GB of VRAM starts to become a limitation at 4K and eventually at 1440.

Really, 4K might be nice to look at, but it requires high end hardware to render it, there really isn't a shortcut to rendering 4K long term with current gen hardware that is cheap. I finally made the jump from 1080 to 1440, and I am happy with it, even though I could spend a lot more to get 4K, it isn't worth it to me. When the 5090 or 6090 can do 8K gaming at reasonable framerates, I might make the jump to 4K. I buy what I can afford and don't look back with regret.
 
Nvidia should have bumped VRAM to 12GB with 3070 and possibly kept it at 12GB for 4070, then bump it to 16GB for 5070. They held onto 8GB for mid tier card about one generation too long as they have had 8GB since 1070. The 3080 at 10GB was a joke, should have been 12GB out the gate. They also bunched the performance of top end cards so there was hardly a performance difference between 3080 12GB/3080 Ti 12GB, 3090 and 3090 Ti. Too many top level cards with less than 10% performance difference when stepping up to the next card.

The xx70 series is really best as a 1440 card and 4K with older games. I have a 3060 Ti that does well at 1440, older games at 4K, maybe dropping some details for certain games. I played Control at 1440 with RT and using DLSS to get roughly 60 FPS, I can't complain. Eventually that 8GB is going to become a handicap at 1440 and I'll have to resort to scaling my resolution to either keep the higher details, or make some concessions as one has to do with any GPU that is getting ever older.
12GB wasn't really a viable option for GA104 because it would leave two memory channels empty and hurt performance- GA lacks the extra L2 cache that helps AD get by with lesser memory bus so realistically GA104 3070 & Ti had to be a 256-bit config.
It seems pretty clear from all the various leaks & rumors in 2020/2021/2022 that a 16GB 3070Ti config was strongly considered, the 3080 was really close to being a 20GB card, and 3080Ti was almost 20GB or 22GB. Based on actual prototype card & BIOS configs and leaked info going between NV and AiBs.

All of those configs would have been way "better" from a longevity perspective but would have demanded a different course of direction on the 4000-series... going from 16GB 3070Ti to 12GB 4070Ti and 20GB 3080 to 16GB 4080 would have been an awkward sell and NVidia is concerned with how a card performs on launch day, not later.

GG Nvidia, I guess, for min-maxing the Ada Lovelace architecture to such a degree that the entire RTX 4000 lineup is more difficult to directly compare to previous generations.
 
12GB wasn't really a viable option for GA104 because it would leave two memory channels empty and hurt performance- GA lacks the extra L2 cache that helps AD get by with lesser memory bus so realistically GA104 3070 & Ti had to be a 256-bit config.
It seems pretty clear from all the various leaks & rumors in 2020/2021/2022 that a 16GB 3070Ti config was strongly considered, the 3080 was really close to being a 20GB card, and 3080Ti was almost 20GB or 22GB. Based on actual prototype card & BIOS configs and leaked info going between NV and AiBs.

All of those configs would have been way "better" from a longevity perspective but would have demanded a different course of direction on the 4000-series... going from 16GB 3070Ti to 12GB 4070Ti and 20GB 3080 to 16GB 4080 would have been an awkward sell and NVidia is concerned with how a card performs on launch day, not later.

GG Nvidia, I guess, for min-maxing the Ada Lovelace architecture to such a degree that the entire RTX 4000 lineup is more difficult to directly compare to previous generations.
Lets not forget that the GTX 285 had a massive 512 bit bus but the GTX 480 only had a paltry 384 bit bus. How quickly people forget... They have to aim for a performance target and they design bus width and memory speed for total memory bandwidth to match the rendering power of a GPU. Can't just slap numbers together and hope it all works to make numbers people happy!

I guess gamers are worse than engineers in the I-know-it-all department!
 
Can't just slap numbers together and hope it all works to make numbers people happy!
Yeah, it's just fun to poke at sometimes because some people were making a big stink about the 6700xt having a 192 bit bus. Then the 4070ti comes out and oh....would you look at that.
 
12GB wasn't really a viable option for GA104 because it would leave two memory channels empty and hurt performance- GA lacks the extra L2 cache that helps AD get by with lesser memory bus so realistically GA104 3070 & Ti had to be a 256-bit config.
It seems pretty clear from all the various leaks & rumors in 2020/2021/2022 that a 16GB 3070Ti config was strongly considered, the 3080 was really close to being a 20GB card, and 3080Ti was almost 20GB or 22GB. Based on actual prototype card & BIOS configs and leaked info going between NV and AiBs.

All of those configs would have been way "better" from a longevity perspective but would have demanded a different course of direction on the 4000-series... going from 16GB 3070Ti to 12GB 4070Ti and 20GB 3080 to 16GB 4080 would have been an awkward sell and NVidia is concerned with how a card performs on launch day, not later.

GG Nvidia, I guess, for min-maxing the Ada Lovelace architecture to such a degree that the entire RTX 4000 lineup is more difficult to directly compare to previous generations.

I'm not surprised they skimped out on vram on 3xxx series. If they added 20gb on 3080 that card would've ruined 3090 sales.
What they did with 4xxx series is give more/enough vram to 4080 but also make it significantly slower than 4090. And I'm looking forward to next-gen rumors because Nvidia will have to make something exceptional to beat 4090.

I'm pretty happy with my 7900 XTX Red Devil. It's no 4090 but it also cost me €500 less and it's not far behind. It's significantly faster than 4080 in raster.
 
6750xt 12GB= 399.00
6800tx 16GB = 560.00

Those prices are pretty good.
 
I think they may have been banking on DLSS 3 being implemented as quickly as possible to add frames in between rendered frames so it would be more of a high end 4K card in terms of framerates with DLSS trickery. That is more scammy and ridiculous than an $800 mid range card, IMHO, especially if what became the 4070 Ti was actually originally a low end 4080 "high end" card that used DLSS 3 to get high end performance when titles rolled out with DLSS 3 support, eventually.

The 4070 Ti is in that awkward spot where it can do 4K, but is better suited as a high end 1440 card for long term usage. That makes it a hard pill to swallow because with the higher price tag we expect 4K performance, but only getting fabulous 1440 performance and decent 4K performance for now. It is too close to not being good at 4K to buy it as a 4K card especially as 12GB of VRAM starts to become a limitation at 4K and eventually at 1440.

Really, 4K might be nice to look at, but it requires high end hardware to render it, there really isn't a shortcut to rendering 4K long term with current gen hardware that is cheap. I finally made the jump from 1080 to 1440, and I am happy with it, even though I could spend a lot more to get 4K, it isn't worth it to me. When the 5090 or 6090 can do 8K gaming at reasonable framerates, I might make the jump to 4K. I buy what I can afford and don't look back with regret.

I agree with you on the DLSS 3 part. I think Nvidia was relying on that to fill in some performance gaps this generation, but it's not something you can bank on because we don't know how it's going to be implemented and it's still not perfect. Raw power on the front end will always be better than trying to use AI as an approximation to fill in the gaps.

In any case, the fact of the matter remains that the initial intention for the 4070Ti was that it was actually a $900 4080 with "less VRAM" and, oh yeah we totally forgot to mention a castrated 192-bit wide memory bus that limits its performance at 4K with key features Nvidia is telling you that you MUST make use of. The argument that Nvidia engineers factored this in and dismissed it because the card was intended to be used as a 1440P solution is a fallacy, and no amount of gaslighting from Nvidia's marketing team will make that untrue.

What's going on here is that Nvidia is trying to now shift the price anchors up the stack, which is fine from their viewpoint if the consumer decides they're willing to pay for it, but everyone was right to call them out on their BS and they were forced to do the recall specifically because no one was buying what they were selling with regard to product placement and 4K performance. It's not unreasonable for a consumer to expect good 4K performance on features Nvidia is marketing as key (ray tracing, etc) when they're shelling out $800 (initially $900 pre-recall) for the card regardless of what you or I think about 4K as a practical gaming resolution.
 
Wise move.

7900XTX competes with at least a 4080 and up at all resolutions, especially 2/4k+. At ~800 there's nothing close. (A bit surprised the media doesn't make more noise about this...)

Would have also considered the 3090 since it is round that price range, but it just gets absolutely smoked by the XTX. DLSS/RT is good and all, but when your card is down ~20-30% or more in general at your playing resolution... it ain't that good.
lol a 7900 XTX doesn't compete with a 4090, not really close a 4090 is almost 50% faster.
I would also like to add that TPU found when testing VRAM with the new Hogwarts game (Which is now the benchmark for a lot of future games)

View attachment 551178

This is directly taken from the review:

The ray tracing performance hit can be reduced by lowering the ray tracing quality setting from the "Ultra" default, which helps a lot with performance, especially on AMD. RX 7900 XTX can now reach 60 FPS with ray tracing "low," at 1080p. Not impressive, but still a huge improvement over 28 FPS. In terms of VRAM usage, Hogwarts Legacy set a new record, we measured 15 GB VRAM allocated at 4K with RT, 12 GB at 4K with RT disabled. While these numbers seem shockingly high, you have to put them in perspective. Due to the high performance requirements you'll definitely not be gaming with a sub-16 GB card at 4K. 8 GB+ at 1080p is certainly a lot, but here, too, owners of weaker GPUs will have to dial down their settings anyway. What's always an option is to use the various upscaling methods like DLSS, FSR and XeSS, which lower VRAM usage, too, by running at a lower internal rendering resolution.
Shouldn't we be using well optimized games for a bench mark, not just "demanding" poorly optimized games. What do I know, maybe I'm wrong, it just seems backwards.
 
lol a 7900 XTX doesn't compete with a 4090, not really close a 4090 is almost 50% faster.
Unless you are talking about ray tracing, a 4090 sure as hell is not anywhere near 50% faster.

EDIT: Average on techpowerup at 4k was 23% for the founders 4090 over the reference 7900 xtx.
 
Last edited:
Unless you are talking about ray tracing, a 4090 sure as hell is not anywhere near 50% faster.

EDIT: Average on techpowerup at 4k was 23% for the founders 4090 over the reference 7900 xtx.
More than twice as fast once you factor in DLSS3. Also much faster in games that don't have fsr, but do have dlss2. No reasonable person buys a high end card only to not use raytracing :rolleyes:.
 
More than twice as fast once you factor in DLSS3. Also much faster in games that don't have fsr, but do have dlss2. No reasonable person buys a high end card only to not use raytracing :rolleyes:.
Lol what the hell are you rolling your eyes about now? Some people that even buy high end cards don't care much about ray tracing and he was making no mention of it. All I was saying is that it is no where near 50% faster if he is referring to raster.
 
Lol what the hell are you rolling your eyes about now? Some people that even buy high end cards don't care much about ray tracing and he was making no mention of it. All I was saying is that it is no where near 50% faster if he is referring to raster.
You mentioned it. Also, :ROFLMAO: at anyone spending for these who doesn't care about high end graphics features. They clearly don't need such a powerful card then. :rolleyes: PS dlss ftw. Same with DLDSR and DLAA. Faster and a full gen ahead of amd. :D
 
You mentioned it. Also, :ROFLMAO: at anyone spending for these who doesn't care about high end graphics features. They clearly don't need such a powerful card then. :rolleyes:
Some people turn on all the ray tracing just to check it out and then turn it off. Some people seriously just don't give a shit about it and just want the highest FPS in their games. Rolling your eyes does not change anything.
 
Some people turn on all the ray tracing just to check it out and then turn it off. Some people seriously just don't give a shit about it and just want the highest FPS in their games. Rolling your eyes does not change anything.
:rolleyes: I notice that you dodged the part about dlss ;).
 
lol a 7900 XTX doesn't compete with a 4090, not really close a 4090 is almost 50% faster.

Shouldn't we be using well optimized games for a bench mark, not just "demanding" poorly optimized games. What do I know, maybe I'm wrong, it just seems backwards.
Actually, the 7900XTX does compete with the 4090 in certain games, even destroying it, but those are limited cases. It seems if games were better optimized for AMD hardware their cards would perform a bit better than they currently do. Also, someone did some OC experimenting and found the 7900XTX, given enough voltage, can compete with a 4090, but power draw becomes an issue. I don't care if nvidia has the fastest card, I cannot afford it. I buy what gives me the most bang for my buck, which typically has been AMD.
 
Actually, the 7900XTX does compete with the 4090 in certain games, even destroying it, but those are limited cases. It seems if games were better optimized for AMD hardware their cards would perform a bit better than they currently do. Also, someone did some OC experimenting and found the 7900XTX, given enough voltage, can compete with a 4090, but power draw becomes an issue. I don't care if nvidia has the fastest card, I cannot afford it. I buy what gives me the most bang for my buck, which typically has been AMD.
We should go by the rules not the exceptions, and just cause you can make the card closer in performance while also making it unstable with a chance to brick components doesn’t really mean all that much to me, Nvidia has a premium for a reason it seems to me as well as a strangle hold on the PC GPU market, even if I don’t necessarily like it because they are getting a little out of hand with their pricing.
 
Back
Top