4060 Ti 16GB due to launch next week (July 18th)

Dev problem or Nvidia problem, $400 video card should be able to max out games at 1080p.
Why dev should not put setting that only the more expensive card can run ?

What is lost by a game having more setting ? A $400 video card should be able to run at game perfectly fine at 1080p yes, does that mean that a game should not allow the player to have 4 bounce instead of 2 during path tracing

That the reason why they hide the settings while making it well know how to run the hidded setting right away.

If that mode was called impossible from a command line argument, no one would say that. Could be why they called it cinematic, so no one is surprised if it is not made to game at that setting with current hardware.

I'm saying the AMD card will have 12+ GB VRAM.

A 7900xtx does not reach 20fps when maxing out the game at 1080p, many will find that the 4080 is not able to play that game maxout at 1080p, that not really a 4060ti being only 8 GB change something relevant about running that game maxed out, put 24 GB it will have no chance doing so:
pt-1920-1080.png
 
Last edited:
Dev problem or Nvidia problem, $400 video card should be able to max out games at 1080p.
Guess I'm crazy tho. Maybe monkey game is the new crysis, haven't played it.
It's the evolution of games. Textures in new games are huge compared to stuff from say 5 years ago.

The cheaper cards don't have the budget for the ram needed to play with new games running at max texture quality. I think this is simply how it is going to be from now on. And games will just keep getting more realistic.

Older games re-used textures 10,000 times.. Gamers today expect more variation and more unique textures... NPC's have more unique faces with less re-use. Half-life one had what 2 scientists head models, 1 policeman model, 1 soldier model, 1 alien model for each species. Now games like that have all unique NPC's. It costs in game install size and in the amount of Vram needed to store them all. The tradeoff now is that the cheaper cards have to use lower resolution textures to fit it all into memory. Gamers still get games that look more realistic and have more unique textures, and more unique NPC's, even if they play their games on High quality texture resolution, vs Ultra or Cinematic. The requirements are shifting. In 10 years all GPU's will probably come with 24GB Vram as the minimum.
 
Last edited:
It's the evolution of games. Textures in new games are huge compared to stuff from say 5 years ago.

The cheaper cards don't have the budget for the ram needed to play with new games running at max texture quality. I think this is simply how it is going to be from now on. And games will just keep getting more realistic.
I do not think that what going with black myth, cinematic 1440p fit well under 8gb of vram, 4k upscaled no difference between 8 and 16gb:
vram.png
min-fps-3840-2160.png


running a full path traced over it is what push it over the top.
 
I do not think that what going with black myth, cinematic 1440p fit well under 8gb of vram, 4k upscaled no difference between 8 and 16gb:
View attachment 676621View attachment 676622

running a full path traced over it is what push it over the top.
There seems to be a bit of a reversal in vram usage on AAA games this last year. The 8GB 4060ti looks to be able to match the min fps od the 16GB version, at least up to 1440p.
 
Yeah Frame Gen consumes about 1.5GB Vram. But that graph exactly shows what I am talking about. Low quality textures = 5GB Vram, Cinematic Textures 7GB Vram. It ranges from 41% to 53% more ram needed going from Low quality textures to Cinematic.

The higher the resolution, the more Vram needed as well. More pixels = more Vram.

Not saying that a game can't be well coded and fit within memory constraints of lower end cards... but most game companies don't spend much effort to achieve that. Games are a magnitude more complex to make vs a decade ago. A lot of the under the hood optimization just isn't feasible any more. They use something like Unreal Engine 5 and work within what the engine can do. If it isn't super efficient in its use of Vram, we just get to deal with it.
 
There seems to be a bit of a reversal in vram usage on AAA games this last year. The 8GB 4060ti looks to be able to match the min fps od the 16GB version, at least up to 1440p.
UE5 does some insane things with VRAM, when paired with DLSS and the other whistles you can stay within an 8GB budget unless you are trying 4K then you need at least 12.
 
Yeah Frame Gen consumes about 1.5GB Vram. But that graph exactly shows what I am talking about. Low quality textures = 5GB Vram, Cinematic Textures 7GB Vram. It ranges from 41% to 53% more ram needed going from Low quality textures to Cinematic.

The higher the resolution, the more Vram needed as well. More pixels = more Vram.

Not saying that a game can't be well coded and fit within memory constraints of lower end cards... but most game companies don't spend much effort to achieve that. Games are a magnitude more complex to make vs a decade ago. A lot of the under the hood optimization just isn't feasible any more. They use something like Unreal Engine 5 and work within what the engine can do. If it isn't super efficient in its use of Vram, we just get to deal with it.
something else is playing into that big of a jump in vram usage, maybe other higher setting being applied? i just tested this on my system and i see a 400mb difference between low and cinematic textures, in the benchmark anyways... or maybe its my scaling idk.
 
It depends on the game. Plus it all kind of starts multiplying. So if you add in raytracing, it's like another 20% Vram needed. Like multiplying by 1.2. So 5Gb x 1.2=6 for low quality textures + RT, or 7Gb x 1.2=8.4 for Cinematic textures raytraced. That's more than 8GB, so not something an 8GB card will be good at. So turn off RT, or choose High instead of Ultra, then it fits and all is well.
 
It depends on the game. Plus it all kind of starts multiplying. So if you add in raytracing, it's like another 20% Vram needed. Like multiplying by 1.2. So 5Gb x 1.2=6 for low quality textures + RT, or 7Gb x 1.2=8.4 for Cinematic textures raytraced. That's more than 8GB, so not something an 8GB card will be good at. So turn off RT, or choose High instead of Ultra, then it fits and all is well.
aka not something you should be doing on an 8gb card in the first place...
yup
 
cant talk about the dev/sw side of things, but with pricing Nv and amd, its like with everything else in a capitalistic world:
the maker/seller determines what specs a product has, and what its (msrp) price is.

ppl need to buy with what their wallet allows, and stop fixating on getting a certain tier (e.g. xx70/80), or expect what a certain type of tier should cost and/or be able to do (fps wise) at certain res, but never apply the same thinking to other things.

how many are going to porsche and say you should be able to get a +300 HP/+150 mph four door car for the sameprice of what any asian brand charges, or that a mansion sitting on 20 acres they are interested in, shouldnt cost more than the average single family home in the area?
right.

you see product, and a price, buy, or dont, as simple as that.

and since some ppl tend to assume things (i never said), this does not imply i like the cost of parts or happy with tiers offered..
 
Last edited:
ppl need to buy with what their wallet allows, and stop fixating on getting a certain tier (e.g. xx70/80), or expect what a certain type of tier should cost and/or be able to do (fps wise) at certain res, but never apply the same thinking to other things.

And yet I can get a mid range cpu for about the same price since 2007 with core2duos, namely $200-250, especially since the first and second gen i5 chips. Ryzen then brings $200 back after intel had inched up to $250 in 2017. We've been stuck at 1080p since about 2010-2013 becoming mainstream and then the entry level resolution. Monitors have finally come down in price for 1440p-4k and yet a $400 (8GB vram) gpu won't really guarantee 2k or 4k 60fps. You can buy $450 4k tv's at Costco these days. The only thing really making it work is "upscaling" which means we're rendering the game at lower resolutions and then scaling up. Thus its a $400 card for maxed out 1080p that is really 720p. Oh and since most of those cards still have 8GB vram, the new frame gen, ray tracing and upscaling techs don't work so great with low ram. So much for advanced tech. Advanced tech you end up paying to beta test as we're seeing with the Ryzen 9000 series or tech that is over-volting itself leading to an early death with Intel's raptor lake.

So yes, I see the products and don't buy. Then I don't bother to buy games that push any limits until they are dirt cheap, especially because many of them are not so complete or are a buggy mess at launch, no matter what hardware power you have. Even your $500 PS5 plus $120 StarWars Outlaws game gets told to start over again because of a game patch and save corruptions.

The only real counter argument to be had is the economies of scale one but then the pricing and supply/demand curves show gpu's get pumped out for crypto or AI just fine. All while helping to keep prices higher and thus profit margins higher afterwards.
 
And yet I can get a mid range cpu for about the same price since 2007 with core2duos, namely $200-250, especially since the first and second gen i5 chips.

GPU exploded significantly more than CPU over that time

A core 2 duo E8400 was a 104mm die of 410 millions transistor with an msrp of $183
A 7600x was a 71mm of logic die + 122 mm of IO for 6,570 millions transistor of logic + 3,400 millions of io with an msrp of $300


a 8600 gts was a 169 mm with 289 millions transistor launched at $200 usd,
A 4070 has around 35,800 millions transistors, launched at $600 USD, that about 125 times more transistors, went from 92.8 gflops to 29.15 tflops in FP32 for 300 times the jump, you get 48 time more vram at 15time the bandwith.

Price by millions transistor between 2007 and 2022
CPU: $0.45 -> $0.03
GPU: $0.69 -> $0.017

performance
passmark ST: 1237 -> 4200 (3.5x time faster)
passmark MT: ? -> 36000 (say 20 times faster)

GPU went from more expensive to much cheaper than CPU for what you buy in a way, and the GPU come with memory, a very fancy cooling solution, power delivery that got quite big, it is like a little motherboard-computer on the side, the GPU come more and more all alone without a cooler.

gpu won't really guarantee 2k or 4k 60fps.
For good reason (the same console game maker do not choose them) it is a purely game dev choice here they could if they wanted make game that run at 4k-120fps on a $200 rx 6600 many fast twitch competitive game does it, it does not exist any amount of GPU that could guarantee anything, a game could always be harder to run, game dev would make them bigger and better looking no matter what.
 
Last edited:
Ranulfo
there is no connection between the cost of one hw part, and the cost of another, no matter how often you compare it.
every "computer" needs a cpu to work, while not true for dgpu.

ignoring that most companies exist to make money, not to please buyers with the latest and biggest hw for the lowest cost.
if you had a bakery shop selling say artisan donuts for 2-5$ a piece, would you lower the price to 50 cents because supermarkets/other donut shops sell them for less (per piece)?
right.
 
ignoring that most companies exist to make money, not to please buyers with the latest and biggest hw for the lowest cost.
if you had a bakery shop selling say artisan donuts for 2-5$ a piece, would you lower the price to 50 cents because supermarkets/other donut shops sell them for less (per piece)?
right.

Well, if you want to get and or keep customers, you do have to please those buyers/customers/consumers. Artisan donuts are gone in my local area for the most part. Less people buying pastries I guess and the local grocery stores beat them out on price. Nvidia has a captive market for the moment so they can get away with their pricing and design choices.

Transistor count is just another part of a bill of materials aka cost to manufacture argument. So at least you're not making the inflation one.
 
well comes down to the same thing, if you have a product that sells at certain price, why would you want to (ruin your sales and) lower prices from business point of view.
and with one out of 2 (3) d-gpu chip makers, they dont worry about "keeping" customers.
 
Few notes from a review few months ago:

  1. Ghost of Tsushima
    1. 0:00 Ghost of Tsushima 4K Very High - stuttery
    2. 4:35 Ghost of Tsushima 4K Very High DLSS P - 69 vs 39 avg & 62 vs 34 1%
    3. 5:58 Frame Generation uses more VRAM- Ghost of Tsushima 4K High DLSS Q FG - perf decreases for 8gb. 1% lows are 252% faster in 16gb
    4. 7:19 Ghost of Tsushima 1440p Very High DLSS Q FG - 16gb better
  2. Horizon Forbidden West
    1. 8:14 Horizon Forbidden West 4K Very High - 21% faster avg, 58% faster 1% lows. Spikes in 8gb
    2. 8:41 Horizon Forbidden West 4K Very High DLSS P - 16gb doubles perf from above, 8gb can't
    3. 9:10 Horizon Forbidden West 4K High - 9% faster avg, 39% faster 1%
    4. 9:36 Horizon Forbidden West 4K High DLSS P - 16gb reaches 63fps but 8gb struggles
    5. 10:03 Horizon Forbidden West 4K Medium DLSS P - 13% faster in 1% lows
    6. 10:30 Horizon Forbidden West 1440p Very High - 13% faster avg, 42% faater 1% lows
    7. 10:57 Horizon Forbidden West 1440p Very High DLSS Q - 80 vs 67 avg & 68 vs 42 1% low
    8. 12:08 Horizon Forbidden West 1080p Very High.- diff in performance
  3. Hellblade 2 (UE5 non-nvidia branch)
    1. 12:43 Hellblade 2 4K High - 8gb crippled
  4. Avatar 4K (engine streams diff texture quality for diff vram)
    1. 16:17 Avatar 4K High - 24 vs 19 1%
  5. Alan Wake 2
    1. 17:47 Alan Wake 2 4K High DLSS P - 8gb crushed
    2. 18:06 Alan Wake 2 4K Low DLSS P - 61 vs 49 avg, 45 vs 37 1%
    3. 18:29 Alan Wake 2 1440p RT High DLSS B - 17% avg, 35% 1%
    4. 18:56 Alan Wake 2 1440p RT High DLSS B FG - 57 vs 42 avg, 48 vs 33 1%
    5. 19:27 Alan Wake 2 1440p High - 10% better
    6. 19:47 Alan Wake 2 1440p High DLSS Q - 20% better
    7. 19:58 Alan Wake 2 1080p RT High DLSS Q - 17% faster 1%
    8. 20:13 Alan Wake 2 1080p RT High DLSS Q FG - 10% faster avg, 27% faster 1%
    9. 20:44 Alan Wake 2 1080p High - 10% better
  6. Resident Evil 4
    1. 21:02 Resident Evil 4 Remake 4K Max - 76% faster 1%, 17% faster avg, massive stutters
    2. 21:40 Resident Evil 4 1440p Max - 5 fps 1% massive stutters
    3. 22:23 Resident Evil 4 1080p Max - massive frametime spike in 1% lows
  7. Starfield
    1. 23:05 Starfield 4K Ultra DLSS P FG - 60% faster avg, 180% faster 1%
  8. Cyberpunk
    1. 24:26 Cyberpunk 2077 1440p RT Overdrive DLSS P - 6% faster avg, 14% faster 1% low
    2. 24:57 Cyberpunk 2077 1440p RT OVerdrive DLSS P FG - 56 vs 26 1% low. Massive spikes/stutters
    3. 25:23 Cyberpunk 2077 1080p RT Overdrive DLSS Q - 52 vs 42 avg, 44 vs 34 1% -
    4. 25:50 Cyberpunk 2077 1080p RT Overdrive DLSS Q FG - 58% better 1% lows

View: https://youtube.com/watch?v=nrpzzMcaE5k
 
Few notes from a review few months ago:

  1. Ghost of Tsushima
    1. 0:00 Ghost of Tsushima 4K Very High - stuttery
    2. 4:35 Ghost of Tsushima 4K Very High DLSS P - 69 vs 39 avg & 62 vs 34 1%
    3. 5:58 Frame Generation uses more VRAM- Ghost of Tsushima 4K High DLSS Q FG - perf decreases for 8gb. 1% lows are 252% faster in 16gb
    4. 7:19 Ghost of Tsushima 1440p Very High DLSS Q FG - 16gb better
  2. Horizon Forbidden West
    1. 8:14 Horizon Forbidden West 4K Very High - 21% faster avg, 58% faster 1% lows. Spikes in 8gb
    2. 8:41 Horizon Forbidden West 4K Very High DLSS P - 16gb doubles perf from above, 8gb can't
    3. 9:10 Horizon Forbidden West 4K High - 9% faster avg, 39% faster 1%
    4. 9:36 Horizon Forbidden West 4K High DLSS P - 16gb reaches 63fps but 8gb struggles
    5. 10:03 Horizon Forbidden West 4K Medium DLSS P - 13% faster in 1% lows
    6. 10:30 Horizon Forbidden West 1440p Very High - 13% faster avg, 42% faater 1% lows
    7. 10:57 Horizon Forbidden West 1440p Very High DLSS Q - 80 vs 67 avg & 68 vs 42 1% low
    8. 12:08 Horizon Forbidden West 1080p Very High.- diff in performance
  3. Hellblade 2 (UE5 non-nvidia branch)
    1. 12:43 Hellblade 2 4K High - 8gb crippled
  4. Avatar 4K (engine streams diff texture quality for diff vram)
    1. 16:17 Avatar 4K High - 24 vs 19 1%
  5. Alan Wake 2
    1. 17:47 Alan Wake 2 4K High DLSS P - 8gb crushed
    2. 18:06 Alan Wake 2 4K Low DLSS P - 61 vs 49 avg, 45 vs 37 1%
    3. 18:29 Alan Wake 2 1440p RT High DLSS B - 17% avg, 35% 1%
    4. 18:56 Alan Wake 2 1440p RT High DLSS B FG - 57 vs 42 avg, 48 vs 33 1%
    5. 19:27 Alan Wake 2 1440p High - 10% better
    6. 19:47 Alan Wake 2 1440p High DLSS Q - 20% better
    7. 19:58 Alan Wake 2 1080p RT High DLSS Q - 17% faster 1%
    8. 20:13 Alan Wake 2 1080p RT High DLSS Q FG - 10% faster avg, 27% faster 1%
    9. 20:44 Alan Wake 2 1080p High - 10% better
  6. Resident Evil 4
    1. 21:02 Resident Evil 4 Remake 4K Max - 76% faster 1%, 17% faster avg, massive stutters
    2. 21:40 Resident Evil 4 1440p Max - 5 fps 1% massive stutters
    3. 22:23 Resident Evil 4 1080p Max - massive frametime spike in 1% lows
  7. Starfield
    1. 23:05 Starfield 4K Ultra DLSS P FG - 60% faster avg, 180% faster 1%
  8. Cyberpunk
    1. 24:26 Cyberpunk 2077 1440p RT Overdrive DLSS P - 6% faster avg, 14% faster 1% low
    2. 24:57 Cyberpunk 2077 1440p RT OVerdrive DLSS P FG - 56 vs 26 1% low. Massive spikes/stutters
    3. 25:23 Cyberpunk 2077 1080p RT Overdrive DLSS Q - 52 vs 42 avg, 44 vs 34 1% -
    4. 25:50 Cyberpunk 2077 1080p RT Overdrive DLSS Q FG - 58% better 1% lows

View: https://youtube.com/watch?v=nrpzzMcaE5k


Why are people expecting Ultra Quality/performance in graphically demanding games (or AMD sponsored games) with a low tier 8GB graphics card (even if it is new)? Alan Wake 2 and CP2077?? The best looking, most demanding raytracing/path tracing games to date. RE4? AMD sponsored title which are notorious for having shitty RT visual quality/performance.
Still applies:
You can buy a new or old, $100, $300, $500, $700, or $1700 GPU, and if the game you are playing isn't reaching the FPS you want, you turn down a setting or 2, check performance, repeat, until you get the performance you want.
... There are thousands of games that do play at max settings even on 4 year old cards, or new budget cards. It has never been the default expectation for demanding games to run at Ultra quality, in 4k, on low tier graphics cards.
 
Why are people expecting Ultra Quality/performance in graphically demanding games (or AMD sponsored games) with a low tier 8GB graphics card (even if it is new)? Alan Wake 2 and CP2077?? The best looking, most demanding raytracing/path tracing games to date. RE4? AMD sponsored title which are notorious for having shitty RT visual quality/performance.
Because when people bought GTX 970's, GTX 1060's, R9 290's/390's, RX 480's/580's, they fully expected to play current games at 1080P with Ultra settings. Where as now they can expect to pay $400 and maybe play at high setting with FSR or DLSS. It's not like CP2077 is a new game either as it was released 4 years ago.
 
Why are people expecting Ultra Quality/performance in graphically demanding games

the $400 price tag and the size of the gpu will rise expectation quite a big, some of the test here are purely academic, 30 fps vs 15 fps does not matter at all.

62 fps on the 16gb vs 30 on the 8 GB model, that a valid would have been better with 12-16GB scenario.

they fully expected to play current games at 1080P with Ultra settings.
Not really, not 2 years after launch with a 1060, specially not the 3GB, when I bought a 1060 6GB around the time it had the same age of a 4060ti now, I could not play all games at ultra settings, it was sometime under 70 fps average.

When people bought GPU in an era of games made for very weak PS3 and PS4 gpu in the second half of the 2010, yes, game were quite easier to run relative to PC hardware.

It's not like CP2077 is a new game either as it was released 4 years ago.
It is not like a 4060ti do not play CP2077 at 1080p perfectly fine, this is a pure academic-tech curiosity experiment of trying to run it with RT overdrive on.

Ultra setting-1080p, a 4060ti 8GB will play CP2077 like a 1060p played game back in the days:
cyberpunk-2077-1920-1080.png
 
Last edited:
It is not like a 4060ti do not play CP2077 at 1080p perfectly fine, this is a pure academic-tech curiosity experiment of trying to run it with RT overdrive on.

Ultra setting-1080p, a 4060ti 8GB will play CP2077 like a 1060p played game back in the days:
FG is the kryptonite for 8gb cards

25:50 Cyberpunk 2077 1080p RT Overdrive DLSS Q FG - 58% better 1% lows
 
Why are people expecting Ultra Quality/performance in graphically demanding games (or AMD sponsored games) with a low tier 8GB graphics card (even if it is new)? Alan Wake 2 and CP2077?? The best looking, most demanding raytracing/path tracing games to date. RE4? AMD sponsored title which are notorious for having shitty RT visual quality/performance.
Still applies:

... There are thousands of games that do play at max settings even on 4 year old cards, or new budget cards. It has never been the default expectation for demanding games to run at Ultra quality, in 4k, on low tier graphics cards.
The better question is why are we still releasing 8GB cards on anything that isn't low-end garbage in 2024? It's 2024 FFS, and the GTX 1070 with 8GB released for $379 way back in 2016. While I certainly didn't expect VRAM to continually double at every segment as we reached the limits of chip density, etc. It can certainly be argued we have stagnated. For as much as we like to talk about how much Nvidia is advancing RT, DLSS, frame gen, other technologies, it sure isn't advancing much in price segments that aren't the high to enthusiast end which has also only ballooned in price.

VRAM aint all that expensive. AD103 with a 256-bit setup and AD104 with a 192-bit setup aint all that big of chips either - 379mm2 and 294mm2 respectively. There's really no reason why 60-class branded cards can't advance like everything else except as a "maximize profits" move by Nvidia lacking any real competition that really impacts their bottom line in any meaningful way. And there's really no excuse for 8GB cards in $300-$400 products in 2024 considering $400 used to buy you the 3rd product down the stack.
 
Last edited:
The better question is why are we still releasing 8GB cards on anything that isn't low-end garbage in 2024? It's 2024 FFS, and the GTX 1070 with 8GB released for $379 way back in 2016. While I certainly didn't expect VRAM to continually double at every segment as we reached the limits of chip density, etc. It can certainly be argued we have stagnated. For as much as we like to talk about how much Nvidia is advancing RT, DLSS, frame gen, other technologies, it sure isn't advancing much in price segments that aren't the high to enthusiast end which has also only ballooned in price.

VRAM aint all that expensive. AD103 with a 256-bit setup and AD104 with a 192-bit setup aint all that big of chips either - 379mm2 and 294mm2 respectively. There's really no reason why 60-class branded cards can't advance like everything else except as a "maximize profits" move by Nvidia lacking any real competition that really impacts their bottom line in any meaningful way. And there's really no excuse for 8GB cards in $300-$400 products in 2024 considering $400 used to buy you the 3rd product down the stack.
Buy AMD.
 
Because when people bought GTX 970's, GTX 1060's, R9 290's/390's, RX 480's/580's, they fully expected to play current games at 1080P with Ultra settings.
Those days are long gone and I wouldn't bet they will ever return.
Where as now they can expect to pay $400 and maybe play at high setting with FSR or DLSS. It's not like CP2077 is a new game either as it was released 4 years ago.
Ranked as one of the best looking games ever made, still, even being 4 years old. This game isn't going to run on a potato powered anything.
...

62 fps on the 16gb vs 30 on the 8 GB model, that a valid would have been better with 12-16GB scenario.
Buy the more expensive card with a faster gpu and/or more vRam. Kind of makes the point about expectations.
The better question is why are we still releasing 8GB cards on anything that isn't low-end garbage in 2024?
I think it is that what used to be thought of as low end garbage has gotten more expensive along with the rest of the stack.
VRAM aint all that expensive. AD103 with a 256-bit setup and AD104 with a 192-bit setup aint all that big of chips either - 379mm2 and 294mm2 respectively. There's really no reason why 60-class branded cards can't advance like everything else except as a "maximize profits" move by Nvidia lacking any real competition that really impacts their bottom line in any meaningful way.
Well, it will add to the cost of the card. What's gotten much much more expensive is the GPU silicon. So cuts gotta be found somewhere to keep it priced in the range to target where it lands in the stack.
And there's really no excuse for 8GB cards in $300-$400 products in 2024 considering $400 used to buy you the 3rd product down the stack.
See above. Those days are long gone. Prices have shifted upwards for the entire stack.
 
Nah. I complain, but Nvidia has the superior tech. AMD doesn't just get to have a pity buy because they are playing "catch up". But I likewise don't feel like paying Nvidia's "monopoly tax". So for now I am just not going to buy.
 
Nah. I complain, but Nvidia has the superior tech. AMD doesn't just get to have a pity buy because they are playing "catch up". But I likewise don't feel like paying Nvidia's "monopoly tax". So for now I am just not going to buy.
It's the only acceptable choice to make.
 
Nah. I complain, but Nvidia has the superior tech. AMD doesn't just get to have a pity buy because they are playing "catch up". But I likewise don't feel like paying Nvidia's "monopoly tax". So for now I am just not going to buy.
It's not a pity buy--it's a strategic decision to try to keep Nvidia from turning into a monopoly. Just thing how expensive Nvidia's cards will be when there's literally no competition.

And setting aside non-gaming uses for their cards, things like driver foibles, etc, for the sake of discussion, if you're not playing raytraced games, their cards are actually pretty good. I used an RX 6800 for a couple of years and i only replaced iwth with a 4070 because I wanted RT in Diablo IV. In the rest of the games I play regularly, the Radeon was plenty good enough.
 
Back
Top