NVIDIA GeForce RTX 4060 Ti Reviews

I swear neither AMD nor NVidia look like they are trying this gen. Like they are putting stuff out because they have too but it’s like they don’t want to.
Like they are both waiting for something else to happen so they can actually start again. In my opinion they are both just kicking the can down the road.
Possibly. It is also possible, much like with regards to CPUs, that gains year to year are becoming much smaller.

A "bad equation" for them to solve. Nvidia is "sort of" solving this by focusing on "extra things" outside of the norm. They may be the "sly fox" of the two big players. Not saying AMD isn't dabbling, but maybe their dabbling was because they (late to the ballgame) have begun to realize this problem (?) Even so, it's a band-aid at best (IMHO).

If they can successfully deprecate old product usefulness, this helps to ease the eventual misery here (talking about for them). However, nobody likes to get kicked in the "you know what". So, while it "sounds good", probably does more damage than they can afford.

Usually when this happens we end up with "temporal renting of services" (yep, I said it). Imagine a GPU that's worthless unless you are paying month to month. And then, much like your phone, can be made to become "obsolete" or have a "fixed leasing term" (with no buy out option) that forces people after a period of time to upgrade.

And no, I don't work for Nvidia or AMD, though the above most certainly is something they are discussing (they have to be).
 
much like with regards to CPUs, that gains year to year are becoming much smaller.
Despite basically the same memory system/speed a 608mm 4090 lovelace seem to beat a 628mm Ampere 3090 by about 65% in pure raster, imagine if GDDR7 was ready and it was released with 50-70% higher memory bandwidth (or cheaper with smaller bus and 30% more bandwidth).

The (379mm) 4080 was 47% faster than the (392) 3080 in the last 7600 review, the giant price jump on the official sticker price can make us lose track that the performance jump was quite good and that was with less memory bandwidth instead of a massive ddr7 jump, would probably have been closer to +55-60%

At the msrp price they initially tried, would that 45-65% not have been possible all the stack would they wanted, technology wise I imagine maybe, say the $900 4080, $1200 4080TI with a $800 4070ti, $700 4070, $600 4060ti, $500 4060 and so on (i am not sure what was the exact plan, but it was high price).

Why did that gen on gen gain stopped once we got down ?

Was it there some decision, that samsung 8 to tsmc N4 type of gain will not rehappen anytime soon, if we do this jump all around the stack:
- Continue to have an useless xx90 card with the significantly cheaper xx80 too close to it ?
- 5xxx series will be tied, the max gain will be very limited and uncertain, will have to rely on a design improvement, reduce of profit margin or both.
- Let use this a lot of "free" new perf per watt to try to set the high price of the last few years in stone in some ways, given an bigger incentive to go to the higher tier than ever before

Would not surprise me if the 5000 series see giant gain across the board instead than just at the top if they need it to be, I am not sure the 4060-4070 over last gen was mostly tech stagnation.
 
Despite basically the same memory system/speed a 608mm 4090 lovelace seem to beat a 628mm Ampere 3090 by about 65% in pure raster, imagine if GDDR7 was ready and it was released with 50-70% higher memory bandwidth (or cheaper with smaller bus and 30% more bandwidth).

The (379mm) 4080 was 47% faster than the (392) 3080 in the last 7600 review, the giant price jump on the official sticker price can make us lose track that the performance jump was quite good and that was with less memory bandwidth instead of a massive ddr7 jump, would probably have been closer to +55-60%

At the msrp price they initially tried, would that 45-65% not have been possible all the stack would they wanted, technology wise I imagine maybe, say the $900 4080, $1200 4080TI with a $800 4070ti, $700 4070, $600 4060ti, $500 4060 and so on (i am not sure what was the exact plan, but it was high price).

Why did that gen on gen gain stopped once we got down ?

Was it there some decision, that samsung 8 to tsmc N4 type of gain will not rehappen anytime soon, if we do this jump all around the stack:
- Continue to have an useless xx90 card with the significantly cheaper xx80 too close to it ?
- 5xxx series will be tied, the max gain will be very limited and uncertain, will have to rely on a design improvement, reduce of profit margin or both.
- Let use this a lot of "free" new perf per watt to try to set the high price of the last few years in stone in some ways, given an bigger incentive to go to the higher tier than ever before

Would not surprise me if the 5000 series see giant gain across the board instead than just at the top if they need it to be, I am not sure the 4060-4070 over last gen was mostly tech stagnation.
It might be better put, "gen to gen gains that the customer cares about". Maybe if the whole world was 4K+, we'd be singing a different tune, but the assumption would be that games completely suck (unplayable) at 1080p or even 1440p. IMHO, a lot has to change to push that bar up that high as being a "requirement". Best argument today has almost zero to do with R&D, and that's available VRAM. Even so, it's niche right now as the primary offenders might only be a 4K resolutions, which may limit things only to the very top GPUs, again, meaning, why all the SKUs that are lower?
 
For this generation it seems like the top end cards from each manufacturer is a decent enough value but the value just keeps getting worse the farther down the product stack you go, maybe some of the yet to be released middle of the road cards will be better but I doubt it.

I'm sure it's a combination of having old stock to clear and being addicted to crypto/covid influenced margins but I'm not sure what the split is there. This is the first time I can recall that prices have dropped by a significant amount early in the product cycle without being a direct reaction to a competitors product so they have to realize that they're pushing it.
 
You guys are funny.

If you own a 3xxx anything, this card isn't for you.

People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.

I think it fits the market where it is, it is just that all tiers' pricing has saw increases.

Reviewers expecting 2x perf from previous gen.. high end performance... mind boggles. I think it's just popular to jump on the anti-nvidia bandwagon right now, and those youtubers all just parrot what the other are saying. HU makes a stink about 3070 performance on a shitty console port, blames it on 8Gb vRam, then ALL of the techtubers follow suit making the same flawed argument. This just feels like more of the same.

Your'e not going to get a blowjob from a 4060Ti. Also not what they are made for. lol
 
You guys are funny.

If you own a 3xxx anything, this card isn't for you.

People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.

I think it fits the market where it is, it is just that all tiers' pricing has saw increases.

Reviewers expecting 2x perf from previous gen.. high end performance... mind boggles. I think it's just popular to jump on the anti-nvidia bandwagon right now, and those youtubers all just parrot what the other are saying. HU makes a stink about 3070 performance on a shitty console port, blames it on 8Gb vRam, then ALL of the techtubers follow suit making the same flawed argument. This just feels like more of the same.

Your'e not going to get a blowjob from a 4060Ti. Also not what they are made for. lol
Forget the blowjob they're not even offering lube.
 
If someone is on a budget it’s used or the fire sale 6000 AMD cards if you must have new. Once those are sold through maybe some price adjustments will happen with the 4060/7600 stuff.
 
You guys are funny.

If you own a 3xxx anything, this card isn't for you.

People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.

I think it fits the market where it is, it is just that all tiers' pricing has saw increases.

Reviewers expecting 2x perf from previous gen.. high end performance... mind boggles. I think it's just popular to jump on the anti-nvidia bandwagon right now, and those youtubers all just parrot what the other are saying. HU makes a stink about 3070 performance on a shitty console port, blames it on 8Gb vRam, then ALL of the techtubers follow suit making the same flawed argument. This just feels like more of the same.

Your'e not going to get a blowjob from a 4060Ti. Also not what they are made for. lol
You should be embarrassed for posting such a load of BS. The 3060 TI is actually faster than the new 4060 TI in some cases yet you are reaching so hard to make this card not sound like the absolute piece of crap that it actually is for 400 bucks. And even going by your thought process that someone that buys this level of card is not going to upgrade for a couple more generations then that means they're going to be compromised on VRAM even from day one so what do you think future games are going to do over the next 2 to 3 years? And I bet you didn't even think about that many, if not most of people looking at a 60 class card probably only have 16 gigs of ram, which means the VRAM issues are going to be even more pronounced. Also a lot of these people will still be on PCIe gen 3 so that may affect performance a little bit too in some cases since the card only runs x8. Please use some common sense and get through your head that this card is absolute garbage for the price no matter how you look at it.
 
Last edited:
You guys are funny.

If you own a 3xxx anything, this card isn't for you.

People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.

I think it fits the market where it is, it is just that all tiers' pricing has saw increases.

Reviewers expecting 2x perf from previous gen.. high end performance... mind boggles. I think it's just popular to jump on the anti-nvidia bandwagon right now, and those youtubers all just parrot what the other are saying. HU makes a stink about 3070 performance on a shitty console port, blames it on 8Gb vRam, then ALL of the techtubers follow suit making the same flawed argument. This just feels like more of the same.

Your'e not going to get a blowjob from a 4060Ti. Also not what they are made for. lol
 

Attachments

  • giphy.webp
    566.3 KB · Views: 0
VRAM issues are going to be even more pronounced
Agree and I’m pretty much ignoring 8GB cards unless they are $150 preferably less. So used deals most likely although Microcenter here has 2 6600 models at 179 right now. The 3060 12g or the 6700xt is what I’ve been recommending people start with. Both can be had used for $250ish.


Yeah it’s insane.
"Low end"

"$400"
 
People who buy on the low end, and this is, 'low end', do not buy a card every generation.

The **60s were never low end, but middle of the range. The **60 Ti were sometimes close to the **70s, like the 3060 Ti. Low end would be the **50 Ti, maybe **60. And this is $400. If it was priced like a low end part I don't think anyone would mind.
 
Low was geforce 2 MX, not the 420.

**50 was the lowest part of the mid-range, some generation (maxwell-turing-ampere) had no low end at launch, that market being cornered by the previous gen mid range that was now more than good enough for that.

If you cost almost as much as a complete Xbox Series S system, it is hard to be called low end, let alone the same or more.
 
People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.

Uh, it is the same price more or less to a 2070/2060Super and Nvidia's own slides were promoting that 50% or so perf increase. Four years later... with frame generation being the main selling point for a limited amount of games now and later. Nevermind their own numbers showed several games still hitting 60fps at 1080p high with a 2060super. This card quite likely will be replaced in 1-2 years like the gtx 960 was. At least that card was only $200-250 (4gb vers.).

Second, if one is on the low end, why spend $400 on this when a rx 6600 is half the price...$200 or less. I saw a gigabyte 6600 model down to $180 earlier at Newegg. Both cards are limited to 1080p, neither are great at ray tracing. If you need an upgrade now, buy on the low end and see what shows up next gen or later.

Third, if one is on the low end then they probably only have a motherboard with pci-e 3.0. That is a performance hit with the new 4.0 cards using only 8x pci-e lanes. So why invest much of anything in that? Unless it is one of the rx 6700 cards or a rtx 3060 that use the full 16x lanes.
 
You guys are funny.

If you own a 3xxx anything, this card isn't for you.

People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.

I think it fits the market where it is, it is just that all tiers' pricing has saw increases.

Reviewers expecting 2x perf from previous gen.. high end performance... mind boggles. I think it's just popular to jump on the anti-nvidia bandwagon right now, and those youtubers all just parrot what the other are saying. HU makes a stink about 3070 performance on a shitty console port, blames it on 8Gb vRam, then ALL of the techtubers follow suit making the same flawed argument. This just feels like more of the same.

Your'e not going to get a blowjob from a 4060Ti. Also not what they are made for. lol
Not sure who you are talking to here?
 
Digital Foundry review conclusion:


Even with the 8GB of memory, this would have been a commendable RTX 4060 non-Ti if the price was right. Owing to the memory interface configuration, 12GB isn't really possible, but a 16GB 4060 Ti at $399 would have also made a lot more sense: at least then the most pressing limitations facing the 3060 Ti and 3070 going forward would have been comprehensively addressed, and it would also have resolved the problem of VRAM-limited GTX 1070 and RTX 2060 Super owners looking for an upgrade that delivers the exact same issue seen in so many big games.

As things stand though, this is a disappointment and I can only assume that Nvidia's GeForce Experience telemetry tells them that 60-class users only game at 1080p and aren't interested in the latest triple-A games.



https://www.eurogamer.net/digitalfoundry-2023-nvidia-geforce-rtx-4060-ti-review?page=7



I would go a step further & say that this would make an excellent $300 4050 ti & the proposed 4060 a good $250 4050
The hypothetical 6gb 4050 should not sell above $200 & maybe named as 4030 !!
 
Cost per wafer for 28nm back in 2015 was around USD $2,500. For TSMC 4N, it's pushing USD $17,000. Maxwell was so cheap back then across the board because the world was moving onto 16/12nm, but it wasn't ready by the time NVIDIA started production. TSMC's 5/4nm process is currently on the bleeding edge, with 3nm still a year out. Both process nodes are heavily crowded, so fab allocation comes at a premium.
 
Even with the 8GB of memory, this would have been a commendable RTX 4060 non-Ti if the price was right.

No it wouldn't because that would mean the RX 7600 is a great deal, and we know that it really isn't. Nobody in 2023 should be buying an 8GB graphics card for 1080p gaming.
Owing to the memory interface configuration, 12GB isn't really possible, but a 16GB 4060 Ti at $399 would have also made a lot more sense:
Would have made more sense to make the 4060's a 192-bit graphics card so then you could have 12GB, but that would make Nvidia's lineup even more stupid.
and it would also have resolved the problem of VRAM-limited GTX 1070 and RTX 2060 Super owners looking for an upgrade that delivers the exact same issue seen in so many big games.
The GTX 1070 has 8GB of VRAM, while the RTX 2060 has 6GB. What exactly is Digital Foundry smoking?
I would go a step further & say that this would make an excellent $300 4050 ti & the proposed 4060 a good $250 4050
The hypothetical 6gb 4050 should not sell above $200 & maybe named as 4030 !!
If this were a 4050 ti then it would have to be around $200, with the 4060 renamed 4050 being less than $200. At least AMD had the decency to make their RX 7600 $270, even though nobody would realistically pay that much for that graphics card.
 
The GTX 1070 has 8GB of VRAM, while the RTX 2060 has 6GB. What exactly is Digital Foundry smoking?
Not sure I see the issues with the statement that a 16GB 4060 Ti at $399 would also have resolved the problem of VRAM-limited GTX 1070 and RTX 2060 ?
 
It gets worse at higher resolutions. So about the same as the 3070 at 1080p but significantly slower at 4k meaning it's better to have a 3070 for both new games at lower resolution and older games at higher resolution - same amount of vram but way better bandwidth.

Nvidias cache narrative was mostly BS.
Noting is cut down on a 3070 other than vram size, 256bit bus and full x16 card.
 
DerBauer with his review, takes a look at performance with the pci-e limited to 4.0 x4 to simulate 3.0 boards.

 
Not sure I see the issues with the statement that a 16GB 4060 Ti at $399 would also have resolved the problem of VRAM-limited GTX 1070 and RTX 2060 ?
Probably not since those aren't fast enough to make good use of it. It isn't enough just to put 16GB, but the bandwidth on the card is also limited. The 3060 Ti has twice as much bandwidth compared to the 4060 Ti, which would make sense giving it 16GB, even though in benchmarks they are nearly equal in performance. It just makes sense to call this a 4050 Ti and pricing it at $200, because right now a 3060 Ti is the better buy.
Fuck DLSS. I’m so sick of this being used as a crutch for shitty hardware performance and/or crappy coding.

Give me real, unadulterated frames. Not this voodoo magic, which acts as a band-aid.
Nobody should care about DLSS benchmarks, just like nobody should care about FSR benchmarks. These technologies are not meant to be used unless your graphics card sucks.
 
Nobody should care about DLSS benchmarks, just like nobody should care about FSR benchmarks. These technologies are not meant to be used unless your graphics card sucks.

Respectfully, that's a bit short sighted, don't you think?

For example, people that bought a 1440p monitors with their RTX 2060 (which could play most titles at launch with that monitor), can now render at 1080p and still have screen-native picture with dlss.

With a 5600x or older AMD card of that calibre, not so much.
 
Probably not since those aren't fast enough to make good use of it. It isn't enough just to put 16GB, but the bandwidth on the card is also limited
Again not sure what the 2060 being a 6gb card has to do with the 4060ti 16gb bandwith, you can just have misread something.

Nobody should care about DLSS benchmarks, just like nobody should care about FSR benchmarks. These technologies are not meant to be used unless your graphics card sucks.
Almost no card can play heavy game at 4k 120hz (none came close to be able in all case even a 4090) with all the details to the maximum, lot of people like to play on 4k tv, so what then, why other compromise automatically better ? Would going down fps a lot, putting down setting a lot better than upscaling ? A lot of the time, no.

Console evaluate everything and care about end result, they all choose, all the time to upscale and upscale a lot, as under many test it happen to be one of the best compromise.
 
Uh, it is the same price more or less to a 2070/2060Super and Nvidia's own slides were promoting that 50% or so perf increase. Four years later... with frame generation being the main selling point for a limited amount of games now and later. Nevermind their own numbers showed several games still hitting 60fps at 1080p high with a 2060super. This card quite likely will be replaced in 1-2 years like the gtx 960 was. At least that card was only $200-250 (4gb vers.).
The Gtx980 launched at $549. The Gtx960 launched at $199, that's 36% the price of the flagship. The 4060Ti (not 4060 which would actually be the equivalent product stack comparison) is launching at $399 and that is 33% the price of this generations' flagship 4080 at $1199. It's a lower price in the stack price delta than the Gtx960 was...

This is not agreeing or disagreeing with the price in itself, just pointing out that the price fits historically, across the stack.

We all wish the prices were less, across the stack.
Second, if one is on the low end, why spend $400 on this when a rx 6600 is half the price...$200 or less. I saw a gigabyte 6600 model down to $180 earlier at Newegg. Both cards are limited to 1080p, neither are great at ray tracing. If you need an upgrade now, buy on the low end and see what shows up next gen or later.
Choices, that's what everyone has. Not a bad thing. Remember, no one is making anyone buy a 4060Ti.
Third, if one is on the low end then they probably only have a motherboard with pci-e 3.0. That is a performance hit with the new 4.0 cards using only 8x pci-e lanes. So why invest much of anything in that? Unless it is one of the rx 6700 cards or a rtx 3060 that use the full 16x lanes.
That has already been tested, Der8auer tested the 4060Ti in with only 4xPcie lanes. Not much difference. ->
The biggest difference he saw was 9% in the 1% lows, Average FPS difference was 1.5%. That's the highest differences seen, the rest of the games the gap was less. And that was at 1440p. pcie gen4 x4 lanes = pcie gen3 x8 in bandwidth.
The observed differences are not going to be perceptible.

Even with all of that, I think the prices will come down on these in 6 months.
 
Last edited:
Respectfully, that's a bit short sighted, don't you think?

For example, people that bought a 1440p monitors with their RTX 2060 (which could play most titles at launch with that monitor), can now render at 1080p and still have screen-native picture with dlss.
An RTX 2060 in 2023 is gonna benefit from DLSS. We aren't talking about an RTX 2060 released 4 years ago, we're talking about the 4060 Ti.
With a 5600x or older AMD card of that calibre, not so much.
I don't know about the 5600 XT but a quick look on YouTube shows you're wrong. In most of these benchmarks the 5600 XT matches or beats the RTX 2060. Unless you think DLSS is going to make it 10x faster than a 5600 XT with FSR, which I really doubt is the case.
 
Again not sure what the 2060 being a 6gb card has to do with the 4060ti 16gb bandwith, you can just have misread something.
The 4060 Ti is so slow and bandwidth limited that the extra VRAM might not help. The whole point of the extra VRAM is to avoid going to system memory, which is slow.
Almost no card can play heavy game at 4k 120hz (none came close to be able in all case even a 4090) with all the details to the maximum, lot of people like to play on 4k tv, so what then, why other compromise automatically better ? Would going down fps a lot, putting down setting a lot better than upscaling ? A lot of the time, no.
DLSS and FSR are fine as long as you don't use them as a way to evaluate a graphics cards performance.
 
An RTX 2060 in 2023 is gonna benefit from DLSS. We aren't talking about an RTX 2060 released 4 years ago, we're talking about the 4060 Ti.

I don't know about the 5600 XT but a quick look on YouTube shows you're wrong. In most of these benchmarks the 5600 XT matches or beats the RTX 2060. Unless you think DLSS is going to make it 10x faster than a 5600 XT with FSR, which I really doubt is the case.

I'm not looking through those Fisher Price benchmarks so I will take your word for it. Still DLSS is essential if you want to sprinkle on some RT with the older cards.
 
I'm not looking through those Fisher Price benchmarks so I will take your word for it. Still DLSS is essential if you want to sprinkle on some RT with the older cards.
I'm seeing a problem with Ray-Tracing as more games get released. New games are so demanding that Ray-Tracing might as well not be a factor without DLSS and FSR. In many circumstances, DLSS and FSR are synonymous with Ray-Tracing because in most new titles you won't be able to use Ray-Tracing without them. So at what point do we admit that Ray-Tracing as it stands right now isn't a viable option to turn on in games? In many new games you can barely achieve 60fps at 1080p with max settings minus Ray-Tracing. The Last of Us Part 1 is just not gonna happen on the 4060 Ti with Ray-Tracing, if the game had Ray-Tracing. It's so bad on the 4060 Ti that the RTX 3060 has a lot better lows because it has more VRAM. You can mitigate this with DLSS, but by no means is that a win when the RTX 3060 is handling it better. A plague Tale Requiem is just unplayable on the 4060 Ti with Ray-Tracing at 1080p Ultra. So either Ray-Tracing doesn't work as a graphics technology or nobody wants to admit that it requires DLSS or FSR because we're being sold shit graphic cards.
 
The more I stop and think about things... the more I appreciate technologies like FSR or DLSS. 10 years ago, if your shiny mid-range GPU couldn't run the latest AAA games at an acceptable frame rate with good graphics, the only options were to step down the graphics quality, resolution, or upgrade your GPU. There was no magic bullet to fix poor frame rates without degrading image quality or spending hundreds on upgrading. DLSS comes out, followed by FSR, and alleviated a lot of the need to have to upgrade GPU's no longer capable of running AAA games at high/ultra graphics settings. Not saying it's a cure-all, because even upscaling has it's limits as to how much of an improvement you will see, it's still awesome to see a company like Nvidia, a company everyone dubs greedy, bring out a feature that can keep even a five-year-old GPU relevant. Say what you will, I know not everyone is going to agree with me, but the results speak for themselves. The new Zelda game running with FSR, consoles getting support for it, more intensive graphics finally being achievable, even with older hardware, show that a company like Nvidia is very forward thinking, and while we the consumers usually pay the price, which I'm not a fan of, I see it as "someone has to foot the bill" so companies like Nvidia or AMD can see that investing and creating new technologies like FSR or DLSS are worthwhile, and not just a fools errand.

Just imagine if the 20 and 30 series cards never sold, then chances are Nvidia would have probably put less effort into DLSS, to the point to where it would go the way of the dodo. With that, I can safely assume that if DLSS were to go, then AMD wouldn't feel so compelled to continue upgrading and supporting FSR, then we'd be back to square one, where we'd be on the same continuous upgrade loop that we were on prior to the 20 series releasing, and I say this because both Nvidia and AMD are companies that are after your hard-earned dollar, they're not out to give you stuff because they're feeling generous. This would be an issue with the current state of AAA gaming on the PC, forcing people to have to upgrade their 10 or 20 series card, or Vega/5700XT if they wanted to get respectable frame rates with good image quality instead of just being able to toggle on FSR, or DLSS to get another good year of usage. While I always loved upgrading in the past, going from a 8800GT, to a 9800GTX, then to a GTX470, so on and so forth, at this point I'm honestly wanting to upgrade less and less, and I think I'm finally at that point where the hardware I have now can give me a respectable life span without having to consider upgrading for at four to five years, and that's thanks to DLSS and Frame Generation.

Honestly though, I hate how developers now seem to program their games around DLSS/FSR, rather than treating it as a feature to bridge the gap between older and newer hardware. It really gives tech like FSR and DLSS a bad rap.
 
An RTX 2060 in 2023 is gonna benefit from DLSS. We aren't talking about an RTX 2060 released 4 years ago, we're talking about the 4060 Ti.
So would a 4090 today if someone as a 4k or more monitor, that without Raytracing on and with no game from 2023 game included:
average-fps-3840-2160.png


you can no enable raytracing obviously and play game that go under 60 fps from time to time without RT on, but if you like a solid 90 fps (average 110 or so to get to that level), let alone wanting lock 120, you can reduce settings or resolution and upscale.

Is playing Control at 68 average fps an better experience than at 117 upscaled with DLSS ? Or with lowering the setting enough to get there, how one know without trying for each individual title ?

Considering how much time and thought is put on this by the console dev, upscaling seem to be almost always the better way to go, very little game are 100% of the time native 4k there.

A plague Tale Requiem is just unplayable on the 4060 Ti with Ray-Tracing at 1080p Ultra. So either Ray-Tracing doesn't work as a graphics technology or nobody wants to admit that it requires DLSS or FSR because we're being sold shit graphic cards.
Or it could be one of the things at ultra that does not work on a 4060ti card, RT does not have to be that special among them (it is just newer), why would a game be limited to run well on a xx60 card, why if you put everything at max would it not push a 3080 a little, what would be the point ?


I hate how developers now seem to program their games around DLSS/FSR

I feel that upscaling have been a mainstay on console game dev for so long, it was often programmed for dynamic upscaling and shading, and those mechanics just cut from the PC game. If a game is designed to run at 50-60fps at 900p-1200p and upscaled on a ps5, yes it will take a lot of hardware with no upscaling to run at the 90+fps pc gamers tend to like, specially at the resolution some want like 1440p.
 
Last edited:


DF also confirming the 4060 is garbage.


This is humorous. We had members here accusing DF of being an Nvidia shill. I'm sure we will hear that line again next product Nvidia releases that is beastly.

But both companies low tier products are a joke this generation, so yhis video is no surprise. And before someone says it, the xx60 series has always been low tier. It seems some people confuse low tier with entry level as if they didn't know any better.
 
Back
Top