With the amount by which Nvidia's stock increased (due to 14% year on year growth in Data Centres), Nvidia could use that to acquire AMD !!
Serious?
Multiple reasons this will never happen. Number one - it would never be approved.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
With the amount by which Nvidia's stock increased (due to 14% year on year growth in Data Centres), Nvidia could use that to acquire AMD !!
Besides they need to cash that in to buy up all the ARM stock so they can take it by force.
Serious?
Multiple reasons this will never happen. Number one - it would never be approved.
Possibly. It is also possible, much like with regards to CPUs, that gains year to year are becoming much smaller.I swear neither AMD nor NVidia look like they are trying this gen. Like they are putting stuff out because they have too but it’s like they don’t want to.
Like they are both waiting for something else to happen so they can actually start again. In my opinion they are both just kicking the can down the road.
Despite basically the same memory system/speed a 608mm 4090 lovelace seem to beat a 628mm Ampere 3090 by about 65% in pure raster, imagine if GDDR7 was ready and it was released with 50-70% higher memory bandwidth (or cheaper with smaller bus and 30% more bandwidth).much like with regards to CPUs, that gains year to year are becoming much smaller.
It might be better put, "gen to gen gains that the customer cares about". Maybe if the whole world was 4K+, we'd be singing a different tune, but the assumption would be that games completely suck (unplayable) at 1080p or even 1440p. IMHO, a lot has to change to push that bar up that high as being a "requirement". Best argument today has almost zero to do with R&D, and that's available VRAM. Even so, it's niche right now as the primary offenders might only be a 4K resolutions, which may limit things only to the very top GPUs, again, meaning, why all the SKUs that are lower?Despite basically the same memory system/speed a 608mm 4090 lovelace seem to beat a 628mm Ampere 3090 by about 65% in pure raster, imagine if GDDR7 was ready and it was released with 50-70% higher memory bandwidth (or cheaper with smaller bus and 30% more bandwidth).
The (379mm) 4080 was 47% faster than the (392) 3080 in the last 7600 review, the giant price jump on the official sticker price can make us lose track that the performance jump was quite good and that was with less memory bandwidth instead of a massive ddr7 jump, would probably have been closer to +55-60%
At the msrp price they initially tried, would that 45-65% not have been possible all the stack would they wanted, technology wise I imagine maybe, say the $900 4080, $1200 4080TI with a $800 4070ti, $700 4070, $600 4060ti, $500 4060 and so on (i am not sure what was the exact plan, but it was high price).
Why did that gen on gen gain stopped once we got down ?
Was it there some decision, that samsung 8 to tsmc N4 type of gain will not rehappen anytime soon, if we do this jump all around the stack:
- Continue to have an useless xx90 card with the significantly cheaper xx80 too close to it ?
- 5xxx series will be tied, the max gain will be very limited and uncertain, will have to rely on a design improvement, reduce of profit margin or both.
- Let use this a lot of "free" new perf per watt to try to set the high price of the last few years in stone in some ways, given an bigger incentive to go to the higher tier than ever before
Would not surprise me if the 5000 series see giant gain across the board instead than just at the top if they need it to be, I am not sure the 4060-4070 over last gen was mostly tech stagnation.
Forget the blowjob they're not even offering lube.You guys are funny.
If you own a 3xxx anything, this card isn't for you.
People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.
I think it fits the market where it is, it is just that all tiers' pricing has saw increases.
Reviewers expecting 2x perf from previous gen.. high end performance... mind boggles. I think it's just popular to jump on the anti-nvidia bandwagon right now, and those youtubers all just parrot what the other are saying. HU makes a stink about 3070 performance on a shitty console port, blames it on 8Gb vRam, then ALL of the techtubers follow suit making the same flawed argument. This just feels like more of the same.
Your'e not going to get a blowjob from a 4060Ti. Also not what they are made for. lol
You should be embarrassed for posting such a load of BS. The 3060 TI is actually faster than the new 4060 TI in some cases yet you are reaching so hard to make this card not sound like the absolute piece of crap that it actually is for 400 bucks. And even going by your thought process that someone that buys this level of card is not going to upgrade for a couple more generations then that means they're going to be compromised on VRAM even from day one so what do you think future games are going to do over the next 2 to 3 years? And I bet you didn't even think about that many, if not most of people looking at a 60 class card probably only have 16 gigs of ram, which means the VRAM issues are going to be even more pronounced. Also a lot of these people will still be on PCIe gen 3 so that may affect performance a little bit too in some cases since the card only runs x8. Please use some common sense and get through your head that this card is absolute garbage for the price no matter how you look at it.You guys are funny.
If you own a 3xxx anything, this card isn't for you.
People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.
I think it fits the market where it is, it is just that all tiers' pricing has saw increases.
Reviewers expecting 2x perf from previous gen.. high end performance... mind boggles. I think it's just popular to jump on the anti-nvidia bandwagon right now, and those youtubers all just parrot what the other are saying. HU makes a stink about 3070 performance on a shitty console port, blames it on 8Gb vRam, then ALL of the techtubers follow suit making the same flawed argument. This just feels like more of the same.
Your'e not going to get a blowjob from a 4060Ti. Also not what they are made for. lol
You guys are funny.
If you own a 3xxx anything, this card isn't for you.
People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.
I think it fits the market where it is, it is just that all tiers' pricing has saw increases.
Reviewers expecting 2x perf from previous gen.. high end performance... mind boggles. I think it's just popular to jump on the anti-nvidia bandwagon right now, and those youtubers all just parrot what the other are saying. HU makes a stink about 3070 performance on a shitty console port, blames it on 8Gb vRam, then ALL of the techtubers follow suit making the same flawed argument. This just feels like more of the same.
Your'e not going to get a blowjob from a 4060Ti. Also not what they are made for. lol
Agree and I’m pretty much ignoring 8GB cards unless they are $150 preferably less. So used deals most likely although Microcenter here has 2 6600 models at 179 right now. The 3060 12g or the 6700xt is what I’ve been recommending people start with. Both can be had used for $250ish.VRAM issues are going to be even more pronounced
"Low end"
"$400"
People who buy on the low end, and this is, 'low end', do not buy a card every generation.
People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.
Not sure who you are talking to here?You guys are funny.
If you own a 3xxx anything, this card isn't for you.
People who buy on the low end, and this is, 'low end', do not buy a card every generation. So it is better than the previous gen, but not by much in raster performance, but that performance delta goes up with DLSS3, only 1 review I saw showed that. For someone coming from 2070 or 10xx anything, it's a definite step up. And it can actually do raytracing at decent frames.
I think it fits the market where it is, it is just that all tiers' pricing has saw increases.
Reviewers expecting 2x perf from previous gen.. high end performance... mind boggles. I think it's just popular to jump on the anti-nvidia bandwagon right now, and those youtubers all just parrot what the other are saying. HU makes a stink about 3070 performance on a shitty console port, blames it on 8Gb vRam, then ALL of the techtubers follow suit making the same flawed argument. This just feels like more of the same.
Your'e not going to get a blowjob from a 4060Ti. Also not what they are made for. lol
Cost per wafer for 28nm back in 2015 was around USD $2,500. For TSMC 4N, it's pushing USD $17,000. Maxwell was so cheap back then across the board because the world was moving onto 16/12nm, but it wasn't ready by the time NVIDIA started production. TSMC's 5/4nm process is currently on the bleeding edge, with 3nm still a year out. Both process nodes are heavily crowded, so fab allocation comes at a premium.
Even with the 8GB of memory, this would have been a commendable RTX 4060 non-Ti if the price was right.
Would have made more sense to make the 4060's a 192-bit graphics card so then you could have 12GB, but that would make Nvidia's lineup even more stupid.Owing to the memory interface configuration, 12GB isn't really possible, but a 16GB 4060 Ti at $399 would have also made a lot more sense:
The GTX 1070 has 8GB of VRAM, while the RTX 2060 has 6GB. What exactly is Digital Foundry smoking?and it would also have resolved the problem of VRAM-limited GTX 1070 and RTX 2060 Super owners looking for an upgrade that delivers the exact same issue seen in so many big games.
If this were a 4050 ti then it would have to be around $200, with the 4060 renamed 4050 being less than $200. At least AMD had the decency to make their RX 7600 $270, even though nobody would realistically pay that much for that graphics card.I would go a step further & say that this would make an excellent $300 4050 ti & the proposed 4060 a good $250 4050
The hypothetical 6gb 4050 should not sell above $200 & maybe named as 4030 !!
Not sure I see the issues with the statement that a 16GB 4060 Ti at $399 would also have resolved the problem of VRAM-limited GTX 1070 and RTX 2060 ?The GTX 1070 has 8GB of VRAM, while the RTX 2060 has 6GB. What exactly is Digital Foundry smoking?
Noting is cut down on a 3070 other than vram size, 256bit bus and full x16 card.It gets worse at higher resolutions. So about the same as the 3070 at 1080p but significantly slower at 4k meaning it's better to have a 3070 for both new games at lower resolution and older games at higher resolution - same amount of vram but way better bandwidth.
Nvidias cache narrative was mostly BS.
Fuck DLSS. I’m so sick of this being used as a crutch for shitty hardware performance and/or crappy coding.but that performance delta goes up with DLSS3
Probably not since those aren't fast enough to make good use of it. It isn't enough just to put 16GB, but the bandwidth on the card is also limited. The 3060 Ti has twice as much bandwidth compared to the 4060 Ti, which would make sense giving it 16GB, even though in benchmarks they are nearly equal in performance. It just makes sense to call this a 4050 Ti and pricing it at $200, because right now a 3060 Ti is the better buy.Not sure I see the issues with the statement that a 16GB 4060 Ti at $399 would also have resolved the problem of VRAM-limited GTX 1070 and RTX 2060 ?
Nobody should care about DLSS benchmarks, just like nobody should care about FSR benchmarks. These technologies are not meant to be used unless your graphics card sucks.Fuck DLSS. I’m so sick of this being used as a crutch for shitty hardware performance and/or crappy coding.
Give me real, unadulterated frames. Not this voodoo magic, which acts as a band-aid.
Nobody should care about DLSS benchmarks, just like nobody should care about FSR benchmarks. These technologies are not meant to be used unless your graphics card sucks.
Again not sure what the 2060 being a 6gb card has to do with the 4060ti 16gb bandwith, you can just have misread something.Probably not since those aren't fast enough to make good use of it. It isn't enough just to put 16GB, but the bandwidth on the card is also limited
Almost no card can play heavy game at 4k 120hz (none came close to be able in all case even a 4090) with all the details to the maximum, lot of people like to play on 4k tv, so what then, why other compromise automatically better ? Would going down fps a lot, putting down setting a lot better than upscaling ? A lot of the time, no.Nobody should care about DLSS benchmarks, just like nobody should care about FSR benchmarks. These technologies are not meant to be used unless your graphics card sucks.
The Gtx980 launched at $549. The Gtx960 launched at $199, that's 36% the price of the flagship. The 4060Ti (not 4060 which would actually be the equivalent product stack comparison) is launching at $399 and that is 33% the price of this generations' flagship 4080 at $1199. It's a lower price in the stack price delta than the Gtx960 was...Uh, it is the same price more or less to a 2070/2060Super and Nvidia's own slides were promoting that 50% or so perf increase. Four years later... with frame generation being the main selling point for a limited amount of games now and later. Nevermind their own numbers showed several games still hitting 60fps at 1080p high with a 2060super. This card quite likely will be replaced in 1-2 years like the gtx 960 was. At least that card was only $200-250 (4gb vers.).
Choices, that's what everyone has. Not a bad thing. Remember, no one is making anyone buy a 4060Ti.Second, if one is on the low end, why spend $400 on this when a rx 6600 is half the price...$200 or less. I saw a gigabyte 6600 model down to $180 earlier at Newegg. Both cards are limited to 1080p, neither are great at ray tracing. If you need an upgrade now, buy on the low end and see what shows up next gen or later.
That has already been tested, Der8auer tested the 4060Ti in with only 4xPcie lanes. Not much difference. ->Third, if one is on the low end then they probably only have a motherboard with pci-e 3.0. That is a performance hit with the new 4.0 cards using only 8x pci-e lanes. So why invest much of anything in that? Unless it is one of the rx 6700 cards or a rtx 3060 that use the full 16x lanes.
and that is 33% the price of this generations' flagship 4080 at $1199.
An RTX 2060 in 2023 is gonna benefit from DLSS. We aren't talking about an RTX 2060 released 4 years ago, we're talking about the 4060 Ti.Respectfully, that's a bit short sighted, don't you think?
For example, people that bought a 1440p monitors with their RTX 2060 (which could play most titles at launch with that monitor), can now render at 1080p and still have screen-native picture with dlss.
I don't know about the 5600 XT but a quick look on YouTube shows you're wrong. In most of these benchmarks the 5600 XT matches or beats the RTX 2060. Unless you think DLSS is going to make it 10x faster than a 5600 XT with FSR, which I really doubt is the case.With a 5600x or older AMD card of that calibre, not so much.
The 4060 Ti is so slow and bandwidth limited that the extra VRAM might not help. The whole point of the extra VRAM is to avoid going to system memory, which is slow.Again not sure what the 2060 being a 6gb card has to do with the 4060ti 16gb bandwith, you can just have misread something.
DLSS and FSR are fine as long as you don't use them as a way to evaluate a graphics cards performance.Almost no card can play heavy game at 4k 120hz (none came close to be able in all case even a 4090) with all the details to the maximum, lot of people like to play on 4k tv, so what then, why other compromise automatically better ? Would going down fps a lot, putting down setting a lot better than upscaling ? A lot of the time, no.
An RTX 2060 in 2023 is gonna benefit from DLSS. We aren't talking about an RTX 2060 released 4 years ago, we're talking about the 4060 Ti.
I don't know about the 5600 XT but a quick look on YouTube shows you're wrong. In most of these benchmarks the 5600 XT matches or beats the RTX 2060. Unless you think DLSS is going to make it 10x faster than a 5600 XT with FSR, which I really doubt is the case.
I'm seeing a problem with Ray-Tracing as more games get released. New games are so demanding that Ray-Tracing might as well not be a factor without DLSS and FSR. In many circumstances, DLSS and FSR are synonymous with Ray-Tracing because in most new titles you won't be able to use Ray-Tracing without them. So at what point do we admit that Ray-Tracing as it stands right now isn't a viable option to turn on in games? In many new games you can barely achieve 60fps at 1080p with max settings minus Ray-Tracing. The Last of Us Part 1 is just not gonna happen on the 4060 Ti with Ray-Tracing, if the game had Ray-Tracing. It's so bad on the 4060 Ti that the RTX 3060 has a lot better lows because it has more VRAM. You can mitigate this with DLSS, but by no means is that a win when the RTX 3060 is handling it better. A plague Tale Requiem is just unplayable on the 4060 Ti with Ray-Tracing at 1080p Ultra. So either Ray-Tracing doesn't work as a graphics technology or nobody wants to admit that it requires DLSS or FSR because we're being sold shit graphic cards.I'm not looking through those Fisher Price benchmarks so I will take your word for it. Still DLSS is essential if you want to sprinkle on some RT with the older cards.
So would a 4090 today if someone as a 4k or more monitor, that without Raytracing on and with no game from 2023 game included:An RTX 2060 in 2023 is gonna benefit from DLSS. We aren't talking about an RTX 2060 released 4 years ago, we're talking about the 4060 Ti.
Or it could be one of the things at ultra that does not work on a 4060ti card, RT does not have to be that special among them (it is just newer), why would a game be limited to run well on a xx60 card, why if you put everything at max would it not push a 3080 a little, what would be the point ?A plague Tale Requiem is just unplayable on the 4060 Ti with Ray-Tracing at 1080p Ultra. So either Ray-Tracing doesn't work as a graphics technology or nobody wants to admit that it requires DLSS or FSR because we're being sold shit graphic cards.
I hate how developers now seem to program their games around DLSS/FSR
Epic review.
DF also confirming the 4060 is garbage.
DF also confirming the 4060 is garbage.