RTX 4xxx / RX 7xxx speculation

Amazing. They really have to dangle dlss3 over your head like jingle keys because why would you buy this vs any other card that can be had for $400 right now?

It's really not all that much better than a 3060, and you can get something like a 6800 for that price. (Edit, not quite. Still in the high 400s)

Yeah, the slides Nvidia put out are rather telling. Proof will be in the pudding aka test results but we have to wait till July for the 4060ti 16gb vs 8gb vs 4070 12gb comparisons. $500 for a 16GB 4060ti is just silly. But hey, it only uses 160w of power! Efficiency!

GN has a decent video on this news. Starting at 7mins in, Steve talks about memory bus, cache and bandwidth. Around 11min30s he covers Nvidia's "effective bandwidth" claims. This NV slide though, via GN's video is them showing us why they think 128bit bus isn't a big deal, the 32mb L2 cache.

Screenshot 2023-05-18 at 08-10-17 Gamers Nexus.png




This slide (via Videocardz) is pure marketing to make you ignore that without frame generation tech, this ain't much of an upgrade over a 3060 or 2060Super, certainly not any 8GB versions, for the same price 2-4 years later.

RTX-4060-TIDLSS3nvidiaSlide.jpg
 
We used to give AMD crap for rebadging, but honestly, the lineup for nvidia would have been better as mostly a rebadge. Aside from the 4090 and the 4080, the 4070ti could have just been a 3090, the 4070 could have been a 20 GB 3080, the 4060ti could have been a 16 GB 3070ti, and the 4060 could have been 8 and 16 GB versions of the 3060ti all with improved clocks and reduced power.
More like they just shifted the entire product stack up 2 spots. The 4080 is priced as a 4080ti but should be a 4070ti at best.
 
This slide (via Videocardz) is pure marketing to make you ignore that without frame generation tech, this ain't much of an upgrade over a 3060 or 2060Super, certainly not any 8GB versions, for the same price 2-4 years later.

View attachment 571101
The fact that they are comparing a 4060ti to a 2060 Super is laughable at best. It would have been like comparing an RTX 2070 to a GTX 970.
 
Launching the 4060 at $300 when the 3060 launched at $370 in today dollars, his a testimony that things did not go at all like planned when the 4080 was planned to launch at $1200.

That much of a price cut do show they know very well this is not powerful enough and sales numbers have not been good.

The 4060TI cherry pick by Nvidia numbers would not have been specially good/spectacular would they have been the $300 4060 benchmark figure, but considering that no new 3060TI seem to be under $390 it would have been a game changer to have something sometime a bit better, sometime vastly better (Metrox Exodus) at $300
 
Whats so confusing about this? The 4070 is a 192 bit card. It can only be 12 GB or 24 GB. Since 24 GB would be an overkill, 12 GB was used. The 3060ti is a 128 bit card. It can only be 8 GB or 16 GB. Since 8 GB is to little (especially for the ultra settings whores) 16 GB was given an option.

I expect this question to pop up on you tube comments but I am always surprised to see it on a tech forum.

We didn't see this happening too much prior to gddr6 since manufactures typically gave more than enough vram, ie they could have released a 4 GB GTX 1070 and it would have done just fine in 95% of gaming scenarios but through cost savings likely not worth it.
I'll be honest, I don't really know about the technicalities behind why these things are done. I just know from an end user point of view it just looks silly. Maybe they should have used a different bus.
 
I just know from an end user point of view it just looks silly.
This it must look strange and for many 12GB of GDRR6x going over 500 GB/s will not fell superior to 16GB going at just 288 GB/s on a 4060TI like it would to some, the 760gb/s 10gig 3080 eat a 360 gb/s 3080 12gb at 4k in all relevant scenarios if both would exist. The main number and the one in the name of the card and that show up in big on the box being bigger.

It certainly make the line up less clean to go up and down like that, there a niche a want lot of vram, I want CUDA or optix but not necessarily much power-bandwith or have a big enough budget for both that are happy I imagine that the lower end but with more vram than the mid range exist, but must be quite rare.

Maybe they should have used a different bus.
Wonder if GDDR7 would offer some fix, would different density option become possible or something of the sort ? Or the bandwidth gain high enough to make a SKUs line that always just go up even has the bus get lower and lower
 
Last edited:
I'll be honest, I don't really know about the technicalities behind why these things are done. I just know from an end user point of view it just looks silly. Maybe they should have used a different bus.
Short answer: cost. Bus lanes are pricey. Good explanation here:
https://www.nvidia.com/en-us/geforce/news/rtx-40-series-vram-video-memory-explained/

They say hybrid memory configurations are impossible (ie 10 GB on a 128 bit bus by having halfs chips 2 GB and half 4 GB) but that's simply not true as current consoles do this and past gpus have done this (560ti if I remember correct). It may be tougher to do on this type of architecture though.
 
This it must look strange and for many 12GB of GDRR6x going over 500 GB/s will not fell superior to 16GB going at just 288 GB/s on a 4060TI like it would to some, the 760gb/s 10gig 3080 eat a 360 gb/s 3080 12gb at 4k in all relevant scenarios if both would exist. The main number and the one in the name of the card and that show up in big on the box being bigger.

It certainly make the line up less clean to go up and down like that, there a niche a want lot of vram, I want CUDA or optix but not necessarily much power-bandwith or have a big enough budget for both that are happy I imagine that the lower end but with more vram than the mid range exist, but must be quite rich.


Wonder if GDDR7 would offer some fix, would different density option become possible or something of the sort ? Or the bandwidth gain high enough to make a SKUs line that always just go up even has the bus get lower and lower

Likely yes, just as it worked out neatly for gddr5. xx60 series likely 128 bit with 16 GB, xx70 series 192 GB with 24 GB, and 80 series with 256 bit and 32 GB. Then again, perhaps they would have a xx70ti with 256 bit and 16 GB.

I think there is just a much wider gambit of vram demands for users when you factor in things like RTX and various resolutions/settings. For similiar price, some get much better use with a 12 GB 3060 with settings maxed out at 1080p while others would rather run 4k medium on their TV where 8 GB of a 3060ti is sufficient.

Years back, there just wasn't as much overlap. The heirichy was closer to 1080p high, 1080p ultra/1440p high, 1440p ultra which worked out nearly for the 6 GB, 8 GB and 11 GB of the GTX 1060, 1070, and 1080ti.
 
They say hybrid memory configurations are impossible (ie 10 GB on a 128 bit bus by having halfs chips 2 GB and half 4 GB) but that's simply not true as current consoles do this and past gpus have done this (560ti if I remember correct). It may be tougher to do on this type of architecture though.
According to chatGPT (thus reddit posts) if you mix, only the first 1gb of the 2GB chips have full bandwith, the rest goes to half bandwith which could be too slow
https://en.wikipedia.org/wiki/Data_striping

The Geforce 970 did this and it created a situation were 3.5 gb versus the rest of the 4gb were used quite differently:
https://techreport.com/review/nvidia-the-geforce-gtx-970-works-exactly-as-intended/

One would assume that it still better at half bandwith really close than on the ram or disk, but half bandwith of a 4060 would start to be really slow
 
All said, I think the 4060 is a solid deal at $300. At $400, the 4060ti seems a bit steep for a minor bump in performance and no extra vram. Paying another $100 on top of that for the 16 GB version is even worse.
 
All said, I think the 4060 is a solid deal at $300. At $400, the 4060ti seems a bit steep for a minor bump in performance and no extra vram. Paying another $100 on top of that for the 16 GB version is even worse.
With +18% performance it's slower than the 3060 Ti.
2.5 years later and it doesn't even move up a full tier. And they cut the VRAM.
 
it's slower than the 3060 Ti.
2.5 years later and it doesn't even move up a full tier. And they cut the VRAM.
3060ti was a 8 gb card has well if you mean the amount, if you mean the vram bandwith, by a good amount 288 GB/s down from 448.

Considering there seem to have no 3060 that sales under $310, even the Zotac affair, the Inno3D are 330 on newegg, cheapest Asus at $350, an actual $300 price tag would be an actual improvement in performance by dollar for Nvidia regardless of the performance jump automatically, has it would be actually cheaper.

If it perform like a 3060ti it would even beat the rare 6650xt and 6700 offered under the $300 mark. Not that elevating performance by dollar for a new xx60 card should not be obviously what happen, but for many of the previous that was not always the case, simply offering a new product in a different price point but with a similar not better perf by dollar.
 
Last edited:
3060ti was a 8 gb card has well if you mean the amount, if you mean the vram bandwith, by a good amount 288 GB/s down from 448.

Considering there seem to have no 3060 that sales under $310, even the Zotac affair, the Inno3D are 330 on newegg, cheapest Asus at $350, an actual $300 price tag would be an actual improvement in performance by dollar for Nvidia regardless of the performance jump automatically, has it would be actually cheaper.

If it perform like a 3060ti it would even beat the rare 6650xt and 6700 offered under the $300 mark. Not that elevating performance by dollar for a new xx60 card should not be obviously what happen, but for many of the previous that was not always the case, simply offering a new product in a different price point but with a similar not better perf by dollar.
To put things in perspective, it is still double that of the 1650 Super which still managed in most cases and the same as the 3070ti and 2080 Super, both of which are faster.

While there were a handful of titles that went berserker on vram when cranked lately, you need to zoom out the graph to see the trend. This card should me solid in the mainstream for years to come.
 
The 4060 is not to far off from the low profile 75w mark. Would be nice to finally have a 1050ti/1650 replacement.

Both the RX 7600 and 4060 are 128 bit so either would work.
 
Videocardz has leaked synthetic benchmark scores for 4060 ti 8gb & 7600

The 7600 performs more like a 6670xt but still is estimated to be better than 4060 non-ti in raster. Nvidia still has the feature advantage over AMD, so AMD should not be able to price it more than $300

But seeing that 6700 10gb goes for $270-$280 and 7600 would be roughly in the same class, the market price should fall down rapidly to the same, if AMD launches at $300

And if AMD still has lots of 66xx stocks left then a realistic price for 7600 would be $250-$260

https://m-3dcenter-org.translate.go...2023?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en-GB
 
Videocardz has leaked synthetic benchmark scores for 4060 ti 8gb & 7600

The 7600 performs more like a 6670xt but still is estimated to be better than 4060 non-ti in raster. Nvidia still has the feature advantage over AMD, so AMD should not be able to price it more than $300

But seeing that 6700 10gb goes for $270-$280 and 7600 would be roughly in the same class, the market price should fall down rapidly to the same, if AMD launches at $300

And if AMD still has lots of 66xx stocks left then a realistic price for 7600 would be $250-$260

https://m-3dcenter-org.translate.go...2023?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en-GB
Problem with those 8gb cards are that they will be obsolete pretty fast. Did get a 12gb 6750xt a few months ago for probably just a bit more than I would have pay for a 4060 non-ti and rendering at 4k it uses close to 10gb of ram in quite a few games and I don't play any really GPU intensive games on that computer (mostly platformers and driving games). Even my 10gb 3080 will most likely start struggling in 1440p in the near future due to being vram limited. Getting an 8gb card will generally mean lowering textures and hope it runs OK at anything above 1080p and might even require lower res textures at 1080p within a year or two. This will start getting pretty bad once PS4 and Xbox One support gets fully dropped and titles start targeting the newer consoles which have fast storage and more than 8gb of vram available.

Basically 8gb cards will be similar to the 1060 3gb version or GTX 580 with 1.5gb, struggling pretty fast due to not enough ram. I had the most expensive GTX 580, but had to upgrade it after just 1.5 years due to constantly running out of vram. The card I upgraded to was top of the line, with more vram but only 25% or so faster due to my 580 being able to run 20% OC permanently. They should run 12gb on 7600 and 4060 cards with 16gb being the standard for 7700xt/4070 class but they want to have higher profit margins and give consumers less performance so they have to upgrade sooner. We have had 8gb of vram as standard for 6 or so years with AMD moving forward last gen and Nvidia moving forward this gen but price performance hasn't moved in since last gen, just the cards got more expensive and card names moved several tiers.
 
Problem with those 8gb cards are that they will be obsolete pretty fast. Did get a 12gb 6750xt a few months ago for probably just a bit more than I would have pay for a 4060 non-ti and rendering at 4k it uses close to 10gb of ram in quite a few games and I don't play any really GPU intensive games on that computer (mostly platformers and driving games). Even my 10gb 3080 will most likely start struggling in 1440p in the near future due to being vram limited. Getting an 8gb card will generally mean lowering textures and hope it runs OK at anything above 1080p and might even require lower res textures at 1080p within a year or two. This will start getting pretty bad once PS4 and Xbox One support gets fully dropped and titles start targeting the newer consoles which have fast storage and more than 8gb of vram available.

Basically 8gb cards will be similar to the 1060 3gb version or GTX 580 with 1.5gb, struggling pretty fast due to not enough ram. I had the most expensive GTX 580, but had to upgrade it after just 1.5 years due to constantly running out of vram. The card I upgraded to was top of the line, with more vram but only 25% or so faster due to my 580 being able to run 20% OC permanently. They should run 12gb on 7600 and 4060 cards with 16gb being the standard for 7700xt/4070 class but they want to have higher profit margins and give consumers less performance so they have to upgrade sooner. We have had 8gb of vram as standard for 6 or so years with AMD moving forward last gen and Nvidia moving forward this gen but price performance hasn't moved in since last gen, just the cards got more expensive and card names moved several tiers.

Well the Series S has about 8 GB of usable vram for graphics so developers will likely set that as the target for 1080p med - mainstream cards.

Series X has about 10 GB of usable vram so closer to a 1440p high / 1080 ultra. Perfect for the 6700 and other midrange cards for a while.
The extra 2 GB made a HUGE difference between the 3080 and the 3070ti in the last few titles. We will see if the cache on 8GB 4060ti helps any...

12 GB will be 1440p ultra and 16GB for 4k ultra, though you will need some serious horsepower to back that up with some of these newer titles.
 
Ignoring the exaggerated absolute numbers, this chart shows that settings trump resolution when it comes to vram usage, which has typically been the case:
Screenshot_20230522_233335_Samsung Internet.jpg

So, if you have a 'high horsepower' 8 GB card, you could still play in the 1440 high to 4k med range. Some just see it as obsolete when they already unable to play their PC-precious 1080p ultra.

Either do some small tweaks in the settings (many of which are more of a placebo setting) or take a bath on that $700+ 8 GB card - really that simple.
 
Well the Series S has about 8 GB of usable vram for graphics so developers will likely set that as the target for 1080p med - mainstream cards.

Series X has about 10 GB of usable vram so closer to a 1440p high / 1080 ultra. Perfect for the 6700 and other midrange cards for a while.
The extra 2 GB made a HUGE difference between the 3080 and the 3070ti in the last few titles. We will see if the cache on 8GB 4060ti helps any...

12 GB will be 1440p ultra and 16GB for 4k ultra, though you will need some serious horsepower to back that up with some of these newer titles.
The series S is mistake IMO, but it is a fixed target so the devs can do special tricks for it and optimize. The Series X is also a fixed target with fast drives so they can optimize texture streaming etc. which will help it a bit. PCs are not a fixed target and see less optimization from the devs.

Having a card that must run medium at 80fps due to vram when the GPU itself is capable of running 60fps at high or 45 fps at ultra is when your card starts to reach obsolete IMO.
 
Obsolete because it 'can't run 45 fps ultra' as vram is holding it back? Are you AI?
If you are fine with running medium on a card that has a GPU that is capable of running high or ultra but can't because lack of VRAM causes stuttering then feel free to get an 8GB card. I will stay away from those 8GB cards.
 
If you are fine with running medium on a card that has a GPU that is capable of running high or ultra but can't because lack of VRAM causes stuttering then feel free to get an 8GB card. I will stay away from those 8GB cards.

While ultra has been a problem for 8 GB, show me ONE example where high has been a problem for ANY vram amount at PLAYABLE settings - ie not 20 fps lows / 25 fps average with 16 GB and only 5 fps lows / 25 fps average with 8 GB. Both are unplayable.

Even the premium consoles use 1440p and up high settings instead of vram heavy 1080p ultra despite having enough vram and horsepower for both.
 
While ultra has been a problem for 8 GB, show me ONE example where high has been a problem for ANY vram amount at PLAYABLE settings - ie not 20 fps lows / 25 fps average with 16 GB and only 5 fps lows / 25 fps average with 8 GB. Both are unplayable.

Even the premium consoles use 1440p and up high settings instead of vram heavy 1080p ultra despite having enough vram and horsepower for both.
Timestamped -

I suppose you could turn off RT on your brand new 4060Ti that costs $400... even though you're still playing at 1080P
 
When is 4090 Ti launching?

Technically its been out for 6 months, just in a pro SKU. Nvidia will launch it whenever they feel like it. (Which, given the current lack of demand in gaming and the ample demand in AI research and ML, is probably a while).
 
I've updated a chart I made a while back to include the 4060 and 4060 Ti. It's pretty obvious visually when comparing to past generations, I'm surprised more people haven't caught on. The Y axis is the percent of CUDA cores compared to the full die of that architecture.
Thanks for this, puts things in the proper perspective. They're gouging harder than I'd realized.
 
Timestamped -

I suppose you could turn off RT on your brand new 4060Ti that costs $400... even though you're still playing at 1080P

They're showing problems running at settings that are not appropriate for the card, so of course there are going to be issues. If you're having texture issues, then you obviously need to turn the texture detail down. I don't recall anybody ever bitching about this so much in the past.
 
They're showing problems running at settings that are not appropriate for the card, so of course there are going to be issues. If you're having texture issues, then you obviously need to turn the texture detail down. I don't recall anybody ever bitching about this so much in the past.

Expectations on performance tend to be set by the price demanded for the product. It also doesn't help when the new card looks weak against the previous generation card. Overall I think most people are frustrated with the prices being demanded vs the performance given.
 
They're showing problems running at settings that are not appropriate for the card, so of course there are going to be issues. If you're having texture issues, then you obviously need to turn the texture detail down. I don't recall anybody ever bitching about this so much in the past.
It's a $400 gpu that can struggle at 1080p and sometimes loses to the 3060ti.
 
It's a $400 gpu that can struggle at 1080p and sometimes loses to the 3060ti.
Exactly. The 60 class cards over the last few generations were touted as also 1440p cards now here we are with a 60 class ti "1080p" card where in some games you need to turn down settings even at that low res. The card is a goddamn failure at that price and these clowns need to stop making excuses for it.
 
They're showing problems running at settings that are not appropriate for the card, so of course there are going to be issues. If you're having texture issues, then you obviously need to turn the texture detail down. I don't recall anybody ever bitching about this so much in the past.
The timestamp is 1080P High settings (not Ultra) with RT that it can run just fine at 70FPS - until it hits a VRAM issue and a massive FPS drop or "stutter."

Justify too little VRAM all you want. I won't.
 
Back
Top