4060 Ti 16GB due to launch next week (July 18th)

Marees

2[H]4U
Joined
Sep 28, 2018
Messages
2,150
The source article describes Nvidia's partners as "hesitant" to participate in reviews for the board, as it will likely be pummeled in reviews as being an even worse value than its 8GB sibling. That's because although it has double the memory, it's still kneecapped by its 128-bit memory bus and x8 PCIe 4.0 interface instead of using 16 lanes.

Hardware Unboxed compares the situation to the silent launch of the RTX 3080 12GB, which occurred at the peak of the pandemic-related GPU shortage, and was seen as a mere cash grab at the time. No review samples were sent to press for that card either, and it never even had an official MSRP, as that didn't matter at the time.

https://www.extremetech.com/gaming/...egedly-wont-be-sending-16gb-rtx-4060-ti-cards

Word about Nvidia's alleged plans, or lack thereof, comes from the popular YouTube channel Hardware Unboxed, a huge player in the hardware review space. The site wrote on Twitter that there is no review plan for the 16GB RTX 4060 Ti, at all. The company isn't making Founders Edition cards as it did with the 8GB version, so it's left up to its partners, which include companies like Asus, MSI, etc. However, when Hardware Unboxed contacted those reps, not a single one said they were sending a card for review. None of them seemed even to know when they were getting cards.

Also, according to Videocardz, add-in board (AIB) partners have to submit the names of sites and reviewers to Nvidia for approval, an interesting ripple demonstrating Nvidia's level of control over its partners. Though Nvidia isn't officially involved in the review process, it controls when embargoes lift and, apparently, who is allowed to review its GPUs. This sounds like the same situation that caused EVGA to quit selling Nvidia GPUs last year.
 
Can we expect AMD to announce the 48 CU 12gb & 60 CU 16gb Navi 32 cards to co-incide with this release by Nvidia !? Or atleast release a 16gb 7600 🤔

Or better still bring down the price of 6800/6800XT to $400/$450 ??
 
Can we expect AMD to announce the 48 CU 12gb & 60 CU 16gb Navi 32 cards to co-incide with this release by Nvidia !? Or atleast release a 16gb 7600 🤔

Or better still bring down the price of 6800/6800XT to $400/$450 ??
I think AMD is still playing the same game with Nvidia, in that they want to see what Nvidia will do first. The RX 7600 is a trash GPU that even AMD admitted they had to lower the price before release. The 6650 XT performs better while being around $250, much like how the RTX 3060 Ti performs better than the 4060 Ti and it cheaper too. I think it's clear that both AMD and Nvidia don't want to move the bar higher than last gen so that consumers don't gain much value from the current situation. You can buy a 6800 XT for $480 right now on Amazon, so it would be worth it for AMD to further lower prices just to make it official.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
No, but the problem is the 128bit bus and 16GB of VRAM - there's no real reason that such a neutered GPU would need 16GB since it doesn't have the horsepower to push anything that would eat up that much frame buffer.
Would be interesting to see a comparison of the 8gb & 16gb 4060 ti cards at 1080p ultra in VRAM heavy games
 
Specs being so close to the 4060ti (are they exactly the same ?) that the review would have not much value and no one I expect will wait in line for a 4060TI, could wait a couple of days after launch for reviews telling us that it is a terrible price point.
 
And probably cost too much for a lot of pre PCI4 built, maybe some that updraged weaker CPU to 9900k-10900k when they got cheaper could have been in the market.
 
No, but the problem is the 128bit bus and 16GB of VRAM - there's no real reason that such a neutered GPU would need 16GB since it doesn't have the horsepower to push anything that would eat up that much frame buffer.
Yes, and I agree, which is why I was asking only about the PCIe interface size.
 
IMHO, the best thing Nvidia can do right now brand wise is "stop releasing consumer stuff".

People can only stand so much rotting garbage at any given time. Eventually, they put back on their clothes and climb out of bed.

If I were Nvidia, I'd give it a break.
 
If I was Nvidia I would finish the full stack with the 4050 and I would create a Pascal moment with the 5000 launched almost as soon as GDDR7 supply can be assured and Tsmc node is ready


In the latest TPU test suite with updated driver, the 4080 destroyed the 3080 at 1080 and 1440p:

relative-performance-2560-1440.png


This is a 50% performance increase despite going from a 628mm 320bits 3080 to a 380mm 256 bits cards, they have the tech to make an impressive offer over all the stack at interesting price already I suspect, that kind of generational jump of 50% was more than enough, I mean the 4080 is 2.4x the 2080 here and the architecture choice must have mean better price being possible.

GDDR7 give you what about 50% bandwidth increase even if you keep those cheaper bus width, which is perfectly fine (even great) for a gen on gen memory bandwith jump, price are low they can go with 16GB 4080 as standard and 20-24 for the higher model if Blackwell does not make them better choice for AI, at some point the pro offer will need to compete with the 4080-4090 anyway.

With that kind of performance boost they could keep the high price alive, the difference to the console until the PS6 launch of a 16GB 5060 that is +90% of an 4060 (in between a 4070-4070ti) at $330 would be worth it for many people with a 2070 super and under.

I imagine that during 2024 they should enter an era where they cannot all sell before making them AI gpus, so not a bad timing for a Pascal type launch
 
Last edited:
I seriously hope this card gets a boring AF name.
RTX4060TI 16GB edition

This would run circles around the A2000 with a power envelope I could work with and has enough VRam to meet the requirements I need. Just a matter of do the creative drivers work with the software and that’s an easy enough email to send.

Would I buy this for gaming, oh hell no, but as a cheap alternative to an A4000, Omnissiah willing, you bet!
 
I seriously hope this card gets a boring AF name.
RTX4060TI 16GB edition

This would run circles around the A2000 with a power envelope I could work with and has enough VRam to meet the requirements I need. Just a matter of do the creative drivers work with the software and that’s an easy enough email to send.

Would I buy this for gaming, oh hell no, but as a cheap alternative to an A4000, Omnissiah willing, you bet!
Nope, it's going to be called RTX 4060 Ti 16GB XXXtreme Gamerz l33t 3D1T10N HAXX0R.
This way when accounting sees it they will tell you no. :D
 
I think they will have a clean looking pro-art edition now, with pro in the name:

11304-front.small.jpg


A simple, perfect bland looking called:
  • DUAL-RTX4060TI-16G
11302-front.small.jpg


That scream msrp at launch as well.

Lot of AI models need over 12gb of ram, 6-7 billions weights being quite common, could be a nice alternative to used 3090
 
Last edited:
Would I buy this for gaming, oh hell no, but as a cheap alternative to an A4000, Omnissiah willing, you bet!
Man, you need to have a meeting with the auditors and sit their asses down in front of a browser with Newegg open to the GPU page. "Find me something that doesn't scream gamer or cost 10x the price!"
 
No, but the problem is the 128bit bus and 16GB of VRAM - there's no real reason that such a neutered GPU would need 16GB since it doesn't have the horsepower to push anything that would eat up that much frame buffer.
There are actually a few games where RT + 4060 fails terribly... like 8fps lows. While the 3060 12gb was still able to hit the 40fps range. That is 100% running out of video ram.

Not that I am suggesting a 4060 16gb is a good purchase. lol Its going to help out in a couple cases where the real answer is don't buy a 4060 to do ray tracing to begin with.

Considering the state of the market. I don't think we are getting anything decent GPU wise for a good while now. What we have is probably more or less what we are going to see for the next year, year and half.
 
There are actually a few games where RT + 4060 fails terribly... like 8fps lows. While the 3060 12gb was still able to hit the 40fps range. That is 100% running out of video ram.

Not that I am suggesting a 4060 16gb is a good purchase. lol Its going to help out in a couple cases where the real answer is don't buy a 4060 to do ray tracing to begin with.

Considering the state of the market. I don't think we are getting anything decent GPU wise for a good while now. What we have is probably more or less what we are going to see for the next year, year and half.

I agree...4060/Ti feels like a stopgap filler between last gen and next gen.
"It's there if you really need it and we're happy to take your money, but wait if you can."
 
Man, you need to have a meeting with the auditors and sit their asses down in front of a browser with Newegg open to the GPU page. "Find me something that doesn't scream gamer or cost 10x the price!"
Government auditors don't care what I sit them down and tell them, the whole problem stems from past incidents where employees may have misappropriated some funds and bought themselves some swanky gaming PCs on the government's dime, so their rule is "flag it all and burry the bastard in paperwork so they learn not to try it again".
 
Outside the 4090, that's my perspective on this entire generation.
I hate to like this comment because it is sad and depressing but terribly accurate.
I blame TSMC for their nearly 100% price increase since 2019 and Samsung for not staying caught up and providing needed competition to TSMC allowing their massive price hikes.
 
The 4090 and 7900 XTX give good performance for the price, everything else has been pretty awful. The 6700 XT was a better purchase a year ago than anything near it now. The fact that with a specific (awful) sale on Newegg a 7900 XTX was only $800 today make the 4060Ti 16GB for $500 worse than garbage value.
 
I hate to like this comment because it is sad and depressing but terribly accurate.
I blame TSMC for their nearly 100% price increase since 2019 and Samsung for not staying caught up and providing needed competition to TSMC allowing their massive price hikes.
Na you can't blame the fabs. They are tuning out tons of silicon with by all reports very low defect rates.
The bottom line is... the 4090 silicon is already cut down. Its not a 100% working chip. Those go to the AI skus. It was funny out of the gate with the 4090 there where all the rumors of Nvidia hording the golden samples for a 4090TI. NAAA they just been selling them into their AI market at 3x the price.
Every chip down from their is a cut down chip with very very nice profitable yields.

Unless AMD or Intel really push Nvidia at some point. Cast offs and cut down 40% more chips per wafer 128bit memory interface chips are all we will get. (and Nvidia will be charging top dollar for it anyway)
 
I think AMD is still playing the same game with Nvidia, in that they want to see what Nvidia will do first. The RX 7600 is a trash GPU that even AMD admitted they had to lower the price before release. The 6650 XT performs better while being around $250, much like how the RTX 3060 Ti performs better than the 4060 Ti and it cheaper too. I think it's clear that both AMD and Nvidia don't want to move the bar higher than last gen so that consumers don't gain much value from the current situation. You can buy a 6800 XT for $480 right now on Amazon, so it would be worth it for AMD to further lower prices just to make it official.
Strategy!

The next true next gen should blow the last gen away. And thus people will buy them far more that way. It's like why would I buy cheap highs from down the block when I could get the Blue 99% pure Walter White grade next month. (OK BAD analogy) haha
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
If people would just learn to code this wouldn't be an issue...


/S
 
Na you can't blame the fabs. They are tuning out tons of silicon with by all reports very low defect rates.
The bottom line is... the 4090 silicon is already cut down. Its not a 100% working chip. Those go to the AI skus. It was funny out of the gate with the 4090 there where all the rumors of Nvidia hording the golden samples for a 4090TI. NAAA they just been selling them into their AI market at 3x the price.
Every chip down from their is a cut down chip with very very nice profitable yields.

Unless AMD or Intel really push Nvidia at some point. Cast offs and cut down 40% more chips per wafer 128bit memory interface chips are all we will get. (and Nvidia will be charging top dollar for it anyway)
Yeah the 4090TI’s were going to be made from the H100’s that weren’t selling. But then OpenAI hit it big out of nowhere and suddenly everybody wanted them now Nvidia can’t make them fast enough.

But defect rates aren’t the problem it’s that nobody has a competitive node to TSMC N5 and N4. Samsung is hoping to have their online next year, Intel is looking about the same.
 
IMHO, the best thing Nvidia can do right now brand wise is "stop releasing consumer stuff".

People can only stand so much rotting garbage at any given time. Eventually, they put back on their clothes and climb out of bed.

If I were Nvidia, I'd give it a break.

They have like 85% market share in a market worth billions of dollars with limited competition because most gamers still won’t touch an AMD card because they’re still mad about their drivers on their ATI Rage 128. Why would they walk away from that? They’re still selling quite a bit of that garbage and that’s precisely the problem.
 
They have like 85% market share in a market worth billions of dollars with limited competition because most gamers still won’t touch an AMD card because they’re still mad about their drivers on their ATI Rage 128. Why would they walk away from that? They’re still selling quite a bit of that garbage and that’s precisely the problem.
You're probably right. I did use the "bed" analogy on purpose.
 
Yeah the 4090TI’s were going to be made from the H100’s that weren’t selling. But then OpenAI hit it big out of nowhere and suddenly everybody wanted them now Nvidia can’t make them fast enough.

But defect rates aren’t the problem it’s that nobody has a competitive node to TSMC N5 and N4. Samsung is hoping to have their online next year, Intel is looking about the same.
That is fair... for us gamers though its also the case of Nvidia only having one master silicon. One design to service two very different markets.
AMD was wise in spinning RDNA and CDNA into their own designs. At least for gamers we can maybe get a RDNA created for less bottle necked fabs. (to be fair so far we are getting the same crap scaled down lower mid range new bits from them as well)

I think all indications and rumors are Nvidias next gen is going to be one design to rule them all as well... unfortunately. Hopefully Nvidia decides after that to spin their designs into Geforce and AI. The gaming chips don't need the tensor counts they are getting... nor the very latest fabs far that matter.

Anyway ya its going to be a painful couple years in gaming GPU hardware I think. The demand on the good stuff Nvidia has been working on isn't going down anytime soon...I can almost see Nvidia pulling a Tesla this next gen and just skipping it for gaming releases. Especially if the 4090s performance crown isn't really threatened. Sort of heading into a 1080 situation... where they will just skip a generation. It worked out for them once before.
 
They have like 85% market share in a market worth billions of dollars with limited competition because most gamers still won’t touch an AMD card because they’re still mad about their drivers on their ATI Rage 128. Why would they walk away from that? They’re still selling quite a bit of that garbage and that’s precisely the problem.
Yeah, let's pretend that AMD didn't have months of black screen issues between 2021 and 2022. :facepalm:
 
Back
Top