One Weird Trick Relaunching RTX 4080 12GB as 4070 Ti

If this is true, I'm expecting to see the same outrage over the 7900XT vs the 7900XTX. Right?... right?...
Yeah sure, if the 7900xtx is what the 7900xt was and the 7900xt is what the 7900 was quite dickish move by team red, that said I don't tend to pay much attention to their video cards though other than they have been the ones who have actually lowered prices on old stock.
 
Yeah sure, if the 7900xtx is what the 7900xt was and the 7900xt is what the 7900 was quite dickish move by team red, that said I don't tend to pay much attention to their video cards though other than they have been the ones who have actually lowered prices on old stock.
Why would Nvidia lower prices when they still sell every card regardless of its price? The market has given them no reason to lower their prices. Nvidia themselves are surprised they are selling as well as they are right now.
 
Yeah sure, if the 7900xtx is what the 7900xt was and the 7900xt is what the 7900 was quite dickish move by team red, that said I don't tend to pay much attention to their video cards though other than they have been the ones who have actually lowered prices on old stock.
More like the 7900xtx is what the 7800xt should be, but yes dickish by team red indeed.
 
Based on....? It's funny that to this day people still misrepresent what the stink about the 4080 12gb was. Nvidia literally tried to say it was the same chip as the 16gb.

Exactly...historically, Nvidia actually has used the same GPUs for the XX70 series. This time it wasn't the same but they tried to keep the same 4080 model name.

The 2070 Super was the same TU104 die as the 2080 and super...they were different models.
The 1070, 1070Ti, and 1080 all used GP104 dies
The 970 and 980 used GM204 dies

I'm sure I could keep going if I looked. Even the 1060 6gb and 3gb which had totally different specs were the same GP106 for most of the production.
 
If there's no associated price drop, this is DOA. They can't even sell the "normal" 4080s at MSRP.

Good grief, folks... NVIDIA messed up, but it's not some unforgivable crime. If the RTX 4070 Ti is basically a 4080 12GB at a lower price, the company will have learned its lesson.

I haven't seen anything related to price yet unless I missed something. If the price ends up being ~$900 like they started with and all they did was change the name, it's DOA. I'm sure people will still buy it (see also the absurdly priced 4080), but I won't.
 
I haven't seen anything related to price yet unless I missed something. If the price ends up being ~$900 like they started with and all they did was change the name, it's DOA. I'm sure people will still buy it (see also the absurdly priced 4080), but I won't.
I suspect there will be a lower price, design tweaks or both. If it was just a name change, it wouldn't take much more to relaunch the GPU than reprinting boxes and changing device IDs.
 
Why would Nvidia lower prices when they still sell every card regardless of its price? The market has given them no reason to lower their prices. Nvidia themselves are surprised they are selling as well as they are right now.
Is the 4080 selling well? I've only heard the opposite but I have no data points. As for the team red stuff, I agree the 7900 XT is bullshit and won't sell well.
 
So funny thing in Canada... you can actually put in a deposit at some retailers for msrp and they will notify you when your order arrives. 4080's are in stock for most models. We couldn't get a PS5 if our lives depended on it, nor would gamestop allow a deposit for one.
 
So funny thing in Canada... you can actually put in a deposit at some retailers for msrp and they will notify you when your order arrives. 4080's are in stock for most models. We couldn't get a PS5 if our lives depended on it, nor would gamestop allow a deposit for one.
That's so wild they're still in such high demand, it's been TWO years. Are people having a hard time getting a Series X? I'm assuming no.
 
That's so wild they're still in such high demand, it's been TWO years. Are people having a hard time getting a Series X? I'm assuming no.
Numbers aren't even close. There are WAY more series X sold & produced then the high-end 3080+ cards that have been impossible to get for normal people. It was reported that 20 million turing/ampere were sold, but that's every single turing/ampere card. Not just 2080ti's, 3080's, etc.

Also, there has been a lot of pent up demand for these higher-end GPU's due to what happened over the past few years, how lackluster turing was, 4k being more common, etc.

TLDR, it's easier to keep up with console demand then it is high-end GPU demand.
 
I wasn't paying attention seriously. I'm not in market for a new gpu this gen at all. What was the controversy with the 4080 vs 4090?
 
I wasn't paying attention seriously. I'm not in market for a new gpu this gen at all. What was the controversy with the 4080 vs 4090?
There wasn't one besides them pricing the 4080 so close to the 4090 to drive people to either buy that or old 3xxx stock.

The main controversy was Nvidia blatantly renaming the 4070ti to 4080 to justify a $900+ sticker, and then to say it's the same chip as the 16gb model.

It was described as exactly the same difference as the 3080 10gb/12gb. This was false.
 
That's so wild they're still in such high demand, it's been TWO years. Are people having a hard time getting a Series X? I'm assuming no.
Series X have been easy to get in a windows recently, but now with the holidays season it seem you must work to find them again.

I wasn't paying attention seriously. I'm not in market for a new gpu this gen at all. What was the controversy with the 4080 vs 4090?
It was multiple from price to grossly misleading marketing, I mean even for NVIDIA marketing team:
small_rtx-4080-slide.jpg


Calling the 4080 12GB 2-4x faster than a 3080TI is pushing it.

In particular to the 4080 16gb to 12gb was how low they want and still used the xx80 monikor and price on t he 4080 12GB

4090.........: AD102 76.3 millions transistor, 16,384 cores, 384 bits, 1,008 bandwith
4080 16gb: AD103, 45.9 millions transistor, 9,728 cores, 256 bits, 716 gb/s bandwith
4080 12gb: AD104 35.8 millions transistor, 7,680 cores, 192 bits 500 gb/s bandwith
 
Series X have been easy to get in a windows recently, but now with the holidays season it seem you must work to find them again.


It was multiple from price to grossly misleading marketing, I mean even for NVIDIA marketing team:
View attachment 530727

Calling the 4080 12GB 2-4x faster than a 3080TI is pushing it.

In particular to the 4080 16gb to 12gb was how low they want and still used the xx80 monikor and price on t he 4080 12GB

4090.........: AD102 76.3 millions transistor, 16,384 cores, 384 bits, 1,008 bandwith
4080 16gb: AD103, 45.9 millions transistor, 9,728 cores, 256 bits, 716 gb/s bandwith
4080 12gb: AD104 35.8 millions transistor, 7,680 cores, 192 bits 500 gb/s bandwith
You go from "grossly misleading" to "pushing it" - depending on the use case it hits that 2x-4x mark. A quick google found 2x pretty easily. I don't think we can call NVIDIA out too much for that stuff.
 
I thought I read a rumor (a good while back) that Amd was planning a card with 2 compute unit chips and have a big power draw like the 4090 was supposed to have.
Anyone else remember that or am I dreaming it?
 
I thought I read a rumor (a good while back) that Amd was planning a card with 2 compute unit chips and have a big power draw like the 4090 was supposed to have.
Anyone else remember that or am I dreaming it?
The MI 250 uses multiple compute chips, and gleefully does up to 600w. But that’s their OAM server chip and it’s price tag hurts.
 
You go from "grossly misleading" to "pushing it" - depending on the use case it hits that 2x-4x mark. A quick google found 2x pretty easily. I don't think we can call NVIDIA out too much for that stuff.
Depending if we talk about the 4080 12gb or 16gb has well, outside some DLSS 3.0 frame generation scenario I doubt a 4080 12 gb was ever close to be twice has fast as the 3080TI, let alone 4?
 
Good grief, folks... NVIDIA messed up, but it's not some unforgivable crime. If the RTX 4070 Ti is basically a 4080 12GB at a lower price, the company will have learned its lesson.
Nah, they didn't just "Mess Up" they tried to fuck us, and we caught them.

You see, nowhere on the box that I'm aware of tells you the CUDA core count of the graphics card. It does however tell you the size of the memory.

So, Nvidia tried to sell a 12GB model branded a 4080, cool, its just 4GB's less than the 16GB model right? WRONG!

The 4080 12GB model had significantly less power & CUDA core count that its 16GB version, but you wouldn't know that looking at the retail box, at most you would just assume that the 12GB-16GB models only differentiated in memory size...

And that is very deceiving to the general customer.
 
Nah, they didn't just "Mess Up" they tried to fuck us, and we caught them.

You see, nowhere on the box that I'm aware of tells you the CUDA core count of the graphics card. It does however tell you the size of the memory.

So, Nvidia tried to sell a 12GB model branded a 4080, cool, its just 4GB's less than the 16GB model right? WRONG!

The 4080 12GB model had significantly less power & CUDA core count that its 16GB version, but you wouldn't know that looking at the retail box, at most you would just assume that the 12GB-16GB models only differentiated in memory size...

And that is very deceiving to the general customer.
Then they doubled down on social media and claimed it's the same difference as 3080 10/12gb. Extremely scummy.
 
I think the pricing is off on the 7900xt, at least relative to the xtx.

To keep the same bus width and ram count, your bom cost has to be the same as the xtx, you would have all 6 MCDs with the same core die. This negates one of the primary benefit of chiplets, the ability combine cheaper cores with flexible io costs and still have the standard binning that's currently used.

The xt still uses the same big core chip as the xtx, the actual cost difference betweenan xt vs xtx might reasonably be $100 (one 6nm MCD and one 4gb ddr6 chip).

That's vs ad104 which is almost half the size of ad103, the difference in 12 vs 16 4080 is much bigger than xt vs xtx, at least in actual cost to produce.
 
I mean...it's true. Just in reverse. The 3080 12GB had more cores than the 3080 10GB.
Since the GeForce 4 MX at least (a Geforce 2 sold with the 4 name in it) we always had been in a strange naming convention.

It is not like there were many 3080 10gb around to buy instead and the performance difference was less than 5% between the 2 a simple GA102 yield got a bit better type of things, that was a much less than even a 1060 6gb vs 3gb type of difference and in a different ballpark alltogether than the 4080 16 vs 12gb.
 
Last edited:
AMD is playing a game yes... with much the same goals. I think everyone can see that. However they are playing games with 90/80 designations... not 70 designations. When you try and get people to spend over a grand for a 70 class card you are going to get more hate... then when you price an 80 class card at 900 bucks.

Yes I would have preferred AMD name the 7900 XTX -> 7900 XT. The 7900 XT -> 7800 XT. Perhaps they where really pushing to hit that 1k price tag I don't know. Logically ya they should be 7900 and 7800 at $999 and $799 (or $849). Who knows perhaps when the bench's drop it will make more sense, perhaps the XT isn't the massive downgrade we assume based on the specs. I guess if its within 10% and priced 10% lower we can't complain.

Nvidia released a 4090 and a 4080... then tried to also call the MID range 4070 a 4080. They have went and called it a TI it seems... which I doubt means there will ever actually be a 4070 non TI. IMO it just means they don't want to admit the 4080 12GB was in fact just a 70 class card. They won't even going to bother with 4060s and lower. They probably have enough 3000s to just skip this generation completely at the real mid range and down.
 
Who knows perhaps when the bench's drop it will make more sense, perhaps the XT isn't the massive downgrade we assume based on the specs. I guess if its within 10% and priced 10% lower we can't complain.
Amd did release some benchmark later on after the announcement:
https://www.pcgamesn.com/amd/radeon-rx-7900-xt-benchmarks

And they fall pretty much exactly where we would expect them (5/6 of a 7900xtx spec wise or 83% in the worst case scenario)
Resident evil: 83% of a xtx
CODMW2: 84%
cyberpunk: 83.3%
WatchDog: 85%

Pretty much around what would a 5/6 xtx working yield card would give us

Maybe the story change at 1440p.

My guess if $1000 xtx actually exists (a big if), AMD will simply not be making xt very much and will not need to sales very much anyway
 
Amd did release some benchmark later on after the announcement:
https://www.pcgamesn.com/amd/radeon-rx-7900-xt-benchmarks

And they fall pretty much exactly where we would expect them (5/6 of a 7900xtx spec wise or 83% in the worst case scenario)
Resident evil: 83% of a xtx
CODMW2: 84%
cyberpunk: 83.3%
WatchDog: 85%

Pretty much around what would a 5/6 xtx working yield card would give us

Maybe the story change at 1440p.

My guess if $1000 xtx actually exists (a big if), AMD will simply not be making xt very much and will not need to sales very much anyway

The XT is just something to use the defective dies on.
 
Which is often the case ?, but I feel the difference here would be about how good the yield is expected to be on the xtx
It's on a mature node and the actual die is not that big, I'm betting on there not being very many XTs and a lot of XTXs. Odds on Nvidia not reducing the price of the "4080 12gb?" I'm saying it stays firm at 799.
 
Odds on Nvidia not reducing the price of the "4080 12gb?" I'm saying it stays firm at 799.
Despite the rumored low volume (not sure how much reliable those word around are, but still seem consensus) seem till possible to order online a RTX 4080, not those at MSRP but still unheard off.

Depends what we mean by price reducing I can see them (and AIB) reduce the price, maybe not officially but by having them not much higher than $799 which would be a massive price cut versus what I imagine they planned. Not sure what announced MSRP we need to have new 4070TI sold not too much above $799 around, $749 ?

I could also see quite the price cut on 4080 by end of spring 2023 with them being sold at only $100 more than $1200 USD or close to it
 
If I were buying anything this gen, I would buy a $1000 7900 XTX 24GB before I would buy a "4070 Ti" 12gb at ANY price. Heck, I would buy a $1500 7950 XTX before I would buy a "current gen" nvidia card with only 12gb. AMD's 6800 (non-XT) released two years ago had 16gb - that was their fourth best card of the entire last-gen. Here is Nvidia's #2 card two years later and it only has 12gb. Pathetic "4070 Ti".
 
According to Nvidia's own (generally cherry picked and BS) benchmarks the 4080 12GB was a weak **70 entry. It will be an even weaker **70Ti.

They're going to need to greatly cut the price or boost the performance, otherwise this will still be underwhelming. Lopping off $200 won't cut it. This may very well end up performing close to **60Ti territory.
 
If I were buying anything this gen, I would buy a $1000 7900 XTX 24GB before I would buy a "4070 Ti" 12gb at ANY price. Heck, I would buy a $1500 7950 XTX before I would buy a "current gen" nvidia card with only 12gb. AMD's 6800 (non-XT) released two years ago had 16gb - that was their fourth best card of the entire last-gen. Here is Nvidia's #2 card two years later and it only has 12gb. Pathetic "4070 Ti".
Meh, for 1440p and below 12GB at that speed when paired up with fast ram and any of the nvme to vram access methods is more than enough. Adding more isn't going to improve things there to any notable degree before the chip itself becomes the bottleneck.
 
According to Nvidia's own (generally cherry picked and BS) benchmarks the 4080 12GB was a weak **70 entry. It will be an even weaker **70Ti.

They're going to need to greatly cut the price or boost the performance, otherwise this will still be underwhelming. Lopping off $200 won't cut it. This may very well end up performing close to **60Ti territory.
I recall the leaked benchmarks showing it landing between the 3080 and the 3080TI, while using a fair bit less power 270'ish Watts. The biggest issue was its MSRP of $900 USD put it more expensive than the 3080TI's that were available on market at the time.
But if the 7900xt is looking to beat out the 3080TI by 30-40% and it has the MSRP of $899 USD, then really anything north of $700 is going to make the 4070TI look very bad in comparison. Really it should be closer to $600 but that is only if the 7900xt remains available at that $899 MSRP, it's really going to depend on what the AIBs do with them.
But I agree, the now-named 4070TI should have probably been named and marketed at the 4060TI leaving room for something in between because there is a very large gap between the die used in them. But who knows what Nvidia is doing right now, their overstock on the 3000 cards has upended their entire release plans and I'm pretty sure their marketing department spent their crypto fortune on blow and phoned this one in.
 
Despite the rumored low volume (not sure how much reliable those word around are, but still seem consensus) seem till possible to order online a RTX 4080, not those at MSRP but still unheard off.

Using that logic, the 3080 and 3090 were always in stock if you wanted to overpay for one. The only 4080s that are available readily are the ones that cost as much as a 4090.

Personally, I'm not going to buy a 4080 at $1500+. I'm also not going to buy a 4090 at $2000+. Two years ago, you could make a lot of your upfront costs back by mining in the off time. Now, there is no crypto mining to offset the cost.
 
  • Like
Reactions: DPI
like this
Back
Top