Nvidia has "Unlaunched" the RTX 4080 12GB

More like a marketing stunt to make the 30 series prices still look OK. First thought. Now no one knew at Nvidia this would come across like a lead balloon is also kinda interesting if not funny. Now if no one or a few only complained about the stupidity -> Nvidia would be smiling more today then ever.
 
Is it me or did the 4090 lines seem smaller than a normal launch day picture?
I would imagine almost certainly considering the product that launched, Is it the first new generation launch that only $1500+ cards are available ? Last time the launch was the 3080 first I think
 
Is it me or did the 4090 lines seem smaller than a normal launch day picture?
How many people have that kind of time and money to sit in line to drop $2K in this market weeks before the competition launches their product? The 4090 is a very high-performing part and the work Nvidia put into it is to be commended, but it is not a part for the masses, that is for the top 1% of the top 5% so, like the .05% of gamers, it is a moonshot halo product that currently exists in a vacuum.
 
Is it me or did the 4090 lines seem smaller than a normal launch day picture?
What's normal though?

3070, 3080 and 3090 all launched on the same day in Sep 2020 during covid supply crunch and people stuck at home with stim money to spend, 4-5 months before mining began competing for stock.

4090 launched by itself, with overkill power (26.33 Teraframes/second in Pacman) and price for 99.98% of gamers, during a glut of cheap 3000 series cards.
 
Last edited:
4090 launched by itself, with overkill power (26.33 Teraframes/second in Pacman) and price for 99.98% of gamers, during a glut of cheap 3000 series cards.

"Cheap" is relative. The 3000 series cards are only "cheap" in comparison to how much they cost during the mining craze. They are still plenty expensive, and being older cards, they probably won't last you as long before you need to upgrade again. A card like the 3090 Ti still has some appeal. But, would I rather pay $1000 for an older and slower card that will last me 2 years or would I rather pay $1600 for a newer and faster card that will last me 4 years? I could see myself buying a 3000 series card if prices came down to the point where I could get a 3090 Ti for ~$600 or maybe a regular 3090 for ~$500, but until then I'd rather hold out for a 4090.
 
"Cheap" is relative. The 3000 series cards are only "cheap" in comparison to how much they cost during the mining craze. They are still plenty expensive, and being older cards, they probably won't last you as long before you need to upgrade again. A card like the 3090 Ti still has some appeal. But, would I rather pay $1000 for an older and slower card that will last me 2 years or would I rather pay $1600 for a newer and faster card that will last me 4 years? I could see myself buying a 3000 series card if prices came down to the point where I could get a 3090 Ti for ~$600 or maybe a regular 3090 for ~$500, but until then I'd rather hold out for a 4090.
Same. I play on 3440x1440 and the 4090 looks like a card I can game on for 4+ years. The only thing that could make it obsolete is heavy RT but Nvidia solved that with better T cores and DLSS 3.0.
The 4090/Ti will be the card(s) that never goes away, this generations 1080Ti.
 
Same. I play on 3440x1440 and the 4090 looks like a card I can game on for 4+ years. The only thing that could make it obsolete is heavy RT but Nvidia solved that with better T cores and DLSS 3.0.
The 4090/Ti will be the card(s) that never goes away, this generations 1080Ti.
I wouldn't make that bet. The 1080ti kept its place for as long as it did because there was no competition and the gaming industry at large decided console ports where the best way to control spiraling costs of AAA game production.

Today we have one solid competitor, one that is going to keep trying because they have to... without AMD I would guarantee Ada would never have been used in a consumer card. If AMD had nothing to release for another year or two that is how long you would still be buying 3000s. Ada would have been used in the Datacenter... like tesla was.
We also have the majority of AAA games now being turned out on a handful of super game engines like Unreal that have reduced costs of development... and allowed for scaling meaning the same game can be developed for consoles and to take advantage of everything a PC setup can muster.

4090 will obviously hold up longer then a mid range card. But it won't have the legs the 1080 did.
 
  • Like
Reactions: kac77
like this
I wouldn't make that bet. The 1080ti kept its place for as long as it did because there was no competition and the gaming industry at large decided console ports where the best way to control spiraling costs of AAA game production.

Today we have one solid competitor, one that is going to keep trying because they have to... without AMD I would guarantee Ada would never have been used in a consumer card. If AMD had nothing to release for another year or two that is how long you would still be buying 3000s. Ada would have been used in the Datacenter... like tesla was.
We also have the majority of AAA games now being turned out on a handful of super game engines like Unreal that have reduced costs of development... and allowed for scaling meaning the same game can be developed for consoles and to take advantage of everything a PC setup can muster.

4090 will obviously hold up longer then a mid range card. But it won't have the legs the 1080 did.
Disagree. There's no resolution 4K and below the 4090 can't run well under raw performance and DLSS 3.0 enabled gives you next gen performance.
Just when you think it's dead, it will continue to be compared to new cards and people won't upgrade.

The consensus is that the card has the largest generational leap, yet you say it won't last long.
 
Disagree. There's no resolution 4K and below the 4090 can't run well under raw performance and DLSS 3.0 enabled gives you next gen performance.
Just when you think it's dead, it will continue to be compared to new cards and people won't upgrade.

The consensus is that the card has the largest generational leap, yet you say it won't last long.
The difference is simple. The 1080 was a massive generational leap... however there was ZERO competition at the time. Nvidia actually did develop an entire GPU generation after 1080 that they never put in a Consumer gaming card. Because they didn't have to.
AMD will have their answer to Nvidia in a month.
Intel will have a second generation probably next year. As underwhelming as Intels first cards are... they show that their Arch is actually very innovative, the way they are doing RT is actually superior to Nvidias. They don't have to sandwich tensor cores on 1/3 of the die to achieve good RT numbers. Out of the gate they are doing RT better then AMD and Nvidia at = raster performance level. If their second gen solves some of the obvious issues with their first gen... Intel is probably going to be real competition. AMD is also going to have a second gen of chiplet based GPUs (the 8000s?) ready to go in 2 years or less.

We just aren't in the same world the 1080 launched in. Also for what its worth DLSS 3 by all accounts is terrible.... its useable on the 4090 cause the FPS is high enough the generated frames don't stick around long... and lag is less of an issue. On mid range cards DLSS 3 is going to be annoying... no one wants to see 120 FPS but feel 60fps response. (games will keep progressing... and in 2 years the 4090 will be closer to the mid range) Also have you not watched some of the reviewers showing off how DLSS 3 garmbles UIs ect. Its garbage... unless Nvidia fixes it with a DLSS 3.5/4. If its possible to fix it anyway. The loss in response time isn't noticeable today, and the garbling of images isn't as notifiable... but when the games hit that the 4090 drops into the 70-80 fps mark native DLSS 3 is going to be an annoying solution. (of course you can probably just use DLSS 2)
 
Last edited:
The difference is simple. The 1080 was a massive generational leap... however there was ZERO competition at the time. Nvidia actually did develop an entire GPU generation after 1080 that they never put in a Consumer gaming card. Because they didn't have to.
AMD will have their answer to Nvidia in a month.
Intel will have a second generation probably next year. As underwhelming as Intels first cards are... they show that their Arch is actually very innovative, the way they are doing RT is actually superior to Nvidias. They don't have to sandwich tensor cores on 1/3 of the die to achieve good RT numbers. Out of the gate they are doing RT better then AMD and Nvidia at = raster performance level. If their second gen solves some of the obvious issues with their first gen... Intel is probably going to be real competition. AMD is also going to have a second gen of chiplet based GPUs (the 8000s?) ready to go in 2 years or less.

We just aren't in the same world the 1080 launched in. Also for what its worth DLSS 3 by all accounts is terrible.... its useable on the 4090 cause the FPS is high enough the generated frames don't stick around long... and lag is less of an issue. On mid range cards DLSS 3 is going to be annoying... no one wants to see 120 FPS but feel 60fps response. (games will keep progressing... and in 2 years the 4090 will be closer to the mid range) Also have you not watched some of the reviewers showing off how DLSS 3 garmbles UIs ect. Its garbage... unless Nvidia fixes it with a DLSS 3.5/4. If its possible to fix it anyway. The loss in response time isn't noticeable today, and the garbling of images isn't as notifiable... but when the games hit that the 4090 drops into the 70-80 fps mark native DLSS 3 is going to be an annoying solution. (of course you can probably just use DLSS 2)
I would argue that DLSS becomes less of an issue at 1440p or 1080p. There are plenty of cards there that will push a consistent 120+ FPS on max settings without resorting to tricks like DLSS.
 
I would argue that DLSS becomes less of an issue at 1440p or 1080p. There are plenty of cards there that will push a consistent 120+ FPS on max settings without resorting to tricks like DLSS.
No doubt if you can hit 120 fps native why use DLSS. Which is the funny thing with the 4090 launch... it is powerful enough to often hit native 120fps even at 4k. I'm not sure why you would care about DLSS 3 on that card.
DLSS 3 in general though seems like a terrible idea. Frame generation... has no one at Nvidia never turned motion compensation built into their TVS on. Even when your just watching video and not playing a game frame generation looks odd. I haven't seen DLSS 3 in person yet... but from what I have heard from reviewers it sounds like it would probably make me physically ill. lol
 
Same. I play on 3440x1440 and the 4090 looks like a card I can game on for 4+ years. The only thing that could make it obsolete is heavy RT but Nvidia solved that with better T cores and DLSS 3.0.
The 4090/Ti will be the card(s) that never goes away, this generations 1080Ti.
Except the 1080ti was a $700 to $800 card

They’ve doubled the consumer price in 5 years (and more than doubled the performance)
 
No doubt if you can hit 120 fps native why use DLSS. Which is the funny thing with the 4090 launch... it is powerful enough to often hit native 120fps even at 4k. I'm not sure why you would care about DLSS 3 on that card.
DLSS 3 in general though seems like a terrible idea. Frame generation... has no one at Nvidia never turned motion compensation built into their TVS on. Even when your just watching video and not playing a game frame generation looks odd. I haven't seen DLSS 3 in person yet... but from what I have heard from reviewers it sounds like it would probably make me physically ill. lol
I think it has some benefits if you are considering it from a cloud gaming perspective, latency there is already a problem DLSS3 wouldn't add to it. And what if the data center was only rendering the odd number frames with a simple motion engine on the player side rendering the even ones? Sort of gaming interlacing? But I agree I do use DLSS 2 on the few games that have it because it keeps things smooth when I crank up all the RT effects but I am prone to motion sickness under the best of times.
Ordered a decent 1440p GSync certified monitor hoping that help alleviate it.
 
Honestly, I am surprised, but after the online backlash I am sure somebody in a board room was reminded of the 970 3.5GB fiasco and the lawsuits that came from that and they didn't want the headache when it is easily fixed with the change of a single number on a bit of box art and "Find and Replace" command on some website CMSs.
Wrong. You think a month before a delayed retail launch (since 30x0 inventory overstock) they didn’t have the 3080 12GB boxes printed, the cards packed, the pallets stacked by the hundreds of thousands if not more? (Worldwide). Now they have to unpack, update vbios firmware, and repack all those hundreds of thousands of cards in all the warehouses.
 
Wrong. You think a month before a delayed retail launch (since 30x0 inventory overstock) they didn’t have the 3080 12GB boxes printed, the cards packed, the pallets stacked by the hundreds of thousands if not more? (Worldwide). Now they have to unpack, update vbios firmware, and repack all those hundreds of thousands of cards in all the warehouses.
day one driver update and a few stickers.
 
I'm half f'ing with you but I really don't expect much from most of the AIBs, part of me thinks that the 4080 12gb was only going to be OEM only.

Its a fair assumption but reportedly the cards were ready to go for next month from Gamer's Nexus' sources.
 
I'm guessing it was a ~$500 piece of crap that suits decided "people were willing to spend way more a year ago so let's sell it at $500"
Given its performance, it competes too well at $500 with the available 3000 stock, Nvidia can't have that right now.
 
How many people have that kind of time and money to sit in line to drop $2K in this market weeks before the competition launches their product? The 4090 is a very high-performing part and the work Nvidia put into it is to be commended, but it is not a part for the masses, that is for the top 1% of the top 5% so, like the .05% of gamers, it is a moonshot halo product that currently exists in a vacuum.
All this is true. But if the hype and demand was there, I’d expect people to have tents and lawn chairs and shelters set up. There will always be demand for the high end, but methinks Nvidia is trying to convince us the demand is huge to trigger our fomo instincts.
 
They are not gonna move this much down the stack or in price. My guess is 4070ti at $800
 
Trick Question: Nvidia is always a dick.

Yeah, the truth is probably that they got wind that some lesser Radeon card will smoke this and they don't want an 80-series card to lose. I doubt this has anything to do with gamers calling it a fake 80-series or that it should be a 70-series.

If Nvidia could sell a 70-, 60-, or even 50-series class card for 80-series class money, they'd do it seven days a week and twice on Sundays.
 
Yeah, the truth is probably that they got wind that some lesser Radeon card will smoke this and they don't want an 80-series card to lose. I doubt this has anything to do with gamers calling it a fake 80-series or that it should be a 70-series.

If Nvidia could sell a 70-, 60-, or even 50-series class card for 80-series class money, they'd do it seven days a week and twice on Sundays.
That is probably closer to the truth. AMD probably was planning to drop a 7700 as they don't have nearly as bad a stock issue as Nvidia. 7900 might be a bit shy of the 4090... 7800 might trade blows with the 3080. But the 7700 was probably going to embarrass a 3080 (in name) for less scratch. That is likely all Nvidia is trying to prevent unlaunching.
 
I'm half f'ing with you but I really don't expect much from most of the AIBs, part of me thinks that the 4080 12gb was only going to be OEM only.
The GN video said that AIBs had already been printing retail boxes.
 
That is probably closer to the truth. AMD probably was planning to drop a 7700 as they don't have nearly as bad a stock issue as Nvidia. 7900 might be a bit shy of the 4090... 7800 might trade blows with the 3080. But the 7700 was probably going to embarrass a 3080 (in name) for less scratch. That is likely all Nvidia is trying to prevent unlaunching.
Just saw one rumor that AMD is launching 7900 7800 and 7600. I guess that could be some gamesmanship if they felt confident their third card down was going to own Nvidia. Perhaps they wouldn't match the datacenter ada 90. However if they released a 7600 that smoked a "4080" that would be embarrassing.
 
Think about it logically for a minute...

Nvidia themselves were not going to release the 12Gb 4080, it was going to be an AIB only item.

Why do you think that is?

I bet it was one of the AIB's who said "let's make this card here a '4080' in name and jack up the price...", and Nvidia agreed to help them deal with the 3xxx stock that needs moved. So the 4080 12Gb gets announced, fools no one, Nvidia has to finally say this was a bad idea, and here we are.

It only makes sense to even do it to help AIB's keep moving the old 3xxx stock, and that includes the pricing.

****

Rumors are that renamed card will stay the same and the price is going to drop. From the leaked specs on the 4070, calling it a Ti makes the most sense as the specs are same but it has a nice clock boost compared to the 4070.
 
This gives them an opportunity to counter what AMD is bringing out. New price, new name, feelgood stories that make it seemed like "they listened," etc.
Even if this card isn't as beastly as the 4090, it's still going to be a powerhouse and half. The price was the real issue. Plus, they have room to pitch their proprietary tech. That's still Nvidia's strongpoint. Ray tracing might as well = RTX at this point. Plenty of cards already on the market can run 4K/60+ without ray tracing as-is. You don't actually even need a new card for that.
I'm not going to be a conspiracy theorist and claim this is intentional (it obviously isn't), but it could actually work out for the best for them.
 
Back
Top