And add $100 to the price because “now it’s a Ti”.LOL....I will say this is the right move. Just release it later and call it the 4070ti.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
And add $100 to the price because “now it’s a Ti”.LOL....I will say this is the right move. Just release it later and call it the 4070ti.
And add $100 to the price because “now it’s a Ti”.
ATX 3.0 isn't Nvidia's spec. Any cockup there lays entirely with the PSU makers.Reportedly they sent the PSU makers (at least Corsair) a different set of specs than Nvidia ended up using.
I would imagine almost certainly considering the product that launched, Is it the first new generation launch that only $1500+ cards are available ? Last time the launch was the 3080 first I thinkIs it me or did the 4090 lines seem smaller than a normal launch day picture?
How many people have that kind of time and money to sit in line to drop $2K in this market weeks before the competition launches their product? The 4090 is a very high-performing part and the work Nvidia put into it is to be commended, but it is not a part for the masses, that is for the top 1% of the top 5% so, like the .05% of gamers, it is a moonshot halo product that currently exists in a vacuum.Is it me or did the 4090 lines seem smaller than a normal launch day picture?
That's not ATX 3.0, it's PCI-SIG, and Nvidia controls how their card behaves regarding the sensing pins.ATX 3.0 isn't Nvidia's spec. Any cockup there lays entirely with the PSU makers.
What's normal though?Is it me or did the 4090 lines seem smaller than a normal launch day picture?
4090 launched by itself, with overkill power (26.33 Teraframes/second in Pacman) and price for 99.98% of gamers, during a glut of cheap 3000 series cards.
Same. I play on 3440x1440 and the 4090 looks like a card I can game on for 4+ years. The only thing that could make it obsolete is heavy RT but Nvidia solved that with better T cores and DLSS 3.0."Cheap" is relative. The 3000 series cards are only "cheap" in comparison to how much they cost during the mining craze. They are still plenty expensive, and being older cards, they probably won't last you as long before you need to upgrade again. A card like the 3090 Ti still has some appeal. But, would I rather pay $1000 for an older and slower card that will last me 2 years or would I rather pay $1600 for a newer and faster card that will last me 4 years? I could see myself buying a 3000 series card if prices came down to the point where I could get a 3090 Ti for ~$600 or maybe a regular 3090 for ~$500, but until then I'd rather hold out for a 4090.
I wouldn't make that bet. The 1080ti kept its place for as long as it did because there was no competition and the gaming industry at large decided console ports where the best way to control spiraling costs of AAA game production.Same. I play on 3440x1440 and the 4090 looks like a card I can game on for 4+ years. The only thing that could make it obsolete is heavy RT but Nvidia solved that with better T cores and DLSS 3.0.
The 4090/Ti will be the card(s) that never goes away, this generations 1080Ti.
Disagree. There's no resolution 4K and below the 4090 can't run well under raw performance and DLSS 3.0 enabled gives you next gen performance.I wouldn't make that bet. The 1080ti kept its place for as long as it did because there was no competition and the gaming industry at large decided console ports where the best way to control spiraling costs of AAA game production.
Today we have one solid competitor, one that is going to keep trying because they have to... without AMD I would guarantee Ada would never have been used in a consumer card. If AMD had nothing to release for another year or two that is how long you would still be buying 3000s. Ada would have been used in the Datacenter... like tesla was.
We also have the majority of AAA games now being turned out on a handful of super game engines like Unreal that have reduced costs of development... and allowed for scaling meaning the same game can be developed for consoles and to take advantage of everything a PC setup can muster.
4090 will obviously hold up longer then a mid range card. But it won't have the legs the 1080 did.
The difference is simple. The 1080 was a massive generational leap... however there was ZERO competition at the time. Nvidia actually did develop an entire GPU generation after 1080 that they never put in a Consumer gaming card. Because they didn't have to.Disagree. There's no resolution 4K and below the 4090 can't run well under raw performance and DLSS 3.0 enabled gives you next gen performance.
Just when you think it's dead, it will continue to be compared to new cards and people won't upgrade.
The consensus is that the card has the largest generational leap, yet you say it won't last long.
I would argue that DLSS becomes less of an issue at 1440p or 1080p. There are plenty of cards there that will push a consistent 120+ FPS on max settings without resorting to tricks like DLSS.The difference is simple. The 1080 was a massive generational leap... however there was ZERO competition at the time. Nvidia actually did develop an entire GPU generation after 1080 that they never put in a Consumer gaming card. Because they didn't have to.
AMD will have their answer to Nvidia in a month.
Intel will have a second generation probably next year. As underwhelming as Intels first cards are... they show that their Arch is actually very innovative, the way they are doing RT is actually superior to Nvidias. They don't have to sandwich tensor cores on 1/3 of the die to achieve good RT numbers. Out of the gate they are doing RT better then AMD and Nvidia at = raster performance level. If their second gen solves some of the obvious issues with their first gen... Intel is probably going to be real competition. AMD is also going to have a second gen of chiplet based GPUs (the 8000s?) ready to go in 2 years or less.
We just aren't in the same world the 1080 launched in. Also for what its worth DLSS 3 by all accounts is terrible.... its useable on the 4090 cause the FPS is high enough the generated frames don't stick around long... and lag is less of an issue. On mid range cards DLSS 3 is going to be annoying... no one wants to see 120 FPS but feel 60fps response. (games will keep progressing... and in 2 years the 4090 will be closer to the mid range) Also have you not watched some of the reviewers showing off how DLSS 3 garmbles UIs ect. Its garbage... unless Nvidia fixes it with a DLSS 3.5/4. If its possible to fix it anyway. The loss in response time isn't noticeable today, and the garbling of images isn't as notifiable... but when the games hit that the 4090 drops into the 70-80 fps mark native DLSS 3 is going to be an annoying solution. (of course you can probably just use DLSS 2)
No doubt if you can hit 120 fps native why use DLSS. Which is the funny thing with the 4090 launch... it is powerful enough to often hit native 120fps even at 4k. I'm not sure why you would care about DLSS 3 on that card.I would argue that DLSS becomes less of an issue at 1440p or 1080p. There are plenty of cards there that will push a consistent 120+ FPS on max settings without resorting to tricks like DLSS.
Except the 1080ti was a $700 to $800 cardSame. I play on 3440x1440 and the 4090 looks like a card I can game on for 4+ years. The only thing that could make it obsolete is heavy RT but Nvidia solved that with better T cores and DLSS 3.0.
The 4090/Ti will be the card(s) that never goes away, this generations 1080Ti.
I think it has some benefits if you are considering it from a cloud gaming perspective, latency there is already a problem DLSS3 wouldn't add to it. And what if the data center was only rendering the odd number frames with a simple motion engine on the player side rendering the even ones? Sort of gaming interlacing? But I agree I do use DLSS 2 on the few games that have it because it keeps things smooth when I crank up all the RT effects but I am prone to motion sickness under the best of times.No doubt if you can hit 120 fps native why use DLSS. Which is the funny thing with the 4090 launch... it is powerful enough to often hit native 120fps even at 4k. I'm not sure why you would care about DLSS 3 on that card.
DLSS 3 in general though seems like a terrible idea. Frame generation... has no one at Nvidia never turned motion compensation built into their TVS on. Even when your just watching video and not playing a game frame generation looks odd. I haven't seen DLSS 3 in person yet... but from what I have heard from reviewers it sounds like it would probably make me physically ill. lol
Wrong. You think a month before a delayed retail launch (since 30x0 inventory overstock) they didn’t have the 3080 12GB boxes printed, the cards packed, the pallets stacked by the hundreds of thousands if not more? (Worldwide). Now they have to unpack, update vbios firmware, and repack all those hundreds of thousands of cards in all the warehouses.Honestly, I am surprised, but after the online backlash I am sure somebody in a board room was reminded of the 970 3.5GB fiasco and the lawsuits that came from that and they didn't want the headache when it is easily fixed with the change of a single number on a bit of box art and "Find and Replace" command on some website CMSs.
day one driver update and a few stickers.Wrong. You think a month before a delayed retail launch (since 30x0 inventory overstock) they didn’t have the 3080 12GB boxes printed, the cards packed, the pallets stacked by the hundreds of thousands if not more? (Worldwide). Now they have to unpack, update vbios firmware, and repack all those hundreds of thousands of cards in all the warehouses.
Couldn't be more wrong.day one driver update and a few stickers.
I'm half f'ing with you but I really don't expect much from most of the AIBs, part of me thinks that the 4080 12gb was only going to be OEM only.Couldn't be more wrong.
I'm half f'ing with you but I really don't expect much from most of the AIBs, part of me thinks that the 4080 12gb was only going to be OEM only.
You mean right around that time that all the OEMs start blasting their new systems right in time for the Christmas sales?Its a fair assumption but reportedly the cards were ready to go for next month from Gamer's Nexus' sources.
I'm guessing it was a ~$500 piece of crap that suits decided "people were willing to spend way more a year ago so let's sell it at $500"It is still a $900 piece of crap..so who cares?
Given its performance, it competes too well at $500 with the available 3000 stock, Nvidia can't have that right now.I'm guessing it was a ~$500 piece of crap that suits decided "people were willing to spend way more a year ago so let's sell it at $500"
All this is true. But if the hype and demand was there, I’d expect people to have tents and lawn chairs and shelters set up. There will always be demand for the high end, but methinks Nvidia is trying to convince us the demand is huge to trigger our fomo instincts.How many people have that kind of time and money to sit in line to drop $2K in this market weeks before the competition launches their product? The 4090 is a very high-performing part and the work Nvidia put into it is to be commended, but it is not a part for the masses, that is for the top 1% of the top 5% so, like the .05% of gamers, it is a moonshot halo product that currently exists in a vacuum.
4080 Suckers Editionrename it the 4080 bastard edition
Trick Question: Nvidia is always a dick.
That is probably closer to the truth. AMD probably was planning to drop a 7700 as they don't have nearly as bad a stock issue as Nvidia. 7900 might be a bit shy of the 4090... 7800 might trade blows with the 3080. But the 7700 was probably going to embarrass a 3080 (in name) for less scratch. That is likely all Nvidia is trying to prevent unlaunching.Yeah, the truth is probably that they got wind that some lesser Radeon card will smoke this and they don't want an 80-series card to lose. I doubt this has anything to do with gamers calling it a fake 80-series or that it should be a 70-series.
If Nvidia could sell a 70-, 60-, or even 50-series class card for 80-series class money, they'd do it seven days a week and twice on Sundays.
The GN video said that AIBs had already been printing retail boxes.I'm half f'ing with you but I really don't expect much from most of the AIBs, part of me thinks that the 4080 12gb was only going to be OEM only.
Just saw one rumor that AMD is launching 7900 7800 and 7600. I guess that could be some gamesmanship if they felt confident their third card down was going to own Nvidia. Perhaps they wouldn't match the datacenter ada 90. However if they released a 7600 that smoked a "4080" that would be embarrassing.That is probably closer to the truth. AMD probably was planning to drop a 7700 as they don't have nearly as bad a stock issue as Nvidia. 7900 might be a bit shy of the 4090... 7800 might trade blows with the 3080. But the 7700 was probably going to embarrass a 3080 (in name) for less scratch. That is likely all Nvidia is trying to prevent unlaunching.