NVIDIA CEO Jensen Huang hints at ‘exciting’ next-generation GPU update on September 20th Tuesday

Its not really internal SLI as I understand it... I guess we'll get more details soon.

It sounds like just like Ryzen the CPU and all the bits are in one hunk of silicon. Its the supporting things that don't need to be on the latest Fab that are off loaded. CPU side that means memory controllers. GPU side we'll see what they off load.

Chiplets gets more exciting when you think about how AMD can better tailor products. They can take the same core raster chiplets and use them for consumer gaming, workstation parts, and AI... potentially I assume they can also have more compute heavy chiplets they could leave out of consumer gaming parts (reducing cost) and double up in data center packages greatly increasing yields on compute monsters. All speculation right now. I don't think we'll have to worry about SLI like issues though... we worried about the same stuff with the first Zens.

PS... looking more at what the 4090 is though looking more like a Data Center chip with the insanity level of tensor hardware. I believe in past generations Nvidia would never have dropped that chip in a consumer card (at least one that isn't an official titan) I suspect the believe AMD is going to have some god level RT bits in chiplet form or something. And yes I'll come back to this post and laugh at myself in a few weeks if AMDs cards are underwhelming. lol
Nvidia had to put all of those tensor cores into the 4090. In Techpowerup's review, t got 68fps at 4K in Control with full RT effects. It takes a 4090 to just now do that.

4K TVs have been mainstream for awhile. And PC gamers have started turning to TVs for the image quality. PC graphics have been lagging in comfortably featuring 4K.
 
We'll see if that is true it means Gen4 RT cores are no better then Gen3.
4080 304 tensors
3090 328 tensors
The number of hardware units is basically identical... if all 4080 can do is = 3090 in RT that means basically zero RT uplift this generation outside the one SKU that super stacks the Tensor bits.
It depends, maybe RT cores are only a part of what will determine frame rate of a game with RT on:
https://hardforum.com/threads/rtx-4xxx-rx-7xxx-speculation.2015174/post-1045466632

RT TFlops
3090 TI: 78.1
4080 12: 92.7
4080 16: 112.7
4090...: 191


Purely on RT core performance, 4080 16 should be 40% more than a 3090TI, 60% than a 3080TI, not sure if it will fully translate (and the RT part are just a fraction of the MS involved in a frame, doubling it should double the fps, like twice the GPU will not give you twice the framerate)
 
Last edited:
  • Like
Reactions: ChadD
like this
It sounds like just like Ryzen the CPU and all the bits are in one hunk of silicon. Its the supporting things that don't need to be on the latest Fab that are off loaded. CPU side that means memory controllers. GPU side we'll see what they off load.
From the speculation I've seen, it's at least: 1 chiplet that's the GPU cores, a bunch of memory controllers, and at least one more I can't remember what it does.
 
We'll see if that is true it means Gen4 RT cores are no better then Gen3.
4080 304 tensors
3090 328 tensors
The number of hardware units is basically identical... if all 4080 can do is = 3090 in RT that means basically zero RT uplift this generation outside the one SKU that super stacks the Tensor bits.

I suspect there is some generation improvement. I think it will be telling for RT how much of an improvement. It needs to be substantial imo or 4060s will be basically as useless for RT as 3060 class hardware. For RT to take off it needs to be usable in the regular human price segments.
The most significant gains are likely to be realized with games that have heavier RT like Cyberpunk 2077 with Overdrive Mode and with path traced games like Quake RTX, Minecraft RTX and Portal RTX. The less hardware intensive the RT implementation, the less gains we'll see gen on gen. At least, that's what I'm expecting.
 
The most significant gains are likely to be realized with games that have heavier RT like Cyberpunk 2077 with Overdrive Mode and with path traced games like Quake RTX, Minecraft RTX and Portal RTX. The less hardware intensive the RT implementation, the less gains we'll see gen on gen. At least, that's what I'm expecting.
Possibly. The 3080 will be a good guide as too how much of a generational uplift there really is. The 4090 almost doesn't count it is a completely different die over 4080. With 40% more RT/Tensor cores. Of course it is doing well in RT stuff it is the full fat Data center lovelace chip turned into a consumer sku.

The 4080 16gb should be a good test of generational improvement as it has basically identical numbers of RT/Tensor cores as the 3090. If it sees the same % of performance hit turning RT on as the 3090 does... that means the generational uplift is pretty minimal and I would expect whatever they sell as the 3070 and the 3060 parts will still be lacking for main stream ray tracing. On the flip side of course if the dip is within reason perhaps there is hope that 4060/70s could be finally have hardware capable of using some RT our side of grabbing screen shots.
 
4080 12 GB is a yikes in Plague Tale.

3653cbielat91.png
 
And that directly from Nvidia own marketing material, so I do not imagine a special issue with the methodology:
https://www.nvidia.com/en-us/geforce/news/october-2022-rtx-dlss-game-updates/

None of the example show the 4080 12 beating the 3090TI which should have been normal and a non issue if the later was still selling just before launch near the MSRP price, but the gap with the 3080 is worrisome has well, that DLSS 3.0 and new RT better getting common before the xx80s launch or the new-used ampere offer price rise back.

3090 on pcpart picker goes at $1000 at the moment, 3080 TI $800 if you are ready to go Zotac, you have to deal with bigger power issues, but you have 24 gig of Vram
 
Its not really internal SLI as I understand it... I guess we'll get more details soon.

PS... looking more at what the 4090 is though looking more like a Data Center chip with the insanity level of tensor hardware. I believe in past generations Nvidia would never have dropped that chip in a consumer card (at least one that isn't an official titan) I suspect the believe AMD is going to have some god level RT bits in chiplet form or something. And yes I'll come back to this post and laugh at myself in a few weeks if AMDs cards are underwhelming. lol
Yep that's why I put "internal-SLI" in quotes because I meant it figuratively, not literal SLI. The concern is where might any potential downsides to the move to chiplet GPU architecture express itself- will we see microstutter, will we see higher frametimes. High frametimes are a showstopper for applications like VR. I don't actually expect RDNA 3 to ship with any high visibility problems like those - I'm sure engineers have mitigated as best they could - but those would be the questions for benchmarks.
 
PS... looking more at what the 4090 is though looking more like a Data Center chip with the insanity level of tensor hardware. I believe in past generations Nvidia would never have dropped that chip in a consumer card (at least one that isn't an official titan) I suspect the believe AMD is going to have some god level RT bits in chiplet form or something. And yes I'll come back to this post and laugh at myself in a few weeks if AMDs cards are underwhelming. lol

I think the reality is 4090 was not supposed to be a GeForce card at all. It's a Titan in all but name, since architecturally speaking, everything about AD102 seems tailored for Titan-class workflows rather than gaming (compute & data science, video editing and rendering, AI, and most importantly Taylor Swift deepfakes). But then again, "what is a GeForce, and what is a Titan", anymore. I guess they were only ever marketing abstractions.

In the end everyone will just have to ask themselves "will a RTX 4090 at least make my tiny peepee seem bigger". And I am.. personally hoping that the answer to that question is please-god-yes.
 
Last edited:
In the end everyone will just have to ask themselves "will a RTX 4090 at least make my tiny peepee seem bigger". And I am.. personally hoping that the answer to that question.. is yes.
Wont make it bigger but will show all the details in 4k and high refresh rate.
 
I mean I don't know why that's shocking, it's not like it's a 4080 with less ram, it's physically a different GPU chip on there, everything about it says "Hi, I'm the 4070" but for whatever reason Nvidia decided to call it a 4080
Still (it could be specially bad benchmark) with the 4090 more than doublign a 3090Ti, seeing a 4080 12GB doing 44% more than a 3070 become a bit of a let down.

The 3070 was beating the 2080:
https://www.techspot.com/review/2124-geforce-rtx-3070/

On launch by 22% at 1440p and 27% at 4K, here the 4080(12gb) beat a RTX 3080 by 11%, it would not even be a particularly good xx70 cards.

Obviously yet to be released game, maybe playing at unrealistic setting for the rams worst case scenario and maybe in most case that would make sense it will do much better (in F1 it beat the 3080 by 24%, Flight Sim by 22% and keep up with the 3090TI like the 3070 did with the 2080TI, which would make it a good xx70 card like the 3070 was).

* would it be launch at a xx70 price obviously, it would have been a good product.

That level of hubris on the xx80s series make me wonder how little they think of RDNA 3 upcoming competition.
 
Am I reading the graph wrong?
I mean I look at the 4080 12g and see it roughly 1.5x faster than the 3070. I mean we all know the 4080 12g is what the 4070 was supposed to be but for some reason, Nvidia decided to say F-that! to the past decade or so of their naming conventions because... profit?
 
Last edited:
Am I reading the graph wrong?
I mean I look at the 4080 12g and see it roughly 1.5x faster than the 3070. I mean we all know the 3080 12g is what the 3070 was supposed to be but for some reason, Nvidia decided to say F-that! to the past decade or so of their naming conventions because... profit?

Well there are no 3rd party benchmarks of either 4080 card yet. Anything that comes from Nvidia is cherry-picked numbers from one of the few games that actually support DLSS. I think that anyone considering a 4080 card should probably just get a 3090 or 3090 Ti unless they only play a few games and know for sure that they all support DLSS. The 4080 12gb is slower than the 3090 cards in most cases when DLSS is not used.
 
I am hoping whatever they call the 4070 is essentially 95% of the performance the 4080 is.
I’m hoping the 4080 12gb just fades away as a paper launch.

I think it only exists as a place holder and partial bow shot for AMD, which is rumoured to be well below the 4090 in performance on their top end. But it sticks there as a price/performance goal post that AMD has to dodge around one way or another.

Either way the name of the 4080 12gb sucks and unless Nvidia is doing something dramatically different to their product lineup or just fucking with us for the lolz, really shouldn’t be a thing.
 
Last edited:
I think the reality is 4090 was not supposed to be a GeForce card at all. It's a Titan in all but name, since architecturally speaking, everything about AD102 seems tailored for Titan-class workflows rather than gaming (compute & data science, video editing and rendering, AI, and most importantly Taylor Swift deepfakes). But then again, "what is a GeForce, and what is a Titan", anymore. I guess they were only ever marketing abstractions.

In the end everyone will just have to ask themselves "will a RTX 4090 at least make my tiny peepee seem bigger". And I am.. personally hoping that the answer to that question is please-god-yes.
Quoted for PeePee
 
Looking at this shows you just how much of a joke the 4080 12gb really is. At only 1440p the 4090 has a 71% gap on the 4080 12gb so just imagine at 4k. https://videocardz.com/newz/nvidia-geforce-rtx-4090-delivers-500-fps-in-overwatch-2-at-1440p-ultra


For some perspective a 3090 is only 47% faster than even just a 3060 ti at 1440p according to techpowerup. https://tpucdn.com/review/nvidia-ge...ion/images/relative-performance_2560-1440.png


So even if the 4080 12gb was called a 4070 it would probably the weakest 70 class card ever released.
The 3090 is still CPU-limited at 2560x1440. The gap is 64% at UHD 4K.
1665667323537.png
 
I would have canceled both:
New 4080 less cut down
4080 16gb moved to 4070
4080 12gb moved to 4060

I bet they are going to be printing new 4070 boxes/stickers for the 4080 12gb.
 
I would have canceled both:
New 4080 less cut down
4080 16gb moved to 4070
4080 12gb moved to 4060

I bet they are going to be printing new 4070 boxes/stickers for the 4080 12gb.
Hoping they use this as an opportunity to restructure both of the 4080's.
The 12 GB needs to be renamed to 4070 and both cards need a price cut of at least $200. Using the 4090 as a baseline.

The 16 GB model would probably do fine at $1200 but is there really a market for the 12 GB at $900? Nvidia must have seen the writing on the wall.
 
Hoping they use this as an opportunity to restructure both of the 4080's.
The 12 GB needs to be renamed to 4070 and both cards need a price cut of at least $200. Using the 4090 as a baseline.

The 16 GB model would probably do fine at $1200 but is there really a market for the 12 GB at $900? Nvidia must have seen the writing on the wall.
Pricing is intentionally high, early adopters will pay through the nose, the regular folks will snap up the remaining 3000 series stock, and when the new AMD cards enter the market Nvidia can readjust accordingly.
AMD investor report indicates they are trying to keep their ~50% margin on their products so I don't expect them to be cheap, but they are also indicating they expect a drop of $1.1B in consumer sales this year so I also don't expect them to make a lot of the cards either.
 
I think the 4080 12 gig was a planned scam from the very beginning and was never going to be released. That is why there was no FE cards and there's been no leaks of aib 4080 12 gigs either. It was only done as a press release to help sell the 30 series cards. And I really think the 4080 16 gig is going to get a price cut either at launch or very soon after unless the AMD competing cards are complete and utter junk.
 
The big green marketing machine is afraid and tired of mean memes. They hear the cries of the gamers! More LHR cards! Renamed relaunches for more reviews! Er, price reductions!


However, if the 4080 12GB becomes a 4070, is it really a 4070 if they sell it for $699?
 
The big green marketing machine is afraid and tired of mean memes. They hear the cries of the gamers! More LHR cards! Renamed relaunches for more reviews! Er, price reductions!


However, if the 4080 12GB becomes a 4070, is it really a 4070 if they sell it for $699?
699 is certainly a bit rough and they'll only be able to get away with that if AMD isn't competitive at all.
 
Back
Top