NVIDIA CEO Jensen Huang hints at ‘exciting’ next-generation GPU update on September 20th Tuesday

"Up to" refers to the maximum value, not the average. Nvidia releases individual graphs you can look at. Don't let MLID bring down your IQ.
Charts here:
https://hardforum.com/threads/nvidi...r-20th-tuesday.2021544/page-9#post-1045455558
Quite a few of his leaks and Analysis is pretty damn good compared to 90% of the industry. Not saying he is right all the time, but he right and spot on a hell of a lot more then most people.

As the boss said....Nvidia needs to be held accountable for the bullshit they spew during their events.
 
Last edited:
Quite a few of his leaks and Analysis is pretty damn good compared to 90% of the industry. Not saying he is right all the time, but he right and spot on a hell of a lot more then most people.

As the boss said....Nvidia needs to be held accountable for the bullshit they spew during their events.
I kinda doubt they will ever be held accountable. Every company likes to talk up their new book and try to make it look as good as they can. And with carefully worded stuff like "up to" they can still be technically right.
 
I kinda doubt they will ever be held accountable. Every company likes to talk up their new book and try to make it look as good as they can. And with carefully worded stuff like "up to" they can still be technically right.
Can you show me any current new gen game that can play at 8k 60fps on a 3090 as Nvidia stated? There is no "up to" in that comment.
 
Quite a few of his leaks and Analysis is pretty damn good compared to 90% of the industry. Not saying he is right all the time, but he right and spot on a hell of a lot more then most people.

As the boss said....Nvidia needs to be held accountable for the bullshit they spew during their events.
And who will hold them accountable? Nvidia can just say hell to the gamin market if they want too. We are at their mercy and they know it.
 
  • Like
Reactions: DPI
like this
Can you show me any current new gen game that can play at 8k 60fps on a 3090 as Nvidia stated? There is no "up to" in that comment.

I don't have an 8K monitor to test it, but it should be possible on games that have a DLSS "performance" mode. That mode typically drastically reduces image quality, but it can skyrocket your FPS. There are new games that can run at 120FPS at 4K in that mode, so 8K/60 should be feasible.
 
  • Like
Reactions: DPI
like this
I don't have an 8K monitor to test it, but it should be possible on games that have a DLSS "performance" mode. That mode typically drastically reduces image quality, but it can skyrocket your FPS. There are new games that can run at 120FPS at 4K in that mode, so 8K/60 should be feasible.
So you are saying if people with an 8k monitor want to run at 60fps to basically lower the resolution down to get those 60fps at 8k even though it really isn't playing at 8k?

Ok Gotcha.
 
Can you show me any current new gen game that can play at 8k 60fps on a 3090 as Nvidia stated? There is no "up to" in that comment.
You could probably find one. I don't really know of many monitors that even support 8k so it's sort of a throwaway statement.
 
So you are saying if people with an 8k monitor want to run at 60fps to basically lower the resolution down to get those 60fps at 8k even though it really isn't playing at 8k?

Ok Gotcha.
I don't have an 8K monitor to test it, but it should be possible on games that have a DLSS "performance" mode. That mode typically drastically reduces image quality, but it can skyrocket your FPS. There are new games that can run at 120FPS at 4K in that mode, so 8K/60 should be feasible.

your math appears to be off by a factor of two. 4k120 is only 8k30 unless you farther downgrade image quality (remember you have 4x as many pixels not 2x).
 
Looking at this slide, disregarding the Next generation section.
The 4090 looks like going from a 1080ti to a 3080. The 4080s look like Turing 2.0.

Unless you are into a game that supports DLSS 3.0 or a game that is heavy on ray tracing.
I just don't see the value in the 4080 based on this slide. 4090... sure if you want to spend the cash.
The reviews may tell a different story.

geforce-RTX-4000-series-gaming-performance-1-1536x864.png
 
Last edited:
Looking at this slide, disregarding the Next generation section.
The 4090 looks like going from a 1080ti to a 3080. The 4080s look like Turing 2.0.

Unless you are into a game that supports DLSS 3.0 or a game that is heavy on ray tracing.
I just don't see the value in the 4080 based on this slide. 4090... sure if you want to spend the cash.
The reviews may tell a different story.

View attachment 512288
Well it's a comparison to a 3090 ti. If you take the 3080 10gb (which most people have) and compare it to the 4080 16gb and lets assume you get nearly 2x raster performance, how is that not a significant performance increase?
 

Well it's a comparison to a 3090 ti. If you take the 3080 10gb (which most people have) and compare it to the 4080 16gb and lets assume you get nearly 2x raster performance, how is that not a significant performance increase?
Is there anything to base that assumption on?
 
It takes zero brain power to be pessimistic about something.
I am generally overly optimistic. I am pessimistic about this because they are quoting 2X-4X performance improvements which look to be mainly dlss related then have a sneaky graph at the end with more sane projections in more rasterized scenarios.
 
Looking at this slide, disregarding the Next generation section.
The 4090 looks like going from a 1080ti to a 3080. The 4080s look like Turing 2.0.

Unless you are into a game that supports DLSS 3.0 or a game that is heavy on ray tracing.
I just don't see the value in the 4080 based on this slide. 4090... sure if you want to spend the cash.
The reviews may tell a different story.

View attachment 512288
lol, my god....read the fine print. Using DLSS performance mode. No one will EVER use that crap performance mode since the IQ is ass. Again terrible graphs from Nvidia.
 
Is there anything to base that assumption on?
I'm just using that chart and extrapolating based on that. It's very rough I'll give you that and I'm not buying anything until we get independent reviews but 1.5-2.0x raster would be my guess going from 3080 10gb -> 4080 16gb.

Again I fully admit that could be wrong/off.
 
Well it's a comparison to a 3090 ti. If you take the 3080 10gb (which most people have) and compare it to the 4080 16gb and lets assume you get nearly 2x raster performance, how is that not a significant performance increase?
Nvidia(and other manufacturers) want to segment you with naming conventions. Consumers segment themselves with price brackets. I think what a lot of folks are getting at is that Nvidia or AMD can call their cards X080 or X800, but if the price clearly aligns with a previous gen top tier card, X090 or X900, these miraculous 2x performance figures are meaningless. It would be like rebranding last gens 3090ti a 4060, keeping the price at $1500, and then preaching to the tech press that the 4060 is 4x faster than the 3060. Yea, technically that is true, but you didn’t really deliver any increased value to the customer because you didn’t make the 4060 any better to anybody shopping in the 3060 price bracket.
 
Well it's a comparison to a 3090ti. If you take the 3080 10gb (which most people have) and compare it to the 4080 16gb and lets assume you get nearly 2x raster performance, how is that not a significant performance increase?
You have a point. I just looked into the 3090ti. It is faster than I remembered. I had it in my head from the early 3080/3090 reviews where there was a ~8%-12% performance difference. I assumed it was just a few percent more than that.
 
I am generally overly optimistic. I am pessimistic about this because they are quoting 2X-4X performance improvements which look to be mainly dlss related then have a sneaky graph at the end with more sane projections in more rasterized scenarios.
The rasterized graphs are still over 40% uplift compared to previous gen and 200-400% with RTX. I can't see how someone can see that as concerning.

The only WTF here is the pricing of the 4080 cards. $1200-1500 AIB 480 is insane.
 
I think you're confusing pessimism with skepticism.

Skepticism is connected to value. Pessimism is connected to affordability.
A skeptic will buy if proven wrong. A pessimist will never buy but will complain about it.

Remember when the 3080 was priced at $699 MSRP? These emotions didn't exist.
 
The rasterized graphs are still over 40% uplift compared to previous gen and 200-400% with RTX. I can't see how someone can see that as concerning.

The only WTF here is the pricing of the 4080 cards. $1200-1500 AIB 480 is insane.
The WTF pricing is the problem. Generational uplifts like that generally don’t add $400-$500.

Also it is showing the 12gb version to be slower than the 3090ti in some cases. Which makes me skeptical that warhammer and flight sim results are dlss related and not indicative of rasterization.

Really don’t know. I hope it is 2-4x faster. I would let go of my laptop in a heat beat and get a 4090 if that were the case. Something just feels off with the marketing.

Also I really like dlss. When it works well it’s great, but in most cases I would rather have it off.
 
Skepticism is connected to value. Pessimism is connected to affordability.
A skeptic will buy if proven wrong. A pessimist will never buy but will complain about it.

Remember when the 3080 was priced at $699 MSRP? These emotions didn't exist.
Exactly, if it’s 2x-4x. The pricing is scummy, but I would pay it.

I am just skeptical that they aren’t playing games with the slides.
Hiding the fact that yea you can get huge uplifts with high latency in a handful of titles over the life of the card. With more typical uplifts in most titles.
 
The WTF pricing is the problem. Generational uplifts like that generally don’t add $400-$500.

Also it is showing the 12gb version to be slower than the 3090ti in some cases. Which makes me skeptical that warhammer and flight sim results are dlss related and not indicative of rasterization.

Really don’t know. I hope it is 2-4x faster. I would let go of my laptop in a heat beat and get a 4090 if that were the case. Something just feels off with the marketing.

Also I really like dlss. When it works well it’s great, but in most cases I would rather have it off.

If you listen to what Jensen was actually saying... he was clear. 4000 is 25% faster then 3000. While sucking more juice.

All the 2-4x crap was with DLSS 3.0... the card is inventing frames hence the major uplift. Now I don't care how the sausage is made, if the AI made a 1/4 of the frames fine as long as it looks fine and doesn't introduce lag or something. I reserve judgement until I hear from a few reviewers, and probably hear it from multiples cause [H] reviews aren't a thing anymore.

And even if they say DLSS 3.0 is the shit... unless its implemented in a lot of games, or at least a good handful I care about personally so what.
 
Looking at this slide, disregarding the Next generation section.
The 4090 looks like going from a 1080ti to a 3080. The 4080s look like Turing 2.0.

Unless you are into a game that supports DLSS 3.0 or a game that is heavy on ray tracing.
I just don't see the value in the 4080 based on this slide. 4090... sure if you want to spend the cash.
The reviews may tell a different story.

View attachment 512288

Ooooooo, just noticed something.

The Division 2 has DLSS 2.0. If this slide is being read correctly, the 3090 Ti is being run at 4K w/ DLSS Performance (1080p upscale). The 4080 12GB is able to match it, but only with DLSS 3 and Frame Generation. Marketing slides at their finest.

I'm telling you, the 4080 12GB is going to be a MAJOR disappointment.
 
Skepticism is connected to value. Pessimism is connected to affordability.
A skeptic will buy if proven wrong. A pessimist will never buy but will complain about it.

Remember when the 3080 was priced at $699 MSRP? These emotions didn't exist.
Yes, and I was ready to buy a 3090, until Nvidia revealed that the MSRP was $1500. Then I realized what a horrible value the 3090 was, and ended up getting the $700 3080 FE instead. The 3080 was a good value considering the performance it provided (20-30% faster than a $1200 2080 Ti). The RTX 4000 series of cards, however, are not based on what we know right now.

This is skepticism, not pessimism.
 
Looking at this slide, disregarding the Next generation section.
The 4090 looks like going from a 1080ti to a 3080. The 4080s look like Turing 2.0.

Unless you are into a game that supports DLSS 3.0 or a game that is heavy on ray tracing.
I just don't see the value in the 4080 based on this slide. 4090... sure if you want to spend the cash.
The reviews may tell a different story.

View attachment 512288

Y axis = relative performance, which to be honest is completely meaningless. I want to see numbers, not "relative performance."
 
Ooooooo, just noticed something.

The Division 2 has DLSS 2.0. If this slide is being read correctly, the 3090 Ti is being run at 4K w/ DLSS Performance (1080p upscale). The 4080 12GB is able to match it, but only with DLSS 3 and Frame Generation. Marketing slides at their finest.

I'm telling you, the 4080 12GB is going to be a MAJOR disappointment.
Will it do frame generation in a dlss 2.0 game? Id imagine its more apples to apples than that with it being dlss 2.0.

Does make me wonder about the rop count though on the 12gb. Probably lower fillrate and memory bandwidth on the 4080 12gb vs the 3090ti, most likely just keeping up due to dlss performance mode running in 1080p and having a newer architecture.
 
The easiest way to find out how these cards are going to perform is just to wait a couple weeks. Unless you're dead set on buying a 4090 at 6AM on 10/12, we'll get exact performance #'s vs. previous cards soon enough. That said, I don't think anyone is excited for or hopeful about the 12GB 4080. At least not at that price. Unless all the other models sell out instantly, that one feels like a day 1 desperation buy only.
 
I'm just sitting here with my 1070 eating popcorn and wondering if this will be the generation where I can get something cheap. Hopefully used prices continue to drop.
got a box with a basic 1080 going strong on the wife's desk (Ryzen 9 3900x), would love a cheap 3070 to pop into that thing. But she's not "complaining" about it yet so I have time, just not a lot of time...
 
I am sure AMD did not send Nvidia an AM5 platform to tune on like RX 7000 series will be, 12900K testing platform is already known that it needs DDR 5 to outperform the 5800X 3D with DDR 4. being faster in DLSS 3.0 vs 2.0 hardware doesn't show the baseline performance we want to know in a game like Sniper Elite 5 without DLSS for example to be used, native 1080p tells so much.
 
Video at the mark:
You expect when the up to is used in marketing that it will not be the average case (AMD latest presentation was so much better), but saying that being at 8K video gaming cards was ridiculous because a game that didn't exist when the 3090 launched does not play well with RayTracing and everything at ultra is a bit of the reverse medal here.

I have seen a different marketing graph that did had the best case scenario and the regular case scenario in a different video that would have been much more accurate.

i.e. there is many games at launch that the 4080 12 gig will do a bit less than a 3090 TI according to NVIDIA and on old title without RT on etc, we should not expect 2x gain everywhere for the 4090.

This seem
21100644451l.jpg


A less manipulative marketing bit than just having the right part (specially until we know the value in the subjective gaming experience of those AI interframe)
 
You expect when the up to is used in marketing that it will not be the average case (AMD latest presentation was so much better), but saying that being at 8K video gaming cards was ridiculous because a game that didn't exist when the 3090 launched does not play well with RayTracing and everything at ultra is a bit of the reverse medal here.

I have seen a different marketing graph that did had the best case scenario and the regular case scenario in a different video that would have been much more accurate.

i.e. there is many games at launch that the 4080 12 gig will do a bit less than a 3090 TI according to NVIDIA and on old title without RT on etc, we should not expect 2x gain everywhere for the 4090.

This seem
View attachment 512369

A less manipulative marketing bit than just having the right part (specially until we know the value in the subjective gaming experience of those AI interframe)
Missing from that is the settings... DLSS PERFORMANCE mode, which is poopoo in terms of IQ.

1663782567915.png


Full text to be transparent.

1663782615254.png
 
Can you show me any current new gen game that can play at 8k 60fps on a 3090 as Nvidia stated? There is no "up to" in that comment.
Doom Eternal was their poster boy for a reason, easy to enough to run:


Battlefield 5, Dirt Rally 2, Death Stranding I imagine there is alist of title that were recent to someone recent in 2020 that you can run at reasonable setting with DLSS, close enough to 60fps for VRR to be ok at "8k", the moment you have hdmi 2.1 and enough VRAM, you can play at 8k it became with what setting is it smooth enough.
 
Back
Top