NVIDIA CEO Jensen Huang hints at ‘exciting’ next-generation GPU update on September 20th Tuesday

The 4090 isn't priced too badly compared to its projected performance. The pair of 4080s are simply priced too high from what I can tell at this point. Maybe reviews will cool the situation off a bit, but I doubt it.
Performance is 25% more then the previous gen. That is from Jensen himself, that is exactly what he said between talking about 4x the performance with DLSS. To me that seems like a lot to drop on a card just for gaming.

All the 2-4x marketing spin crap is with DLSS 3.0. A new version which is doing some stuff we would have called hinky a few years back. I'll believe an image with 1/4 or more of the frames being made is great when I see it myself. (I want to see good reviewers talk about Lag)
I'm not saying DLSS 3.0 is not be fantastic... but I'll wait till real people actually use it and say its fantastic. I personally thought DLSS 1.0 was complete garbage... they got a handful of people to buy in cause AI AI AI. This is going to be great in the future. DLSS 2.0 is for sure improved but if your not playing games that support DLSS who cares.

Main issue with DLSS is simple it needs developer support. I already went through Nvidias hilariously bad list of Ray Traced titles... there are a handful of good RT games yes. But 5 years in they still have only 89 ray traced titles on the market... and as I pointed out early a large number of games on that list are utter garbage indie games made by guys in their spare time that don't even have triple digit review left for them on steam (and in almost every case have very low scores). AAA titles can be counted on your fingers. DLSS is basically in the exact same boat.
 
Because proving objectively that one manufacturer's drivers are better than the other is pretty difficult. How are you proving that they are better?
Your not wrong... they are superior IMO. And it is my opinion... but I have touched an Nvidia card before. Unlike most people claiming the reverse. I have AMD and Nvidia systems around... I have young adult boys (Ok the oldest is over 30 so I probably can't say that anymore) they have their own machines with various different cards... and the boys friends all know that you call Mr. D if your buying parts or if your stupid Battlefield keeps crashing. I have spent more time trying to figure out why the boys friends 2080s randomly crash then I should have to be. I have a couple foster kids right now and a friend of one of our boys wife does respite for us... I swear every single weekend I go to pick the kids up at their place he has some new issue for me to diagnose, 2080 super card seems fine isn't overheating power supply is solid.

Why do I prefer AMD. (Other then lack of strange issues in general) AMD doesn't require me to create an account... already scoring points. AMD has features that just work in every title. Super resolution may not be as clean as DLSS 2.0... but in quality mode its pretty darn hard to tell the difference. Only I can turn super resolution on in ANY Game. Anti Lag/Radeon Chill... I use both newer AAA games I use lag, older games I use Chill. Chill alone for me sells me AMD cards going forward. I play plenty of older games still (as I believe many people do) I use chill cause I don't need my cards spinning fans up to play a 6 year old game. I set Chill to my FSync range and in a bunch of the games I play my GPU doesn't even turn the fan on... my case is well vented and it just doesn't need the GPU fans 3/4 of the time playing those older titles Chill spins the game down to 50 PFS when the action is low and that thing runs cool and quite. Built in Overclock and metrics... no third party software required. All in one package... everything game settings basic graphic settings overclocking metric overlays, per game settings. Everything one driver. ALT-R change settings ALT-R back to gaming.

I haven't had a hard crash in over 3 years... and perhaps 1 or 2 soft recoveries which where my own fault messing setting.

Long term support... in the wives machine I have an old RX570. Super resolution, Anti Lag, Chill and almost all the goodies in her settings as well.
 
Last edited:
AD104 is a little baby boy.

d6ddnfbxfmp91.png


Source https://www.reddit.com/r/nvidia/comments/xlzo4x/die_sizes_of_nvidia_gaming_gpus/
 
Your not wrong... they are superior IMO. And it is my opinion... but I have touched an Nvidia card before. Unlike most people claiming the reverse. I have AMD and Nvidia systems around... I have young adult boys (Ok the oldest is over 30 so I probably can't say that anymore) they have their own machines with various different cards... and the boys friends all know that you call Mr. D if your buying parts or if your stupid Battlefield keeps crashing. I have spent more time trying to figure out why the boys friends 2080s randomly crash then I should have to be. I have a couple foster kids right now and a friend of one of our kids wife does respite for us... I swear every single weekend I go to pick the kids up at their place he has some new issue for me to diagnose, 2080 super card seems fine isn't overheating power supply is solid.

Why do I prefer AMD. (Other then lack of strange issues in general) AMD doesn't require me to create an account... already scoring points. AMD has features that just work in every title. Super resolution may not be as clean as DLSS 2.0... but in quality mode its pretty darn hard to tell the difference. Only I can turn super resolution on in ANY Game. Anti Lag/Radeon Chill... I use both newer AAA games I use lag, older games I use Chill. Chill alone for me sells me AMD cards going forward. I play plenty of older games still (as I believe many people do) I use chill cause I don't need my cards spinning fans up to play a 6 year old game. I set Chill to my FSync range and in a bunch of the games I play my GPU doesn't even turn the fan on... my case is well vented and it just doesn't need fans. Built in Overclock and metrics... no third party software required. All in one package... everything game settings basic graphic settings overclocking metric overlays, per game settings. Everything one driver. ALT-R change settings ALT-R back to gaming.

I haven't had a hard crash in over 3 years... and perhaps 1 or 2 soft recoveries which where my own fault messing setting.

Long term support... in the wives machine I have an old RX570. Super resolution, Anti Lag, Chill and almost all the goodies in her settings as well.
I've used both and I'd say these days they are about even. I don't have an account for nvidia (still download drivers the old school way) and both will occasionally break compatibility with certain games here and there. The feature sets are pretty similar so it's mostly just fanboys who would claim one side sucks and the other is great.
 
Oh.... OH..... I see it!!!

Let me put my tinfoil hat on.

This was Nvidia's plan all along. Price consumers out of the market. Start pushing their Geforce NOW solution as an alternative and make people start paying for it as a subscription service. Eventually, they roll out "tiers" of subscriptions where the higher the tier, the higher level the hardware you get access to. It gives them a continuous stream of revenue, and overall, it will cost gamers less because the subscription fee is lower than buying a card.

BUT!!!

You're sharing the hardware with other people. This is the AWS business model.

Tier 1 - Dedicated hardware, accessible anytime, best latency (no time limit)
Tier 2 - High-end shared hardware, accessible as long as it's available (no time limit).
Tier 3 - mid/low-end shared hardware - accessible as long as it's available (time limit).
Tier 4 (free) - only accessible when it's available, may get kicked off during peak hours (time limit).

Oh Nvidia.... you sly devil.

/tinfoilhatoff
 
MSI also with an MSRP 4090 - surprising since they were probably the most aggressive AIB in terms of marking up all their SKU's higher and more immediately than other AIB's during the 30-series run.

1663964723529.png
 
Last edited:
MSI also with an MSRP 4090 - surprising since they were probably the most aggressive AIB in terms of marking up all their SKU's higher and more immediately than other AIB's during the 30-series run.

View attachment 513065

One with a slight overclock is probably $200 more knowing the way they do business.
 
MSI also with an MSRP 4090 - surprising since they were probably the most aggressive AIB in terms of marking up all their SKU's higher and more immediately than other AIB's during the 30-series run.
I know this is why I didn't buy an MSI card this last gen. Their cheap Ventus models had some of the worst cooling and a plastic "graphene...ok" backplate.
The trio was a well built card but it was as you say priced higer and visually not so great in my opinion. It was also longer as I recall and wouldn't fit in my case but that is a different argument. I guess my point is maybe they learned their lesson on being the overpriced loss leader?

Edit: I think the one in your post looks like shit!
 
MSI also with an MSRP 4090 - surprising since they were probably the most aggressive AIB in terms of marking up all their SKU's higher and more immediately than other AIB's during the 30-series run.

View attachment 513065

Here is hoping they offer a similar model at MSRP for each GPU (80/70/60/60ti/50ti/50 etc.).
 
Ok, You brought up that Nvidia were breaking rules/regulations. I asked you to point to the rules/regulations they were breaking, I meant specifics.
I quite literally gave you links to the legislative statutes from several territories which govern consumer laws, the fact that you cannot even be bothered looking at the them (or even doing basic google searches) is enough to tell me that that I am dealing with someone who is completely disingenuous or intellectually lazy.

And you state that there is plenty of case law where this exact point is been decided against marketers/sellers. Give me an example that applies to this situation exactly. I don't think you will find any.

US decisions:
- Mantikas v. Kellogg Company, No. 17-2011
- Locklin v. Strivectin Operating Co.

Aus decisions:
- National Exchange Pty Ltd v ASIC [2004] FCAFC 90
- ACCC v GlaxoSmithKline Consumer Healthcare Australia Pty Ltd [2019] FCA 676

Canada decisions:
- Telus Communications Co v. Rogers Communications Inc., 2009 BCSC 1610
- Commissioner of Competition v. Forzani Group Ltd.

Are but a handful of many, but I don't know why I even bothered because I doubt you will take the time to read and understand them.

Rebranding/renaming has been part of GPUs since the beginning. We look at the specs and internal SKU name(like GK104, GF100 etc) and we try to base our predictions about naming/price/performance etc on that. But the fact is, Nvidia can name the final product anything they like. That's where your whole legal argument falls down.
The legal argument doesn't depend on "rebranding", you are conflating two completely different issues.

Throughout the years, Both AMD and Nvidia have used different SKU's for different products and sometimes the same products have different SKUs. The final product name hasn't always matched up what We the tech crowd predicted.

When the GTX 680/670 came out there was a lot of shouting from the tech community because Nvidia used their midrange SKUs, the GK104. There was a lot of "outrage" about it. Charging $499 for a midrange SKU a disgrace, down with Nvidia etc. etc. Yet, no legal action, hmmm. odd that.
Again, nothing to do with rebranding.

Uh huh.

But the most relevant example might be the fine Nvidia got for misleading specifications/advertising. The GTX 970 had different specs than the ones they advertised. Do you really think that Nvidia are going to make a mistake like that again? Especially now when they are struggling because of huge overstocks of GPUs and AMD breathing down their neck?
Yes, companies constantly fuck up and do the wrong thing even when previously sued, and especially if they think they can make a quick buck.

So you see the law works pretty quickly in cases like this. On at least two previous occasions Nvidia used different names than the ones tech forums thought they would. The 680 GTX should have really been the 660GTX going by the previous release naming scheme.
The law isn't some sentient being that decides of its own motion to take action, its dependent on aggrieved persons deciding to pursue causes of action and legal recourse. And the use of different names isn't the issue, you seem incapable of understanding the distinction.

And the 1060 6GB and 1060 3GB had the same name despite a drop in performance that went beyond the memory difference.
And as I said repeatedly, it is context specific - i.e. what information was contained on the ACTUAL packaging and in marketing material to make it clear that the 1060s were different products apart from memory configurations, because if the specifications were clearly marked then that of itself COULD be sufficient to avoid any claims of misleading/deceptive naming.

ah, the crux of the matter as you say. You say they changed the name. Prove it. This is an entirely new lineup of GPUs. How do you know what they were going to call that GPU?
I didn't say they "changed" the name, and it absolutely makes no difference what their reasoning or thought processes were leading up to that point. All that matters is how the products are labelled and marketed.

Nvidia, love them or hate them, they are extremely good at marketing. And that's all this is, marketing.
Marketing by its very nature is subject to the law.

There will be no legal issues.
Sure, as long as the products are appropriately labelled and marketed so its clear what all the differences are.

I wasn't indignant until I read this line. What a load of drivel. You talk about then real world and then write a statement like that. I asked you a question in my last post, but you didn't answer. If someone came to you about buying a new GPU, would you tell them to buy based on Nvidia's/AMD's slides and marketing or would you tell them to wait for reviews? Would you tell them just do into the store and buy it based on the specs on the box?And that applies to anything, not just graphic cards. It's basic common sense. If you were buying a new TV, would you just rock into the store look at the specs and buy it purely on those specs? Or would you go check to see if there were any issues?
Cry harder, that's the way it works. Your hypothetical is as asinine as it is irrelevant because there are plenty of people who buy things whilst in store based on the representations and specs on the box. Consumers have no obligation to do any due diligence apart from reading and comprehending the information presented to them at the time they buy. Whether or not it is is prudent for them to do so is irrelevant, and that is categorically supported by court decisions across jurisdictions.

You know damn well that advertising can be completely honest and still not tell the whole story. And people make a mess buying things all the time because they don't get advice or do any level of research. That's the actual real world. And, you are right, nobody is obliged to do any research at all if they don't want to. I never said otherwise. Is that how you go through life? Would that be how you recommend anybody else go through life? No, I don't think so. Unless you are going lie just because it doesn't suit your narrative.
It doesn't matter what I do or don't do, because legal systems deal with constructs and principles that are applied to fact patterns. And yes, advertising has been held to be misleading/deceptive even when factually true because of the way the information has been presented or some key detail has been omitted.

Happy to keep debating you day and night on this point because you clearly have no idea, although at this rate all you will achieve is getting this thread locked.
 

The reason why I asked you for a specific law, was just to waste your time. You see, Nvidia aren't breaking any laws with their naming scheme.

Nvidia released the 1060 the same way they are releasing the 4080. Announcement first and release after. All the information that came on the box of the FE editions was the system requirements, warranty information and what's in the box. And Nvidia, like all companies, change the names of the products they are selling to make them sound better so they sell better. They did it with the Titan, instead of releasing their top end card as x80, they called it a Titan card and charged way more.

the fact that they have changed the name to create the impression that the price hike is justified.
I didn't say they "changed" the name,

You are getting it hard to keep your waffle straight.

And, no, I won't be debating anymore with you on this. There is no point, your posts are, well, full of nonsense. And you still didn't answer my question about advising somebody before buying. I knew you wouldn't because it wouldn't fit your narrative.

AS for myself and the 4xxx cards, No I won't be buying and I will be advising anyone that asks me, to not by one. The prices are ridiculous. Just like I didn't buy any of the 3xxx cards.
 
When does the ti release???
With the suit. Which is usually 6 months out, but could be a year.
On Topic:

QUESTION: Am I excited about Nvidia's new generation of Ada Lovelace GPUs?

ANSWER: No... no, I am not.
They could've released it as a VPU add-in card with the dual AV1 engines + ai-assisted frame generation. I'd buy that, for the correct price.
 
Last edited:
Also

"*RTX 4080 performance taken from Nvidia's presentation and transformed by scaling RTX 3090 TI result from Techpowerup."

So chances are very high than Nvidia's presentation was cherry picking what tests they ran to give the improvements they are quoting, so wouldn't be surprised if this is actually a step backwards, yeah it is faster but it's not the same amount of faster as the price you're paying. "The more graphics cards you buy, the more you save!" indeed.
 
It may technically be superior, but it kills the original style.
I suppose that depends on if you think the original style is somehow tied to dated graphics. I, for one, don't believe that blurry graphics. dated shaders, or crap shadows are intrinsic to the charm or style of Morrowind. I believe it's entirely possible to update assets and graphics in the spirit of the original, and considering that assets are literally the original but upscaled, I'd have to say it's quite a ridiculous critique. But hey, some people feel compelled to be contrarian, and I won't stop you.
 
I suppose that depends on if you think the original style is somehow tied to dated graphics. I, for one, don't believe that blurry graphics. dated shaders, or crap shadows are intrinsic to the charm or style of Morrowind. I believe it's entirely possible to update assets and graphics in the spirit of the original, and considering that assets are literally the original but upscaled, I'd have to say it's quite a ridiculous critique. But hey, some people feel compelled to be contrarian, and I won't stop you.
Dated graphics have nothing to do with the mood and lighting intensity. If you can't see the difference when it's almost literally night and day, I don't know what to tell you.
 
They focused too much on the RX 4090 and marketing, so I guess they forgot about the new drivers for the rtx 3060 12gb
 

The performance to pay attention to in this Graph is the "Today's games" section. 3090ti to 4090 is 1.75x for Resident Evil Village, 1.5x for Assassins Creed, 1.65x for Division 2, 2x for Warhammer. These are very nice performance gains, but it's also a marketing presentation. Which means these are likely the highest performing games gen to gen out there, or the rest of the games are seeing less %gains. 30% to 50% is probably what we will see across the board. It's still nice gains, but for 2.7x the number of transistors seems low. Granted some of those new transistors are DLSS 3.0 hardware, and if a game gets support for DLSS 3, it could turn the tables pretty drastically. Remains to be seem if DLSS 3.0 1) gets more widespread adoption, and 2) doesn't introduce any input lag. Regarding performance gains over previous gen, your best bet is to wait for reviews on performance (gen to gen) in games you actually play, as that is all that really matters. If you are on 2xxx or older, 4xxx is going to be a very nice performance uplift for you. Or with the performance to spare you can crank up image Quality.
The performance on the right side is with DLSS 3.0, so while impressive, only applies to games that support DLSS 3.0. That list is published, was looking at it the other day. 30 games out of which only 5 I had ever heard of and only 3 I play. Hopefully that support gets more widespread. I don't see this being a huge hurdle, as DLSS is now pretty much available in all new games as far as I am aware.

Let's be real... the 4080 12GB is a 4080 in name only. It has a 192-bit memory bus. Guaranteed that's either a cut down 104 chip or a full-fat 106 chip.

The 4080 12 GB is a joke.

They likely named it as a 4080 to appease the AIB's. I suspect the 4080 12Gb could be a short lived sku that is just here to help the market absorb the remaining 3xxx inventory. Also hence the higher price than they could have placed it.
Comparing it to the rumored specs for the 4070, it has a nice clock speed bump. The 4080 12GB would more closely be compared to a 4070Ti, based on what the 4070 specs are. Or the 4070 could also have less ram, who knows. Time will tell and all that.

Regarding the naming in general, it represents the performance in the stack relative to it's brothers, price, and performance relative to previous generations. The GTX 680 was originally going to be a lower end part, but the performance was so good, they marketed and sold it as a 680. In that regard, I am not sure it's worth getting too upset over.

Regarding the memory bus width, what really matters is the bandwidth. And that really only matters if it is a bottleneck. 4080 12Gb (or 4070Ti if you want to think of it as such, I wouldn't blame anyone for that), is basically equaling the 3090Ti in performance. That's pretty impressive no matter how you slice it. Joke? nope. Bring DLSS 3 into the picture, doubles the 3090Ti... I suspect the only weakness that the 504Gb/sec will impart is at 4k resolution and higher, but even this remains to be seen.

Some have complained that DLSS makes PQ take a hit, but you really have to try it for yourself and see. Back to recommending you wait for reviews on both performance AND IQ uplifts, in games you actually play, before you decide.

Correct. I think Nvidia is going to wait until 3090 chips are almost gone, holding it's price above $900 before releasing the 4070.

The 4080 12GB (really 4070) was priced higher, given the 4080 name and made AIB only to prevent the bottom from dropping out on 3090 chips.
Image a 4070 12GB for $699 trouncing a 3090 when they have a warehouse full of them.

I wouldn't be surprised he made a price increase on the 4080s at the last minute, that's why pricing slides didn't show all of the model prices.
It like the slide presenter was told to stop. lol. We immediately saw they were a bad value compared to the 4090.

Last gen the 3090 was $1500 and the 3080 was $699. How did the 4080 16GB get a $500 price increase is beyond me.

I suspect the same. The 4080 price hike is to help the AIB's continue to move the 3xxx inventory. They will still have to cut prices on 3xxx stock. And the higher 4080 pricing helps them balance the books.

...Nvidia needs to be held accountable for the bullshit they spew during their events.

Marketing is Marketing. You always have to read between the lines. Company makes a claim, and if it is true in ONE game, it's just that, TRUE. They all have lawyers making sure everything presenting cannot be construed as false advertising.

There are other things to get upset about, this is pretty low on the list. And is par for course for all of these companies.

"But AMD has been more accurate last few years..." That's because they are the underdog. They cannot afford to be seen in any worse light than their product already places them. Making claims when your product is already the lower performing one, would just get pointed out and hammered home into peoples brains. Pointing out "slightly less performance, for slightly less price" is something AMD has done for decades.

lol, my god....read the fine print. Using DLSS performance mode. No one will EVER use that crap performance mode since the IQ is ass. Again terrible graphs from Nvidia.

Terrible Graphs? It tells you what everything is, specific game performance even. For IQ, that remains to be seen. I wouldn't say it is "ass", on what I have seen of DLSS 2.0.

Ok... crystal ball time.
....
Mark my words: At 4K, the 4080 12GB will be inferior to the 3090/Ti.
Why would this matter one way or the other, if it turned out to be true?
There's a possibility that the 4080 16GB will be faster, but if so, it will probably be by 10-20% max, and even then, it won't be a clean sweep victory. The 3090 has a stupid amount of memory bandwidth at its disposal.

....

/crystalball
And, assuming you are saying 4080 16Gb will be faster than '3090', maybe only by 10-20%. They are the same price, so who cares?
They only had to pay $30 back to GTX 970 customers. They'll do it again.

Regarding the 4080 12Gb cards name? It's not the same situation.

A lot of gaslighting going on regarding the name of that card, but it wouldn't hold up in court. And the specs are on the box.
 
Last edited:
Dated graphics have nothing to do with the mood and lighting intensity. If you can't see the difference when it's almost literally night and day, I don't know what to tell you.

And that is why you need to design a game around the technologies at hand. Introducing new technologies makes other things possible, but you just can't slap it on an older game. I'm sure they can use ray tracing and still have the scene look similar to the original but that would take more time and effort to redesign everything to capture the dark lighting of the original.
 
  • Like
Reactions: Meeho
like this
For something like mood by amount of light, wouldn't a simple cinema like color grade do a lot of that work ? I am not sure the result we see was easier than otherwise too, they could have wanted to make the change in texture and lighting very visible to the eye.
 
Back
Top