NVIDIA CEO Jensen Huang hints at ‘exciting’ next-generation GPU update on September 20th Tuesday

I see it as manufactured outrage by the most savvy consumers possible, but opinions are like assholes. I don't think anyone spending $1000 on this card will be fooled by a naming convention only savvy people are even familiar with.

How hard is it to require specs on the box? Just because you don't think its an issue, or more likely because it doesn't impact you, its nbd?
 
How hard is it to require specs on the box? Just because you don't think its an issue, or more likely because it doesn't impact you, its nbd?

I have no problem with requiring specs on boxes. That's something else altogether. What I don't have an issue with is them naming the card 4080. They can call it whatever they want and sleep fine as far as I'm concerned.
 
Who's dropping $1k on a GPU? Your average buyer is nowhere near that, despite Nvidia trying to normalize it.
^^^This.

The average consumer sees a PS5 or Xbox Series X as "high-end", and those go for $500ish... and that's the complete system, not just a component. Nvidia is asking consumers to spend well over that on their 5-6 highest SKUs, while their SKUs below $500 aren't able to compete with a PS5/XSX. Nvidia is hoping that the consumer will see "Nvidia" and assume that it's a good product.

Oh, just a side-note... we're headed into a recession.

Good luck with that. (y)
 
I’m not attached to any PC component company. In fact I rant about pretty much all of them pretty regularly due to their anti-consumer horseshit.

In this case I just can’t understand why anyone gets mad about the naming. Well I understand why the techtoobers do it because engagement and ad revenue but please if we’re really “geeks” or proper “enthusiasts” here we’d be discussing specs and not some mundane marketing and consumer-oriented shit like naming.
It makes me mad because it is deceptive marketing.

Since the 700 series ignoring titans and x90 series we have had:
Nearly the full die as a “TI” and cut down x80 sku then a 10-20% more cut down die as a x70 sku.

Before that going way back. In the high performance tier you had a fat die and a cut down die. They were named appropriately to segment their performance.

What they are doing is disingenuous. Marketing a lower tier sku as higher end than it is and it will part uninformed consumers with their money.
 
It makes me mad because it is deceptive marketing.

Since the 700 series ignoring titans and x90 series we have had:
Nearly the full die as a “TI” and cut down x80 sku then a 10-20% more cut down die as a x70 sku.

Before that going way back. In the high performance tier you had a fat die and a cut down die. They were named appropriately to segment their performance.

What they are doing is disingenuous. Marketing a lower tier sku as higher end than it is and it will part uninformed consumers with their money.
Yeah this is what I find terrible about this even if doesn't affect me. I know better, but Nvidia should show a little respect/consideration for all their consumers.

I get the naming scheme is probably to get higher margin for lower cut die to inflate margins (ie. to keep stock prices from taking a bigger hit).

Hopefully there was some internal strife on naming because I'd hate to work for a company where everyone has no regard for it's customers. I mean you expect some of that from sales and marketing :)
 
Yeah this is what I find terrible about this even if doesn't affect me. I know better, but Nvidia should show a little respect/consideration for all their consumers.

I get the naming scheme is probably to get higher margin for lower cut die to inflate margins (ie. to keep stock prices from taking a bigger hit).

Hopefully there was some internal strife on naming because I'd hate to work for a company where everyone has no regard for it's customers. I mean you expect some of that from sales and marketing :)
Nvidia seems to increasingly think of home PC owners as dogs worthy of the scraps. This move really reinforced that.
 
Most computer consumers are not consummate tech enthusiasts, this has been done intentionally to confuse the average consumer into thinking they've bought the same product with just less VRAM, when they are actually getting a much lower end model that would normally be a different product, especially since as mentioned by J2C there is nowhere on the box that these differences are typically listed at the store.

That is why the naming matters more than anything else in this thread, the average consumer is being hoodwinked and companies should be forced to list the specs on the box.

The average consumer? LOL, the average consumer isn't buying $900 cards. And anybody who is spending that much on a graphics card and doesn't do 5 minutes of research deserves to be hoodwinked.

Inflated pricing aside (which we all knew was coming) - naming has always been completely arbitrary and nonsensical apart from maybe the X2 suffix which meant you’re getting two dies. I don’t understand the outrage here.

I don't get all the fake outrage on tech forums either. Why does the naming matter? Look at the performance and look at the price, If you feel the price/performance/features are good enough over the card you currently have then buy it. Or wait and see what AMD come out with, then decide.

It's always been recommended to wait for reviews before buying a card so Why would people start doing anything different just because Nvidia names a card to something different than what they were expecting.
 
  • Like
Reactions: DPI
like this
The average consumer? LOL, the average consumer isn't buying $900 cards. And anybody who is spending that much on a graphics card and doesn't do 5 minutes of research deserves to be hoodwinked.



I don't get all the fake outrage on tech forums either. Why does the naming matter? Look at the performance and look at the price, If you feel the price/performance/features are good enough over the card you currently have then buy it. Or wait and see what AMD come out with, then decide.

It's always been recommended to wait for reviews before buying a card so Why would people start doing anything different just because Nvidia names a card to something different than what they were expecting.
why do people defend bad business practices, because you do.

Me: Put the specs on the box.

you: lol consumers deserve to be hoodwinked because their not me.
 
AIBs are in a world of pain against the 4090 FE at $1600 and a potential 4090Ti FE at $2000-$2100. They can't afford to overcharge like they did with Ampere or they'll get double penetrated by those 2.
 
Am I the only one on this forum to be excited about 3080 to 4090 upgrade? To be excited about much more capable ray tracing capabilities? And tech behind DLSS 3.0? I did not expect this many people to be grumpy! Can't wait to do my first run in Cyberpunk at highest RT setting on my oled monitor and in 4k resolution.
 
Am I the only one on this forum to be excited about 3080 to 4090 upgrade? To be excited about much more capable ray tracing capabilities? And tech behind DLSS 3.0? I did not expect this many people to be grumpy! Can't wait to do my first run in Cyberpunk at highest RT setting on my oled monitor and in 4k resolution.
I’ve been playing Cyberpunk on my 3080 with 4K DLSS Quality maxed. It’s pretty nice. They’ve made some good optimizations. Frankly, for this game, it’s more than enough.

You have every right to be excited if you think the $1600 for a 4090 is justified.
 
Am I the only one on this forum to be excited about 3080 to 4090 upgrade? To be excited about much more capable ray tracing capabilities? And tech behind DLSS 3.0? I did not expect this many people to be grumpy! Can't wait to do my first run in Cyberpunk at highest RT setting on my oled monitor and in 4k resolution.
I mean, nobody is telling you not to. No, I'm not buying a $1600 gpu and a nearly $1000 screen to enjoy a single game. Most other people aren't, either. To most this release signifies that mining might be gone, but the skyhigh prices are here to stay. Not much to be excited about TBH.
 
His presentation that you're an idiot if you don't rush out to buy all the Ampere GPU's?
Yes and No it has more to do with body language, tone and pacing.

Saw his ampere coverage last night since it was at the top of my recommended. Turned it off pretty quick. He didn’t go into specs or details. Would have been better off with tech Jesus.
 
Am I the only one on this forum to be excited about 3080 to 4090 upgrade? To be excited about much more capable ray tracing capabilities? And tech behind DLSS 3.0? I did not expect this many people to be grumpy! Can't wait to do my first run in Cyberpunk at highest RT setting on my oled monitor and in 4k resolution.
Have fun with Cyberpunk showing you 120fps on your fancy OLED, but your mouse responds like it's 40fps. And don't believe your eyes either. Those are not motion artifacts. They're ray-traced.
 
Am I the only one on this forum to be excited about 3080 to 4090 upgrade? To be excited about much more capable ray tracing capabilities? And tech behind DLSS 3.0? I did not expect this many people to be grumpy! Can't wait to do my first run in Cyberpunk at highest RT setting on my oled monitor and in 4k resolution.
I wish there were games with raytracing that made me want to spend $1600 on a video card.

The only place it was noticeable in cyberpunk for me was the bar with the glass floor reflecting neon lights. Most other games it’s basically increased puddle reflection fidelity.

I go through the rtx games list periodically and always leave disappointed.

The tech demos and Minecraft look great though.
 
I’ve been playing Cyberpunk on my 3080 with 4K DLSS Quality maxed. It’s pretty nice. They’ve made some good optimizations. Frankly, for this game, it’s more than enough.

You have every right to be excited if you think the $1600 for a 4090 is justified.
Um its more than enough? You must really have some low standards because at 4k maxed with ray tracing you are lucky to stay above 40 fps in plenty of areas with DLSS on quality with a 3080. Heck I know you would be hitting 30 fps at times on those settings at 4k because even at just 1440p my faster 3080 ti occasionally dips below 60 fps with ray tracing and DLSS on quality. I have to use balanced even at 1440p to stay above 60 fps the whole time.
 
Um its more than enough? You must really have some low standards because at 4k maxed with ray tracing you are lucky to stay above 40 fps in plenty of areas with DLSS on quality with a 3080. Heck I know you would be hitting 30 fps at times on those settings at 4k because even at just 1440p my faster 3080 ti occasionally dips below 60 fps with ray tracing and DLSS on quality. I have to use balanced at that res to stay above 60 fps the whole time.
It's not a low standard. It's called "being happy with what I have".

Whether you want to admit it or not, the 3080 is a fantastic gaming GPU, one which was easily worth the $700 I paid for mine. I just don't see this next generation measuring up to that sort of value. I got it for the 4K120hz HDR+GSYNC capability over HDMI 2.1 so I could power this LG CX OLED. In that regard, it has worked marvelously.
 
You called it "more than enough" which I doubt anyone would agree with when talking about 40 fps and below.
It's not a competitive first person shooter. 40 FPS for Cyberpunk is fine. This game is the Crysis of Ray Tracing.
 
It's not a competitive first person shooter. 40 FPS for Cyberpunk is fine.
Well I fully disagree and think many if not most people would too. Again it is beyond silly to say the 3080 is "more than enough" for that game on those settings when it is not even remotely close to being able to get 60 fps or even 50fps. Sub 50 fps and 40 fps with minimums in the low 30s is abysmal compared to holding at least 60 fps.
 
  • Like
Reactions: Meeho
like this
It’s been quite some time but I remember playing cyberpunk at launch on my oled and 3080 and I think I was running it on dlss balanced.

Heavy scenes were in the 40s with a big part of the game being 55-60

Looked great on the cx48. Pretty playable with gsync. I think the oled is a little more forgiving than a lcd as long as it is above 30, but more is always better.
 
It’s been quite some time but I remember playing cyberpunk at launch on my oled and 3080 and I think I was running it on dlss balanced.

Heavy scenes were in the 40s with a big part of the game being 55-60

Looked great on the cx48. Pretty playable with gsync.
You are remembering wrong as the game was always more demanding than that with maxed settings and ray tracing and it got even more demanding on max settings after some patches due to offering visual improvements.
 
45 fps is the lowest I can tolerate, and not really @ sustained, just if it goes down there momentarily or briefly, below that and with either mouse or controller, it feels like trying to punch in a dream, like trying to run through jello
 
You are remembering wrong as the game was always more demanding than that with maxed settings and ray tracing and it got even more demanding on max settings after some patches due to offering visual improvements.
You're in quite the mood tonight. Everything ok?

We're happy that you're excited about the RTX 4090, but that doesn't mean you have to shit on other people for not sharing your point of view.
 
You're in quite the mood tonight. Everything ok?

We're happy that you're excited about the RTX 4090, but that doesn't mean you have to shit on other people for not sharing your point of view.
Lol what? I am not mad nor am I looking forward to the 4090 at all. AGAIN I am only pointing out how silly it is to say a 3080 is "more than enough" in that game on those settings when you are getting performance that most people would not even accept all. More than enough would indicate you had plenty of headroom left in game...
 
You are remembering wrong as the game was always more demanding than that with maxed settings and ray tracing and it got even more demanding on max settings after some patches due to offering visual improvements.
It’s all good. It’s been a long time I don’t remember if I tweaked other settings or what. Just don’t remember it being awful.
 
Lol what? I am not man nor am I looking forward to the 4090 at all. AGAIN I am only pointing out how silly it is to say a 3080 is "more than enough" in that game on those settings when you are getting performance that most people would not even accept all. More than enough would indicate you plenty of headroom left in game...
Your definition and my definition of "more than enough" are different. It doesn't mean either of them are wrong or silly or whatever other adjective you want to attach to it.

Agree to disagree and move on.
 
It’s all good. It’s been a long time I don’t remember if I tweaked other settings or what. Just don’t remember it being awful.
Well as you said with gsync it will help quite a bit and of course thank goodness for DLSS.
 
I have no problem with requiring specs on boxes. That's something else altogether. What I don't have an issue with is them naming the card 4080. They can call it whatever they want and sleep fine as far as I'm concerned.

It actually isn't something else altogether and overlaps with the naming conventions. The naming is of itself is a representation regarding the nature of what is being sold, and in this case the immediate representation is that these two models are identical save for memory allocations. That's all good and fine, but it ultimately means that in most territories with decent consumer laws Nvidia and AIBs will need to prominently plaster the 4080 12gb with the true specs to avoid being accused of false/misleading advertising. I expect the snooty high and mighty response on [H] will likely be "well the buyer should have done their research beforehand", but in the real world that's simply not the way regulatory and legal systems operate - the onus is on the manufacturer/seller to be truthful with their advertising, not on consumers to work out if they are being misled before agreeing to buy something.

The easiest way to avoid this issue is to distinguish the 4080 12gb by calling it something else - Nvidia has decided not to do that, so it will be interesting to see how they package and market it.
 
The easiest way to avoid this issue is to distinguish the 4080 12gb by calling it something else
You know, there used to be something like that. Like a lower model than the xx80. Huh, makes you wonder.
Nvidia has decided not to do that, so it will be interesting to see how they package and market it.
Well, the official reply so far is "They're both 4080s, just with different vram. It's just like previous examples of this", when we know that's blatantly false.
 
It actually isn't something else altogether and overlaps with the naming conventions. The naming is of itself is a representation regarding the nature of what is being sold, and in this case the immediate representation is that these two models are identical save for memory allocations. That's all good and fine, but it ultimately means that in most territories with decent consumer laws Nvidia and AIBs will need to prominently plaster the 4080 12gb with the true specs to avoid being accused of false/misleading advertising. I expect the snooty high and mighty response on [H] will likely be "well the buyer should have done their research beforehand", but in the real world that's simply not the way regulatory and legal systems operate - the onus is on the manufacturer/seller to be truthful with their advertising, not on consumers to work out if they are being misled before agreeing to buy something.

The easiest way to avoid this issue is to distinguish the 4080 12gb by calling it something else - Nvidia has decided not to do that, so it will be interesting to see how they package and market it.
They only had to pay $30 back to GTX 970 customers. They'll do it again.
 
I liked his post because my Nvidia stock needs people like him 👍
Exactly! It goes without saying as posted in their corporate website ;)

It's the shareholders and not the gamers, duh.

The sales and marketing dept will review/adjust the 4090/4080s cards pricing over time just like what they did with the high end 3000 series recently, so I would wait a bit before buying one. Maybe RDNA3 announcement will force them to re-evaluate on Nov 3rd.

1663822878310.png
 
You know, there used to be something like that. Like a lower model than the xx80. Huh, makes you wonder.

Well, the official reply so far is "They're both 4080s, just with different vram. It's just like previous examples of this", when we know that's blatantly false.

Nvidia is begging to be sued by regulators.

They only had to pay $30 back to GTX 970 customers. They'll do it again.

That was one class action in the US which was not as clear cut and far less blatant than was is happening here. Regulators love easy low hanging fruit, so I wouldnt be surprised to see litigation in more than just the US unless Nvidia and AIBs clearly distinguish the 12gb from the 16gb. This could be easily done by just prominently featuring the number of CUDA cores on the packaging so consumers can see that the differences are not just memory related.

Edit: and in case it isn't obvious, the aim of regulatory litigation is not necessarily aligned with that of consumer class actions - i.e. extracting compensation. Rather the aim is to penalise and deter future bad behavior, which invariably means imposing a high penalty that isn't absorbed as an ongoing cost of business.
 
Last edited:
What is kind of scary, is that with the 4080 12GB so gimped, just imagine how bad the actual 4070 is going to be when they release it. 4070 will probably be what the 4060 should have been. And the 4060 will be a rebranded 4050, but probably still $500+. Then they will take the current 1650, re-brand it as a 4030, and double the price.
 
Back
Top