NVIDIA CEO Jensen Huang hints at ‘exciting’ next-generation GPU update on September 20th Tuesday




Someone should make a diorama of the pyramids to use as a GPU prop in the bottom of their computer case.


Brilliance and Absolute Dark Power lol at 18:09

These cards are so Hot AIO partners didn't know what to do to cool them down. I own a bracket but but it's balanced on a plastic PSU shroud. I'm going to skip the 4090 for sure I think the 4080 version will be just as heavy. I hope I don't have to wait around till 2023 to find one.

Just watched the Keynote thing last night skipped some parts the Realistic RC car thing was impressive more so than Raytraced Marbles from two years ago.
 
Last edited:
  • Like
Reactions: erek
like this
why do people defend bad business practices, because you do.

Me: Put the specs on the box.

you: lol consumers deserve to be hoodwinked because their not me.

Bad business practices? LOL, it's called marketing and every company in the world does it.

What difference would it make to have specs on the box for the "average consumers" you are talking about? If they aren't bothered about doing even 5 minutes of research before spending $900 on a card, they certainly aren't going to take time to learn what all the different specs are.

And show me where I said consumers deserve to be hoodwinked because they are not me? You won't find it, because I didn't say anything remotely like that. It doesn't matter what you are buying. Washing machine, TV, Phone whatever and you go out and don't understand what you are buying, you should always get advice, look up reviews, compare prices etc. Because if you go out without doing any research and end up paying over the odds, or getting an out of date model, or model with issues, that's on you. And there is no excuse these days, all the information you need is literally at your fingerprints.

Why would it be any different with a GPU? Look at the price, the performance and the features. Read some reviews. If the card that you are looking at offers the performance and features at a price you like, what does it matter what the card is called?

If Nvidia called it the 4070, you would still be outraged.

Don't be outraged at AMD/Nvidia for their prices. They are businesses just doing business. Be outraged at all those people who buy at the high prices. Who pay over the odds because of FOMO or want to have the latest and greatest and will pay anything to have it. If people weren't paying those prices, Nvidia and AMD wouldn't be charging them. And I bet you anything it's going to be the same this generation. Despite all the fake outrage about prices and naming schemes, as soon as these new cards are released they will sell out. If people just waited a few months, the prices would drop.
 
‘Moore’s Law’s dead,’ Nvidia CEO Jensen Huang says in justifying gaming-card price hike
https://www.marketwatch.com/story/m...justifying-gaming-card-price-hike-11663798618

“We are very, very specifically selling into the market a lot lower than is what’s selling out of the market, a significant amount lower than what’s selling out of the market,” Huang said. “And I’m hoping that by Q4 time frame, sometime in Q4, the channel would have normalized, and it would have made room for a great launch for Ada.”

To critics, Huang said he feels the higher price is justified, especially since the cutting-edge Lovelace architecture is necessary to support Nvidia’s expansion into the so-called metaverse.

“A 12-inch [silicon] wafer is a lot more expensive today than it was yesterday, and it’s not a little bit more expensive, it is a ton more expensive,” Huang said.

“Moore’s Law’s dead,” Huang said, referring to the standard that the number of transistors on a chip doubles every two years. “And the ability for Moore’s Law to deliver twice the performance at the same cost, or at the same performance, half the cost, every year and a half, is over. It’s completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past.”

“Computing is a not a chip problem, it’s a software and chip problem,” Huang said.
 
I wish EVGA was still around
They wouldn't make a penny with the material costs required to cool those behemoths. I think their CEO might be the most sane person in the industry.

What else besides GPUs is EVGA good at making?
Power supplies!

What component is every NVIDIA customer going to need for their new shiny 4000 series high end card?
A new power supply!

I predict the water cooling crowd is about to get huge boost to sales from enthusiasts. Who wants a video card with a peg leg when you can have a sexy, sleek, and thin card?
 
Bad business practices? LOL, it's called marketing and every company in the world does it.

What difference would it make to have specs on the box for the "average consumers" you are talking about? If they aren't bothered about doing even 5 minutes of research before spending $900 on a card, they certainly aren't going to take time to learn what all the different specs are.

And show me where I said consumers deserve to be hoodwinked because they are not me? You won't find it, because I didn't say anything remotely like that. It doesn't matter what you are buying. Washing machine, TV, Phone whatever and you go out and don't understand what you are buying, you should always get advice, look up reviews, compare prices etc. Because if you go out without doing any research and end up paying over the odds, or getting an out of date model, or model with issues, that's on you. And there is no excuse these days, all the information you need is literally at your fingerprints.

Why would it be any different with a GPU? Look at the price, the performance and the features. Read some reviews. If the card that you are looking at offers the performance and features at a price you like, what does it matter what the card is called?

If Nvidia called it the 4070, you would still be outraged.

Don't be outraged at AMD/Nvidia for their prices. They are businesses just doing business. Be outraged at all those people who buy at the high prices. Who pay over the odds because of FOMO or want to have the latest and greatest and will pay anything to have it. If people weren't paying those prices, Nvidia and AMD wouldn't be charging them. And I bet you anything it's going to be the same this generation. Despite all the fake outrage about prices and naming schemes, as soon as these new cards are released they will sell out. If people just waited a few months, the prices would drop.

Read this...

https://hardforum.com/threads/nvidi...eptember-20th-tuesday.2021544/post-1045457136

Then read it again. Doesn't matter how indignant you get about it, the naming convention is misleading unless accompanied by spec details making it clear that the differences between the models go beyond memory configuration.
 
Source.
https://twitter.com/9550pro/status/1572272657760153600?t=9ITqDyc3g_LzajkZf6F8Ow&s=19

🤣 https___t.co_LtjiK4v7E.gif


1663849101645.png
 
They wouldn't make a penny with the material costs required to cool those behemoths. I think their CEO might be the most sane person in the industry.

What else besides GPUs is EVGA good at making?
Power supplies!

What component is every NVIDIA customer going to need for their new shiny 4000 series high end card?
A new power supply!

I predict the water cooling crowd is about to get huge boost to sales from enthusiasts. Who wants a video card with a peg leg when you can have a sexy, sleek, and thin card?
EVGA doesn't make power supplies. They source them from OEMs, rebrand them, and resell them.
 
Lol what? I am not mad nor am I looking forward to the 4090 at all. AGAIN I am only pointing out how silly it is to say a 3080 is "more than enough" in that game on those settings when you are getting performance that most people would not even accept all. More than enough would indicate you had plenty of headroom left in game...
The problem is you keep making the same bold claim "that most people would not even accept all." for which you have zero proof, just a statement to try make your point more valid. I could argue that the fact that a lot of people are using the 40fps rate limit on Steam Deck, indicates that they are fine accepting that as minimum frame rate depending on the game. Sure, I prefer 60fps but I am not going to claim MOST people want this or that cause I really have no idea as I have not made any polls to see what "MOST" people feel is acceptable.
 
The problem is you keep making the same bold claim "that most people would not even accept all." for which you have zero proof, just a statement to try make your point more valid. I could argue that the fact that a lot of people are using the 40fps rate limit on Steam Deck, indicates that they are fine accepting that as minimum frame rate depending on the game. Sure, I prefer 60fps but I am not going to claim MOST people want this or that cause I really have no idea as I have not made any polls to see what "MOST" people feel is acceptable.
Perhaps you need to look at context here as we are talking about high-end gpus and upgrading for a very demanding game. So when someone claims a current high end GPU is already more than enough for a game when that performance is in the 30 to 40 frames per second range then you know damn well that is absolutely not what most enthusiasts discussing a high-end GPU upgrade would consider more than enough.
 



Someone should make a diorama of the pyramids to use as a GPU prop in the bottom of their computer case.

Oh man so many so many jokes.

The 4090 is the Rings of Power of GPUS. I will be so much more entertained laughing at it then I would be trying to position my support stick if I bought in.
 
I think it's going to be really interesting to see how these cards sell. I feel like the 4090 will probably do fine since it has a defined audience and quantities will likely be lower than the other ones. The other ones, I dunno. The internet loves to be super negative about everything, but it's been a while since I've seen this much negativity. Not since the days when you could roll up to Microcenter and buy anything you wanted on release day. Then again, people love to trash something and and then suddenly post about getting lucky when they manage to buy one. Gonna be really interesting to see what the reviews say. I hope there's more to these than meets the eye.
 
I think it's going to be really interesting to see how these cards sell. I feel like the 4090 will probably do fine since it has a defined audience and quantities will likely be lower than the other ones. The other ones, I dunno. The internet loves to be super negative about everything, but it's been a while since I've seen this much negativity. Not since the days when you could roll up to Microcenter and buy anything you wanted on release day. Then again, people love to trash something and and then suddenly post about getting lucky when they manage to buy one. Gonna be really interesting to see what the reviews say. I hope there's more to these than meets the eye.
Do you even GPU stick bro ? :)
 
I think it's going to be really interesting to see how these cards sell. I feel like the 4090 will probably do fine since it has a defined audience and quantities will likely be lower than the other ones. The other ones, I dunno. The internet loves to be super negative about everything, but it's been a while since I've seen this much negativity. Not since the days when you could roll up to Microcenter and buy anything you wanted on release day. Then again, people love to trash something and and then suddenly post about getting lucky when they manage to buy one. Gonna be really interesting to see what the reviews say. I hope there's more to these than meets the eye.
It's the pricing that initial reaction is so negative imo. The segmentation (esp. 4080 12gb) and clear old inventory pricing is rubbing lots of people he wrong way. Makes sense when you could get a 3080 10gb for less than 1k.
 
Just a joke, but still the 4090 GPU has around 2.7 times more transistor by mm² in a 2 year's interval, which is significantly better than Moore Laws (double of density every 2 year's) I think. Must be a rare case for this to happen in a while.
Moore's law is double SPEED
the principle that the speed and capability of computers can be expected to double every two years, as a result of increases in the number of transistors a microchip can contain:
 
So far what I've seen has convinced me to keep my EVGA 3080 FTW until it dies, no matter how long that takes, even if I have to lower the settings in newer things to keep the frame rates smooth.
 
If I can get the same or very close to the 4090 cud’s cores in a 4080 ti and it fits in my case the. I will buy one but like you I have 2x 3080 for games and rendering I’m fine
 
Perhaps you need to look at context here as we are talking about high-end gpus and upgrading for a very demanding game. So when someone claims a current high end GPU is already more than enough for a game when that performance is in the 30 to 40 frames per second range then you know damn well that is absolutely not what most enthusiasts discussing a high-end GPU upgrade would consider more than enough.
It is irrelevant. You cannot say most people this or that because you got no evidence about that or have you done a study?
That is the problem with your statement, you are trying to validate your opinion with a made up "fact" .
That is not the right way to argue .
 
  • Like
Reactions: noko
like this
So far what I've seen has convinced me to keep my EVGA 3080 FTW until it dies, no matter how long that takes, even if I have to lower the settings in newer things to keep the frame rates smooth.

There shouldn't be a whole lot of games out or even coming soon you'll have to do that with. At least not beyond "high/very high." Maybe by 2024. With COVID pushing so many titles back, the high-end 30-series cards should have a long life if you want them to.
 
  • Like
Reactions: Damar
like this
they're wide enough they can put LCD screens on them now.

These 4000 series cards are so freaking big that we're going to go full circle- time to turn your case on it's side and plant your monitor on the top again just like your old Packard Bell 386.
 
‘Moore’s Law’s dead,’ Nvidia CEO Jensen Huang says in justifying gaming-card price hike
https://www.marketwatch.com/story/m...justifying-gaming-card-price-hike-11663798618

“We are very, very specifically selling into the market a lot lower than is what’s selling out of the market, a significant amount lower than what’s selling out of the market,” Huang said. “And I’m hoping that by Q4 time frame, sometime in Q4, the channel would have normalized, and it would have made room for a great launch for Ada.”

To critics, Huang said he feels the higher price is justified, especially since the cutting-edge Lovelace architecture is necessary to support Nvidia’s expansion into the so-called metaverse.

“A 12-inch [silicon] wafer is a lot more expensive today than it was yesterday, and it’s not a little bit more expensive, it is a ton more expensive,” Huang said.

“Moore’s Law’s dead,” Huang said, referring to the standard that the number of transistors on a chip doubles every two years. “And the ability for Moore’s Law to deliver twice the performance at the same cost, or at the same performance, half the cost, every year and a half, is over. It’s completely over, and so the idea that a chip is going to go down in cost over time, unfortunately, is a story of the past.”

“Computing is a not a chip problem, it’s a software and chip problem,” Huang said.
People are changing what Moore's law is. If you are going to quote the man get it right.
Moore's Law
NOUN
the principle that the speed and capability of computers can be expected to double every two years, as a result of increases in the number of transistors a microchip can contain:

The speed doubles because an unspecified number of transistors can be added.
 
These 4000 series cards are so freaking big that we're going to go full circle- time to turn your case on it's side and plant your monitor on the top again just like your old Packard Bell 386.
This is why I like my old Cooler Master HAF-XB's.. never have to worry about video card sag. not that I would ever entertain the idea of a 4 slot card. I wouldn't mind one of those lighted card support stick things though. Sort of looks like a minecraft version of a mini light saber.

1663861622073.png
 
Last edited:
I see leather jacket boy and his lackeys getting a depressing dose of reality this launch. Everything is lining up for a whopping good year for expensive gaming gear peddlers. Consumers finding themselves with all of that expendable income suddenly 🤣 yep, gonna be a great year ngreedia.
The problem with these "smartest man in the room" guys is that they are wrong so often. 1000 series GPUs "wow sales doubled in one month at this rate we will sell 16 times that in 4 months!" then came the mining crash. 3000 series GPUs "wow sales doubled in one month at this rate we will sell 16 times that in 4 months!" then came the mining crash.
 
This is why I like my old Cooler Master HAF-XB's.. never have to worry about video card sag. not that I would ever entertain the idea of a 4 slot card. I wouldn't mind one of those lighted card support stick things though. Sort of looks like a minecraft version of a mini light saber.

View attachment 512630
Bling-Bling! I went the pragmatic route and just used some black lego to address the little bit of sag from the 3080Ti.
 
Currently on a 2070 Super from 2019 that I moved from my old computer to my new one at the end of last year. Refused to pay what they were/are asking for Nvidia 3000 series or AMD 6000 series. That 2070 super, sold brand new from a reputable/authorized retailer, was one of THE best 2070 Super models you could buy - the other top models only had different pros/cons. $540. Default non-TI/super 4070-renamed-4080 a few years later? $900. And it has less memory than AMD's last-gen 3070 TI equivalent.

Definitely staying on 1080p, at least for gaming monitors, for many years to come. Have no problem at all continuing to lower graphics settings over time if I am not getting at least 60 fps. On certain games even 60 is not enough for me to keep settings higher. And with multiplayer games I already lower graphics as much as needed, even if to minimum, if I am getting any less than the max fps of my monitor, which is currently 144, but will eventually upgrade in the future.

I was planning on getting a gsync-module monitor once inflation calmed down. But looks like that is going to be put indefinitely on hold too. Don't want to get locked into Nvidia if AMD doesn't go down the same pricing path.

Consumers are more and more overpaying, and by that I mean they are not just buying bigger and more powerful GPUs, but buying them at higher and higher profit margins for Nvidia. I could care less if Nvidia was building physically larger and more power hungry GPUs with incredible performance at absurd prices, as long as they still kept more reasonably powerful GPUs at similar profit margins. But labor/parts/R&D/inflation/etc are not where the bulk of these price increases are originating. They are from Nvidia skyrocketing profit margins.

But what I find even more ridiculous, is that while the more reasonable GPUs are becoming more cost-prohibitive since people are overpaying for all of them at all levels, there are not even good games to make use of them! All the best games (and mods for some of those games) where the difference in computer hardware can actually provide a better and more immersive experience and not just relatively minor graphical upgrade gimmicks, ARE CPU-BOUND! Heck, there are people buying big GPUs and cheaping out on the CPUs. If $1,000+ parts are going to be "mid-range" now, can someone please start making parts that actually enhance gameplay and not just add some extra shadows or lights here and there?

If Nvidia announced that they developed some revolutionary technology that would allow game devs to offload from the CPU (while not causing latency problems) huge amounts of non-graphical simulation and AI processing to GPUs, allowing much more large-scale, detailed, reactive, and interactive worlds and gameplay across all genres (open-world RPG/survival/FPS, MMO, 4x, strategy, gand strategy, tycoon, simulation, etc.) .... I would be asking where is my $1600 4080 16GB. Or if Intel announced a CPU the size of a GPU at $2,000 with a triple digit increase in single-thread performance, I would be on board for that too.

Why are people willing to pay this much money for just some extra nice looking pixels with the exact same gameplay? And more and more as time goes on. And arguing online defending the prices too. Insanity.
 
They're putting them on some mobos now. Just more useless bling for the most part.

On MoBo it's fine IMO for error code & readout (instead of deciphering beep codes or looking up what a colored light means) or system/temp/fan status and things
 
Currently on a 2070 Super from 2019 that I moved from my old computer to my new one at the end of last year. Refused to pay what they were/are asking for Nvidia 3000 series or AMD 6000 series. That 2070 super, sold brand new from a reputable/authorized retailer, was one of THE best 2070 Super models you could buy - the other top models only had different pros/cons. $540. Default non-TI/super 4070-renamed-4080 a few years later? $900. And it has less memory than AMD's last-gen 3070 TI equivalent.

Definitely staying on 1080p, at least for gaming monitors, for many years to come. Have no problem at all continuing to lower graphics settings over time if I am not getting at least 60 fps. On certain games even 60 is not enough for me to keep settings higher. And with multiplayer games I already lower graphics as much as needed, even if to minimum, if I am getting any less than the max fps of my monitor, which is currently 144, but will eventually upgrade in the future.

I was planning on getting a gsync-module monitor once inflation calmed down. But looks like that is going to be put indefinitely on hold too. Don't want to get locked into Nvidia if AMD doesn't go down the same pricing path.

Consumers are more and more overpaying, and by that I mean they are not just buying bigger and more powerful GPUs, but buying them at higher and higher profit margins for Nvidia. I could care less if Nvidia was building physically larger and more power hungry GPUs with incredible performance at absurd prices, as long as they still kept more reasonably powerful GPUs at similar profit margins. But labor/parts/R&D/inflation/etc are not where the bulk of these price increases are originating. They are from Nvidia skyrocketing profit margins.

But what I find even more ridiculous, is that while the more reasonable GPUs are becoming more cost-prohibitive since people are overpaying for all of them at all levels, there are not even good games to make use of them! All the best games (and mods for some of those games) where the difference in computer hardware can actually provide a better and more immersive experience and not just relatively minor graphical upgrade gimmicks, ARE CPU-BOUND! Heck, there are people buying big GPUs and cheaping out on the CPUs. If $1,000+ parts are going to be "mid-range" now, can someone please start making parts that actually enhance gameplay and not just add some extra shadows or lights here and there?

If Nvidia announced that they developed some revolutionary technology that would allow game devs to offload from the CPU (while not causing latency problems) huge amounts of non-graphical simulation and AI processing to GPUs, allowing much more large-scale, detailed, reactive, and interactive worlds and gameplay across all genres (open-world RPG/survival/FPS, MMO, 4x, strategy, gand strategy, tycoon, simulation, etc.) .... I would be asking where is my $1600 4080 16GB. Or if Intel announced a CPU the size of a GPU at $2,000 with a triple digit increase in single-thread performance, I would be on board for that too.

Why are people willing to pay this much money for just some extra nice looking pixels with the exact same gameplay? And more and more as time goes on. And arguing online defending the prices too. Insanity.
Well tbf 1080p is pretty low res at this point in time. You could easily get a 1440p 144hz monitor and it would be a really nice upgrade. With gsync/freesync the 2070 would handle it just fine.
 
On MoBo it's fine IMO for error code & readout (instead of deciphering beep codes or looking up what a colored light means) or system/temp/fan status and things
Eh just put the damn codes on there imo, which is funny because lots of those boards still have them. Other than a test bench set up I just can't see myself trying to look at the inside of my case when I can easily look at my monitor for temp/fan speed etc.
 
Back
Top