NVIDIA CEO Jensen Huang hints at ‘exciting’ next-generation GPU update on September 20th Tuesday

I'm assume it's the same complaints everyone else has been making for the last week.
The "New High in Low Morality" sounded like he would be preaching, and making some pretty crazy claims...

Does nVidia use child labor? (Apple and Nike do, so, maybe?) Does nVidia invade countries? Does nVidia force you to have that baby??

Gonna say probably not.

Do they sell for prices we don't like? Sure. Don't see the moral question here.
So clickbaity video, not gonna click. Guy needs a new job, as the "morality" of his is questionable. Irony right there, or more likely hypocrisy.
 
The "New High in Low Morality" sounded like he would be preaching, and making some pretty crazy claims...

Does nVidia use child labor? (Apple and Nike do, so, maybe?) Does nVidia invade countries? Does nVidia force you to have that baby??

Gonna say probably not.

Do they sell for prices we don't like? Sure. Don't see the moral question here.
So clickbaity video, not gonna click. Guy needs a new job, as the "morality" of his is questionable. Irony right there, or more likely hypocrisy.
Adored is an AMDrone. AMDrones hang on to his every word. You're probably making a smart decision.
 
I always went with AMDummies myself

The "New High in Low Morality" sounded like he would be preaching, and making some pretty crazy claims...

Does nVidia use child labor? (Apple and Nike do, so, maybe?) Does nVidia invade countries? Does nVidia force you to have that baby??

Gonna say probably not.

Do they sell for prices we don't like? Sure. Don't see the moral question here.
So clickbaity video, not gonna click. Guy needs a new job, as the "morality" of his is questionable. Irony right there, or more likely hypocrisy.

Yeah I got ticked a while back when even Kyle was like nViDiA iS LItErAlLy EvIL because.......... well people who diddle kids and people who murder not in self defense are evil. Nvidia is just a bunch of assholes at worst. And that's fine if you were to think that. But that's it.
 
The "New High in Low Morality" sounded like he would be preaching, and making some pretty crazy claims...

Does nVidia use child labor? (Apple and Nike do, so, maybe?) Does nVidia invade countries? Does nVidia force you to have that baby??

Gonna say probably not.

Do they sell for prices we don't like? Sure. Don't see the moral question here.
So clickbaity video, not gonna click. Guy needs a new job, as the "morality" of his is questionable. Irony right there, or more likely hypocrisy.
You're right, morality is not the right word. Ethics would be a more appropriate word. Everything he says in the video is true though, and he presents hard hitting facts/info that so far PC tech channels like LTT, GN, HWU, and Jay have not comprehensively presented so far. Essentially he details the history of previous performance, MSRP, cost (to Nvidia), and other important metrics comparable from generation to generation, and shows that the 4090/4080 series value is the worst in 80-series history, going all the way back to 14 years ago when the first 80-series, the 280, was released. Simultaneously, Nvidia will maintain its historically high profit margins, regardless of whatever margins the retailers/AIBs are making. The "increased costs" and "unprecedented performance increases" supposedly justifying the MSRPs are Nvidia BS.
 
GPU's on 150nm nodes were much simpler to design, or even 65nm which the GTX280 was built on. The orders of magnitude that the complexity has grown over that same timespan he discusses, he utterly ignores. Either out of stupidity or on purpose. If it is the former, he has no business trying to make any ethical judgement on shit he understands only at the surface, most basic level. If it's the latter, he's a hypocrite. Either way, not getting any views from me.
Complexity = new capabilities + more speed

GTX280 = 1.4 Billion Transistors - Features: unified shaders, texture mapping units, render output units, 0.933 Tflops processing power
Features added in the GPUs in between: cuda cores, shader processors, improved power management, GPU Boost (automatic overclocking when temp and power allows it), parallelized instructions, faster encoding, newer pcie, PureVideo feature set, Dynamic Super Resolution, Delta Color Compression, Multi-Pixel Programming Sampling, Nvidia VXGI (Real-Time-Voxel-Global Illumination), VR Direct, Multi-Projection Acceleration, Multi-Frame Sampled Anti-Aliasing (MFAA), Async compute, G-Sync, Hardware Raytracing, Deep Learning Super Sampling, newer video port technologies such as Displayport. All along the way, these technologies have been refined and improved upon. And this is just what I skimmed from wikipedia, there's probably more that is not listed.
GA102 = 76.3 Billion Transistors - 1,321 Tensor-TFLOPs, 191 RT-TFLOPs, and 83 Shader-TFLOPs. This total is 1710x the GTX280, if you leave out the Tensor and RT processing power and only include the Shader processing power, it's 89x the GTX280's processing power.
over 54x the transistors in nearly the same die size.

My 'Ethics' analysis: RTX4080 16Gb card is $1199, that is $904 in 2008 dollars. GTX 280 launched at $649. So for 1.39x the price in inflation adjusted dollars, you get 89x the shader processing power, tons of new features, some pretty amazing ones. Oh the Morality of it! Oh the Ethics of it! What will we ever do??!?! Why isn't it 1x the price? Let's make a video!
For all of the billions spent on R&D to make a modern GPU possible, 1.39x cost increase doesn't feel immoral or unethical.

But let's ignore all of that and preach the Heavenlies of AMD and the Evils of nVidia!
(it gets more clicks, and/or I am paid to make these videos by AMD)

Edit: corrected the gtx280 tflops
 
Last edited:
I'm ok with Nvidia making money. In fact, I'm GREAT with Nvidia making money, because that money gives them resources to build newer and better products. The dilemma that we are facing with the RTX 4000 cards is "What is a frame?". Higher framerates and higher image quality has always been the goal, but in the last several years, "lower latency" has also asserted itself as an important factor, especially as esports takes hold in modern culture.

If GPU technology is improving the way it's supposed to, you will get a combination of these items.
  • Higher framerates
  • Higher image quality
  • Lower latency
The problem with the RTX 4000 series is that it has introduced a feature that creates higher framerates, mildly impacts image quality negatively, and keeps latency the same as it was, or even increases it. This trend started with the introduction of the RTX 2000 cards and DLSS which, at first, degraded image quality in order to increase framerates and lower latency. So... if we're getting higher framerates at the expense of image quality and latency, is it really a step forward? I honestly don't think it is.

It feels like Nvidia has hit a plateau from a design/innovation standpoint and is now trying to trick consumers into thinking they are still making advancements. On top of this, they are tagging us with insane pricing for these new products. I'm all for technology moving forward, but the RTX 4000 cards feels like a solid step to the side, not a super tangible increase. The RTX 4090 looks like it's going to be a solid step up, but everything else is very... meh.

Correct me if I'm wrong on this.
 
Last edited:
I'm ok with Nvidia making money. In fact, I'm GREAT with Nvidia making money, because that money gives them resources to build newer and better products. The moral dilemma that we are facing with the RTX 4000 cards is "What is a frame?". Higher framerates and higher image quality has always been the goal, but in the last several years, "lower latency" has also asserted itself as an important factor, especially as esports takes hold in modern culture.

If GPU technology is improving the way it's supposed to, you will get a combination of these items.
  • Higher framerates
  • Higher image quality
  • Lower latency
The problem with the RTX 4000 series is that it has introduced a feature that creates higher framerates, mildly impacts image quality negatively, and keeps latency the same as it was, or even increases it. This trend started with the introduction of the RTX 2000 cards and DLSS which, at first, degraded image quality in order to increase framerates and lower latency. So... if we're getting higher framerates at the expense of image quality and latency, is it really a step forward? I honestly don't think it is.

It feels like Nvidia has hit a plateau from a design/innovation standpoint and is now trying to trick consumers into thinking they are still making advancements. On top of this, they are tagging us with insane pricing for these new products. I'm all for technology moving forward, but the RTX 4000 cards feels like a solid step to the side, not a super tangible increase. The RTX 4090 looks like it's going to be a solid step up, but everything else is very... meh.

Correct me if I'm wrong on this.
It's new technology. It has nothing to do with morality. Don't buy the product if you don't like it. Every release it's always something that's EVIL. SMDH.
 
It's new technology. It has nothing to do with morality. Don't buy the product if you don't like it. Every release it's always something that's EVIL. SMDH.
Correct. There is very little "morality" in business. You make a product, you sell it, you make a profit, you build a new product. Morality does not play into any of it.
 
I'm ok with Nvidia making money. In fact, I'm GREAT with Nvidia making money, because that money gives them resources to build newer and better products. The moral dilemma that we are facing with the RTX 4000 cards is "What is a frame?". Higher framerates and higher image quality has always been the goal, but in the last several years, "lower latency" has also asserted itself as an important factor, especially as esports takes hold in modern culture.

If GPU technology is improving the way it's supposed to, you will get a combination of these items.
  • Higher framerates
  • Higher image quality
  • Lower latency
The problem with the RTX 4000 series is that it has introduced a feature that creates higher framerates, mildly impacts image quality negatively, and keeps latency the same as it was, or even increases it. This trend started with the introduction of the RTX 2000 cards and DLSS which, at first, degraded image quality in order to increase framerates and lower latency. So... if we're getting higher framerates at the expense of image quality and latency, is it really a step forward? I honestly don't think it is.

It feels like Nvidia has hit a plateau from a design/innovation standpoint and is now trying to trick consumers into thinking they are still making advancements. On top of this, they are tagging us with insane pricing for these new products. I'm all for technology moving forward, but the RTX 4000 cards feels like a solid step to the side, not a super tangible increase. The RTX 4090 looks like it's going to be a solid step up, but everything else is very... meh.

Correct me if I'm wrong on this.
We'll see how much latency there really is, but I tend to have a glass-half-full approach; to me, DLSS is more about making a 1080p or 1440p game look better without tanking your frame rate.

Having said this, NVIDIA is definitely hitting a wall in terms of performance and efficiency. It's bit funny to hear some hardcore PC gamers calling for the death of consoles (and competition, and variety) when you can buy a modern console and a TV for the same price as an RTX 4080. Get back to me when there's an RTX 4060... which hopefully won't cost more than a PS5.
 
GPU's on 150nm nodes were much simpler to design, or even 65nm which the GTX280 was built on. The orders of magnitude that the complexity has grown over that same timespan he discusses, he utterly ignores. Either out of stupidity or on purpose. If it is the former, he has no business trying to make any ethical judgement on shit he understands only at the surface, most basic level. If it's the latter, he's a hypocrite. Either way, not getting any views from me.
Complexity = new capabilities + more speed

GTX280 = 1.4 Billion Transistors - Features: unified shaders, texture mapping units, render output units, 0.933 Tflops processing power
Features added in the GPUs in between: cuda cores, shader processors, improved power management, GPU Boost (automatic overclocking when temp and power allows it), parallelized instructions, faster encoding, newer pcie, PureVideo feature set, Dynamic Super Resolution, Delta Color Compression, Multi-Pixel Programming Sampling, Nvidia VXGI (Real-Time-Voxel-Global Illumination), VR Direct, Multi-Projection Acceleration, Multi-Frame Sampled Anti-Aliasing (MFAA), Async compute, G-Sync, Hardware Raytracing, Deep Learning Super Sampling, newer video port technologies such as Displayport. All along the way, these technologies has been refined and improved upon. And this is just what I skimmed from wikipedia, there's probably more that is not listed.
GA102 = 76.3 Billion Transistors - 1,321 Tensor-TFLOPs, 191 RT-TFLOPs, and 83 Shader-TFLOPs. This total is 1710x the GTX280, if you leave out the Tensor and RT processing power and only include the Shader processing power, it's 89x the GTX260's processing power.
over 54x the transistors in nearly the same die size.

My 'Ethics' analysis: RTX4080 15Gb card is $1199, that is $904 in 2008 dollars. GTX 280 launched at $649. So for 1.39x the price in inflation adjusted dollars, you get 89x the shader processing power, tons of new features, some pretty amazing ones. Oh the Morality of it! Oh the Ethics of it! What will we ever do??!?! Why isn't it 1x the price? Let's make a video!
For all of the billions spent on R&D to make a modern GPU possible, 1.39x cost increase doesn't feel immoral or unethical.

But let's ignore all of that and preach the Heavenlies of AMD and the Evils of nVidia!
(it gets more clicks, and/or I am paid to make these videos by AMD)

Edit: corrected the gtx280 tflops
I get a lot of people are butthurt over the prices, and they are meant to be, Nvidia has intentionally made the pricing on the 4000 series unattractive, they need consumers to choose the leftover 3000 series stock over the 4000 series. That high pricing is completely intentional and Nvidia didn't hide that they were going to do this, they stated it very plainly when they said they were going to shift the market accordingly and all the forums were up in arms about Nvidia "manipulating the market". This is them manipulating the market as they publically stated to clear out the 3000 series over stock.

AIBs were begging Nvidia to delay the launch of the 4000 series but they couldn't, then the AIBs begged Nvidia to cut back on their TSMC order, but they couldn't.
So Nvidia jacks the prices on the 4000 series, makes them a complete halo product for big spenders, demand will be low AIBs will take small orders, users will buy the 3000 series instead and Nvidia can then shift production to the H series silicon, which they already have. Orders for their H100 series chips have greatly exceeded their expectations, so this whole strategy is letting them fill that market while giving their AIB partners exactly what they asked for.
There's nothing unethical here, we are seeing the 3000 series parts selling for below MSRP and they are by no means bad cards, they were bad cards for the price for the past 2 years but that was mostly the AIBs jacking prices because of demand or scalpers taking advantage of the situation not because Nvidia was doing anything unethical.
 
We'll know if DLSS 3 is a giant ruse soon enough - just 2 more weeks until the review embargo lifts. I feel pretty confident that if latency is an issue, the major tech reviewers will be all over it.
Then it'll come down to support. Nvidia usually isn't afraid to throw money at things, so then it'll come down to which titles support it and when.
 
We'll see how much latency there really is, but I tend to have a glass-half-full approach; to me, DLSS is more about making a 1080p or 1440p game look better without tanking your frame rate.

Having said this, NVIDIA is definitely hitting a wall in terms of performance and efficiency. It's bit funny to hear some hardcore PC gamers calling for the death of consoles (and competition, and variety) when you can buy a modern console and a TV for the same price as an RTX 4080. Get back to me when there's an RTX 4060... which hopefully won't cost more than a PS5.
And AMD is to be commended on those consoles, they punch above their weight class, I would take a PS5/XBox over any $600 gaming PC easily.
 
And AMD is to be commended on those consoles, they punch above their weight class, I would take a PS5/XBox over any $600 gaming PC easily.
PC high end definitely performs head, neck, and shoulders above consoles, but at the low-to-mid range, the PS5 and Xbox Series X stand alone. It would seem PC component manufacturers have ceded the low/mid arena to consoles.
 
We'll know if DLSS 3 is a giant ruse soon enough - just 2 more weeks until the review embargo lifts. I feel pretty confident that if latency is an issue, the major tech reviewers will be all over it.
Then it'll come down to support. Nvidia usually isn't afraid to throw money at things, so then it'll come down to which titles support it and when.
Yeah I'm going to guess it won't be. Nvidia seems to be pretty aware that lower latency is important to some (as their work on reflex shows) but a small increase in latency for better RTX visuals would certainly appeal to a large group as well.
 
Yeah I'm going to guess it won't be. Nvidia seems to be pretty aware that lower latency is important to some (as their work on reflex shows) but a small increase in latency for better RTX visuals would certainly appeal to a large group as well.
The big question is does the latency added by the frame generation get countered enough by reflex so that the final result is less than or equal to the latency if neither was used? Because if they combined are equal to using neither then its a zero-sum argument and not an issue.
 
And AMD is to be commended on those consoles, they punch above their weight class, I would take a PS5/XBox over any $600 gaming PC easily.
Hell, I'd take one over more expensive PCs than that! To have a clear edge over a PS5 or XSX, I'd say you need a fairly hefty GPU investment.
 
A $600 gaming PC would probably have to be full of used parts to be worth a damn. Either that or using shaky #'s that don't count expenses like the cost of a Windows license, KB/M/Gamepad, fans, case, etc.
The days of low-end PC's besting consoles are over for the moment.
 
A similarly priced PC over a recently released console, I am not sure if it has ever been competitive (specially if you pay for the OS and do not take into account the cheaper available game on GoG, steam versus consoles historically).

The spending $500 and $600 more on a gaming PC instead of a non gaming PC that you virtually need to have anyway for many people (a bit like we tend to not count the price of a gaming TV for the console, because you already had a TV or still changed it) can be competitive again with the GPU priced down.
 
Hell, I'd take one over more expensive PCs than that! To have a clear edge over a PS5 or XSX, I'd say you need a fairly hefty GPU investment.

A $600 gaming PC would probably have to be full of used parts to be worth a damn. Either that or using shaky #'s that don't count expenses like the cost of a Windows license, KB/M/Gamepad, fans, case, etc.
The days of low-end PC's besting consoles are over for the moment.

And that is why neither AMD nor Nvidia is putting much into the mid/low end, AMD in that space is literally competing with themselves, and its more profitable for them to steer users towards the consoles, and for Nvidia, they would then just be spending time and resources to compete with their excess stock which traditionally they always have.
 
I'm ok with Nvidia making money. In fact, I'm GREAT with Nvidia making money, because that money gives them resources to build newer and better products. The dilemma that we are facing with the RTX 4000 cards is "What is a frame?". Higher framerates and higher image quality has always been the goal, but in the last several years, "lower latency" has also asserted itself as an important factor, especially as esports takes hold in modern culture.

If GPU technology is improving the way it's supposed to, you will get a combination of these items.
  • Higher framerates
  • Higher image quality
  • Lower latency
The problem with the RTX 4000 series is that it has introduced a feature that creates higher framerates, mildly impacts image quality negatively, and keeps latency the same as it was, or even increases it. This trend started with the introduction of the RTX 2000 cards and DLSS which, at first, degraded image quality in order to increase framerates and lower latency. So... if we're getting higher framerates at the expense of image quality and latency, is it really a step forward? I honestly don't think it is.

It feels like Nvidia has hit a plateau from a design/innovation standpoint and is now trying to trick consumers into thinking they are still making advancements. On top of this, they are tagging us with insane pricing for these new products. I'm all for technology moving forward, but the RTX 4000 cards feels like a solid step to the side, not a super tangible increase. The RTX 4090 looks like it's going to be a solid step up, but everything else is very... meh.

Correct me if I'm wrong on this.

I this DLSS 2 is great technology for what it is. In some games the visual downsides aren't noticeable in real gameplay. If you have a high end PC and get enough frame rates, turn it off. DLSS 3 I'll wait and see what people find. Seems like it may introduce more downsides.

But I agree that while these technologies are good the end goal should always be increasing image quality and frame rates. DLSS are good options while ray tracing is hardware demanding and is great for lower end GPUs, but it shouldn't be considered a standard feature. Ideally I don't use it in games, it is a cost benefit issue. Is the minor image quality hit worth it to enable ray tracing? Or to jump from 55 to 75 frame rates?

I do like that Nvidia is thinking of ways to increase performance rather than brute forcing it but they need to keep image quality in mind.
 
But let's ignore all of that and preach the Heavenlies of AMD and the Evils of nVidia!
(it gets more clicks, and/or I am paid to make these videos by AMD)

Yes, because I see so many preaching the wonderfulness of AMD's prices (sans the somewhat sane 6600 for the past 3 months). Never mind that the market didn't want to pay AMD's prices either even when they were available and $200-400 cheaper at the high end for over a year now compared to ampere cards.

My 'Ethics' analysis: RTX4080 15Gb card is $1199, that is $904 in 2008 dollars. GTX 280 launched at $649. So for 1.39x the price in inflation adjusted dollars, you get 89x the shader processing power, tons of new features, some pretty amazing ones. Oh the Morality of it! Oh the Ethics of it! What will we ever do??!?! Why isn't it 1x the price? Let's make a video!
For all of the billions spent on R&D to make a modern GPU possible, 1.39x cost increase doesn't feel immoral or unethical.

I care not for Nvidia's complexities and problems with R&D just as they care not about my gpu budget. But hey, run me the numbers on a $379 1070 in 2016 (that really was $400-450 for AIB models, more for the founders NV brand) that is now $899. What was the GTX970 in 9/2014? Oh yeah, $330-350.

Screenshot 2022-09-29 at 16-40-10 AdoredTV.png

Complexities man. Its all about them complexities...of profit margin.

Screenshot 2022-09-29 at 17-09-07 AdoredTVb.png
 
Last edited:
Yes, because I see so many preaching the wonderfulness of AMD's prices (sans the somewhat sane 6600 for the past 3 months). Never mind that the market didn't want to pay AMD's prices either even when they were available and $200-400 cheaper at the high end for over a year now compared to ampere cards.



I care not for Nvidia's complexities and problems with R&D just as they care not about my gpu budget. But hey, run me the numbers on a $379 1070 in 2016 (that really was $400-450 for AIB models, more for the founders NV brand) that is now $899. What was the GTX970 in 9/2014? Oh yeah, $330-350.

View attachment 514959

Complexities man. Its all about them complexities...of profit margin.

View attachment 514970
It's even easier for me.

Rather than trying to justify a purchase, I make it simple. If I don't "feel" like I'm getting what I pay for, I don't buy it. The RTX 4090 feels like you will get what you pay for if you can afford the $1600 price of entry. The RTX 4080 16GB is a big "if", while the RTX 4080 12GB is a resounding "no".

I care not for Nvidia's "Woe is me. Production costs are much higher this generation. We can't do anything about it" attitude. If Nvidia wants to sell me a GPU, they will find a way to create value in their products. Otherwise, I will not buy from them. The RTX 3080 FE @ $700 provided tremendous value compared to the RTX 2000 cards, and I needed a card that could drive a 4K120 display via HDMI 2.1... the RTX 3080 FE was and still is, to this day, a perfect fit. Now, Nvidia is trying to tell me that the RTX 4080 12GB is the next step?

No... not buying it. I'll wait for reviews, but my guess is that it will be a side-grade a best, and a down-grade at worst. I care not for higher framerates via DLSS 3 if it adds latency. It's a cool tech, for sure, but useful? Not at the moment.
 
Correct. There is very little "morality" in business. You make a product, you sell it, you make a profit, you build a new product. Morality does not play into any of it.
This. In the most basic form of capitalism, corporations exist to earn a profit and maximize shareholder wealth. In that regard, corporations are no more evil than animals in nature who savagely kill other animals to eat them. The upside to the consumer is innovation to earn their dollars. Without capitalism, we wouldn't have a grocery store full of foods from all over the world and we sure wouldn't have graphic cards for playing video games.

Then, we have supply and demand. People will only pay as much as they think a product is worth to them. If something is really valuable to the consumer, but it's scarce, the price will go even higher. There is no such thing a price gouging. Prices adjust to maximize profit while being able to just barely sell each item and maximize shareholder wealth.

Now, all that said, capitalism isn't perfect, big corps fix prices with each other, the government gets involved by "choosing winners", barriers to entry for new competition are put in place, etc. All of these thing hurt the consumer. However, it's the best type of market we have available. Do you really think a full on Communist or Socialist system with price controls and state controlled production is going to produce innovative graphics cards?

Back to nVidia, the 4080 12 GB model is a 4070 based upon their history. It's a cut down chip and meets their past designation of the mid range, but it is priced as high end. They've done it before and they will do it again. It's up to us to be informed and vote with our wallets. We also need good competition from AMD to give us an alternative. I don't particularly care for the man in the leather jacket, but that is besides the point. I hope the new chiplet design from AMD works out and gives us all a second viable option.
 
GPU's on 150nm nodes were much simpler to design, or even 65nm which the GTX280 was built on. The orders of magnitude that the complexity has grown over that same timespan he discusses, he utterly ignores. Either out of stupidity or on purpose. If it is the former, he has no business trying to make any ethical judgement on shit he understands only at the surface, most basic level. If it's the latter, he's a hypocrite. Either way, not getting any views from me.
Complexity = new capabilities + more speed

GTX280 = 1.4 Billion Transistors - Features: unified shaders, texture mapping units, render output units, 0.933 Tflops processing power
Features added in the GPUs in between: cuda cores, shader processors, improved power management, GPU Boost (automatic overclocking when temp and power allows it), parallelized instructions, faster encoding, newer pcie, PureVideo feature set, Dynamic Super Resolution, Delta Color Compression, Multi-Pixel Programming Sampling, Nvidia VXGI (Real-Time-Voxel-Global Illumination), VR Direct, Multi-Projection Acceleration, Multi-Frame Sampled Anti-Aliasing (MFAA), Async compute, G-Sync, Hardware Raytracing, Deep Learning Super Sampling, newer video port technologies such as Displayport. All along the way, these technologies has been refined and improved upon. And this is just what I skimmed from wikipedia, there's probably more that is not listed.
GA102 = 76.3 Billion Transistors - 1,321 Tensor-TFLOPs, 191 RT-TFLOPs, and 83 Shader-TFLOPs. This total is 1710x the GTX280, if you leave out the Tensor and RT processing power and only include the Shader processing power, it's 89x the GTX260's processing power.
over 54x the transistors in nearly the same die size.

My 'Ethics' analysis: RTX4080 15Gb card is $1199, that is $904 in 2008 dollars. GTX 280 launched at $649. So for 1.39x the price in inflation adjusted dollars, you get 89x the shader processing power, tons of new features, some pretty amazing ones. Oh the Morality of it! Oh the Ethics of it! What will we ever do??!?! Why isn't it 1x the price? Let's make a video!
For all of the billions spent on R&D to make a modern GPU possible, 1.39x cost increase doesn't feel immoral or unethical.

But let's ignore all of that and preach the Heavenlies of AMD and the Evils of nVidia!
(it gets more clicks, and/or I am paid to make these videos by AMD)

Edit: corrected the gtx280 tflops
Let's apply that line of thinking to CPUs. The P4 EE 3.4 GHz was a single core / Hyper Threading chip that sold for $999 in 2004, so the 32 thread 13900K should cost about $25,000 today. Intel stockholders should be pissed.
 
Let's apply that line of thinking to CPUs. The P4 EE 3.4 GHz was a single core / Hyper Threading chip that sold for $999 in 2004, so the 32 thread 13900K should cost about $25,000 today. Intel stockholders should be pissed.
A 2003 Pentium EE had around 178 millions transistor a 12900K around 12,400 millions or 70 times more.
A 2003 Nvidia 5900 ultra had around 135 millions transistor (less than an ultra high end CPU of the time) the announced 4090 has around 76,300 millions transistors or 565 times more and now 5-6 time more than the big cpu instead of significantly less than them.

Which indicate it is "ok" for the high desktop GPU to have become more expensive than the high end desktop CPU instead of being 40-50% of the price or so. What and how much people do with those GPU (or very similar chips that are passed down to us for gaming afterwards) over time changed more than CPU

The 4080 too....
 
Last edited:
Geez, thats comically large. I think I'll go with a AIO cooler this time.
I know right… nucking futs.

I really think that GPU’s are going to have to go the AIO route and cases are going to need to adapt to accommodate. GPU’s draw more than the rest of the system combined and things will have to change accordingly.
 
Half serious question, but how long until the GPU is in some ATX format and our CPU/Storage fits in the PCIe slot?

I mean Intel already has the compute card format, others just gotta run with it.
 
I know right… nucking futs.

I really think that GPU’s are going to have to go the AIO route and cases are going to need to adapt to accommodate. GPU’s draw more than the rest of the system combined and things will have to change accordingly.
Looks like if I'm upgrading to 4xxx, I'd be kissing my mini-ATX cases goodbye. The performance of the new cards looks...nice...but for what purpose? So I can get 4k 100+ FPS instead of 4k 95 FPS in Spiderman: Remastered? Back in the day, graphics in software was improving somewhat along the same rate as GPU horsepower. But now? Software developers have no use for that much GPU power. SLI has been dead/redundant for quite a long time now (and I remember having to convince people it was dead a few years back). We're entering a new ballgame. I don't blame the developers, they're just trying target and accommodate the common consumer rather than the enthusiast, and the buy-in on a regular GPU has increase an incredible amount over the last 5 years.
 
Back
Top