• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

Supply of 5090 cards to increase

Only if you get lucky. I'm sure glad I'm not stupid enough to roll the dice when the evidence and examples are right in front of my face. You do you!
Those same examples were there for the 4090. Tell me about all the houses burning down and people dying. Tell me about all the lawsuits.
 
Cables are literally melting, I guess you think that isn't a fire hazard?
I mean all of that is going on inside a metal case, not open in the room next to some linen drapes... so no, I personally dont think it's a fire hazard.
 
Those same examples were there for the 4090. Tell me about all the houses burning down and people dying. Tell me about all the lawsuits.

I have a 4090 too and haven't had a problem but I didn't buy it knowing there were melting issues. I got lucky and after one reseating of the cable mine has been fine. These 5090s draw far more power and we've already had MULTIPLE examples of cable failure and melting. How many cards are out in the wild that are actually being used? Very few. So go ahead and buy one, just don't bitch about it when you have a failure. Amazing how many people line up to defend this lowlife corporation. They'd laugh in your face after taking your money, blame you for a burned cable, and so would every other fanboy who likes being fleeced.
 
I have a 4090 too and haven't had a problem but I didn't buy it knowing there were melting issues. I got lucky and after one reseating of the cable mine has been fine. These 5090's draw far more power and we've already had MULTIPLE examples of cable failure and melting. How many cards are out in the wild that are actually being used? Very few. So go ahead and buy one, just don't bitch about it when you have a failure. Amazing how many people line up to defend this lowlife corporation. They'd laugh in your face after taking your money, blame you for a burned cable, and so would every other fanboy who likes being fleeced.
I'm not defending a thing. I don't think it's an issue if you plug your shit in properly or use aftermarket BS and multiple adapters.

Time will tell which one of us is right. Let's leave it at that. Agree to disagree.

If I'm wrong, I'll be glad to admit it and jump on the hate bandwagon.
 
Maybe the AIB's are scalping their own cards. What's to stop them from making the very bots that are used to clear out retailers?

Why sell a card once when you can sell it twice for twice the profit!
 
The leaker says Nvidia has an oversupply of B200 data centre GPU due to plummeting demand. It uses a GB100 GPU, from which the RTX 5090's GB202 is derived. Hence, the leftover GB100 dies will be repurposed into RTX 5090s, effectively ending the shortage.

That said, this goes against an earlier report which stated Nvidia's RTX 50 series laptop chips have been pushed back due to supply issues. Now, the exact reason for that delay reason isn't clear, but some speculate it to be due to a shortage of GPUs, while others opine it is due to performance and functionality issues.

Nevertheless, if true, those waiting to upgrade their GPUs might have to slug it out for just a month. Plus, there will be more options to pick from. Along with the RTX 5090 and RTX 5080, the RTX 5070 Ti and RTX 5070 will also be out of the door, along with AMD's Radeon RX 9070 XT and RX 9070. And if you're willing to wait a bit, AMD's 32 GB Radeon RX 9070 XTX is due to come out later this year


https://www.notebookcheck.net/RTX-5090-supply-to-greatly-increase-in-the-coming-weeks.962149.0.html
 
It uses a GB100 GPU, from which the RTX 5090's GB202 is derived.
hum... what ?

I thought those gpu had no RT core for examples, very different size has well

https://www.techpowerup.com/gpu-specs/nvidia-gb100.g1069
die size: Unknown but probably in the 8xxmm like the all the previous one

The leaker is not saying they will turn GB100 to RTX 5090, that they used extra TSMC fab space because they bought too much versus how much GB100 they ended up doing, going on the twitter directly:
https://x.com/Zed__Wang/status/1890643714009121073
Imagine you are Nvidia and have purchased shit loads of TSMC yields for B200, but now the market doesn’t want that much B200, and RTX40 is retired……The only solution is to make as much RTX50 as possible to cover the unused yield of B200

Looking at the spending of facebook, google, microsoft, etc... they announced in the last weeks for 2025, I doubt the leaker (2-3 weeks ago he was saying low volume was Chinese new year issue...., I doubt GB200 demand went from not able to make desktop GPU to panic to pass TMSC pre-order in a couple of weeks...)
 
Last edited:
Oh ok I thought it magic just turning Data center chips into consumer chips. It sounded good though.
 
Oh ok I thought it magic just turning Data center chips into consumer chips. It sounded good though.
Maybe they could for the L40-x40, RTX6000 type, which use the same die, but the A100-H100-B100 type are quite different, bigger, the type of core they put on them, HBM memory/NVLink controller and so on, maybe some PCIe version of the x100 do not carry those but I could see them just being disabled and a way to bin chips.


A100 vs a 3090 for example
A100:
https://x.com/Locuza_/status/1506074642976493571/photo/4
vs
https://www.tomshardware.com/news/infrared-photographer-photos-nvidia-ga102-ampere-silicon

FOalCPHXEAUsyEv?format=jpg&name=large.jpg
https://pbs.twimg.com/media/FOalCPHXEAUsyEv?format=jpg&name=large
vs
WriSXYzQsqxcnYcMpA2BfK-1200-80.jpg.jpg

The transistor for doing FP64 will also be completely different this time around, desktop gpu barely feature them, would be my guess, not just no RT core vs RT core or memory or network/nvlink type controller.
 
I’m so poor I can’t even justify spending $2400 on a video card. It’s tough to realize there are people that destitute out there, but I am one of them.
 
I remember reading here and I believe it was posted that Nvidia adjusts manufacturing capacity based on data center sales. So, the chips are not repurposed, but they can increase consumer chips manufactured if the data center demand decreases ensuring all available manufacturing capacity is used and sells through. Since they are Nvidia, it's likely the higher margin chips would be the candidates for any extra capacity. So, it lines up that if data center demand/sales drop, consumer chips may become more plentiful short term. We'll see soon enough.
 
We'll see soon enough.
I doubt we will know the supply of 5090 and if we would, know if it was not the planned supply all along or a reaction to lesser than expected GB100/200 purchase volume.

That the nice things of those statement, they will never be proven false or true, they can just be made willy-nilly.
 
Last edited:
How long does it take to produce something like a modern GPU (or CPU) chip, starting from a blank wafer? That is, what's the lead time on these supposed new cards, as far as the silicon goes?
 
If I made a product that people were camping at stores for, and then running out of stock in minutes with that insane amount of markup. You could bet your ass I would be running 3 shifts adding over time bonuses, what ever I had to do to make sure that everyone that wanted my product got one, if the product is not on the shelf you are missing sales. It's ridiculous the way consumers are being treated, and that goes for all GPU makers.
 
How long does it take to produce something like a modern GPU (or CPU) chip, starting from a blank wafer? That is, what's the lead time on these supposed new cards, as far as the silicon goes?
just wafer to working die can have over a thousands steps now, before the testing and packaging one, it can takes many months, apparently.

https://semiengineering.com/battling-fab-cycle-times/

Generally, the most common metric for cycle time in the fab is “days per mask layer.” On average, a fab takes 1 to 1.5 days to process a layer. The best fabs are down to 0.8 days, Leachman said.

A 28nm device has 40 to 50 mask layers. In comparison, a 14nm/10nm device has 60 layers, with 7nm expected to jump to 80 to 85. 5nm could have 100 layers. So, using today’s lithographic techniques, the cycle times are increasing from roughly 40 days at 28nm, to 60 days at 14nm/10nm, to 80 to 85 days at 7nm. 5nm may extend to 100 days using today’s techniques, without extreme ultraviolet (EUV) lithography.


Something coming out in 3 weeks could have needed to have been decided and started more than 3 months ago.. (maybe more than 5?, the above is just the layer, there is many other steps and then testing-packaging-assembling card-testing card-shipping card) with all the other steps involved, you cannot really react that much when dealing with advance node it seems.
 
LOL. Please show me all the cases of anyone dying or their house burning down. You know what? You can't.

Get off the clickbait bandwagon. SMDH.

Guess what? My "fire hazard" 4090 has been going strong since release. I'm still in my home and haven't had a whiff of smoke. I expect the same with my future 5090.
nvidia made the 5090 actually worse than the 4090, there is no proper load balancing so the cable can overheat even when plugged in properly it can draw uneven power on each pin.

Why are some people so desperate to get one anyway? From the mood one would think you are standing in line for cheap eggs.
 
nvidia made the 5090 actually worse than the 4090, there is no proper load balancing so the cable can overheat even when plugged in properly it can draw uneven power on each pin.

Why are some people so desperate to get one anyway? From the mood one would think you are standing in line for cheap eggs.
Who's desperate? I'll buy one once I can get one for MSRP. I buy the top end time after time. AMD doesn't compete there and hasn't for a while.
 
Something coming out in 3 weeks could have needed to have been decided and started more than 3 months ago
Yeah, I figured it was something like that and couldn't just be "decided to redirect production on the spur of the moment."
 
hum... what ?

I thought those gpu had no RT core for examples, very different size has well

https://www.techpowerup.com/gpu-specs/nvidia-gb100.g1069
die size: Unknown but probably in the 8xxmm like the all the previous one

The leaker is not saying they will turn GB100 to RTX 5090, that they used extra TSMC fab space because they bought too much versus how much GB100 they ended up doing, going on the twitter directly:
https://x.com/Zed__Wang/status/1890643714009121073
Imagine you are Nvidia and have purchased shit loads of TSMC yields for B200, but now the market doesn’t want that much B200, and RTX40 is retired……The only solution is to make as much RTX50 as possible to cover the unused yield of B200

Looking at the spending of facebook, google, microsoft, etc... they announced in the last weeks for 2025, I doubt the leaker (2-3 weeks ago he was saying low volume was Chinese new year issue...., I doubt GB200 demand went from not able to make desktop GPU to panic to pass TMSC pre-order in a couple of weeks...)
I am not saying that the article is correct but that is their claim. Here is a direct quote from the article.
An X leaker says the RTX 5090 shortage could be over soon due to low demand for Nvidia's B200 GPUs. The GB100 GPUs powering it will be repurposed for the RTX 5090's GB202.
Anil Ganti, Published 02/17/2025 🇩🇪 🇫🇷 ...
Gaming Geforce GPU Desktop Leaks / Rumors
 
I am not saying that the article is correct but that is their claim. Here is a direct quote from the article.
Yes and it is a very strange article, it even says (as you quote which should be a clue to not trust that writter, because again if you go to the message board refered, not only we are talking message board message level of sourcing but in no way it says that):
And if you're willing to wait a bit, AMD's 32 GB Radeon RX 9070 XTX is due to come out later this year.

It says maybe....

People can simply go to the twitter directly instead, I am not sure that publication as an editor or the author serious, seem like a troll online.
 
26 minutes..... does he think anyone seeing those combo do not understand them.,.. after 2 or 3 second max ?

There is an actual 2-3 minutes of interesting content in there (or a 2 page article, like time in stock, price over time, etc...) but in a 26 minutes package....

jay2cent had a 16 minutes video to say the sentence, we had an Asus sku that we were told to review as an msrp model yet 2 big retailers did show it at $900 and I have no idea why or anything to say about it....
 
26 minutes..... does he think anyone seeing those combo do not understand them.,.. after 2 or 3 second max ?

There is an actual 2-3 minutes of interesting content in there (or a 2 page article, like time in stock, price over time, etc...) but in a 26 minutes package....

jay2cent had a 16 minutes video to say the sentence, we had an Asus sku that we were told to review as an msrp model yet 2 big retailers did show it at $900 and I have no idea why or anything to say about it....
He thinks his audience is so moronic, that they'll fall for the click bait title and thumbnail. Hell, he doesn't think it at all, he knows it. He sees the single brain cell "gamers" that can't even plug in a cable all the way.
 
  • Like
Reactions: Niner
like this
26 minutes

There is an actual 2-3 minutes of interesting content in there
You just defined 99% of YouTube!

I'm glad our work lab of 16 3090s still does what I need it to do in a more than acceptable timeframe. And that my 4090 still does everything I need it to do and more.

Looking forward to the RTX 6090, or even the 7090. Maybe by the time the 7090 comes around, we'll be making some in the USA.
 
Again, I know a lot of folks on here hate LTT (and after I saw some different takes on the matters by other Youtubers, there's probably some merit to the hate), but I think his WAN show talks are sometimes interesting:

View: https://youtu.be/oav0JrnEnCY

This specifically deals with the current PC consumer space holistically, especially the GPU market, and the note it ends on is sort of haunting. I think he's right. I think today and the near future is as close as PC gaming is getting to just outright dying. This frenzy over 5080 and 5090 purchasing is sort of masking it, but I think a lot of the PC gaming space is just growing apathetic. They're not yelling, screaming and shouting about it because they just no longer give a shit. Not being able to afford anything that's worth a damn for the majority of people is just becoming the status quo. They also touch on the interesting trend of how the 5090, and the highest end flagships have slowly been creeping into "what everyone is interested in", which is an interesting trend to point out. Like I remember back in the day, they were right. I got a 760, or 760 SLI, and I was still gaming pretty good. Fast forward to day, 4060s (and doubtless the 5060 that follows) will feel like a shitty deal. They're what most people can still buy (because if you still care, what the fuck else are you going to do?), but that's not what the interest is at. Probably intentional from Nvidia and other manufacturers to gradually shift the GPU brand image and image overall from low margin, high supply products, to low supply products at high margins.

To be clear, it's not just an Nvidia issue. It's the way the corps are fucking people over, the way real estate is fucking people over, etc etc... all of this is just building upon itself and this is just where we're at. It's not good, but was probably unavoidable. Throw in AI and how it's going to probably make all of this even worse than it is already when it keeps advancing.
 
I think today and the near future is as close as PC gaming is getting to just outright dying.
The notion that 5080 or 5090 has much to do with PC gaming is having been really rich for a long time and being around other very rich (or at least spend a lot) youtuber.

If we look at steam active account, would that notion make any sense ?

Kingdom come sold 2 millions copy in a couple of week, a lot of those were on the PC.

For one Kids, the future of the industry, never bought those expensive GPUs, large part of the middle east, India-China-rest of india, etc.. the booming of the industry, I doubt a lack of 5090 is a big variable.

This sound like content creator going for very spectacular, ridiculous hyperbole like the regular news do, to me. The top games played that the people are interested in, until GTA 6 launch at least are not that hardware demanding, they play on laptop fine.

Just seeing giant line for just a chance at buying a $2200-2800 gpu and concluding this is a sign PC gaming is getting to just outright dying is wanting to spin it in a negative way or more so seeing it a lot from the point of view of people like themselves making reviews 24 hours before a product launch that people cannot buy for months anyway, removing all the fun out of it (and selling their souls for access for no reason anymore), while being almost irrelevant to the subject will PC gaming still exist in 2 years or not.

Probably intentional
Could be because of the pundit-content industry as well, or lack of overclocking, people are still more interested in some way to the lower tier but there is nothing to talk about and no content about them. Could also just be aging, I am sure kids building up PC with part time McDo money could still be exited with the latest lower range and what it do for the used previous high range and so on.... When he was young and all about that, maybe older rich gamer were just caring about the 8800 GTX has well.

Fast forward to day, 4060s (and doubtless the 5060 that follows) will feel like a shitty deal.
https://store.steampowered.com/hwsurvey/videocard/

4060 would not have been the most popular model of GPU of the last 2 years by far if that was the case. Maybe $300 for a 4060 sound like a shitty deal, but for a lot of people $1000 for the best gpu ever would sound like a shitty deal.
 
Last edited:
4060 would not have been the most popular model of GPU of the last 2 years by far if that was the case. Maybe $300 for a 4060 sound like a shitty deal, but for a lot of people $1000 for the best gpu ever would sound like a shitty deal.

Re-read what I said. It's the only option people have, at the price level they're willing to spend. That doesn't make it perceptually feel like a good deal, and people certainly don't feel like taking this all the time. Now if you're going to point out that not everyone even cares about whether what they get is a good deal, and most consumers are ignorant... then sure, fair enough. But eventually word of mouth will bubble that down. Everyone knows that one person that's "really into computers".

The rest is speculation so I won't comment, but you do have some fair points. That said, Linus is looking at numbers and statistics based on his own channel views. Though who knows, maybe there are third variables in there, too.
 
the best way to know if someone considered it a shitty deal, did they bought it, if so, you know the answer, maybe they did not thought of it as a good deal, maybe it was in a pre-build gaming pc and the overall did not look bad, but it did not feel shitty, it was for a long time not the only option, the 3060 was available, the 7600 for a while, the 7600xt.

That said, Linus is looking at numbers and statistics based on his own channel views. Though who knows, maybe there are third variables in there, too.
Yes the PC gaming review-pundit-not playing but watching content about it industry could be hit hard by this... worldwide pc gaming is a different thing, if 5060-5070-9070, etc.... are hard to buy for a very long time, than sure, but 5090 ? Virtually every potential buyer of a 5090 as a gpu good enough to play everything fine already.

DIY pc building for example, that could take a hit, kids will stop playing Fortnite, Roblox, minecraft because of this, that sound a bit hysteric.
 
the best way to know if someone considered it a shitty deal, did they bought it, if so, you know the answer, maybe they did not thought of it as a good deal, maybe it was in a pre-build gaming pc and the overall did not look bad, but it did not feel shitty, it was for a long time not the only option, the 3060 was available, the 7600 for a while, the 7600xt.

You don't see all of the copium people who are saying "this is a shitty deal but I need to upgrade"? The notion that people don't buy things that they know are a ripoff is kind of silly at best. You're conflating two completely different things: behavior and attitude. Look at some social psychology when you have time.
 
Who's desperate? I'll buy one once I can get one for MSRP. I buy the top end time after time. AMD doesn't compete there and hasn't for a while.
Well, you seem desperate, even if you don't recognize it about yourself. But what I want to know is why do you want one? In what specific application will you use it that makes it sensible to pay this much for a minor upgrade? As for MSRP, I doubt you'll see it unless willing to wait a year or more.
 
Again, I know a lot of folks on here hate LTT (and after I saw some different takes on the matters by other Youtubers, there's probably some merit to the hate), but I think his WAN show talks are sometimes interesting:
Just because LTT is a manipulative selfish POS emotional bully, doesn't mean everything he says on his channel is wrong
This specifically deals with the current PC consumer space holistically, especially the GPU market, and the note it ends on is sort of haunting. I think he's right. I think today and the near future is as close as PC gaming is getting to just outright dying. This frenzy over 5080 and 5090 purchasing is sort of masking it, but I think a lot of the PC gaming space is just growing apathetic. They're not yelling, screaming and shouting about it because they just no longer give a shit. Not being able to afford anything that's worth a damn for the majority of people is just becoming the status quo. They also touch on the interesting trend of how the 5090, and the highest end flagships have slowly been creeping into "what everyone is interested in", which is an interesting trend to point out. Like I remember back in the day, they were right. I got a 760, or 760 SLI, and I was still gaming pretty good. Fast forward to day, 4060s (and doubtless the 5060 that follows) will feel like a shitty deal. They're what most people can still buy (because if you still care, what the fuck else are you going to do?), but that's not what the interest is at. Probably intentional from Nvidia and other manufacturers to gradually shift the GPU brand image and image overall from low margin, high supply products, to low supply products at high margins.
Everybody who ever said PC gaming is dying has been wrong so far. In fact PC gaming is living its renaissance currently. Nothing signifies this more than big game publishers abandoning console exclusivity either entirely or reducing it to much shorter time spans (to months from years).

There is no frenzy for buying 5080 or 5090, what you are seeing is a few enthusiasts fighting over a paper launch, who have more money than sense. But high end GPUs were never the meat of PC gaming, they get talked about the most, they make or break the manufacturers reputation, but it is the 2060s, 3060s and 4060s that matter. Because that's where they shift volume. For investors margins matter less than growth and revenue. AI chips might hinder the supply of hi-end gaming cards, but it won't affect the mid and low end which makes up the majority of PCs used for gaming.

Average kids could never afford hi-end GPUs, even when they were more reasonably priced. When I was in college all I could afford was an used 8500LE, yet I still loved gaming. You don't need to be on the bleeding edge to be able to enjoy PC gaming.
 
There is no frenzy for buying 5080 or 5090, what you are seeing is a few enthusiasts fighting over a paper launch, who have more money than sense. But high end GPUs were never the meat of PC gaming, they get talked about the most, they make or break the manufacturers reputation, but it is the 2060s, 3060s and 4060s that matter. Because that's where they shift volume. For investors margins matter less than growth and revenue. AI chips might hinder the supply of hi-end gaming cards, but it won't affect the mid and low end which makes up the majority of PCs used for gaming.

Average kids could never afford hi-end GPUs, even when they were more reasonably priced. When I was in college all I could afford was an used 8500LE, yet I still loved gaming. You don't need to be on the bleeding edge to be able to enjoy PC gaming.

RTX 5070ti have prices listed online, I only see a PNY model (which does not look that bad) for MSRP. The rest are around $900, with some models being $1000. That is more or less an upper mid range GPU. Quite a lot of money, hopefully many more models settle at the MSRP but even then $750 is a lot. The regular 5070 is more like a **60 card, at $550 if they can be had for MSRP, which I kind of doubt.

Edit: Now I cannot even find the PNY model. Cheapest I can find is a Gigabyte for $900.

Point being, the mid range GPUs are going to see a price jump to.
 
Average kids could never afford hi-end GPUs, even when they were more reasonably priced.
I've been doing a lot of gaming on APUs lately. For $400, you can get a Ryzen 7735HS with an iGPU that will play stuff like heavily modded Minecraft, indie stuff like Dredge, and most non-AAA games all day long.
 
I've been doing a lot of gaming on APUs lately. For $400, you can get a Ryzen 7735HS with an iGPU that will play stuff like heavily modded Minecraft, indie stuff like Dredge, and most non-AAA games all day long.
This iGPU have performance of GTX 950 from 2015? You could buy one of those for $40 on ebay, not $400.
 
26 minutes..... does he think anyone seeing those combo do not understand them.,.. after 2 or 3 second max ?

There is an actual 2-3 minutes of interesting content in there (or a 2 page article, like time in stock, price over time, etc...) but in a 26 minutes package....

jay2cent had a 16 minutes video to say the sentence, we had an Asus sku that we were told to review as an msrp model yet 2 big retailers did show it at $900 and I have no idea why or anything to say about it....
20 minutes is the magic number for Youtube algorithms which is why 90% of Youtube videos are that length.

YouTube doesn't favor a particular video length, but longer videos do have advantages over shorter videos. Longer videos can accumulate more watch time minutes than short videos, leading to more channel growth.
Learn more:

Why longer videos are becoming more commonplace on YouTube
Jul 15, 2024 · The number of 20-plus minute-long videos creators around the world are uploading to YouTube each month has increased from 1.3 million in July 2022 to 8.5 million in June 2024, according to data...
Author: Digiday
 
Back
Top