GeForce RTX 3080 sees increasing reports of crashes in games

Just read Igor Lab's article. Indeed it was a great read. If some AIB partners are choosing (or were choosing for the first couple of manufacturing batches) hardware components that are inferior to what the card needs just to rush them to market (as nVIDIA admitted them and AIB partners have only been making this card since sometime last month) that's a bad play on the first few batches of cards. One can only hope this is corrected in later batches by the AIBs involved. We'll see. Out!
 
Just read Igor Lab's article. Indeed it was a great read. If some AIB partners are choosing (or were choosing for the first couple of manufacturing batches) hardware components that are inferior to what the card needs just to rush them to market (as nVIDIA admitted them and AIB partners have only been making this card since sometime last month) that's a bad play on the first few batches of cards. One can only hope this is corrected in later batches by the AIBs involved. We'll see. Out!

Damn if true, its best to avoid the early cards. This release is one fiasco after another. Grab some popcorn and wait for the dust to settle time...
 
Just read Igor Lab's article. Indeed it was a great read. If some AIB partners are choosing (or were choosing for the first couple of manufacturing batches) hardware components that are inferior to what the card needs just to rush them to market (as nVIDIA admitted them and AIB partners have only been making this card since sometime last month) that's a bad play on the first few batches of cards. One can only hope this is corrected in later batches by the AIBs involved. We'll see. Out!
The only thing i disagree with is that nVidia doesn't share any blame for this. It shares quite a bit.

a) The reference design allows for either. The only way a manfuacturer is going to run into problems if there's higher power than originally expected, or yields don't fit comfortably within performance targets. It's really obvious that these cards are at near maximum performance with very little left on the floor. nVidia hasn't done anything like this since before Pascal. So changing power requirements is not the fault of AIBs. The saving grace is that there are SOME cards on the market that perform in expectant tolerances to prevent claims of just straight up fraud.

b) nVidia rushed this launch. Nvidia gave AIB's a month. For Turing they had two months. Regardless of what anyone says more and more the problems seem to come from late changes to eek out performance at the last minute.

c) I don't think anyone would have cared if the performance target was relaxed a little. If that meant you win some you lose some who really would care? This type of stuff causes more damage not less.
 
What a shit show. Safe bet at this point is to stick to FE's or overbuilt AIB cards like the Strix.
 
Saw this randomly on the facebook. I don’t even follow them.

1EFAA216-8268-4C1C-8E79-94C0EE753C9E.jpeg
 
What a shit show. Safe bet at this point is to stick to FE's or overbuilt AIB cards like the Strix.

Asus TUF has a better design as well according to Igor.

By the way, you also have to praise a company here that recognized the whole thing from the start and didn’t even let it touch them, as the Asus TUF RTX 3080 Gaming consequently did without POSCAPs and only used MLCC groups. My compliments, it fits!

https://www.igorslab.de/en/what-rea...tabilities-of-the-force-rtx-3080-andrtx-3090/
 
It’s probably because they don’t want to the subreddit to be littered with 100 threads on the same topic.

https://www.reddit.com/r/nvidia/com...urce=share&utm_medium=ios_app&utm_name=iossmf

This thread is a hot post in the subreddit and hasn’t been deleted. Seems like they’re trying to keep discussion in there.

Yeah, again a subreddit run by the company makes it easier to keep a cap/lid on things. If you search fo driver issues you find a crap ton of reddit posts for AMD (community run) and only a few for Nvidia. It's because they "consolidate" similar complaints into one so it doesn't look like it's a widespread issue, but keep it alive so they can say they acknowledge the issue. It's just another form of control they are able to maintain.

Anyways, Nvidia gave AIBs the reference design and then didn't use it themselves, maybe now we know why ;). This simplest "fix" is going to be some sort of limit on boost clocks more than likely. Guess we'll see how widespread this is, but with the limited quantity of cards available and this many complaints coming in, it's looking like it's an issue on a pretty good scale.

AMD, please don't botch this launch, lol, this is your chance to be taken seriously and a small opening gifted to you by nvidia. Don't blow it, we need the competition.
 
  • Like
Reactions: kac77
like this
Saw this randomly on the facebook. I don’t even follow them.
I like how it comes from one guy speculating into "likely", lol. Also it has happened on FE cards as well as TUF cards, so it's hard to say it's just due to a capacitor, although better ones may mask the issue better. Seems the fix is to lose a bit of frequency... Hopefully this isn't the permanent fix, guess we'll see how Nvidia responds.
 
Companies love to screw up power delivery. Kind of surprising on a graphics card though since that’s a huge part of it... maybe with the rushed timeline they did too much copy paste and the higher power draw screwed them. But a bunch of them?

Kind of happy I didn’t get a card right now lol. Wouldn’t kill me to do -100Mhz but wouldn’t be thrilled either.

Wow, hardware issues borking release cards. It's 2018 and "space invaders", all over again.

Yeah seems like both companies are getting proficient at mauling launches.
 
Jay had a good video on this, companies using the cheaper components causing the crashes.

You've got to give it to nVidia marketing. "It's the AIB's!" who are using our reference design.....lol
 
Yup deja vu all over again... Looks like the safe bet is ASUS or Nvidia FE cards for now.
Supposedly MSI Gaming X Trio's also have the same 4x POSCAP and 2x MLCC layout that the FE has. But not 100% sure.

EDIT: This is for the 3090 and not the 3080.
 
Last edited:
"MSI uses a single MLCC group in the central arrangement, with five SP-CAPs", according to Techpowerup.
Gigabyte and EVGA are the unknowns I'm curious about.

https://www.techpowerup.com/272591/...ly-connected-to-aib-designed-capacitor-choice
You’re correct. MSI used 2 MLCCs on the 3090 MSI Gaming X Trio according to Jayz video.

Apparently the 3080 FTW3 cards use a 6x POSCAPs array (according to Newegg/Amazon photos) while the cheaper 3080 XC3 uses a 5x POSCAP 1x MLCC array (which apparently is the reference design). Why would they put cheaper components on the FTW3 card opposed to the XC3?
 
Last edited:
Apparently the 3080 FTW3 cards use a 6x POSCAPs array (according to Newegg/Amazon photos) while the cheaper 3080 XC3 uses a 5x POSCAP 1x MLCC array (which apparently is the reference design). Why would they put cheaper components on the FTW3 card opposed to the XC3?
Interesting... Looks like it'll be the usual Asus Strix for me unless Gigabyte decides to not cut corners with their 3080 Master.
 
My goal has always been the Gigabyte Gaming OC 3080 (which has a similar layout to the Eagle) and we've had a teardown of that for a while. We can see here that it's using 6xPOSCAPs at 470µF while the cheap Zotac for example is using 6x330µF.
 
You've got to give it to nVidia marketing. "It's the AIB's!" who are using our reference design.....lol

Keep in mind that the Founders Editions don't use reference design's. Which makes me wonder, did they even test the reference design PCB in house before they gave it to the AIB partners?

What a fluster cuck!
 
Glad I went with an FE. This impacts a lot of the AIB 3080's right?
 
Glad I went with an FE. This impacts a lot of the AIB 3080's right?

Looks like the AIB's who chose to use cheaper components are the ones affected. Specifically Gigabyte, Colorful & rumored to be Zotac
 
Keep in mind that the Founders Editions don't use reference design's. Which makes me wonder, did they even test the reference design PCB in house before they gave it to the AIB partners?

What a fluster cuck!

This may cause a lot of dead stock or AIBs re-flashing the cards before shipping them.
Stock will be non-existent for a while, especially if everyone runs to FE and ASUS cards.

I wouldn't want to by a flashed card because any OC will crash the card.
 
This may cause a lot of dead stock or AIBs re-flashing the cards before shipping them.
Stock will be non-existent for a while, especially if everyone runs to FE and ASUS cards.

I wouldn't want to by a flashed card because any OC will crash the card.

Cheap ass AIBs need to take the cards back and replace them with decent cards.
 
Cheap ass AIBs need to take the cards back and replace them with decent cards.
I agree, but you know they are going to try and flash them to 1850MHz limit or something.
Nvidia isn't going to take responsibility for the reference design which they didn't follow. Either way, the pipeline for cards will be empty for a while.
 
I agree, but you know they are going to try and flash them to 1850MHz limit or something.
Nvidia isn't going to take responsibility for the reference design which they didn't follow. Either way, the pipeline for cards will be empty for a while.

If they do that they are signing their death warrants. The AIBs are responsible for this. They could have charged $10 more for each card with the proper hardware and all would be well. I don't understand why they are trying to take such shortcuts. They know NV cards are money makers so why bring discredit on your company by trying to make an extra couple of dollars?
 
If they do that they are signing their death warrants. The AIBs are responsible for this. They could have charged $10 more for each card with the proper hardware and all would be well. I don't understand why they are trying to take such shortcuts. They know NV cards are money makers so why bring discredit on your company by trying to make an extra couple of dollars?

Fancy cooler designs with huge logos and RGB come first of course. They probably thought OC's will be limited but boost clocks should past the test.
 
This is pretty easy for someone to test. Just stack a few more capacitors!
 
Well, like Jay said, companies are probably gonna put out firmware updates because its free, gonna be a lot of pissed off gamers. And rightfully so.
 
Back
Top