GeForce RTX 3080 sees increasing reports of crashes in games

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,584
Looks like Nvidia holding back suitable drivers for the AIBs to test their configuration also contributed to this problem. Makes me wonder if the the original reference spec was based on a certain quality level for the GPU, which later Nvidia could not meet and was sent to the AIBs anyways. Like most failures, it is like a hangman game where many pieces have to come together to cause the failure.
  • AIBs given reference specs that in the end would poorly support the quality of the GPU's given
  • Nvidia held back applicable software (drivers) hampering AIBs from proper testing and verification
  • Nvidia allowed flawed designs to be made and sold, not overlooking or working with the AIBs effectively allowing faulty cards to be purchased
  • The BOM and the MSRP makes AIBs more likely to go with the minimum spec that Nvidia provided competing against a superior constructed and cost FE model of Nvidia's
  • The low supply of Ampere GPUs will make it hard for the AIBs to rapidly correct/replace the bad cards in the immediate future allowing for a continuation of the issue for users, responses and bad publicity for the AIBs with the worst problems
    • As in Gigabyte and Zotac are more likely considered to be the cheap or low quality cards (which frankly seems to have been the case previously) even if they correct the problem
    • Instead of replacing bad cards, firmware reducing performance may be implemented to allowed use with a performance loss
Top it off with Jensens rather misleading marketing of 1.8x efficiency improvement, performance gains and maybe even the MSRP being completely ridiculous for AIBs to make cards at that price adds to the overall dismay. Getting hype and not following through usually ends up hurting the company.

It's an all of the above problem for sure.....mostly planning and execution.
 
Last edited:
  • Like
Reactions: noko
like this

motqalden

[H]ard|DCOTM x3
Joined
Jun 22, 2009
Messages
1,633
I've got a bit of a predicament...

I had an almost identical situation with Amazon as they said stock ran out as i was submitting my order but they still accepted it and that they were waiting on more stock to fulfill it.
I decided to cancel my order through them but this was more so due to the early reports of how the cooler performs on the Ventus showing it needing pretty high fan speeds to keep temps around 70c.
I think if the Poscap situation is your only concern you should probably be OK since Ventus is supposed to be 5 +1 which is not showing to cause as much problems as cards with 6. For example EVGA XC3 is 5+1 and they said this was fine for these cards. For me I decided in the end that I would rather wait potentially months longer to get the card model I really wanted and i didn't want to mess around with re-selling it if I wasn't satisfied.
 

thesmokingman

Supreme [H]ardness
Joined
Nov 22, 2008
Messages
6,229
If they do that they are signing their death warrants. The AIBs are responsible for this. They could have charged $10 more for each card with the proper hardware and all would be well. I don't understand why they are trying to take such shortcuts. They know NV cards are money makers so why bring discredit on your company by trying to make an extra couple of dollars?

Concur. The AIBs are right screwed. The ones cutting corners had better fess up now and take the cards back. There is no where for them to hide in grey areas. Flashing the cards to a lower state of performance... yea that gets into legal areas that they will lose in. I think Nvidia is the root of this. Their leadership should have prevented idiocy like this from occurring...
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,584
Looks like Nvidia holding back suitable drivers for the AIBs to test their configuration also contributed to this problem. Makes me wonder if the the original reference spec was based on a certain quality level for the GPU, which later Nvidia could not meet and was sent to the AIBs anyways. Like most failures, it is like a hangman game where many pieces have to come together to cause the failure.
  • AIBs given reference specs that in the end would poorly support the quality of the GPU's given
  • Nvidia held back applicable software (drivers) hampering AIBs from proper testing and verification
  • Nvidia allowed flawed designs to be made and sold, not overlooking or working with the AIBs effectively allowing faulty cards to be purchased
  • The BOM and the MSRP makes AIBs more likely to go with the minimum spec that Nvidia provided competing against a superior constructed and cost FE model of Nvidia's
  • The low supply of Ampere GPUs will make it hard for the AIBs to rapidly correct/replace the bad cards in the immediate future allowing for a continuation of the issue for users, responses and bad publicity for the AIBs with the worst problems
    • As in Gigabyte and Zotac are more likely considered to be the cheap or low quality cards (which frankly seems to have been the case previously) even if they correct the problem
    • Instead of replacing bad cards, firmware reducing performance may be implemented to allowed use with a performance loss
Top it off with Jensens rather misleading marketing of 1.8x efficiency improvement, performance gains and maybe even the MSRP being completely ridiculous for AIBs to make cards at that price adds to the overall dismay. Getting hype and not following through usually ends up hurting the company.
Looking at it again there's one thing not in this list. The manufacturing process and the power being used. I'm surprised no one really has touched on this but these cards use more power than ANY nVidia card before it. It uses more power than ANY Titan card. Actually these cards use more power than ANY video card before within the last few generations. So the AIBs literally can't reuse components like they probably have in the past. The top cards of the 80 series have pretty much stayed around the 250 mark. These cards are 100w more.
 
  • Like
Reactions: noko
like this

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,600
Looking at it again there's one thing not in this list. The manufacturing process and the power being used. I'm surprised no one really has touched on this but these cards use more power than ANY nVidia card before it. It uses more power than ANY Titan card. Actually these cards use more power than ANY video card before within the last few generations. So the AIBs literally can't reuse components like they probably have in the past. The top cards of the 80 series have pretty much stayed around the 250 mark. These cards are 100w more.
Seems like desperation somewhat from Nvidia. How would Ampere perform if at 250w? Vice 320w or 350w? Maybe a tester will do this comparing Turing to Ampere watt per watt.
 
  • Like
Reactions: kac77
like this

kalston

[H]ard|Gawd
Joined
Mar 10, 2011
Messages
1,173
Seems like desperation somewhat from Nvidia. How would Ampere perform if at 250w? Vice 320w or 350w? Maybe a tester will do this comparing Turing to Ampere watt per watt.

I'm guessing at 250w it would be an underwhelming jump over Turing. Something like Pascal to Turing (or maybe worse?). But with current pricing I would still have been in the market for it that's the funny thing (I skipped Turing entirely). Performance jump is bigger than I expected but it's like buying a super overclocked Turing card with all those watts and seemingly barely stable clocks.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,584
I'm guessing at 250w it would be an underwhelming jump over Turing. Something like Pascal to Turing (or maybe worse?). But with current pricing I would still have been in the market for it that's the funny thing (I skipped Turing entirely). Performance jump is bigger than I expected but it's like buying a super overclocked Turing card with all those watts and seemingly barely stable clocks.

Gamer Nexus kind of runs past this without talking about the elephant in the room:

"The 2080 Ti pulled 264W stock and 330W overclocked, for comparison, with the 2080 FE stock at 235W. With that 330W OC number, the 3080 was often still 10-15% ahead of the 2080 Ti OC while being a few watts lower, proving its efficiency improvements. "

Wait hold up, 10 - 15 %? After a node shrink? Basically at same power you have memory differences, architectural changes (additional cores which was doubled), and process advancement all crammed into that 15%.
 
Last edited:
  • Like
Reactions: noko
like this

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,600
Gamer Nexus kind of runs past this without talking about the elephant in the room:

"The 2080 Ti pulled 264W stock and 330W overclocked, for comparison, with the 2080 FE stock at 235W. With that 330W OC number, the 3080 was often still 10-15% ahead of the 2080 Ti OC while being a few watts lower, proving its efficiency improvements. "

Wait hold up, 10 - 15 %? After a node shrink? Basically at same power you have memory differences, architectural changes (additional cores which was doubled), and process advancement all crammed into that 15%.
Nvidia maxed out the clocks on GA102, not much headroom. To see arch improvements I think watt to watt would be a good starting point except different loads will probably cause some differences in power usage once configured. If Nvidia put a bigger cooler on a 2080Ti, maxed out the power, added faster ram it would be like you wrote 10-15% faster. In that respect Amper just doesn't seem as strong. Curious in how well the 3070 expectations of equalling to beating out a 2080 Ti is or is that another lie.
 
Last edited:

MissJ84

2[H]4U
Joined
Dec 22, 2009
Messages
2,094
Some new stability findings timestamped in video link below. He found instability in Windows, but none whatsoever in Linux even with higher boost clocks (2100Mhz). He also says that the press drivers weren't crashing, but the mainstream ones were.

Edit - I forgot this was a 3080 thread, but seems applicable nonetheless.

 

Nobu

Supreme [H]ardness
Joined
Jun 7, 2007
Messages
4,759
Some new stability findings timestamped in video link below. He found instability in Windows, but none whatsoever in Linux even with higher boost clocks (2100Mhz). He also says that the press drivers weren't crashing, but the mainstream ones were.

Edit - I forgot this was a 3080 thread, but seems applicable nonetheless.

Sometimes linux can recover from a gpu hardware fault without crashing, so you'd need to check the logs to be sure it was stable. Good to know, though.
 

exlink

Supreme [H]ardness
Joined
Dec 16, 2006
Messages
5,117
I wonder what the "fix" looks like in practice. Was it actually a driver issue, or is it doing some sort of clock manipulation?
So far I've seen one report from a known Nvidia leaker on Reddit that his 3080's max boost clock was reduced by approximately 30Mhz while also increasing the power consumption of the card by 10w. He also indicated that despite the lower clock speed his card managed to score higher in Time Spy. If I can find the post I'll link it.

EDIT: He actually tweeted about it: https://twitter.com/kopite7kimi/status/1310582120121679872

Keep in mind this is ONE source and is not indicative of anything. Going need to see several sets of data before reaching any conclusion.
 
Last edited:

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,584
Sometimes linux can recover from a gpu hardware fault without crashing, so you'd need to check the logs to be sure it was stable. Good to know, though.
Correct Linux is far more resilient.
 

kac77

2[H]4U
Joined
Dec 13, 2008
Messages
2,584
So far I've seen one report from a known Nvidia leaker on Reddit that his 3080's max boost clock was reduced by approximately 30Mhz while also increasing the power consumption of the card by 10w. He also indicated that despite the lower clock speed his card managed to score higher in Time Spy. If I can find the post I'll link it.

EDIT: He actually tweeted about it: https://twitter.com/kopite7kimi/status/1310582120121679872

Keep in mind this is ONE source and is not indicative of anything. Going need to see several sets of data before reaching any conclusion.
Which means the 3080 is essentially a 350w card. This is above any Titan in recent memory. I would have to say that the 3080/90 as I said before are the true Titans of the line up. There is some value in that but it's no free lunch considering the power draw.
 

vegeta535

Supreme [H]ardness
Joined
Jul 19, 2013
Messages
4,827
So far I've seen one report from a known Nvidia leaker on Reddit that his 3080's max boost clock was reduced by approximately 30Mhz while also increasing the power consumption of the card by 10w. He also indicated that despite the lower clock speed his card managed to score higher in Time Spy. If I can find the post I'll link it.

EDIT: He actually tweeted about it: https://twitter.com/kopite7kimi/status/1310582120121679872

Keep in mind this is ONE source and is not indicative of anything. Going need to see several sets of data before reaching any conclusion.
Increase power might of allowed a higher substaind boost clocks.
 

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
26,823
Fixed by today's driver. Lock the thread up, put it in storage, we're done here.

Lol...like a driver is going to magically solve the cap issue. At best you're going to have reduced performance and/or even higher power draw.
 

vegeta535

Supreme [H]ardness
Joined
Jul 19, 2013
Messages
4,827
Lol...like a driver is going to magically solve the cap issue. At best you're going to have reduced performance and/or even higher power draw.
A 30 mhz decrease in clocks is not much of a performance hit. Cards will still hit way above advertised boost clocks. The caps used are irrelevant. You buy a cheap model don't expect to get same clocks as higher end models which is also irrelevant since all the 3080/3090s have a hard time being stable above 2000mhz. It is not like you can't oc to get better performance like you use too. Yea this release was rushed and could have all these bugs worked out of they gave AiBs to properly test their shit. Nvidia should know better but they probably don't care.
 
Last edited:

pek

prairie dog
Joined
Nov 7, 2005
Messages
1,301
Not sure if the link to Jayztwocents has been posted, but here it is:


Bottom line, get the Aib boards that over-engineer if you want to have a large oc. Now trying to fix is in drivers by backing off the clock to where it is stable is a no-cost alternative, but not something I would accept. But, that's me being me.
 

DejaWiz

Fully [H]
Joined
Apr 15, 2005
Messages
20,398
I've got a bit of a predicament...

Originally, I was shooting for either an Asus TUF or an FE, but was willing to jump on just about any 3080 that was MSRP'd up to $750 before tax/shipping.

Back on the 21st, the MSI Ventus OC popped into stock at Best Buy, and I was able to complete the transaction, but then it disappeared from my order history.

Well, I'll be... I guess the Best Buy back office folks fixed the problem, so it looks like I'm getting the MSI Ventus OC, after all.

My first new GPU in five years. Now to keep fingers crossed that it doesn't get poached while on its way.


Screenshot_20201001-062010.png


I'm still a bit leery over the whole POSCAP/MLCC thing despite the new drivers being hailed as the fix.
Guess I'll find out soon.
 

Ricky T

Limp Gawd
Joined
Nov 7, 2019
Messages
397
Well, I'll be... I guess the Best Buy back office folks fixed the problem, so it looks like I'm getting the MSI Ventus OC, after all.

My first new GPU in five years. Now to keep fingers crossed that it doesn't get poached while on its way.


View attachment 284512

I'm still a bit leery over the whole POSCAP/MLCC thing despite the new drivers being hailed as the fix.
Guess I'll find out soon.
You're going to pair a 3080 with a 3770k? And at 1080p?
 

DejaWiz

Fully [H]
Joined
Apr 15, 2005
Messages
20,398
You're going to pair a 3080 with a 3770k? And at 1080p?

Hell yes I am. Gonna be bottleneck bliss.

god_damn_right_breaking_bad.gif









...until I gather the rest of the parts for my upcoming Zen3 build.
Have a 1440p 144Hz IPS sitting new in box since last October.
Just got in my 2x16GB DDR4-3600 set a few days ago.
Have had a Corsair 450D new in box for pushing two years, since my original plan was to go with a 2700X build, but that got put on hold.
Still going back and forth between getting a 570X MoBo soon, or just wait for 670X when Zen3 drops.
 

Ricky T

Limp Gawd
Joined
Nov 7, 2019
Messages
397
Hell yes I am. Gonna be bottleneck bliss.

View attachment 284529








...until I gather the rest of the parts for my upcoming Zen3 build.
Have a 1440p 144Hz IPS sitting new in box since last October.
Just got in my 2x16GB DDR4-3600 set a few days ago.
Have had a Corsair 450D new in box for pushing two years, since my original plan was to go with a 2700X build, but that got put on hold.
Still going back and forth between getting a 570X MoBo soon, or just wait for 670X when Zen3 drops.
Is there even going to be a 670x? I haven't even seen the single leak or a rumor in the last few months from any of the usual reliable sources.
 

DejaWiz

Fully [H]
Joined
Apr 15, 2005
Messages
20,398
Is there even going to be a 670x? I haven't even seen the single leak or a rumor in the last few months from any of the usual reliable sources.

The rumor is that ASmedia will be manufacturing the 670X chipset. Still no bona fide confirmation, to my knowledge.
 

exlink

Supreme [H]ardness
Joined
Dec 16, 2006
Messages
5,117
Is there even going to be a 670x? I haven't even seen the single leak or a rumor in the last few months from any of the usual reliable sources.
There isn’t a lot of talk about it because it’s going to likely be a minimal upgrade over the X570. Likely going to just have reduced power consumption compared to the X570 and passive cooling across the board. I’m not really sure what else they could add at this time.
 
Last edited:

vegeta535

Supreme [H]ardness
Joined
Jul 19, 2013
Messages
4,827
I know it is a non issue and all but I wonder if some lawyers out there will try to start a class action lawsuit claiming reduced performance from this fix lol. Wouldn't surprise me even though they wouldn't have a case since cards still perform above advertised speed and have been reported that there was no loss in performance.
 
Top