RTX 3090 [H] Owner's Official 3DMark Time Spy Leaderboard

anything else open in the background that might be sucking up gpu resources? Even a bunch of chrome tabs could affect it, but photoshop, video editing, mining, deep learning, etc.
Nope just Aida and hw monitor. wonder if it’s my riser or cpu overclock. Guess time to spend my Sunday on this
 
First run on the 500W BIOS: 22,010 (SMT off)

Up to you if you want to record it since 3DMark for some reason doesn't want to recognize my 3090 in the results. It recognizes it in the program software itself:

1616346247107.png


I've already done a clean driver install and reinstalled 3DMark. Don't feel like formatting just for 3DMark.

When I have more time next weekend I'll tinker with the VF curve a bit to see if I can get a couple hundred more points.

EDIT: 22,118 latest score giving the memory a higher clock. +150/+1200.
 
Last edited:
First run on the 500W BIOS: 22,010 (SMT off)

Up to you if you want to record it since 3DMark for some reason doesn't want to recognize my 3090 in the results. It recognizes it in the program software itself:

View attachment 340972

I've already done a clean driver install and reinstalled 3DMark. Don't feel like formatting just for 3DMark.

When I have more time next weekend I'll tinker with the VF curve a bit to see if I can get a couple hundred more points.

EDIT: 22,118 latest score giving the memory a higher clock. +150/+1200.
"
The result is hidden and will not be shown for example on leaderboards or search."
 
"
The result is hidden and will not be shown for example on leaderboards or search."
Wonderful. It’s probably hidden because of the Generic GPU issue because I don’t have it set as hidden. I can see it fine but that’s probably because it’s on my account.
 
Wonderful. It’s probably hidden because of the Generic GPU issue because I don’t have it set as hidden. I can see it fine but that’s probably because it’s on my account.
My new board is also listed as Generic. Guessing the new revision just isn't in their database yet.
 
My new board is also listed as Generic. Guessing the new revision just isn't in their database yet.
Thanks for the update. Glad to hear it’s not just my card so I can cross that off as a potential issue!
 
Oh yeah, finally got to the top of the air cooled list all of the lists in this thread :D (and 13th in the world for my hardware combo)

https://www.3dmark.com/3dm/60198840?

Graphics score 22 679​

I've been trying to follow this on the evga forums... have they actually made a physical change to the boards now? I've never had an issue with mine, but I have not tried the 500W bios yet either, so maybe it would show up if I did.

Also, was that with rebar on or off?
 
I've been trying to follow this on the evga forums... have they actually made a physical change to the boards now? I've never had an issue with mine, but I have not tried the 500W bios yet either, so maybe it would show up if I did.

Also, was that with rebar on or off?
The boards have some sort of revision. The most apparent is the NCP81610 voltage controller upgrade. ReBar is fully enabled as far as I can tell. PX1 shows all green check marks.
How did you fix the generic GPU issue in 3DMark? Haven’t run benchmarks since I discovered the issue since it felt pointless not being able to share results.
I'm running the KPE 520w ReBar BIOS.
 
The boards have some sort of revision. The most apparent is the NCP81610 voltage controller upgrade. ReBar is fully enabled as far as I can tell. PX1 shows all green check marks.

I'm running the KPE 520w ReBar BIOS.
Interesting... Also, are you on water? Or did you somehow keep that thing under 60C on air using that much power? That's quite impressive.
 
Interesting... Also, are you on water? Or did you somehow keep that thing under 60C on air using that much power? That's quite impressive.
I had my window open. It's only air cooled for now. Some time this year my Optimus block should arrive.
 
Well; I moved up a spot! lol. I did not gain much though... Installed 500W bios, but it would appear I do have the issue... topped out around 475W, but PCIe was pulling 78.3W at times. Probably not going to hurt anything; debating if I should enter into the replacement program or not.

Evga 3090 FTW3 Ultra with 119% Power Slider and +150gpu +1000mem. Air Cooling.

21570 Graphics Score

http://www.3dmark.com/spy/19442324

I did gain a decent amount in Port Royal though; got 14389 in that!

http://www.3dmark.com/pr/973439
 
I do have the issue... topped out around 475W, but PCIe was pulling 78.3W at times. Probably not going to hurt anything; debating if I should enter into the replacement program or not.
I have yet to see anyone receive a card which performed worse in any situation that received a replacement card. The only down side would be if you didn't have a secondary card for the two weeks it will likely take for shipping. Further, the cost of shipping it back is on you so that also sucks since it indirectly makes the card like $100 more expensive. I did two day shipping with $2000 of insurance and it was $117 from the midwest to California second day air. I didn't want it shipped ground in case it was damaged.
 
I have yet to see anyone receive a card which performed worse in any situation that received a replacement card. The only down side would be if you didn't have a secondary card for the two weeks it will likely take for shipping. Further, the cost of shipping it back is on you so that also sucks since it indirectly makes the card like $100 more expensive. I did two day shipping with $2000 of insurance and it was $117 from the midwest to California second day air. I didn't want it shipped ground in case it was damaged.
Eh, TBH, not sure it is really worth the hassle then... Technically the card works perfect up to 450W (which is what evga guarantees anyway) and I never pull more than 75W on the PCIe at 107% power slider (450W). I only now pull more when trying to push 500W and will hit 78W and change when hitting 475~480W.

It made some difference in synthetic benchmarks, but when I ran some Game benchmarks, I gained maybe 0.5~1.0 FPS tops between SOTTR, CP2077, RDR2, BFV all at 4K with RT on (in games that have it)... not even noticeable honestly.
 
Eh, TBH, not sure it is really worth the hassle then... Technically the card works perfect up to 450W (which is what evga guarantees anyway) and I never pull more than 75W on the PCIe at 107% power slider (450W). I only now pull more when trying to push 500W and will hit 78W and change when hitting 475~480W.

It made some difference in synthetic benchmarks, but when I ran some Game benchmarks, I gained maybe 0.5~1.0 FPS tops between SOTTR, CP2077, RDR2, BFV all at 4K with RT on (in games that have it)... not even noticeable honestly.
The issue is that you're leaving performance on the table since you are already power limited. I'd look at your clocks instead of power draw. I wasn't able to really sustain 1950mhz before I swapped and now I'm holding 2100 regularly. The other side of it is the first revision has a serious issue with red lights of death and black screening. The EVGA forums are filled with dead cards though that is a bit of a anti survivorship bias inherent in looking at that kind of data. I luckily had the 1080ti hanging around still so didn't really mind not having the card for a bit.
 
The issue is that you're leaving performance on the table since you are already power limited. I'd look at your clocks instead of power draw. I wasn't able to really sustain 1950mhz before I swapped and now I'm holding 2100 regularly. The other side of it is the first revision has a serious issue with red lights of death and black screening. The EVGA forums are filled with dead cards though that is a bit of a anti survivorship bias inherent in looking at that kind of data. I luckily had the 1080ti hanging around still so didn't really mind not having the card for a bit.
I mean, I still have my 2080Ti so I could survive a few weeks just fine, even more so with spring about here.

It depends on the game for me as far as clocks go... in BFV I average around 1950Mhz or so... in CP2077 and COD:CW it is 2025Mhz... in SOTTR I will average between 1995Mhz~2040Mhz. I just don't know how much that 2100Mhz+ may translate to real world gaming FPS for all the hassle.

Seems stupid I would have to pay shipping on a defective product though and it still seems like evga has not admitted shit or even stated what exactly they have changed. I do not think I have a "first" revision card tho as I got it in December and it has the black trim, but part of me really does hate leaving performance on the table.

I suppose if I got the red lights of death, I would have to RMA and would get a new card anyway?
 
I suppose if I got the red lights of death, I would have to RMA and would get a new card anyway?
And you'd still have to pay to ship it to them anyway. The big thing for me was I don't want to have to deal with the thermal pads and all that stuff once I get my waterblock so I wanted the revised card before I tore my card apart.
 
And you'd still have to pay to ship it to them anyway. The big thing for me was I don't want to have to deal with the thermal pads and all that stuff once I get my waterblock so I wanted the revised card before I tore my card apart.
From what you can tell, did they ship you a brand new card? Or is it a refurbished card? I suppose it does not matter too much if it works, but something about spending $1800 on a video card makes me want to have something brand new.
 
From what you can tell, did they ship you a brand new card? Or is it a refurbished card? I suppose it does not matter too much if it works, but something about spending $1800 on a video card makes me want to have something brand new.
It's a brand new card in a full retail box. As far as I'm aware the cards in the retail channel aren't the revisions yet though I haven't followed too closely. There were complaints that people had bought cards AFTER the announcement of the swap program and then found out they'd have to swap to fix the issues of their less than week old card.
 
It's a brand new card in a full retail box. As far as I'm aware the cards in the retail channel aren't the revisions yet though I haven't followed too closely. There were complaints that people had bought cards AFTER the announcement of the swap program and then found out they'd have to swap to fix the issues of their less than week old card.
Thats good to know and interesting as well. Im curious how they plan to handle this. I also wonder if its more like, if you never notice or see the forums, then evga is never going to tell you anyway if your old card works (as far as you know if your not into benching or monitoring). Seems silly as I saw something saying all newer card will be shipping with the 500W XOC bios.

You would think evga has some liability if these things are going past PCIe spec, although I imagine 75W is not the absolute electrical max.

BTW, can you post (or PM) the link to the thread with instructions on how to apply for this specific card swap program?
 
Thats good to know and interesting as well. Im curious how they plan to handle this. I also wonder if its more like, if you never notice or see the forums, then evga is never going to tell you anyway if your old card works (as far as you know if your not into benching or monitoring). Seems silly as I saw something saying all newer card will be shipping with the 500W XOC bios.

You would think evga has some liability if these things are going past PCIe spec, although I imagine 75W is not the absolute electrical max.

BTW, can you post (or PM) the link to the thread with instructions on how to apply for this specific card swap program?
To swap the board you need the 3090 FTW3U (part number 24G-P5-3987-KR). Email [email protected] and say you may have a card which is affected by the power draw issues as detail in the forum. They should ask you to verify by running a benchmark or game with GPUz open to show the issue. I found Time Spy is the best way to demonstrate the issue or at least was the most consistent for me. Oh yeah, I believe you need to be running the XOC 500w bios as well.

I totally agree with the sentiment that this card SHOULD be recalled. The problem is that it performs juuuust well enough that people aren't noticing the issues. I bet shit really hits the fan when the 3080ti comes out and EVGA's own cards out perform the higher end 3090s which are affected by this issue. I truly believe the cards dying are a direct result of this faulty board design. I think this is a niche enough product they won't have to issue a recall like they should.
 
In fact, every swap I've done for this card resulted in a brand new card in a full retail box.
you got lucky, when the problem first arose and I did the rma with them I got a used/refurb 3080 FtW3U back where they slapped a new serial sticker over the old one and came all scratched up
 
you got lucky, when the problem first arose and I did the rma with them I got a used/refurb 3080 FtW3U back where they slapped a new serial sticker over the old one and came all scratched up
Yeahhh... these are the type of stories I worry about. I finally found the section on the evga forums about this and a few people have worse cards now than when they shipped their old ones out!

To swap the board you need the 3090 FTW3U (part number 24G-P5-3987-KR). Email [email protected] and say you may have a card which is affected by the power draw issues as detail in the forum. They should ask you to verify by running a benchmark or game with GPUz open to show the issue. I found Time Spy is the best way to demonstrate the issue or at least was the most consistent for me. Oh yeah, I believe you need to be running the XOC 500w bios as well.

I totally agree with the sentiment that this card SHOULD be recalled. The problem is that it performs juuuust well enough that people aren't noticing the issues. I bet shit really hits the fan when the 3080ti comes out and EVGA's own cards out perform the higher end 3090s which are affected by this issue. I truly believe the cards dying are a direct result of this faulty board design. I think this is a niche enough product they won't have to issue a recall like they should.
Thanks, I'll email them I think just to see where this leads.

And yeah, I am on the 500W bios now. I actually do NOT have the problem on the 450W bios and I can hit 450W with PCIe at 75W max. It only now draws more on the PCIe after pushing past 450W toward 500W. So my board is probably on the fringe of the issue (as people seemed to have it on even the 450W bios unable to hit 450W).

As for performance vs. a 3080Ti, hard to say until we have true specs... I believe those cards are going to still have less RT cores, less shaders, less tensor cores and less vram... BUT if those cards are allowed to PUSH the power settings, the raw speed could bump it very close to the 3090's, so it will be interesting to see how this plays out.
 
I mean, I still have my 2080Ti so I could survive a few weeks just fine, even more so with spring about here.

It depends on the game for me as far as clocks go... in BFV I average around 1950Mhz or so... in CP2077 and COD:CW it is 2025Mhz... in SOTTR I will average between 1995Mhz~2040Mhz. I just don't know how much that 2100Mhz+ may translate to real world gaming FPS for all the hassle.

Seems stupid I would have to pay shipping on a defective product though and it still seems like evga has not admitted shit or even stated what exactly they have changed. I do not think I have a "first" revision card tho as I got it in December and it has the black trim, but part of me really does hate leaving performance on the table.

I suppose if I got the red lights of death, I would have to RMA and would get a new card anyway?

Well I just ran the RDR2 benchmark the other day at 390w and 500w. Average went from 47 to 49 fps. And that's more than you will gain obviously since you're already at 450w or more.
 
Well I just ran the RDR2 benchmark the other day at 390w and 500w. Average went from 47 to 49 fps. And that's more than you will gain obviously since you're already at 450w or more.

Yeah, seeing as some people are complaining they are getting 3080Ti Die's that are relabeled as 3090 and have worse overclocking; I'm not so sure its worth exchanging this card and going through the hassle. Add time and shipping; is that really worth 2~3% more performance tops (if that)?

On the left (or top) is my card set at 107% (450W)... it can hit that and is usually really close; note the PCIe max is not even at 75W. Which is why originally I did not think my card had the power issues so many had been reporting, it works perfect with the factory 450W bios. On the right (or bottom) my card is set at 119% (500W XOC bios)... it tops out around 485W, but the PCIe will spike to freaking 79W+! Not sure if that can hurt anything over time or not; its not sustained but will average about 76~77W at the 500W setting.

Clearly the regulator sucks; but I don't think I have it as bad as others who can't even break 420W tops no matter what bios they use.

Both tests were run in Timespy Extreme (as I game at 4K); by using 500W over 450W; I gained 100 points in my graphics score (which i was about 10,930); that works out to about 1% performance gain rounding up at 4K resolution, so MAYBE 1 FPS assuming I am hitting 100 FPS in a game.

20210402_221620.png20210402_221631.png
 
Last edited:
Yeah, seeing as some people are complaining they are getting 3080Ti Die's that are relabeled as 3090 and have worse overclocking; I'm not so sure its worth exchanging this card and going through the hassle. Add time and shipping; is that really worth 2~3% more performance tops (if that)?

On the left (or top) is my card set at 107% (450W)... it can hit that and is usually really close; note the PCIe max is not even at 75W. Which is why originally I did not think my card had the power issues so many had been reporting, it works perfect with the factory 450W bios. On the right (or bottom) my card is set at 119% (500W XOC bios)... it tops out around 485W, but the PCIe will spike to freaking 79W+! Not sure if that can hurt anything over time or not; its not sustained but will average about 76~77W at the 500W setting.

Clearly the regulator sucks; but I don't think I have it as bad as others who can't even break 420W tops no matter what bios they use.

Both tests were run in Timespy Extreme (as I game at 4K); by using 500W over 450W; I gained 100 points in my graphics score (which i was about 10,930); that works out to about 1% performance gain rounding up at 4K resolution, so MAYBE 1 FPS assuming I am hitting 100 FPS in a game.

79w isn't an issue. The biggest risk is blowing the fuse that your card has on the PCIE, but at least the 1st gen of that card has a fuse that shouldn't blow until 120w (10 amp). Although I saw a post on another forum suggesting the new version of the card may have a smaller fuse (maybe 8amp, can't find the post, which would be 96w).
 
Yeah, seeing as some people are complaining they are getting 3080Ti Die's that are relabeled as 3090 and have worse overclocking; I'm not so sure its worth exchanging this card and going through the hassle. Add time and shipping; is that really worth 2~3% more performance tops (if that)?
Oh no, not you too.
 
79w isn't an issue. The biggest risk is blowing the fuse that your card has on the PCIE, but at least the 1st gen of that card has a fuse that shouldn't blow until 120w (10 amp). Although I saw a post on another forum suggesting the new version of the card may have a smaller fuse (maybe 8amp, can't find the post, which would be 96w).
If you find it, let me know. I have a later revision of the card, so probably 96W. Still, sounds like I am well below that limit. The card overall does perform quite nice. Even so, how were people blowing these fuses to begin with? I feel your bus is in danger pulling that high at 96W, 120W tho? yikes! Fuses sound over spec'ed, unless the PCIe bus is capable of more power draw than we are led to believe... but I doubt its that high. Maybe +10% tops I'd think.

Maybe that's where the red light of death was coming from? Had that ever been figured out?


Oh no, not you too.
LOL... I know they are 3090 dies and nvidia re-enabled all features. I think the real debate is if nvidia binned them any differently in terms of speed as to originally not out do the 3090, if only by a slim margin. I imagine these dies were the original 3080Ti's that were supposed to have same counts on all cores and features with only the memory bus and amount different.
 
LOL... I know they are 3090 dies and nvidia re-enabled all features. I think the real debate is if nvidia binned them any differently in terms of speed as to originally not out do the 3090, if only by a slim margin. I imagine these dies were the original 3080Ti's that were supposed to have same counts on all cores and features with only the memory bus and amount different.
There are only 3 bins per part number. 10% are bin '2' that go into the HOF/KPE/Quadro/etc. 60% are bin '1' which is the FE/AIB/etc. 30% are bin '0' for budget cards, OEM implementation, etc. Bins are assigned by part number not feature level. Feature level is determined by product demand and can be changed before implementation onto a board. So you weren't in a top bin anyway and you certainly weren't in the bottom bin.
 
There are only 3 bins per part number. 10% are bin '2' that go into the HOF/KPE/Quadro/etc. 60% are bin '1' which is the FE/AIB/etc. 30% are bin '0' for budget cards, OEM implementation, etc. Bins are assigned by part number not feature level. Feature level is determined by product demand and can be changed before implementation onto a board. So you weren't in a top bin anyway and you certainly weren't in the bottom bin.
Informative... thanks!
 
If you find it, let me know. I have a later revision of the card, so probably 96W. Still, sounds like I am well below that limit. The card overall does perform quite nice. Even so, how were people blowing these fuses to begin with? I feel your bus is in danger pulling that high at 96W, 120W tho? yikes! Fuses sound over spec'ed, unless the PCIe bus is capable of more power draw than we are led to believe... but I doubt its that high. Maybe +10% tops I'd think.

Maybe that's where the red light of death was coming from? Had that ever been figured out?



LOL... I know they are 3090 dies and nvidia re-enabled all features. I think the real debate is if nvidia binned them any differently in terms of speed as to originally not out do the 3090, if only by a slim margin. I imagine these dies were the original 3080Ti's that were supposed to have same counts on all cores and features with only the memory bus and amount different.

I'm not sure how many people have. That's just what everyone always warns of. And it's from people running way more power than the card was designed for from either shunt modding, running an XOC bios, etc... I suspect most boards can handle 120w or they would have not gone with a 10 amp fuse in the first place, but I could be wrong. Not unlike the 8 pin connectors that have a spec wattage of 150w. They can handle closer to 300w. I've run all of my TS runs in this thread recently using the KP XOC bios on a 2x8pin Zotac 3090 pulling 500+ watts. Although PCIE never goes above 80w. I guess Zotac got that right. lol (sorry couldn't resist)
 
I'm not sure how many people have. That's just what everyone always warns of. And it's from people running way more power than the card was designed for from either shunt modding, running an XOC bios, etc... I suspect most boards can handle 120w or they would have not gone with a 10 amp fuse in the first place, but I could be wrong. Not unlike the 8 pin connectors that have a spec wattage of 150w. They can handle closer to 300w. I've run all of my TS runs in this thread recently using the KP XOC bios on a 2x8pin Zotac 3090 pulling 500+ watts. Although PCIE never goes above 80w. I guess Zotac got that right. lol (sorry couldn't resist)
LOL... yeah, evga clearly fucked up on the power delivery aspect of the FTW3U card. It runs cool on the stock cooler; so even 500W is a breeze for it (if it can hit it). Sadly, it would appear about 480W the most I can get out of it (when pushing it hard) because the PCIe hits close to 80W. Clearly evga tried to save a few bucks on a cheap component somewhere as those 3 8-pin connectors can pull so much more; why draw from the bus?!?

In the end though; I gained a trivial amount of performance running 500W. In some games where it does not push past 470W, it actually gained me higher boost clocks.

Played some COD:CW for a bit today for around 1.5 hours and these were the maximum usages I got... clearly not 500W; but my FPS at 4K (fully maxed with RT on BTW) were always 110FPS or higher.... so maybe the card just did not need to push itself too hard as it looks like I almost never hit the power limit, so I guess that helps some! :)

This is of course just one game; as I posted previously, Timespy pushed things pretty high on the 4K test.

I guess the real question that no one seems to be able to answer is; is 80W on the PCIe actually going to hurt anything if you have a quality motherboard and GPU?

3090 Power Loaded 119 COD.jpg
 
Your just sitting on the voltage limit the whole time. Doesn't tell you how high the power will go. Maybe you don't want to push your pcie power any further but if you want to see how high it will go, install Quake 2 RTX from Steam. If it doesn't get to 500w with that, it never will.
 
Your just sitting on the voltage limit the whole time. Doesn't tell you how high the power will go. Maybe you don't want to push your pcie power any further but if you want to see how high it will go, install Quake 2 RTX from Steam. If it doesn't get to 500w with that, it never will.
Yeah, BFV with RT on Ultra at 4K also gets me to about 480W and will push the PCIe to about 78~79W. COD is clearly really well optimized; or the RT is not that demanding at 4K.

Can't say I want to fry anything though, so I have been monitoring games with GPUz... for example I can bump it up in COD to ensure Voltage is the limit (It will go over 450W), which gives me more boost. BFV on the other hand; pushes the PCIe power higher and the extra boost is not really worth the possible issues with too much current draw, so I play that game now at 450W limit (keeps PCIe under 75W).

Seems silly, I know... but unless I go through the hassle of a return/exchange with evga its my "safe" option... and even then, is it really gonna be worth maybe 1~2FPS tops.... even if I were to luck out with a nice overclocker again with better power delivery?
 
I guess the real question that no one seems to be able to answer is; is 80W on the PCIe actually going to hurt anything if you have a quality motherboard and GPU?
The mobo should be fine. The real problem is the headroom for power. At 80w you are already at the power limit. Once a single power limit is hit the entire cars is power limited. The new boards give you 20w of headroom at stock on pcie alone.
 
Back
Top