Show your FIRE STRIKE ULTRA scores

Should this benchmark competition include Multi Core Processors more than 4 physical cores?


  • Total voters
    16
  • Poll closed .
Thread isn't dead, just taking a nap. :D

64gb of RAM!?... uh... massive video editing? Or do you just like having 200 apps open at the same time? :)

Sounds like you may want to upgrade your radiator and/or fans if your case is heating up... Since it looks like your cards and CPU are waterblocked, you really shouldn't be having any issues with heat in your case. Perhaps reexamine your case fans and airflow?

Oh, and don't forget to update your sig as to RAM as well!
Yeah, I can't see the whole thing but it looks like the entire loop is going through that single 140mm radiator on the back. Adding another fan to it for push-pull might help a little, but you really need more surface area to cool a loop with a pair of Titan X attached to it.

Actually, looks like there is a radiator on the top. It didn't look like there was enough space for one at that angle. Might just be a skinny one.
 
Yeah, I can't see the whole thing but it looks like the entire loop is going through that single 140mm radiator on the back. Adding another fan to it for push-pull might help a little, but you really need more surface area to cool a loop with a pair of Titan X attached to it.

Actually, looks like there is a radiator on the top. It didn't look like there was enough space for one at that angle. Might just be a skinny one.


Ya there are 3 x rad's in there . 360 mm up top, 420 mm rad in front, and 120 mm rad in rear. So there is quite a bit of surface area.

But I use super low RPM fans (700 rpm or so) to keep the system silent.

The actual components with the water blocks on them, i.e. the GPU's and CPUs, their temps stay very consistent during marathon gaming. But what does happen is, the whole "hot box" effect where the interior of the case heats up like an oven. This doesn't mess with the temps on the components with a water block attached to them but it does raise temps of all the other stuff, i.e. RAM, Mobo, mofsets.

So Ive had some crashes after an hour or so of a heavy hitting game like Elite Dangerous (at 3440 x 1440 and 100 fps!) that are because one of the non water-cooled parts went above a reasonable temp.

Anyway here's to a thread back from the dead!
 
skypine27, with the super low fan RPM's you are using, have you considered doubling up fans in a push/pull config? Also, with the fans you currently have, are they pushing air out of your case through the radiator, or are the pulling air in? If it's the latter, that would definitely contribute to your case getting thermally heat soaked over time. It sounds like you just need to tweak your internal case airflow a bit - having all the fans blow outwards would help.
 
Supercharged:

I won't continue to thread hijack (lets get back to Firestrike Ultra scores) but most of my fans are intake so the case does turn into an oven over marathon gaming sessions but everything seems stable recently.

POST YOUR SCORE!
 
Well, I get a 5061 score without my SLI bridge installed as seen here:

firestrikeultra04.PNG


My score goes up to 8739 with the SLI bridge installed, but Ultra then gives me a validation warning as the graphics cards drop from the system information detail.

firestrikeultra05.PNG


Any idea what might be causing that? CPU and GPUs are all on water and I'm running mild overclocks so far.

firestrikeultra06.PNG
 
Ok, switched from using Asus AI to overclock the CPU and RAM to doing it manually in the BIOS. That apparently took care of the issue with Fire Strike Ultra not seeing the GPUs.

So here's my valid score:

firestrikeultra8879.PNG


CPU 4.6GHz @ 1.430V
RAM @ 3333MHz @ 1.35V
GPUs @ 1480MHz @ +50mV
GPU RAM @ 8GHz

Rig:

onwater07.JPG


onwater08.JPG
 
Last edited:
what does time measurement mean? How to fix?
  • Time measuring inaccurate
    This message indicates funny business with the system clock during benchmark run. In most cases, this means that, no, you cannot cheat in 3DMark by adjusting Windows timers during the benchmark run or otherwise tampering with the measurements done by the benchmark. If this message persists and you have not done anything out of the ordinary, it may indicate a hardware issue with the real time clock in your system or the presence of a background program that somehow twists the time-space continuum of your operating system in such a way that this anti-cheat detection is tripped.
From: Troubleshooting 3DMark and PCMark results | Futuremark

BIOS current? System clock set? Twisting the time-space continuum by using 4 Titan-X's in a single rig? Other than that, drawing a blank here... Nice score BTW!
 
Last edited:
Ive had the time measurement error a few times myself. No idea what causes it.

I just re run 3DMark immediately after the report with the error, and its almost always gone the 2nd run.
 
Last edited:
DarthBeavis, an unstable CPU OC will give those time measurement errors. And I think, judging by your physics score, something isn't clicking there. You're giving up a lot on your overall score

In other news, you guys need to OC those TXs!

I'm ordering 2 x 1080's on launch day from the nVidia store so I have kinda given up on OC'ing my Titan X'es :)
 
Looks like most of us destroy the GTX 1080 single card. Wont be upgrading until non reference cards are released. Probably August, reference sucks !~,


index.php
 
Ya, it looks like the reference 1080 would be more of a side grade for those of us with OC'ed 980Ti's. Definitely going to sit this out until a 1080Ti arrives with HBM 2 memory... most likely late 4th quarter or early 1st quarter next year.

The reference 1080 looks to be pretty lack luster in overall gaming performance, at least when compared to a nicely OC'ed 980Ti. (Like only a 10% performance gain) Will be interesting to see some real benchmarks/tests once the NDA's lift.
 
Ya, it looks like the reference 1080 would be more of a side grade for those of us with OC'ed 980Ti's. Definitely going to sit this out until a 1080Ti arrives with HBM 2 memory... most likely late 4th quarter or early 1st quarter next year.

The reference 1080 looks to be pretty lack luster in overall gaming performance, at least when compared to a nicely OC'ed 980Ti. Will be interesting to see some real benchmarks/tests once the NDA's lift.


I am curious to know what SLI will offer for the GTX 1080 at 4k . Might be the first pair of cards to finally pull off a steady 60+fps in 4k gaming in SLI.
anyway
I could care less until there is a 144hz gsync IPS 4k monitor out. Until then I just dont see the need for this card because i know for sure the 1080 will not do 144 FPS at 4k or even 80fps in gaming. I would be surprised if it did. Other than that I think what we really need is a new intel processor with better IPC performance than skylake to satisfy my needs right now.
Im sure the 1080 will be great for laptops thats about it until we see non reference with two 8 pin connectors . I have a feeling these reference cards will suck for overclocking and i also think these cards will only be a 5-10 percent improvement over a overclocks non reference ti

I will upgrade once a new mobos come out with this new SLI bridge and better PCI-E lanes. Maybe NVlink? I dont know but i am sure as hell not paying for this stupid new SLI HB cable.

"
According to EVGA, NVIDIA will not support three- and four-way SLI on the GeForce GTX 1080. They state that, even if you use the old, multi-way connectors, it will still be limited to two-way. The new SLI connector (called SLI HB) will provide better performance “than 2-way SLI did in the past on previous series”. This suggests that the old SLI connectors can be used with the GTX 1080, although with less performance and only for two cards.






This is the only hard information that we have on this change, but I will elaborate a bit based on what I know about graphics APIs. Basically, SLI (and CrossFire) are simplifications of the multi-GPU load-balancing problems such that it is easy to do from within the driver, without the game's involvement. In DirectX 11 and earlier, the game cannot interface with the driver in that way at all. That does not apply to DirectX 12 and Vulkan, however. In those APIs, you will be able to explicitly load-balance by querying all graphics devices (including APUs) and split the commands yourself.

Even though a few DirectX 12 games exist, it's still unclear how SLI and CrossFire will be utilized in the context of DirectX 12 and Vulkan. DirectX 12 has the tier of multi-GPU called “implicit multi-adapter,” which allows the driver to load balance. How will this decision affect those APIs? Could inter-card bandwidth even be offloaded via SLI HB in DirectX 12 and Vulkan at all? Not sure yet (but you would think that they would at least add a Vulkan extension). You should be able to use three GTX 1080s in titles that manually load-balance to three or more mismatched GPUs, but only for those games.

If it relies upon SLI, which is everything DirectX 11, then you cannot. You definitely cannot".


With this news i think i will wait until new mobos come with the SLI cable and possible better performance PCI-E lanes. Not impressed with the specs of the new line of GTX 10xx series reference cards.
Probably best to wait until DEC 2016 for us 980ti owners or sell the cards before may 27. I am sure this is great news for a 970 owners or AMD owners who want to upgrade in June. Good luck to anyone who thinks they will see non reference 1080xx cards by August
 
Last edited:
Nvidia's new Pascal GPU is aimed at super computing, packing 15.3-billion transistors, 16-nm FinFET technology and HBM 2.0 memory. Meet the Tesla P100.
During its big Tuesday Keynote at GTC 2016, Nvidia announced the Tesla P100 GPU. Based on the new Pascal architecture, which uses 16nm FinFET technology, it packs in 15.3 billion transistors, uses HBM2 memory, and steps up performance with some vigor. The chip is 600 mm square and connects to the memory with 4,000 wires (Maxwell has 384).

It also includes NVLink, a new interconnect capable of 160 GB/s. NVLink is what Nvidia will use to connect the graphics card to the system instead of PCI-Express, which would not offer enough bandwidth. Nvidia claimed a 5x performance improvement with its new interconnect.


The Tesla P100 has 16 GB of CoWoS (Chip on Wafer on Substrate) 3D-stacked HBM2 memory, with an enormous 720 GB/s memory bandwidth. It is a new unified memory architecture. The chip has 14 MB of shared memory register files with an aggregate throughput of 80 TB/s, along with 4 MB of L2 cache.



I want HBM2 Nvidia card not GDDRX ! I have a feeling this is going to be a boring release for us 980ti owners

Btw the new SLI news for the 1080x series card was leaked yesterday by a nvidia employee than deleted a few hours later on the geforce forums
NDA is lifted on may 17
 
Last edited:
Not sure if you guys have seen this info but:

Any "newer" LED sli bridge supports the new high bandwidth SLI tech. It's only the single, cheap ribbon cables that come with mobos that don't. There isn't any new tech in the new SLI bridge itself, the new tech is in the cards that allows them to use both prongs to talk to each other in 2-way SLI. So all those LED type 2- way bridges (Asus ROG, evga, and nvidia) already connect both prongs on both cards in 2-way SLI. I forgot what website I just read this on (this info was released today) but nvidia directly stated the above.

So you don't have to wait for any new mobo to support his new High bandwidth SLI. Nor do you have to order that batman looking new SLI bridge from nvidia. You just need to snag any current nice looking LED bridge that uses both tabs on both cards.

While one poster mentioned that most Of us are already beating the 1080, keep in mind most of us are running SLI and comparing it to a single 1080.

2 X 1080s in SLI is going to destroy anything we have here in this thread.

Even having just said that, I'm not going with reference cards this time. Most websites are staying that it's the single 8 pin power connection that is limiting the over clocking of the 1080. A german site even has already tested it with a better aftermarket cooler (air) and can't get any higher clock speeds. Again they are saying it hits the max power draw that a single 8 pin + the PCI-E slot is capable of before it temps out. And this on air. It's going to be an even bigger issue for water.

I'm waiting for the aftermarket boards that have 2 X 8 pin power slots on them and then snagging 2 of them.
 
Last edited:
While one poster mentioned they most
Of us are already beating the 1080, keep in mind most of us are running SLI and comparing it to a single 1080.

Nice info on the SLI bridge and agree with everything you've stated except for the above. Take a look at some of the solo card 980Ti scores in this thread - there are quite a few scoring well over the 5000 mark. Granted an overclocked 1080 is going to beat that by a little, but a stock 1080 isn't. (Scores I've seen peak at just under 5000) It seems the 980Ti's memory bandwidth helps it out a bit at 4K resolution. Not knocking the 1080, it's a great piece of kit, it's just that compared to nicely overclocked 980Ti, it isn't something that makes me want to rush out and buy it. Going to wait for the Ti variant... or at the very least, a non-reference board that can be put on water and fed adequately to make the OC potential worth pursuing. As is, going after a 10-15% bump with all the expense and hassle of upgrading just isn't worth it... to me at least.
 
Last edited:
Ya I wish I could find that article again so I could post a link. Im not having any luck. But it was definitely an Nvidia rep/spokesperson saying that any of the newer LED style SLI bridges that have connectors for both tabs on both cards would support thew new high bandwidth SLI (for 2 cards obviously, not 3 or 4). I was really happy after reading it, since I think the new Batman SLI bridge from nvidia looks terrible and I didnt want to have to wait for Asus ROG to make a nice looking new one.

The same article also did an SLI test with an single tab sli bridge (i.e., not supporting the new high bandwidth mode) vs an LED double tabbed bridge using the new high bandwidth mode. They found that it didnt make any difference until you got into 4K gaming, and then having the double tabbed/high bandwidth bridge "smoothed out the dips (i.e. low points) in frame rates". It didnt increase the max frame rate from what I remember but just helped mitigate the low points.

Again I cant find the article today but I'll keep looking.

I agree with you in waiting for a non reference model that packs an 8+8 or an 8 +6 pin power connector on it, and put it under water and get 2100+ out of it. Here is a blurb from the german article that tested the 1080 with a better air cooler and still found its power, not temp, limited when it comes to overclocking:
NVIDIA GeForce GTX 1080 tested with aftermarket cooler | VideoCardz.com
 
Update:

Found the info on "old" dual tab SLI bridges supporting the new high bandwidth SLI on the 1080/1070:

The Display Pipeline, SLI And GPU Boost 3.0 - Nvidia GeForce GTX 1080 Pascal Review

"Nvidia adds that its SLI HB bridges aren’t the only ones able to support dual-link mode. Existing LED-lit bridges may also run at up to 650MHz if you use them on Pascal-based cards. Really, the flexible/bare PCB bridges are the ones you’ll want to transition away from if you’re running at 4K or higher. Consult the chart below for additional guidance from Nvidia:"
 
Last edited:
Some "good" news for those of us who want a 1080 under a custom loop, but DONT want single 8 pin reference cards.

This is straight from EK tech support (I asked them about this):

"thank you for contacting us.

We will make full cover water blocks and backplates for some of custom GTX 1080 but at the moment we can't confirm for which versions we will decide.

Best regards,
Grega"

So, to me, it sounds like we will (soon?) have options of a double 8 x pin 1080 and be able to slap an EK full block and snazzy looking backplate on it. Just a question of how long. For those that don't want to wait however, the:
EVGA GTX 1080 Hydro Copper will definitely fit our bill:
Custom GeForce GTX 1080 Roundup | VideoCardz.com

I will grab two of them if EK fails to deliver in a reasonable amount of time. I prefer the snazzy looks of EK blocks and plates to the really plain black EVGA models so I'll wait a bit.
 
With the 1080 SLI that just arrived. I'll be rebenchmarking once I get the new SLI HB Bridge. This score is default, no overclocking whatsoever. Score: 8979

NVIDIA GeForce GTX 1080 video card benchmark result - Intel Core i7-5960X,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME

*EDIT*

OK, now with a valid score, and it's actually better than the previous one (9262), even with everything at stock. Probably due to all the unplugging and changing the USB ports when I restarted the computer and just had time to run a Firestrike Test.

NVIDIA GeForce GTX 1080 video card benchmark result - Intel Core i7-5960X,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME
 
Last edited:
With the 1080 SLI that just arrived. I'll be rebenchmarking once I get the new SLI HB Bridge. This score is default, no overclocking whatsoever. Score: 8979

NVIDIA GeForce GTX 1080 video card benchmark result - Intel Core i7-5960X,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME

It appears that a stock (no OC) 1080 SLI setup is performing pretty much on par with a OC'ed 980Ti SLI setup then... at least where Firestrike Ultra 4K is concerned. I highly doubt the newer SLI bridges will produce any further gains. (At least not in a 2 card SLI setup)

I have a feeling that at 4K resolutions, the smaller bus on the 1080 is starting to hold it back a bit and the wider bus on the 980Ti is helping it keep up.
 
Last edited:
With the 1080 SLI that just arrived. I'll be rebenchmarking once I get the new SLI HB Bridge. This score is default, no overclocking whatsoever. Score: 8979

NVIDIA GeForce GTX 1080 video card benchmark result - Intel Core i7-5960X,ASUSTeK COMPUTER INC. RAMPAGE V EXTREME

What bridge are you using? If it's a newer "light-up" one then it is better than the old O.G.-style.

It appears that a stock (no OC) 1080 SLI setup is performing pretty much on par with a OC'ed 980Ti SLI setup then... at least where Firestrike Ultra 4K is concerned. I highly doubt the newer SLI bridges will produce any further gains. (At least not in a 2 card SLI setup)

I have a feeling that at 4K resolutions, the smaller bus on the 1080 is starting to hold it back a bit and the wider bus on the 980Ti is helping it keep up.

I am thinking the HB SLI bridges will matter in 2-way SLI - especially because NVIDIA now no longer officially supports 3-way and 4-way SLI (it's about time, both are near pointless unless you're a pro bencher).
 
What bridge are you using? If it's a newer "light-up" one then it is better than the old O.G.-style.


I am thinking the HB SLI bridges will matter in 2-way SLI - especially because NVIDIA now no longer officially supports 3-way and 4-way SLI (it's about time, both are near pointless unless you're a pro bencher).

I have the old style flex strip SLI cable. The update should definitely help on this.
 
NVIDIA GeForce GTX 1080 video card benchmark result - Intel Core i7-5960X,MSI X99S XPOWER AC (MS-7881)

5,415

Intel i7-5960X CPU @ 4GHz (Corsair H110 280mm Liquid CPU cooler with 2x Noctua NF-A14 iPPC-3000), G.SKILL Ripjaws 4 series 32GB 2800MHz DDR4, EVGA NVIDIA GeForce GTX 1080 FE (+200 core, +150 mem), Samsung 950 Pro SSD (512GB), Samsung 850 Pro SSD (1TB), and a 4TB Western Digital Black WD4003FZEX 7200RPM HDD on an MSI X99S XPOWER AC motherboard inside of a Cooler Master CM Storm Trooper full tower – powered by a Corsair AX1500i power supply (1500W; fully modular) with braided cables.
 
Last edited:
Back
Top