bigdogchris
Fully [H]
- Joined
- Feb 19, 2008
- Messages
- 18,706
Do we know if the 1070 is going to come with the back plate?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Yes they are allowed to add circuitry/VRMs/additional power inputs... However, if Nvidia is staying the course they've been on for the last few years, the cards will still be voltage locked.
Has anyone found reviews testing at increased fan speeds (that also note decibel levels please)?that's right.. but I think this time, with all the talks about the relations of clocks and fan speed mentioned in the article would have worth maybe... another 30 minutes to brent to test how the card behave with an aggressive fan profile or 100% fan speed if desired.. 30 minutes or 60 minutes for both test would have keep us all happy.. instead we have now to outsource to another review sites to see numbers with a tested improved fan curve or 100% fan speed.
They are using a custom cooler on it btw, accelero extreme ivWell at pcgameshardware.de it is averaging 10% faster at 1440p than an MSI Lightning 980ti, and just under 15% in AC Unity and DA Inquisition......
That is more than I expected.
I use Chrome translation option in settings to read the site:
Geforce GTX 1080 im Test: Der erste 16-nm-König mit 2 GHz im OC-Betrieb
Cheers
so a lot of people suddenly forgot the fact that his card it's replacing the GTX 980 and not the GTX 980Ti which is already a good chunk faster than the GTX 980. We 980 TI/Titan X owners will have a proper replace of our cards later.. at this point in the market, Nvidia doesn't have any reason to put out the strongest possible GPU because as the same they did with the GTX 680, GTX 780 and GTX 980 they just want to put out something faster out and milk some money, this time is even easier because they don't have any competition, which could be even worse as AMD stated their focus in Mid-range/mainstream market so the performance crown is free for Nvidia..
The GTX 980 was praised at launch date being only marginally faster than the current 780TI, Nvidia launch now a 1080 25-35% faster overall than the current 980TI and then suddenly that hurt in the butt to every 980TI owner calling major disappointment. that's a joke... this card will be at least 100$ cheaper in 6-8 months when the new cards and new kings arrive to the market same as GTX 780 and GTX 980 did.. history its cyclic..
Would like to ask Kyle what his overall opinion would be in 1080's vs SLI Titan X's , especially with the experience you had with the SLI TItan X's at 4k that you wrote about over time. I've got Titan X's in SLI and a Benq 3201 4k , running with a 4790k OC'd. Overall have enjoyed the system lots the past year , just occasional aggravations like Fallout 4 launch with no 4k profile etc.
I keep my room temp at 70 year round (the wife has adjusted finally ) , have reference blowers exhausting the air of the cards out the case (found better results with SLI with reference blowers ) and my Titan X's overclock quite well with fairly aggressive fan profile.
While I'd like to get rid of SLI it looks like one 1080 isn't quite there for 4k performance with zero worries , or am I wrong about that ? Would you stick with Titan X's in SLI , deal with the SLI headaches , and wait for the 1080 Ti or Titan version of it etc , or do you expect it to be another year before a single card solution can handle 4k at 60 fps safely ?
I could sell mine currently and replace with 1080 SLI , just curious if seems worth the hassle and should wait or not.
That was not just for the 2GHz core clock?They are using a custom cooler on it btw, accelero extreme iv
I'm sorry if this has been answered and I missed it but is this card still restricted to 8 bit colour output like previous gen Geforce cards or has Nvidia lifted the lock for new games that can make use of a full 10 bit colour range and HDR?
So for those who have 980 Tis, is this worth considering an upgrade?
I'm sorry if this has been answered and I missed it but is this card still restricted to 8 bit colour output like previous gen Geforce cards or has Nvidia lifted the lock for new games that can make use of a full 10 bit colour range and HDR?
On page 2 of the review at the end of the page, it states it can output 12-bit color for 4K HDR.Pretty sure it's still 10 bit rendering and 8 bit output or whatever.
Does the theory of AMD-optimization for AoTS have any supporting evidence?
On page 2 of the review at the end of the page, it states it can output 12-bit color for 4K HDR.
HDR display support is going to be a big topic this year, especially when AMD Polaris is launched. HDR displays are coming, and they are poised to give us a whole new perspective on image quality.
The GeForce GTX 1080 is capable of 12b color (BT.2020 wide color gamut), SMPTE 2084 (Perceptual Quantization) and HDMI 2.0b and DisplayPort 1.4 for 10/12b 4K HDR. Pascal introduces new features such as 4K@60 10/12b HEVC Decode for HDR video, 4K@60 10b HEVC Encode for HDR recording or streaming, and DisplayPort 1.4 ready HDR Metadata Transport. GeForce GTX 1080 is capable of 7680x4320 @ 60Hz. It will certainly be interesting to see how AMD Polaris compares in display technology and HDR support.
It can't be 8-bits for HDR. The various HDR formats require more than 8-bits.Pretty sure it's still 10 bit rendering and 8 bit output or whatever.
Smaller die + similar power consumption = hotter chip. There's less surface area of silicon in contact with the heatsink to pull heat through.With such a DIE change I would have expected a cooler chip.... Something seems off to me a bit.
On page 2 of the review at the end of the page, it states it can output 12-bit color for 4K HDR.
HDR display support is going to be a big topic this year, especially when AMD Polaris is launched. HDR displays are coming, and they are poised to give us a whole new perspective on image quality.
The GeForce GTX 1080 is capable of 12b color (BT.2020 wide color gamut), SMPTE 2084 (Perceptual Quantization) and HDMI 2.0b and DisplayPort 1.4 for 10/12b 4K HDR. Pascal introduces new features such as 4K@60 10/12b HEVC Decode for HDR video, 4K@60 10b HEVC Encode for HDR recording or streaming, and DisplayPort 1.4 ready HDR Metadata Transport. GeForce GTX 1080 is capable of 7680x4320 @ 60Hz. It will certainly be interesting to see how AMD Polaris compares in display technology and HDR support.
I think the conclusion is odd. The benefit of async compute is discounted because it doesn't enable a last gen 28nm card to beat the latest 16nm card? This seems to be an impossibly high bar to set. The question is not whether Fury X with async compute could match Pascal - I don't think anyone expected it would - it's whether Polaris/Vega will show similar gains with async compute that its predecessor does. And whether Pascal's reduced negative gain versus Maxwell is the result of pure brute force or architectural changes that might eventually lead to Pascal showing gains with DX12/async.
The performance speaks for itself, GTX 1080 is a lot faster in AoTS as the graphs show. It provides the best gameplay experience, that is fact. The words I used and analysis is correct, with Async Compute Fury X is not faster than GTX 1080 in this game. That analysis is correct. We don't discount it, we directly tested it, showed the results, and made the conclusion that Fury X with Async is not faster than GTX 1080 in this game, that's a true statement. I cannot speak on unannounced and unavailable hardware performance.
I still think this game and Async Compute is getting blown out of porporition. Async Compute is not the most important grpahics feature ever to exist in a GPU. There are other methods to optimize performance that can be used. Async is just but one feature, out of many, it isn't even required for DX12. People are treating Async Compute like it's the holy grail and Indiana Jones must find it. It's in one game, one, and no one even plays the game as a game, just a benchmark. It's future use in games is unknown, and from what developers have been saying probably not a feature a lot are going to end up using.
So no tests were run with a custom fan profile that took it to 100% by say 70 degrees? That might have made quite a difference on being able to maintain a higher boost clock.
DX12 performance is all over the board.
GamersNexus got crazy good numbers from both Maxwell and Pascal.
NVIDIA GeForce GTX 1080 Founders Edition Review & Benchmark
Why do people care so much about a backplate? It doesn't do anything
Heat and power are big components of overclocking. I would suggest we will see AIBs with coolers that work better than FE with power circuitry that may be a bit more robust.Are you suggesting you think the Nvidia reference card (Founders Edition) will outperform AIB manufacturer's reference cards with in house blowers (i.e. That crappy blower MSI has put out) when it comes to overclocking?
Titan X SLI is still the tits.While I'd like to get rid of SLI it looks like one 1080 isn't quite there for 4k performance with zero worries , or am I wrong about that ? Would you stick with Titan X's in SLI , deal with the SLI headaches , and wait for the 1080 Ti or Titan version of it etc , or do you expect it to be another year before a single card solution can handle 4k at 60 fps safely ?
that's right.. but I think this time, with all the talks about the relations of clocks and fan speed mentioned in the article would have worth maybe... another 30 minutes to brent to test how the card behave with an aggressive fan profile or 100% fan speed if desired.. 30 minutes or 60 minutes for both test would have keep us all happy.. instead we have now to outsource to another review sites to see numbers with a tested improved fan curve or 100% fan speed.